Code Monkey home page Code Monkey logo

neural-fortran's Introduction

neural-fortran

GitHub issues

A parallel neural net microframework. Read the paper here.

Features

  • Dense, fully connected neural layers
  • Convolutional and max-pooling layers (experimental, forward propagation only)
  • Stochastic and mini-batch gradient descent for back-propagation
  • Data-based parallelism
  • Several activation functions

Available layer types

Layer type Constructor name Supported input layers Rank of output array Forward pass Backward pass
Input (1-d and 3-d) input n/a 1, 3 n/a n/a
Dense (fully-connected) dense input (1-d) 1
Convolutional (2-d) conv2d input (3-d), conv2d, maxpool2d 3
Max-pooling (2-d) maxpool2d input (3-d), conv2d, maxpool2d 3
Flatten flatten input (3-d), conv2d, maxpool2d 1

Getting started

Get the code:

git clone https://github.com/modern-fortran/neural-fortran
cd neural-fortran

Dependencies

Required dependencies are:

  • A Fortran compiler
  • HDF5 (must be provided by the OS package manager or your own build from source)
  • h5fortran, json-fortran (both handled by neural-fortran's build systems, no need for a manual install)
  • fpm or CMake for building the code

Optional dependencies are:

  • OpenCoarrays (for parallel execution with GFortran)
  • BLAS, MKL (optional)

Compilers tested include:

  • gfortran-9.4.0
  • ifort-2021.4
  • ifx-2021.4

Building with fpm

Building in serial mode

With gfortran, the following will create an optimized build of neural-fortran:

fpm build \
  --profile release \
  --flag "-fno-frontend-optimize -I$HDF5INC -L$HDF5LIB"

HDF5 is now a required dependency, so you have to provide it to fpm. The above command assumes that the HDF5INC and HDF5LIB environment variables are set to the include and library paths, respectively, of your HDF5 install. The -fno-frontend-optimize disables some optimizations that may be harmful when building neural-fortran.

Building in parallel mode

If you use GFortran and want to run neural-fortran in parallel, you must first install OpenCoarrays. Once installed, use the compiler wrappers caf and cafrun to build and execute in parallel, respectively:

fpm build \
  --compiler caf \
  --profile release \
  --flag "-fno-frontend-optimize -I$HDF5INC -L$HDF5LIB"

Testing with fpm

fpm test \
  --profile release \
  --flag "-fno-frontend-optimize -I$HDF5INC -L$HDF5LIB"

For the time being, you need to specify the same compiler flags to fpm test as you did in fpm build so that fpm knows it should use the same build profile.

See Fortran Package Manager for more info on fpm.

Building with CMake

Building in serial mode

mkdir build
cd build
cmake .. -DSERIAL=1
make

Tests and examples will be built in the bin/ directory.

Building in parallel mode

If you use GFortran and want to run neural-fortran in parallel, you must first install OpenCoarrays. Once installed, use the compiler wrappers caf and cafrun to build and execute in parallel, respectively:

FC=caf cmake ..
make
cafrun -n 4 bin/mnist # run MNIST example on 4 cores

Building with a different compiler

If you want to build with a different compiler, such as Intel Fortran, set the HDF5_ROOT environment variable to the root path of your Intel HDF5 build, and specify FC when issuing cmake:

FC=ifort cmake ..

for a parallel build of neural-fortran, or

FC=ifort cmake .. -DSERIAL=1

for a serial build.

Building with BLAS or MKL

To use an external BLAS or MKL library for matmul calls, run cmake like this:

cmake .. -DBLAS=-lblas

where the value of -DBLAS should point to the desired BLAS implementation, which has to be available in the linking path. This option is currently available only with gfortran.

Building in debug mode

To build with debugging flags enabled, type:

cmake .. -DCMAKE_BUILD_TYPE=debug

Running tests with CMake

Type:

ctest

to run the tests.

Examples

The easiest way to get a sense of how to use neural-fortran is to look at examples, in increasing level of complexity:

  1. simple: Approximating a simple, constant data relationship
  2. sine: Approximating a sine function
  3. mnist: Hand-written digit recognition using the MNIST dataset
  4. cnn: Creating and running forward a simple CNN using input, conv2d, maxpool2d, flatten, and dense layers.
  5. mnist_from_keras: Creating a pre-trained model from a Keras HDF5 file.

The examples also show you the extent of the public API that's meant to be used in applications, i.e. anything from the nf module.

The MNIST example uses curl to download the dataset, so make sure you have it installed on your system. Most Linux OSs have it out of the box. The dataset will be downloaded only the first time you run the example in any given directory.

If you're using Windows OS or don't have curl for any other reason, download mnist.tar.gz directly and unpack in the directory in which you will run the example program.

API documentation

API documentation can be generated with FORD. Assuming you have FORD installed on your system, run

ford ford.md

from the neural-fortran top-level directory to generate the API documentation in doc/html. Point your browser to doc/html/index.html to read it.

Contributors

Thanks to all open-source contributors to neural-fortran:

Related projects

neural-fortran's People

Contributors

milancurcic avatar rouson avatar scivision avatar ivan-pi avatar jacobwilliams avatar pirpyn avatar

Watchers

James Cloos avatar Tomek Plewa avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.