deep learning compiler and inference framework
Here are few of many ways.
- Try deepC with Colab Noteboook
- Install it on Ubuntu, raspbian (or any other debian derivatives) using
pip install deepC
- Compile onnx model- read this article or watch this video
- Use deepC with a Docker File
See more examples in tutorial dir.
deepC Compiler and inference framework is designed to enable and perform deep learning neural networks by focussing on features of custom ai-accelerators like micro-controllers, eFPGAs, cpus and other embedded devices like raspberry-pi, odroid, arduino, SparkFun Edge, risc-V, mobile phones, x86 and arm laptops among others.
deepC also offers ahead of time compiler producing optimized executable based on LLVM compiler tool chain specialized for deep neural networks with ONNX as front end.
Main components of deepC have been designed to represent and optimize the common deep learning networks in high level graph IR and to transform the computation graph to minimize memory utilization, optimize data layout and fuse computation patterns for different hardware backends.
Read more at high level design document
Build and start modifying dnnCompiler locally from source code with following steps
Follow the steps to install pre-requisites
sudo apt-get update
sudo apt-get install build-essential python3.6-dev python3-pip swig doxygen clang-format clang clang-8 llvm-8 llvm-8-dev protobuf-compiler libprotoc-dev
sudo pip3 install numpy==1.15.0 onnx==1.5.0
Once you are done, build dnnCompiler
git clone https://github.com/ai-techsystems/dnnCompiler.git
cd dnnCompiler
make
Make sure you have the below pre-requisites
Once you are done, build dnnCompiler inside docker container
git clone https://github.com/ai-techsystems/dnnCompiler.git
cd dnnCompiler
python buildDocker.py
find include src swig -name \*.h -print0 -o -name \*.cpp -print0 | xargs -0 -P8 -n1 clang-format -i
make -C src
make[1]: Entering directory 'dnnCompiler/src'
make -C core
make[2]: Entering directory 'dnnCompiler/src/core'
compiling broadcast.cpp
/usr/bin/g++ -O3 -Wall -std=c++14 -fPIC -march=native -msse2 \
-isystem ./packages/eigen-eigen-323c052e1731 -I./include \
-c broadcast.cpp -o obj/broadcast.o
compiling tensor.cpp
...
...
/usr/bin/g++ -shared ./obj/dnnc_swig.o ./obj/dnnc_pyutils.o ./obj/dnnc_api.o -o lib/libdnnc.so
ln -s -f lib/libdnnc.so _dnnc.so
/usr/bin/python3 ../test/swig/basic.py
Supported Architectures | Status |
---|---|
Arm | βοΈ |
Armv7 | βοΈ |
Arm64 | βοΈ |
AMD64 | βοΈ |
ppc64le | βοΈ |
Supported OS | Distributions | Status |
---|---|---|
Linux | Ubuntu 18.04 | βοΈ |
Linux | CentOS 6 | βοΈ |
Linux | Arch Linux | βοΈ |
Linux | Manjaro | βοΈ |
Windows | 1803 and above | βοΈ |
Mac OS | Sierra and above | βοΈ |
dnn Compiler adopts apache committer model, we aim to create an open source project that is maintained and owned by the community. Checkout the Contributor Guide.
We acknowledge the efforts predecessor projects like LLVM, ONNX etc. to make this project a reality.
dnnCompiler is targeted towards devices with small formfactor like microcontrollers, which are part of all sorts of household devices: think appliances, cars, and toys. In fact, there are around 30 billion microcontroller-powered devices produced each year. They're cheap, require very little energy, and are very reliable.
By bringing deep learning models to tiny microcontrollers, we can boost the intelligence of billions of devices that we use in our lives, without relying on expensive hardware or reliable internet connections. Imagine smart appliances that can adapt to your daily routine, intelligent industrial sensors that understand the difference between problems and normal operation, and magical toys that can help kids learn in fun and delightful ways.
π§ Project Under Development. Stay tuned. We plan to release the first version in Nov. 2019.
This project exists thanks to all the people who contribute. [Contribute].
Become a financial contributor and help us sustain our community. [Contribute]
Support this project with your organization. Your logo will show up here with a link to your website. [Contribute]