Code Monkey home page Code Monkey logo

pysonar's Introduction

PySonar

Decentralized Machine Learning Client

Chat on Slack Build Status

Sonar is a smart contract library that allows data scientists to publish new models they want to get trained into the ModelRepository and people to pick models they can train on their personal data.

You can find a working proof of concept in the notebooks directory.

Setup

Using Docker

Using Docker is the easiest way to get this running.

  1. Run docker-compose up. This will launch IPFS, the in-memory fake ethereum blockchain with the smart contract, OpenMined mine.js and the jupyter notebooks
  2. Open the the Jupyter notebooks on http://localhost:8888
  3. Step through the notebook and check the output of the previous docker-compose up to get some infos on what happens

Usage

Bootstrap environment

Before running the demo there are a couple of prerequisites you need to install.

Base libraries

Before installing the python packages you need to make sure your system holds a set of basic math libraries required for the encryption operations (phe lib)

  • mpc: arithmetic of complex numbers with arbitrarily high precision and correct rounding of the result
  • mpfr: multiple-precision floating-point computations
  • gmp: GNU multiple precision arithmetic library
  • npm: NPM Package Manager

For MacOS with brew just run:

brew install libmpc mpfr gmp

For Linux run:

apt-get install libgmp3-dev libmpfr-dev libmpc-dev python3-dev

Then run:

curl -o- https://raw.githubusercontent.com/creationix/nvm/v0.33.5/install.sh | bash
nvm install v8

Solidity

The solidity tools are required to compile the contract of our demo. See installing solidity for instructions for your platform.

Truffle

In order to import ABIs you'll need to install truffle.

sudo npm install -g truffle

IPFS

As the network itself is too big to actually host it on the blockchain you need IPFS to host the files. For installation see the ipfs installation page or run:

brew install ipfs

After installation is complete run ipfs init to initialize your local IPFS system.

PIP packages

Make sure you have a clean python3 install and continue with installing all the packages

pip install -r requirements.txt

PIP package maintenance

PySonar utilizes pip-tools to help with maintaining PIP packages (https://github.com/jazzband/pip-tools)

To update all packages, periodically re-run

pip-compile --upgrade

Build local libraries

First you need to get sonar package bundled up

python setup.py install

Then make sure you also have the syft package properly installed. Head over to the repository and follow its instructions.

Import Smart Contract ABI

The interface for our Sonar smart contract is distributed via an npm package. You can import the ModelRepository.abi file to your local environment by running

make import-abi

which will place the file at abis/ModelRepository.abi.

Start

After you made sure all the installation steps are done you need to set up your local mock environment.

# start the ipfs daemon in the background
ipfs daemon&
# run local ethereum mock
testrpc -a 1000

Now open a second shell, start the notebook and follow its instructions

jupyter notebook notebooks

Known issues

  • there have been reports of the brew installation of solidity not working properly

If you experience any problems while running this demo please create a github issue and help us get better.

License

Apache-2.0 by OpenMined contributors

pysonar's People

Contributors

4mber avatar alex-kattathra-johnson avatar amiteshp avatar anoff avatar axelhodler avatar davidrhodus avatar ecliptik avatar gavinuhma avatar iamtrask avatar jsn5 avatar karthiktsaliki avatar kevinahuber avatar lucaslopes avatar olveirap avatar sagivo avatar saintograph avatar samsontmr avatar swaroopch avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pysonar's Issues

Docker One-Liner on Tendermint

User Story A: As a user of our Docker One-Liner to run the entire OpenMined Platform, I want the ability to instead have a one-liner which downloads a docker image running the OpenMined platform on the Ethermint Tendermint blockchain.

Acceptance Criteria:

  • A single docker command exists with exactly equal functionality as the Ethereum Docker one-liner except that the test blockchain being run is the Tendermint - Ethermint blockchain.

Federated Learning Demos

User Story: As a user of PySonar, I want the ability to run the Decentralized Learning Demo (https://github.com/OpenMined/PySonar/blob/master/notebooks/Sonar%20-%20Decentralized%20Learning%20Demo.ipynb) on Machine Learning Models other than those packaged in PySyft. However, I still want the gradients to be encrypted using Paillier Homomorphic Encryption.

[Example of Paillier Homomorphic Encryption:] https://github.com/OpenMined/PySyft/blob/master/notebooks/Syft%20-%20Paillier%20Homomorphic%20Encryption%20Example.ipynb

Acceptance Criteria:

  • Rebuilt the demo from the notebook above using a Tensorflow model instead of Syft's Linear Classifier
  • Rebuilt the demo from the notebook above using a PyTorch model instead of Syft's Linear Classifier
  • Rebuilt the demo from the notebook above using a Keras model instead of Syft's Linear Classifier
  • Rebuilt the demo from the notebook above using a DyNet model instead of Syft's Linear Classifier
  • In all demos above, encrypt the gradients using Paillier Homomorphic Encryption. By "Gradients" I mean the total change in weights as a result of local training. Basically... during Step 2 of the model above, when the Anonymous Patient downloads a model to improve it. The Anonymous Patient downloads the model (say... a Tensorflow model), trains it on local data, then computes how much the weights changed (gradient = new_weights - old_weights). This gradient should then be converted to an encrypted PaillierTensor and uploaded to the blockchain.
  • Step 3 should be decrypt the gradient that was uploaded when evaluating

OpenMined Grid - Keras Demo

Helium - "OpenMined Grid" Prototype - Keras

Background: One of Sonar's main value propositions to the OpenMined ecosystem is the ability to create a marketplace for the training of ML models. This is a hybrid market for compute and data, as both the available computing power AND the available training data is distributed throughout the marketplace.

In this work, we wish to make independent progress on building the "compute marketplace" functionality of Sonar and PySonar, such that spare GPU time can be bought and sold in the marketplace for the purpose of training deep neural networks using Keras.

Part 1: User Stories

  • User Story A: As an Keras AI Researcher I want the ability to submit un-trained Neural Models to be trained by members of the OpenMined Grid. Furthermore, instead of renting access to a specific machine, I instead want to push my model to a queue which is automatically allocated to a machine of the appropriate size and cost given my constraints. I should be able to BOTH submit a model to this queue AND receive the trained mode from the queue in a Jupyter Notebook. Finally, in this setup, I am also submitting all training data for the model to be trained on alongside the model to be trained. I provide the model, the training data, and I am willing to pay only for the compute power to train the model.
  • User Story B: As a GPU Owner, I want the ability to rent out spare time on my GPU to the OpenMined grid, such that I can earn a passive income in a secure way facilitating the training of Machine Learning models. I'm interested in doing this because while I love using my GPU, most of the time I don't use it, and when I do I (frankly) wish that I had 10 GPUs. I'm interested in participating in the OpenMined platform because I want to rent out the use of my 1 GPU so that I can occasionally have access to 10 or 100 at a time when I'm doing my research. It is my hope that I will still break even on the cost/revenue of this tradeoff while giving me the flexibility to scale up compute as needed to support my projects.

Part 2: Mockup

The following notebook shows an example of how OpenMined Grid could work. However, while the code is plausible, please consider it only for inspiration. Furthermore, the model in the notebook uses PySyft for Machine Learning, whereas this ticket uses Keras.

Part 3: Acceptance Criteria

  • A demo Jupyter notebook that trains a Keras model on a locally hosted OpenMined Grid blockchain. It should include instructions (at the top of the notebook) for how to setup your local ecosystem (all dependencies and commands) to be able to run the notebook itself.

Slow

https://github.com/OpenMined/sonar/blob/master/notebooks/Sonar%20-%20Decentralized%20Model%20Training%20Simulation%20(local%20blockchain).ipynb

The training process is quite slow. There's lots of low hanging fruit to make it faster, and I'd like to capture the issue here. The main bottlenecks as I see them are:

  1. Serialization and deserialization to IPFS (this is the big one)
  2. Blockchain Smart Contract Ops
  3. Homomorphic Encryption and Decryption
  4. Homomorphic Operations
  5. Deep Learning Algos Themselves

There are plenty of redundancies in the current code.

(1) simply happens more than is necessary, and it's usually entire python objects (instead of... say...just the weights). Furthermore, it doesn't get cached in the python clients and is instead re-serialized every time it's called (very expensive).
(2) many of the smart contract ops iterate through entire lists to handle gradients and such instead of constant time lookups.
(3) FV uses a server that wraps R which wraps C++... the obvious thing here is to wrap the C++ directly in Python. That should speed things up considerably.
(3) FV also has native support for vectors and matrices which we ignore for the sake of simplicity. Using them should make ops much faster than wrapping at the scalar level (which is what we currently use)
(4) if you can see faster ways to implement these, i'm all ears (perhaps by re-implementing paillier in BLAS). There's also a GPU implementation of YASHE that's pretty cool.
(5) Since addition and multiplication by a constant are the fastest operations in HE, better mini-batching should allow for considerable speedups.
(5) the use of Momentum should also be quite useful given that all the gradients are public.

... this is the most obvious low hanging fruit to me. Love to see more below as you come across it.

docker-compose is broken in PySonar and mine.js

I am trying to run the demo notebooks in PySonar. However, docker-compose up fails with:
ERROR: Service 'pysonar-notebooks' failed to build: The command 'pip3 install scipy' returned a non-zero code: 1
for both the docker in PySonar and mine.js.
Please note that I have scipy installed already.

EncodingError: Object encoding error: Can't pickle <class 'syft.nn.linear.LinearClassifier'>: import of module 'syft.nn.linear' failed

When running the jupyter notebook, I ran into this error when in
Step 1: Cure Diabetes Inc Initializes a Model and Provides a Bounty. The ethereum testrpc and ipfs daemon are running in the docker container built by docker-compose.yml in Mine.js. Here's the full stack trace.

---------------------------------------------------------------------------
PicklingError                             Traceback (most recent call last)
~/.envs/openmined/lib/python3.6/site-packages/ipfsapi-0.4.1-py3.6.egg/ipfsapi/encoding.py in encode(self, obj)
    375         try:
--> 376             return pickle.dumps(obj)
    377         except pickle.PicklingError as error:

PicklingError: Can't pickle <class 'syft.nn.linear.LinearClassifier'>: import of module 'syft.nn.linear' failed

During handling of the above exception, another exception occurred:

EncodingError                             Traceback (most recent call last)
<ipython-input-18-0db73993ecda> in <module>()
     10                        target_error = 10000
     11                       )
---> 12 model_id = repo.submit_model(diabetes_model)

~/.envs/openmined/lib/python3.6/site-packages/sonar-0.1.0-py3.6.egg/sonar/contracts.py in submit_model(self, model)
    151         """
    152 
--> 153         ipfs_address = self.ipfs.add_pyobj(model.syft_obj)
    154         deploy_trans = self.get_transaction(model.owner,value=self.web3.toWei(model.bounty,'ether')).addModel([ipfs_address[0:32],ipfs_address[32:]],model.initial_error,model.target_error)
    155         return self.call.getNumModels()-1

~/.envs/openmined/lib/python3.6/site-packages/ipfsapi-0.4.1-py3.6.egg/ipfsapi/client.py in add_pyobj(self, py_obj, **kwargs)
   1999             str : Hash of the added IPFS object
   2000         """
-> 2001         return self.add_bytes(encoding.Pickle().encode(py_obj), **kwargs)
   2002 
   2003     def get_pyobj(self, multihash, **kwargs):

~/.envs/openmined/lib/python3.6/site-packages/ipfsapi-0.4.1-py3.6.egg/ipfsapi/encoding.py in encode(self, obj)
    376             return pickle.dumps(obj)
    377         except pickle.PicklingError as error:
--> 378             raise exceptions.EncodingError('pickle', error)
    379 
    380 

EncodingError: Object encoding error: Can't pickle <class 'syft.nn.linear.LinearClassifier'>: import of module 'syft.nn.linear' failed

Failed to run demo notebook

Hello there,
I run into some errors when running cells in Notebook.
File "/Users/anaconda/lib/python2.7/site-packages/phe/command_line.py", line 117 print(serialised, file=output) ^ SyntaxError: invalid syntax
Is there anyone who had similar mistake?

Error on docker-compose up

The error is:
ERROR: Cannot locate specified Dockerfile: Dockerfile.notebooks
When Dockerfile.notebooks was renamed to Dockerfile, the docker-compose.yml file wasn't updated accordingly.

KeyError: 'result'

I set up PySonar locally and am trying to run the demo notebook ('Sonar - Decentralizing Learning Demo').

When I run the cell under the heading 'Step 1: Cure Diabetes Inc Initializes a Model and Provides a Bounty' I get the following error:

screen shot 2017-11-10 at 22 49 32

Would someone be able to help debug this?

Helium - OpenMined on Tendermint via Ethermint

Helium : OpenMined on Tendermint

Background: Everything we do with Ethereum, we want to do on Tendermint for 20x the speed. This project is about building the 2 big demos from Hydrogen (A notebook and a docker one-liner) to use Ethermint instead of Ethereum as the underlying blockchain.

Part 1: User Stories

  • User Story A: As a user of our Docker One-Liner to run the entire OpenMined Platform, I want the ability to instead have a one-liner which downloads a docker image running the OpenMined platform on the Ethermint Tendermint blockchain.
  • User Story B: As a developer of OpenMined, I want the ability to spin up Sonar and PySonar on Tendermint using the Ethermint platform (not using Docker... just by running it locally)

Reduce docker image size

The notebook image is huge and takes ages to load. Need to get this down in size.
Foremost we should try to reduce the python dependencies like scipy, scikitlearn

We also have to look into ways to distribute the pySyft/pySonar libraries to other components. Base Imaging and building from source feels wrong

/cc @iamtrask @theoriginalalex

Error handling for end of stack

Current handling:

InsufficientDataBytes Traceback (most recent call last)
/usr/lib/python3.6/site-packages/web3/contract.py in call_contract_function(contract, function_name, transaction, *args, **kwargs)
828 try:
--> 829 output_data = decode_abi(output_types, return_data)
830 except DecodingError as e:

/usr/lib/python3.6/site-packages/eth_abi/abi.py in decode_abi(types, data)
108 stream = BytesIO(data)
--> 109 return decoder(stream)

/usr/lib/python3.6/site-packages/eth_abi/decoding.py in call(self, stream)
101 def call(self, stream):
--> 102 return self.decode(stream)
103

/usr/lib/python3.6/site-packages/eth_utils/functional.py in inner(*args, **kwargs)
32 def inner(*args, **kwargs):
---> 33 return callback(fn(*args, **kwargs))
34

/usr/lib/python3.6/site-packages/eth_abi/decoding.py in decode(cls, stream)
139 else:
--> 140 yield decoder(stream)
141

/usr/lib/python3.6/site-packages/eth_abi/decoding.py in call(self, stream)
101 def call(self, stream):
--> 102 return self.decode(stream)
103

/usr/lib/python3.6/site-packages/eth_abi/decoding.py in decode(cls, stream)
164 def decode(cls, stream):
--> 165 raw_data = cls.read_data_from_stream(stream)
166 data, padding_bytes = cls.split_data_and_padding(raw_data)

/usr/lib/python3.6/site-packages/eth_abi/decoding.py in read_data_from_stream(cls, stream)
246 cls.data_byte_size,
--> 247 len(data),
248 )

InsufficientDataBytes: Tried to read 32 bytes. Only got 0 bytes

The above exception was the direct cause of the following exception:

BadFunctionCallOutput Traceback (most recent call last)
in ()
5 target_error = 10000
6 )
----> 7 model_id = repo.submit_model(diabetes_model)

/usr/lib/python3.6/site-packages/sonar-0.1.0-py3.6.egg/sonar/contracts.py in submit_model(self, model)
157 deploy_tx.addModel(IPFSAddress().to_ethereum(ipfs_address),
158 model.initial_error, model.target_error)
--> 159 return self.call.getNumModels() - 1
160
161 def submit_gradient(self, from_addr, model_id, grad):

/usr/lib/python3.6/site-packages/web3/contract.py in call_contract_function(contract, function_name, transaction, *args, **kwargs)
849 )
850 )
--> 851 raise_from(BadFunctionCallOutput(msg), e)
852
853 normalized_data = [

/usr/lib/python3.6/site-packages/web3/utils/exception_py3.py in raise_from(my_exception, other_exception)
1 def raise_from(my_exception, other_exception):
----> 2 raise my_exception from other_exception

BadFunctionCallOutput: Could not decode contract function call getNumModels return data 0x for output_types ['uint256']

https://slack-files.com/T6963A864-F7MTFL2KW-e7a934a640

improve tooling

It should be easier to get the demo running. Possible solution: package all the prerequisites into a docker container that hosts the latest version of the notebook and expose the jupyter port to the host.

truffle compile truffle migrate fails

I am trying to run PySonar demo on Decentralized Model Training(Local) by bootstrapping the environment. I have successfully installed all dependencies. However, I fail to get the ModelRepository address after running truffle compile truffle migrate (the command runs without an error but without displaying anything.

Any help resolving this is appreciated.

dockerize the notebook

The (main) demo notebook within pysonar should be dockerized to allow running the platform using only docker and no local installations.

Acceptance Criteria:

  • Main notebook is automatically initialised when starting the container
  • (default) jupyter port is exposed to host system
  • container connects to blockchain container (see Sonar)
  • runs it's own IPFS node

docker-compose up command is not running with version 3

System : Ubuntu17.04

Steps: Run the "docker-compose up" command in the PySonar repository.

Result:

$ sudo docker-compose up

ERROR: Version in "./docker-compose.yml" is unsupported. You might be seeing this error because you're using the wrong Compose file version. Either specify a version of "2" (or "2.0") and place your service definitions under the services key, or omit the version key and place your service definitions at the root of the file to use version 1.
For more on the Compose file format versions, see https://docs.docker.com/compose/compose-file/

Workaround ::
Changed the version from 3 to 2 in the docker-compose.yml file present in the PySonar directory, and now it is able to run successfully.

Purpose ::
Reporting here so that we can solve this issue in future if it comes up.

OpenMined Grid - DyNet Demo

Helium - "OpenMined Grid" Prototype - Dynet

Background: One of Sonar's main value propositions to the OpenMined ecosystem is the ability to create a marketplace for the training of ML models. This is a hybrid market for compute and data, as both the available computing power AND the available training data is distributed throughout the marketplace.

In this work, we wish to make independent progress on building the "compute marketplace" functionality of Sonar and PySonar, such that spare GPU time can be bought and sold in the marketplace for the purpose of training deep neural networks using DyNet.

Part 1: User Stories

  • User Story A: As an DyNet AI Researcher I want the ability to submit un-trained Neural Models to be trained by members of the OpenMined Grid. Furthermore, instead of renting access to a specific machine, I instead want to push my model to a queue which is automatically allocated to a machine of the appropriate size and cost given my constraints. I should be able to BOTH submit a model to this queue AND receive the trained mode from the queue in a Jupyter Notebook. Finally, in this setup, I am also submitting all training data for the model to be trained on alongside the model to be trained. I provide the model, the training data, and I am willing to pay only for the compute power to train the model.
  • User Story B: As a GPU Owner, I want the ability to rent out spare time on my GPU to the OpenMined grid, such that I can earn a passive income in a secure way facilitating the training of Machine Learning models. I'm interested in doing this because while I love using my GPU, most of the time I don't use it, and when I do I (frankly) wish that I had 10 GPUs. I'm interested in participating in the OpenMined platform because I want to rent out the use of my 1 GPU so that I can occasionally have access to 10 or 100 at a time when I'm doing my research. It is my hope that I will still break even on the cost/revenue of this tradeoff while giving me the flexibility to scale up compute as needed to support my projects.

Part 2: Mockup

The following notebook shows an example of how OpenMined Grid could work. However, while the code is plausible, please consider it only for inspiration. Furthermore, the model in the notebook uses PySyft for Machine Learning, whereas this ticket uses DyNet.

Part 3: Acceptance Criteria

  • A demo Jupyter notebook that trains a DyNet model on a locally hosted OpenMined Grid blockchain. It should include instructions (at the top of the notebook) for how to setup your local ecosystem (all dependencies and commands) to be able to run the notebook itself.

docker-compose currently broken

Running docker-compose up fails with

pysonar-notebooks_1  | Traceback (most recent call last):
pysonar-notebooks_1  |   File "/usr/bin/jupyter-notebook", line 11, in <module>
pysonar-notebooks_1  |     sys.exit(main())
pysonar-notebooks_1  |   File "/usr/lib/python3.6/site-packages/jupyter_core/application.py", line 267, in launch_instance
pysonar-notebooks_1  |     return super(JupyterApp, cls).launch_instance(argv=argv, **kwargs)
pysonar-notebooks_1  |   File "/usr/lib/python3.6/site-packages/traitlets/config/application.py", line 657, in launch_instance
pysonar-notebooks_1  |     app.initialize(argv)
pysonar-notebooks_1  |   File "<decorator-gen-7>", line 2, in initialize
pysonar-notebooks_1  |   File "/usr/lib/python3.6/site-packages/traitlets/config/application.py", line 87, in catch_config_error
pysonar-notebooks_1  |     return method(app, *args, **kwargs)
pysonar-notebooks_1  |   File "/usr/lib/python3.6/site-packages/notebook/notebookapp.py", line 1296, in initialize
pysonar-notebooks_1  |     self.init_webapp()
pysonar-notebooks_1  |   File "/usr/lib/python3.6/site-packages/notebook/notebookapp.py", line 1120, in init_webapp
pysonar-notebooks_1  |     self.http_server.listen(port, self.ip)
pysonar-notebooks_1  |   File "/usr/lib/python3.6/site-packages/tornado/tcpserver.py", line 142, in listen
pysonar-notebooks_1  |     sockets = bind_sockets(port, address=address)
pysonar-notebooks_1  |   File "/usr/lib/python3.6/site-packages/tornado/netutil.py", line 197, in bind_sockets
pysonar-notebooks_1  |     sock.bind(sockaddr)
pysonar-notebooks_1  | OSError: [Errno 99] Address not available
pysonar_pysonar-notebooks_1 exited with code 1

Issue was found by adamchen in slack and reproduced by me

running docker-compose up in the openmined/mine.js repo works fine.

Helium - "OpenMind Grid" Prototype

Helium : "OpenMined Grid" Prototype

Background: One of Sonar's main value propositions to the OpenMined ecosystem is the ability to create a marketplace for the training of ML models. This is a hybrid market for compute and data, as both the available computing power AND the available training data is distributed throughout the marketplace.

In this work, we wish to make independent progress on building the "compute marketplace" functionality of Sonar, such that spare GPU time can be bought and sold in the marketplace for the purpose of training deep neural networks. Furthermore, we wish for this compute marketplace to be generic to any framework.

Part 1: User Stories

  • User Story A: As an AI Researcher I want the ability to submit un-trained Neural Models to be trained by members of the OpenMined Grid. Furthermore, instead of renting access to a specific machine, I instead want to push my model to a queue which automatically allocates my model to a machine of the appropriate size and cost given my constraints. I should be able to BOTH submit a model to this queue AND receive the trained mode from the queue in a Jupyter Notebook. Finally, in this setup, I am also submitting all training data for the model to be trained on alongside the model to be trained. I provide the model, the training data, and I am willing to pay only for the compute power to train the model.
  • User Story B: As a GPU Owner, I want the ability to rent out spare time on my GPU to the OpenMined grid, such that I can earn a passive income in a secure way facilitating the training of Machine Learning models. I'm interested in doing this because while I love using my GPU, most of the time I don't use it, and when I do I (frankly) wish that I had 10 GPUs. I'm interested in participating in the OpenMined platform because I want to rent out the use of my 1 GPU so that I can occasionally have access to 10 or 100 at a time when I'm doing my research. It is my hope that I will still break even on the cost/revenue of this tradeoff while giving me the flexibility to scale up compute as needed to support my projects.

Part 2: Mockup

The following notebook shows an example of how OpenMined Grid could work. However, while the code is plausible, please consider it only for inspiration. Furthermore, the model in the notebook uses PySyft for Machine Learning, we want external Deep Learning frameworks to also be compatible.

OpenMined Grid - Tensorflow Demo

Helium - "OpenMined Grid" Prototype - Tensorflow

Background: One of Sonar's main value propositions to the OpenMined ecosystem is the ability to create a marketplace for the training of ML models. This is a hybrid market for compute and data, as both the available computing power AND the available training data is distributed throughout the marketplace.

In this work, we wish to make independent progress on building the "compute marketplace" functionality of Sonar and PySonar, such that spare GPU time can be bought and sold in the marketplace for the purpose of training deep neural networks using Tensorflow.

Part 1: User Stories

  • User Story A: As an Tensorflow AI Researcher I want the ability to submit un-trained Neural Models to be trained by members of the OpenMined Grid. Furthermore, instead of renting access to a specific machine, I instead want to push my model to a queue which is automatically allocated to a machine of the appropriate size and cost given my constraints. I should be able to BOTH submit a model to this queue AND receive the trained mode from the queue in a Jupyter Notebook. Finally, in this setup, I am also submitting all training data for the model to be trained on alongside the model to be trained. I provide the model, the training data, and I am willing to pay only for the compute power to train the model.
  • User Story B: As a GPU Owner, I want the ability to rent out spare time on my GPU to the OpenMined grid, such that I can earn a passive income in a secure way facilitating the training of Machine Learning models. I'm interested in doing this because while I love using my GPU, most of the time I don't use it, and when I do I (frankly) wish that I had 10 GPUs. I'm interested in participating in the OpenMined platform because I want to rent out the use of my 1 GPU so that I can occasionally have access to 10 or 100 at a time when I'm doing my research. It is my hope that I will still break even on the cost/revenue of this tradeoff while giving me the flexibility to scale up compute as needed to support my projects.

Part 2: Mockup

The following notebook shows an example of how OpenMined Grid could work. However, while the code is plausible, please consider it only for inspiration. Furthermore, the model in the notebook uses PySyft for Machine Learning, whereas this ticket uses Tensorflow.

Part 3: Acceptance Criteria

  • A demo Jupyter notebook that trains a Tensorflow model on a locally hosted OpenMined Grid blockchain. It should include instructions (at the top of the notebook) for how to setup your local ecosystem (all dependencies and commands) to be able to run the notebook itself.

Update Circle CI docker tags

After removing the develop branch the way the tagging works needs to be updated:

latest commit is the edge image, all tags are published under the same tag + the most recent tag is aliased with latest (default image)

OpenMined Grid - PyTorch Demo

Helium - "OpenMined Grid" Prototype - PyTorch

Background: One of Sonar's main value propositions to the OpenMined ecosystem is the ability to create a marketplace for the training of ML models. This is a hybrid market for compute and data, as both the available computing power AND the available training data is distributed throughout the marketplace.

In this work, we wish to make independent progress on building the "compute marketplace" functionality of Sonar and PySonar, such that spare GPU time can be bought and sold in the marketplace for the purpose of training deep neural networks using PyTorch.

Part 1: User Stories

  • User Story A: As an PyTorch AI Researcher I want the ability to submit un-trained Neural Models to be trained by members of the OpenMined Grid. Furthermore, instead of renting access to a specific machine, I instead want to push my model to a queue which is automatically allocated to a machine of the appropriate size and cost given my constraints. I should be able to BOTH submit a model to this queue AND receive the trained mode from the queue in a Jupyter Notebook. Finally, in this setup, I am also submitting all training data for the model to be trained on alongside the model to be trained. I provide the model, the training data, and I am willing to pay only for the compute power to train the model.
  • User Story B: As a GPU Owner, I want the ability to rent out spare time on my GPU to the OpenMined grid, such that I can earn a passive income in a secure way facilitating the training of Machine Learning models. I'm interested in doing this because while I love using my GPU, most of the time I don't use it, and when I do I (frankly) wish that I had 10 GPUs. I'm interested in participating in the OpenMined platform because I want to rent out the use of my 1 GPU so that I can occasionally have access to 10 or 100 at a time when I'm doing my research. It is my hope that I will still break even on the cost/revenue of this tradeoff while giving me the flexibility to scale up compute as needed to support my projects.

Part 2: Mockup

The following notebook shows an example of how OpenMined Grid could work. However, while the code is plausible, please consider it only for inspiration. Furthermore, the model in the notebook uses PySyft for Machine Learning, whereas this ticket uses PyTorch.

Part 3: Acceptance Criteria

  • A demo Jupyter notebook that trains a PyTorch model on a locally hosted OpenMined Grid blockchain. It should include instructions (at the top of the notebook) for how to setup your local ecosystem (all dependencies and commands) to be able to run the notebook itself.

PySonar Runnable on Tendermint (Ethermint)

User Story: As a developer of OpenMined, I want the ability to spin up Sonar and PySonar on Tendermint using the Ethermint platform (not using Docker... just by running it locally)

Acceptance Criteria:

  • a demo notebook exists wherein i can run the demo located in this notebook on a local Tendermint Ethermint blockchain
  • much like this notebook the notebook created should include full instructions for launching the demo (including where to download ethermint deps, etc.)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.