Code Monkey home page Code Monkey logo

morb's Introduction

Morb: a modular RBM implementation in Theano

![Morb logo](http://github.com/benanne/morb/raw/master/morblogo.png)

Introduction

Morb is a toolbox for building and training Restricted Boltzmann Machine models in Theano. It is intended to be modular, so that a variety of different models can be built from their elementary parts. A second goal is for it to be extensible, so that new algorithms and techniques can be plugged in easily.

The elementary parts in question are different types of units, which can be connected with different types of parameters. A schematic diagram of the architecture can be viewed below.

A unit type defines the distribution of that unit. For example, binary units are Bernoulli distributed. Several unit types are available, and new ones can be defined easily.

The different types of parameters form the trainable part of the model. These include biases, regular weights, convolutional weights and third order weights, amongst others. New parameter types can be defined by specifying the terms they contribute to the activations of each of the units they tie, the term they contribute to the model energy function, and the gradient of the energy function with respect to the parameters.

To train the model, one has to specify how the parameters should be updated in each step of the training process. This is possible by defining updaters, which can be composed. For example, one can combine a contrastive divergence updater with a weight decay updater and a sparsity regularisation updater. Momentum can also be applied with a momentum updater, which encapsulates another updater. Some updaters, like the contrastive divergence updater, calculate parameter updates from statistics obtained from training data.

Finally, a trainer is used to compile the symbolical parameter update expressions into a training function.

Schematic diagram of Morb's RBM architecture

Example

Below is a simple example, in which an RBM with binary visibles and binary hiddens is trained on an unspecified dataset using one-step contrastive divergence (CD-1), with some weight decay.

from morb import base, units, parameters, stats, updaters, trainers, monitors
import numpy
import theano.tensor as T

## define hyperparameters
learning_rate = 0.01
weight_decay = 0.02
minibatch_size = 32
epochs = 50

## load dataset
data = ...

## construct RBM model
rbm = base.RBM()

rbm.v = units.BinaryUnits(rbm) # visibles
rbm.h = units.BinaryUnits(rbm) # hiddens

rbm.W = parameters.ProdParameters(rbm, [rbm.v, rbm.h], initial_W) # weights
rbm.bv = parameters.BiasParameters(rbm, rbm.v, initial_bv) # visible bias
rbm.bh = parameters.BiasParameters(rbm, rbm.h, initial_bh) # hidden bias

## define a variable map, that maps the 'input' units to Theano variables.
initial_vmap = { rbm.v: T.matrix('v') }

## compute symbolic CD-1 statistics
s = stats.cd_stats(rbm, initial_vmap, visible_units=[rbm.v], hidden_units=[rbm.h], k=1)

## create an updater for each parameter variable
umap = {}
for variable in [rbm.W.W, rbm.bv.b, rbm.bh.b]:
    new_value = variable + learning_rate * (updaters.CDUpdater(rbm, variable, s) - decay * updaters.DecayUpdater(variable))
    umap[variable] = new_value

## monitor reconstruction cost during training
mse = monitors.reconstruction_mse(s, rbm.v)
 
## train the model
t = trainers.MinibatchTrainer(rbm, umap)
train = t.compile_function(initial_vmap, mb_size=minibatch_size, monitors=[mse])

for epoch in range(epochs):
    costs = [m for m in train({ rbm.v: data })]
    print "MSE = %.4f" % numpy.mean(costs)

Disclaimer

This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.

You should have received a copy of the GNU General Public License along with this program. If not, see http://www.gnu.org/licenses/.

morb's People

Contributors

benanne avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

morb's Issues

the batch size in the image (2) at run time is different than at build time (10) for the ConvOp

I can't figure out how to correct this error:

ValueError: the batch size in the image (2) at run time is different than at build time (10) for the ConvOp.
Apply node that caused the error: ConvOp{('imshp', (1, 28, 28)),('kshp', (28, 28)),('nkern', 50),('bsize', 10),('dx', 1),('dy', 1),('out_mode', 'valid'),('unroll_batch', 5),('unroll_kern', 2),('unroll_patch', False),('imshp_logical', (1, 28, 28)),('kshp_logical', (28, 28)),('kshp_logical_top_aligned', True)}(Subtensor{int64:int64:}.0, Subtensor{::, ::, ::-1, ::-1}.0)
Inputs shapes: [(2, 1, 28, 28), (50, 1, 28, 28)]
Inputs strides: [(6272, 6272, 224, 8), (6272, 6272, -224, -8)]

I made mb_size=1, and it works but i'd like to work with larger minibatches, so can you please help me out here?

Picklable models?

First off, thanks for this awesome library! I'm new to DeepLearning and Theano, but your lib makes it so much easier.

I was wondering if there's a way to save (pickle) MORB models? I tried modifying example_basic.py by adding the following lines at the end:

import cPickle
with open('model.pkl', 'wb') as f:
    cPickle.dump(rbm, f, protocol=cPickle.HIGHEST_PROTOCOL)

but I get the following trace:

...
Epoch 49
MSE = 0.0026
Took 6.69 seconds
Traceback (most recent call last):
  File "example_basic.py", line 61, in <module>
    cPickle.dump(rbm, f, protocol=cPickle.HIGHEST_PROTOCOL)
cPickle.PicklingError: Can't pickle <type 'function'>: attribute lookup __builtin__.function failed

If I remove the protocol=cPickle.HIGHEST_PROTOCOL argument I get:

Epoch 49
MSE = 0.0026
Took 6.96 seconds
Traceback (most recent call last):
  File "example_basic.py", line 61, in <module>
    cPickle.dump(rbm, f)
  File "/<removed path for privacy>/python2.7/copy_reg.py", line 70, in _reduce_ex
    raise TypeError, "can't pickle %s objects" % base.__name__
TypeError: can't pickle function objects

Is there another way to save MORB models for later use?

Thanks,
Jorge

Monitors should be plain old Python functions

The Monitor baseclass doesn't do anything useful - monitors should just be Python functions that potentially take a StatsCollector and a Units instance as arguments. Should it turn out to be necessary to make them 'richer', then it's better to use annotations, which is more in line with Samplers and ActivationFunctions.

MSE values increase when I train Guassianbinary RBM

Hi,
Thank you for your post,it helps me a lot.
I tried to train Guassianbinary RBM with my data,which consists of real number ranging from -100 to 100. But the value of MSE monitor keeps increasing. I konw the triaing of RBM uses free energy function to do backpropagation, so I want to ask if it is okay for value of MSE monitor to keep incresing while training.

Best,
Amos

ImportError: No module named 'base'

Problem: I meet a import error in morb/init.py when I excute 'from morb import base, units, parameters, stats, updaters, trainers, monitors'

  • I download all the code file.When I run the code , it raise the import error.
  • Then I change 'import base' to 'from . import base' in the morb/init.py . It do not raise import error.But it raise ModuleNotFoundError.
ImportError information: 
 File "D:mydir\example.py", line 1, in <module>
    from morb import base, units, parameters, stats, updaters, trainers, monitors
  File "D:mydir\morb\__init__.py", line 1, in <module>
    import base
ImportError: No module named 'base'
ModuleNotFoundError information: 
 File "D:mydir\example.py", line 1, in <module>
    from morb import base, units, parameters, stats, updaters, trainers, monitors
  File "D:mydir\morb\__init__.py", line 7, in <module>
    from .import updaters
  File "D:mydir\morb\updaters.py", line 2, in <module>
    import samplers
ModuleNotFoundError: No module named 'samplers'

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.