Code Monkey home page Code Monkey logo

tlearn-rb's Introduction

TLearn Ruby

Build Status

Recurrent Neural Network library for Ruby which uses the tlearn neural network simulator(http://crl.ucsd.edu/innate/tlearn.html)

To see an example run:

rake example:train
rake example:fitness

Seriously? Just use FANN!

TLearn supports recurrent networks (http://en.wikipedia.org/wiki/Recurrent_neural_network), FANN does not. Recurrent networks maintain state, enabling the context of previous inputs to effect further outputs.

While there have been attempts to add recurrent networks to FANN (http://leenissen.dk/fann/forum/viewtopic.php?t=47) these are still sitting on a dead branch that was never merged into master.

Installing TLearn

gem install tlearn

#Usage

require 'tlearn'

tlearn = TLearn::Run.new(:number_of_nodes => 86,
                         :output_nodes    => 41..46,
                         :linear          => 47..86,
                         :weight_limit    => 1.00,
                         :connections     => [{1..81   => 0},
                                              {1..40   => :i1..:i77},
                                              {41..46  => 1..40},
                                              {1..40   => 47..86},
                                              {47..86  => [1..40, {:max => 1.0, :min => 1.0}, :fixed, :one_to_one]}])
                 
  
training_data = [{[0] * 77 => [1, 0, 0, 0, 0, 0]}],
                [{[1] * 77 => [0, 0, 0, 0, 0, 1]}]
  
tlearn.train(training_data, sweeps = 200)

tlearn.fitness([0] * 77, sweeps = 200)
# => ["0.016", "0.013", "0.022", "0.020", "0.463", "0.467"]

You will often seperate the training and fitness process. By specifying a working directory when training you can reuse the training data for later fitness evaulations.

tlearn.train(training_data, sweeps = 200, working_dir='/training_session/')
tlearn.fitness([0] * 77, sweeps = 200, working_dir = '/training_session/')
# => ["0.016", "0.013", "0.022", "0.020", "0.463", "0.467"]

Configuring TLearn (What the heck does all that config mean?)

Yes, its complicated configuring this thing. Lets work through the different configuration options:

#Total number of nodes in the network (not counting input).
:number_of_nodes => 86

#The nodes that are used for output.
:'output_nodes' => 41..46

nodes 1-10 are linear. Linear nodes ouput the inner-product of the input and weight vectors
:linear_nodes => 1..10

#nodes 1-10 are bipolar. Bipolar nodes output ranges continuously from -1 to +1.
:bipolar_nodes => 1..10

#weights between nodes will not exceed 1.00
:weight_limit => 1.00

Connections

Here we specify how all of our nodes are connected, the architecture of the neural network.

We use ranges to specify connections between nodes:

1..3 => 4..6

Indicates connections:

node 1 <- node 4 
node 1 <- node 5
node 1 <- node 6

node 2 <- node 4
node 2 <- node 5
node 2 <- node 6

node 3 <- node 4
node 3 <- node 5
node 3 <- node 6

Note that the nodes specified first (1..3) are the destination nodes, the second nodes (4..6) are the source nodes. The sources nodes feed into the destination nodes.

Make sure you feed the input nodes into the network.

#1-6 nodes are fed input nodes 1-10.
1..6 => i1..i10

We can also add constraints to the different connections:

#nodes 1-6 connections with nodes 7-9 will have weights never less than 1 or greater than 10. 1..6 => [7..9, {:min => 1, :max => 10}]
#nodes 1-6 are fed from node 0 (node 0 is always the bias node).
1..6 => 0

#nodes 1-6 connections with nodes 7-10 are fixed at initiation values and will not change throughout learning.
1..6 => [7..10, :fixed]

#nodes 1-6 connections with nodes 7-10 are fixed at 2 and will not change throughout learning.
1..6 => [7..9, {:min => 2, :max => 2}]

#1-6 nodes connections with nodes from 7-9 are fixed at weight 1. 
1..3 => [7..9, {:min => 1.0, :max => 1.0}, :fixed, :'one_to_one']

one_to_one changes the way connections are mapped. Instead of one node mapping to every other node we have a 1-1 mapping between nodes:

For example:

node 1 -> node 7
mode 2 -> node 8
node 3 -> node 9

There is also the TLearn manual if you want read more:

http://blog.josephwilk.net/uploads/tlearn.pdf

#C TLearn Library

The Original TLearn C library was written by:

  • Kim Plunkett
  • Jeffrey L. Elman

#Contributors

#License

(The MIT License)

Copyright (c) 2012-2016 Joseph Wilk

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the 'Software'), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

tlearn-rb's People

Contributors

josephwilk avatar ptuchster avatar beeksiwaais avatar semanticart avatar lolgear avatar

Stargazers

Valentin avatar HOANG Bao Tin avatar Artem Kirienko avatar Matt avatar Kamil Nicieja avatar Łukasz Siwiński avatar  avatar Chris Olstrom avatar  avatar Håkan Nylén avatar Andrew Carpenter avatar Emre Çelik avatar Eric Weinstein avatar Matt Daubert avatar Lex Bartnicki avatar Johannus Vogel avatar  avatar Artem Khramov avatar Santi Bivacqua avatar Trevoke avatar Alexey Cherepanov avatar Geoffrey Lee avatar UsabilityHub avatar Nick avatar Christoph Zipperle avatar Patryk Łapiezo avatar Howard avatar  avatar Chris Slott avatar  avatar Oliver Morgan avatar Caleb Xu avatar Denis avatar Son Jungwook avatar Thomas Efer avatar Abdullah avatar Alex Baldwin avatar Sungjin Kim avatar Gerardo Ortega avatar Toshiyuki Kawanishi avatar Akihiko Tomokane avatar Paul Deardorff avatar Robert Qualls avatar  avatar Anup Michael Salve avatar Richardson Dackam avatar  avatar  avatar Joseph Young avatar  avatar mr.The avatar Zhouyou avatar Brad Bergeron avatar Micah avatar Matt Leonard avatar John Hope avatar Geoff Buesing avatar Yee avatar Aida Torajiro avatar  avatar Marcin Lewandowski avatar Andrew Nesbitt avatar Stephen avatar JaimeLynSchatz avatar Boris Köster avatar Bart Kamphorst avatar Anatoly Chernov avatar Vasiliy Yorkin avatar aIex quiterio avatar Leo Gallucci avatar Gabriel Habryn avatar Matt Solt avatar Dmitry Polushkin avatar Joe Fredette avatar Halida avatar  avatar Oscar Rendon avatar Tatsuya Takamura avatar Dan Kozlowski avatar Victor Hugo Bernardes de Souza avatar Mikhail Dieterle avatar Andrei Beliankou avatar Oleg Kovalenko avatar Delisa avatar Lukas Elmer avatar Kelley Reynolds avatar a_nackov avatar Alexander Quine avatar Thiago Massa avatar kayak avatar Dawid Pogorzelski avatar Ryan Stout avatar Mikhail S. Pabalavets avatar Kurt Sussman avatar Julias Shaw avatar  avatar

Watchers

Trevor avatar Eleanor McHugh avatar Andrei Beliankou avatar Bart Kamphorst avatar James Cloos avatar  avatar JaimeLynSchatz avatar Sungjin Kim avatar Ian Robinson avatar angelhead avatar  avatar

tlearn-rb's Issues

FileUtils not loaded by default in Ruby

FileUtils is not loaded by default, so I have an error like this:

`clear_entire_training_data': uninitialized constant TLearn::RunTLearn::FileUtils (NameError)

Adding require 'fileutils' solve the problem but it should be present in the gem to avoid having to add it each time.

Dynamic output size

Currently we have max output size at 255.
This should be dynamic based on config.

Either load config earlier or malloc during method.

Spec fail on Travis

Do you have an idea why TLearn fail on Travis (and with my Ubuntu 12.04) ?

I know TLearn C library is quite old. What environment are you using to test ? (Mac ?)

ERROR: Can't open .cf file: No such file or directory

Cloning the tlearn-rb repo and running rake gives me the following error:

~/.rvm/rubies/ruby-1.9.3-p448/bin/ruby -S rspec spec_integration/tlearn_ext_spec.rb spec_integration/tlearn_spec.rb
..ERROR: Can't open .cf file: No such file or directory

Any pointers?

No 'new' method in 0.0.6

Hi,

I have installed TLearn 0.0.6 with RubyGems. When I tried your example (in the README) I get this :

`<main>': undefined method `new' for TLearn:Module (NoMethodError)

make tlearn fails

$ gem install tlearn
# Building native extensions.  This could take a while...
# ERROR:  Error installing tlearn:
#   ERROR: Failed to build gem native extension.

#     ~/.rbenv/versions/2.1.2/bin/ruby extconf.rb
# creating Makefile

# make "DESTDIR=" clean

# make "DESTDIR="
# compiling activate.c
# activate.c:10:8: warning: extra tokens at end of #endif directive [-Wextra-tokens]
# #endif  EXP_TABLE
#         ^
#         //
# activate.c:19:1: warning: 'extern' ignored on this declaration [-Wmissing-declarations]
# extern  struct  cf {
# ^
# activate.c:28:1: warning: 'extern' ignored on this declaration [-Wmissing-declarations]
# extern  struct  nf {
# ^
# activate.c:42:1: warning: type specifier missing, defaults to 'int' [-Wimplicit-int]
# act_nds(aold,amem,anew,awt,local,atarget)
# ^~~~~~~
# activate.c:119:9: warning: implicitly declaring library function 'exit' with type 'void (int) __attribute__((noreturn))'
#                                                                 exit(1);
#                                                                 ^
# activate.c:119:9: note: please include the header <stdlib.h> or explicitly provide a declaration for 'exit'
# activate.c:220:1: warning: control may reach end of non-void function [-Wreturn-type]
# }
# ^
#6 warnings generated.
# compiling arrays.c
# arrays.c:49:5: warning: implicit declaration of function 'free' is invalid in C99 [-Wimplicit-function-declaration]
#     free(error_values);
#     ^
# arrays.c:72:1: warning: type specifier missing, defaults to 'int' [-Wimplicit-int]
# make_arrays()
# ^~~~~~~~~~~
# arrays.c:84:3: warning: implicitly declaring library function 'exit' with type 'void (int) __attribute__((noreturn))'
#                 exit(1);
#                 ^
# arrays.c:84:3: note: please include the header <stdlib.h> or explicitly provide a declaration for 'exit'
# arrays.c:124:60: warning: format specifies type 'int' but the argument has type 'unsigned long' [-Wformat]
#                 printf("wt malloc failed--needed %d bytes for pointers", nn*sizeof(float *));
#                                                  ~~                      ^~~~~~~~~~~~~~~~~~
#                                                  %lu
# arrays.c:197:1: warning: control may reach end of non-void function [-Wreturn-type]
# }
# ^
# arrays.c:199:1: warning: type specifier missing, defaults to 'int' [-Wimplicit-int]
# free_parrays(){
# ^~~~~~~~~~~~
# arrays.c:213:1: warning: control reaches end of non-void function [-Wreturn-type]
# }
# ^
# arrays.c:215:1: warning: type specifier missing, defaults to 'int' [-Wimplicit-int]
# make_parrays()
# ^~~~~~~~~~~~
# arrays.c:261:1: warning: control may reach end of non-void function [-Wreturn-type]
# }
# ^
#9 warnings generated.
# compiling compute.c
# compute.c:27:1: warning: 'extern' ignored on this declaration [-Wmissing-declarations]
# extern  struct  nf {
# ^
# compute.c:40:1: warning: type specifier missing, defaults to 'int' [-Wimplicit-int]
# comp_errors(aold,atarget,aerror,e,ce_e)
# ^~~~~~~~~~~
# compute.c:65:4: warning: implicitly declaring library function 'exit' with type 'void (int) __attribute__((noreturn))'
#                         exit(1);
#                         ^
# compute.c:65:4: note: please include the header <stdlib.h> or explicitly provide a declaration for 'exit'
# compute.c:110:1: warning: control may reach end of non-void function [-Wreturn-type]
# }
# ^
# compute.c:113:1: warning: type specifier missing, defaults to 'int' [-Wimplicit-int]
# comp_deltas(apold,apnew,awt,adwt,aold,anew,aerror)
# ^~~~~~~~~~~
# compute.c:215:2: error: non-void function 'comp_deltas' should return a value [-Wreturn-type]
#         return;
#         ^
# compute.c:218:1: warning: type specifier missing, defaults to 'int' [-Wimplicit-int]
# comp_backprop(awt,adwt,aold,amem,atarget,aerror,local)
# ^~~~~~~~~~~~~
# compute.c:401:2: error: non-void function 'comp_backprop' should return a value [-Wreturn-type]
#         return;
#         ^
#6 warnings and 2 errors generated.
# make: *** [compute.o] Error 1

# make failed, exit code 2

# Gem files will remain installed in ~/.rbenv/versions/2.1.2/lib/ruby/gems/2.1.0/gems/tlearn-0.0.8 for inspection.
# Results logged to ~/.rbenv/versions/2.1.2/lib/ruby/gems/2.1.0/extensions/x86_64-darwin-13/2.1.0-static/tlearn-0.0.8/gem_make.out

error

irb
irb(main):001:0> require "tlearn"
Traceback (most recent call last):
10: from /usr/local/bin/irb:23:in <main>' 9: from /usr/local/bin/irb:23:in load'
8: from /usr/local/share/gems/gems/irb-1.2.4/exe/irb:11:in <top (required)>' 7: from (irb):1 6: from /usr/local/share/ruby/site_ruby/rubygems/core_ext/kernel_require.rb:156:in require'
5: from /usr/local/share/ruby/site_ruby/rubygems/core_ext/kernel_require.rb:168:in rescue in require' 4: from /usr/local/share/ruby/site_ruby/rubygems/core_ext/kernel_require.rb:168:in require'
3: from /home/user/.gem/ruby/2.7.0/gems/tlearn-0.0.8/lib/tlearn.rb:22:in <top (required)>' 2: from /usr/local/share/ruby/site_ruby/rubygems/core_ext/kernel_require.rb:92:in require'
1: from /usr/local/share/ruby/site_ruby/rubygems/core_ext/kernel_require.rb:92:in `require'
TypeError (no implicit conversion of nil into String)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.