Code Monkey home page Code Monkey logo

dann's People

Contributors

allcontributors[bot] avatar and1can avatar danielbaumert avatar dependabot[bot] avatar labnann avatar matiasvlevi avatar sharkace avatar shirrsho avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dann's Issues

[๐Ÿ”ท Feature request ]: More manual tests & examples

Feature

More manual tests in test/manual-tests/browser and test/manual-tests/node.

Type

  • Dann
  • Matrix
  • Layer
  • Activation functions
  • Loss functions
  • Pool functions
  • Datasets
  • Documentation
  • tests & examples
  • Other

Description

Examples have to be simple and easy to understand. You can use the template test/manual-tests/browser/empty-example and test/manual-tests/node/empty-example to create a new one.
Maybe they could also be run as tests along with mocha unit tests.

[๐Ÿ”ท Feature request ]: RNNs for Dannjs

Feature

Recurrent Neural Networks for Dannjs,

Type

  • Dann
  • Matrix
  • Layer
  • Activation functions
  • Loss functions
  • Pool functions
  • Datasets
  • Documentation
  • tests & examples
  • Other
  • Rann (New category for RNNs)

Description

Adding a new class for RNNs would be interesting. The class would likely be called Rann for Recurrent Artificial Neural Network, instead of the Dann acronym which stands for Deep Artificial Neural Network. This would be a pretty time-consuming feature to implement. It would require lots of testing before we can publish the feature to the master branch. Maybe creating another testing/dev branch for this feature would be necessary.

Examples

These are really early examples and might be completely different once we impelent the feature.
This would create the RNN.

const rnn = new Rann(input_neurons, hidden_neurons, output_neurons);

We could feed a Rann a set of sequences

let rnn = new Rann(2, 20, 2);
rnn.train([
  [1, 2],
  [3, 4],
  [5, 6],
  [7, 8]
],
[9, 10]
);
rnn.feed([
  [1, 2],
  [3, 4]
]);
// would return [5, 6]

Note

This is, of course, early speculation about a big addition to the library. It might take quite some time to create & test, and usability might change a whole lot throughout development. I'm assigning this issue to myself because I want to start working on it, but help and criticism are welcome.

[๐Ÿ”ท Feature request ] Batch Back-Propogation

Feature

Batch Back-Propagation / Training

Type

  • Dann
  • Matrix
  • Layer
  • Activation functions
  • Loss functions
  • Pool functions
  • Datasets
  • Documentation
  • tests & examples
  • Other

Description

In batch back-propagation, we split the full input set into smaller batches
and during training the model feed-forward through all the input / target pair of a batch
without changing the weight / bias but will be updating the gradient (I am not sure on the math behind it).
once the batch is complete it updates the weights and biases based on a overall picture of batch rather than using one input.
this could help the model find the right weights and biases much faster than the single input back-propagation.

Examples

How I would expect the final method to look like.

//say arch = 1 -> 2 - > 2

input = [1] 
target = [1 , 0]
nn.backpropagate(input , target)
// for one input, target pair

input =[ [ 1 ] , [-1 ] , [-5 ], [ 5 ] , [ 2 ] ]
target =[ [ 1 ,  0 ] , [ 0 , 1 ] , [ 0 , 1 ] , [ 1 , 0 ] , [ 1 , 0 ] ]

nn.backpropagate(input, target)
// expression remains same, internally it just need to check if Array.isArray(input[0]) => then batch train.

Additional context

I think this is available in all major ML libraries due to its great efficiency,
also will help in creating distributed training capabilities.

[๐Ÿ”ท Feature request ]: Aliases for feedForward & backpropagate

Feature

We could have aliases for the feedForward & backpropagate methods of the Dann class.

Type

  • Dann
  • Matrix
  • Layer
  • Activation functions
  • Loss functions
  • Pool functions
  • Datasets
  • Documentation
  • tests & examples
  • Other

Description

These aliases would be to make method names a little more uniform when we add in the Rann class for RNNs as feedForward & backpropagate would not be the most accurate terms for RNNs. See the issue on RNNs here.

Feedforward

Is located in src/classes/dann/methods/feedForward.js

backpropagate

Is located in src/classes/dann/methods/backpropagate.js

Examples

These would be the aliases
Dann.prototype.backpropagate to Dann.prototype.train
Dann.prototype.feedForward to Dann.prototype.feed

Note

It is important to note that we do not want to remove backpropagate & feedForward names, I think having machine learning terms for methods helps to get a grasp at what the neural network is doing since you can look up the terms.

Small datasets should be included in the library

Small datasets typically used for testing should be included in the library as an easy way to acces them. (So you woudn't need to define an XOR dataset to simply test your neural network)
Datasets such as:

  • XOR
  • Counting in binary
  • Even/Odd classification in binary
  • Any other set of small data with a pattern to detect

[๐Ÿ”ถ Change request ]: Activations should be case insensitive

Change

We could use toLocaleLowerCase to specify the activations functions without worrying about capitalization.

Type

  • Dann
  • Matrix
  • Layer
  • Activation functions
  • Loss functions
  • Pool functions
  • Datasets
  • Documentation
  • tests & examples
  • Other

Examples

these statements would all be valid if we ignore capitalization.

nn.addHiddenLayer(x, 'leakyrelu');
nn.addHiddenLayer(x, 'LEAKYRELU');
nn.addHiddenLayer(x, 'leakyReLU');

Additional context

This also means all activation names need to be in lowercase, and when we add a new activation with Add.activation, we should also convert the input name to lowercase.

Changes should be in these methods:

  • Layer.stringToFunc parses the activation names into an object containing, the derivative & the activation.
  • Add.activation adds new activations, and also needs to respect the case insensitivity.
  • Other Dann methods using activation strings will need some minor adjusments.

[๐Ÿ”ท Feature request ]: Inline docs example linter

Feature

Linter for documentation examples.

Type

  • Dann
  • Matrix
  • Layer
  • Activation functions
  • Loss functions
  • Pool functions
  • Datasets
  • Documentation
  • tests & examples
  • Other

Description

A grunt task that would lint the examples in the inline documentation & overwrite them in between the <code> tags.

[๐Ÿ”ถ Change request ]: Mobile friendly documentation

Change

We could adapt the documentation for mobile devices.

Type

  • Dann
  • Matrix
  • Layer
  • Activation functions
  • Loss functions
  • Pool functions
  • Datasets
  • Documentation
  • tests & examples
  • Other

Examples

Here is an example of what could be done to the documentation on mobile:
image

Additional context
We are using handlebars with yui-doc for the documentation. Even if you never used yui-doc or handlebars before, these tasks only requires CSS knowledge since the content/elements dont change.

The documentation templates are located in
docs/yuidoc-dannjs-theme/partials/
And all the css is located in
docs/yuidoc-dannjs-theme/assets/css/

[๐Ÿ”ท Feature request ]: Derivative of Softmax

Feature

Softmax activation function.

Type

  • Dann
  • Matrix
  • Layer
  • Activation functions
  • Loss functions
  • Pool functions
  • Datasets
  • Documentation
  • tests & examples
  • Other

Description

Here is the softmax function I wrote not so long ago:

/**
* Softmax function
* @method softmax
* @param z An array of numbers (vector)
* @return An array of numbers (vector)
**/
function softmax(z) {
  let ans = [];
  let denom = 0;
  for (let j = 0; j < z.length; j++) {
    denom += Math.exp(z[j]);
  }
  for (let i = 0; i < z.length; i++) {
    let top = Math.exp(z[i]);
    ans.push(top / denom);
  }
  return ans;
}

This function is not implemented in the repository yet.

For this function to work in a Neural Network, we would need to write the derivative of this function. This might be a difficult task since this function takes in & outputs vectors. Vectors that are represented as arrays.

These two functions would need to be implemented in src/core/functions/actfuncs.js.

For this function to work with a Dann model, we would need to change how to activations are handled since it expects a vector instead of a number value. I could work on that once the derivative is implemented.

[๐Ÿ”ท Feature request ] Can we add a Change Log / Release Notes ?

Feature

Unless I've missed it, there doesn't seem to be any public-facing documentation that describes the changes introduced in a given release.

E.g., comparing v2.4.0 to v2.4.1c it looks like some of the recent code-level changes may include:

  • renaming the various named activation/loss functions to be all lower case (but backward compatible to the mixedCase names because of the name.toLocaleLowerCase() calls added in parallel?)
  • minor changes to the minimized toFunction output (maybe a bug fix?)
  • a new saveLoss configuration option (that populates nn.losses?)

I suspect there are no "breaking" changes here (and didn't notice anything obviously broken when I upgraded locally) but it would be helpful to have some context for what has changed beyond spelunking in the diff between release tags. (Also if saveLoss does what I think it does that would be neat to add to the log(options) documentation.)

Type

  • Dann
  • Matrix
  • Layer
  • Activation functions
  • Loss functions
  • Pool functions
  • Datasets
  • Documentation
  • tests & examples
  • Other

Description

(See above)

Examples

For an elaborate example of this kind of documentation, see Electron.js's release notes but for what it's worth I'd be satisfied with a simple CHANGES.md file or whatever in the root directory that describes:

a. breaking changes,
b. new features, and
c. bugs fixed

in few short bullet points (and the "bugs fixed" one is probably optional since "fixed in release N" is probably being added to the actual bug report anyway and if anyone knows or cares about a specific issue they can probably find that info there).

It doesn't need to be exhaustive or especially detailed, I'm just looking for a clue from the contributors about the intended or expected impact of the changes in a give release.

Additional Context

Thanks for this library BTW. I don't mean to appear ungrateful, but even rudimentary release notes would make it easier to upgrade versions with confidence and it's hard for someone on more on the "api consumer" side of this to contribute to that. (It's probably best for the contributors directly involved with the changes or at least the troubleshooting to capture that info.)

Function that converts a Dann model to dannData JSON object

Dann.save(); is not using any function to convert a Dann model to a savable json object, the convertion is happening directly in the Dann.save(); function, which is not optimal. Creating a Dann function that outputs a json object to be called in Dann.save(); will result in cleaner code. Also the function could be called on its own to use the json object in an other way. The decoder equivalent of this function is Dann.applyToModel(jsonObject); . Documentation on this function is still missing, it would be relevant to add it when creating the converter function.

[๐Ÿ”ท Feature request ]: XOR multiple inputs

Feature

A function that would create an XOR dataset with X number of inputs.

Type

  • Dann
  • Matrix
  • Layer
  • Activation functions
  • Loss functions
  • Pool functions
  • Datasets
  • Documentation
  • tests & examples
  • Other

Description

We currently have a static 2 input XOR dataset for testing/examples purposes. We also have a makeBinary function that creates a dataset of binary digits with X bits, so you can create a custom dataset to test a neural network. What if XOR had a similar function allowing for the creation of a 3 or 4 input XOR.

Examples

const dataset = makeXOR(3);
console.log(dataset);
{
 {
  input: [0, 0, 0],
  output: [0]
 },
 {
  input: [0, 0, 1],
  output: [1]
 },
 {
  input: [0, 1, 0],
  output: [1]
 },
 {
  input: [0, 1, 1],
  output: [0]
 },
//...
 {
  input: [1, 1, 1],
  output: [1]
 },
}

This is a 3 input XOR table for reference

drawing

[๐Ÿ”ท Feature request ]: Browser unit tests

Feature

Unit tests in a browser environement

Type

  • Dann
  • Matrix
  • Layer
  • Activation functions
  • Loss functions
  • Pool functions
  • Datasets
  • Documentation
  • tests & examples
  • Other

Description

Running the mocha unit tests for the browser.
We currently have unit tests with mocha that run in the command line with node. Having the same tests in a browser environment would eliminate potential errors in the future.

[๐Ÿ”ท Feature request ]: numToString, reverse from stringToNum.

Feature

A static function that takes in an array of values and returns a string. Just like Rann.stringToNum reversed. This method should be added to the RNN branch.

Type

  • Dann
  • Matrix
  • Layer
  • Rann
  • Activation functions
  • Loss functions
  • Pool functions
  • Datasets
  • Documentation
  • tests & examples
  • Other

Examples

Existing stringToNum method

Rann.stringToNum('hey');
// Returns [0.7604166666666666, 0.7291666666666666, 0.9375]

numToString method

Rann.numToString([0.7604166666666666, 0.7291666666666666, 0.9375]);
// Should return 'hey'

Additional context
Be sure to commit changes to the RNN branch.

[๐Ÿ”ท Feature request ]: Sin wave dataset for Rann

Feature

Segmented Sin wave dataset for Rann.

Type

  • Dann
  • Matrix
  • Layer
  • Activation functions
  • Loss functions
  • Pool functions
  • Datasets
  • Documentation
  • tests & examples
  • Other

Description

Just like we have XOR & Binary digit testing datasets for Dann models, it would be nice to have a testing dataset for the upcoming Rann model. The changes would have to be applied to origin/RNN branch. The method creating the dataset should be referenced in the module.exports in src/io/exports.js.

The source file for this method should be in src/core/datasets/

Context

We train a Rann model this way. We feed an array of sequences to the Rann model. The sequences lengths must be the same as the number of input neurons the Rann model has.

rnn.train([
  [0, 1],
  [2, 3],
  [4, 5]
]);

We could technically have a sin wave in an array of sequences to later train a model with.

let data = [
  [0, sinus values..., ],
  [sinus values continuation..., ],
  [sinus values continuation..., ],
  [sinus values continuation..., ],
]

Example

Here is an example of how the method could work.

let dataset = makeSinWave( sequence_length, total_length, resolution );
console.log(dataset);

[๐Ÿ”บ malformed arrows when logging nn.toFunction ]

Bug description

the console.log provides a function with malformed arrow statements

To Reproduce

// logging the following snippet toFunction will produce malformed arrow statements
// ex. 
// a[1]=(1+Math.exp(-t)=>);a[2]=(1+Math.exp(-t)=>);a[3]=(1+Math.exp(-t)=>);  

// initialize Dann with 1 input and 1 output  
let nn = new Dann(1, 1)

// Number of neuron layers in the neural net  
nn.addHiddenLayer(3, "sigmoid")
nn.addHiddenLayer(3, "sigmoid")

// how to calculate output  
nn.outputActivation("sigmoid")

// assign random weights to layers  
nn.makeWeights()

// How fast should it learn?  
nn.lr = 0.1

// mean square errorrate  
nn.setLossFunction("mse")

// show info about the neural network  
nn.log()


// Training data  
for(let count=0; count < 1000; count++) {
    let randNum = Math.random()*10 - 5
    nn.backpropagate([randNum], [randNum < 0 ? 0 : 1])
}
console.log(nn.loss)

// log the function
console.log(nn.toFunction())

// Logging the function produces the following -   
// function myDannFunction(input){let w=[];w[0]=[[118.16350397261459],[125.7198197305115],[-61.353668013979465]];w[1]=[[-0.3268324018128853,-0.10547783949606436,1.2385617474541086],[-4.756040201258138,-5.530586211507047,2.0654393849840065],[-6.638909077737027,-6.373098375160245,3.4436506766914343]];w[2]=[[-0.27843549703223947,-1.8499126518203834,-2.4900563442361467]];let b=[];b[0]=[[6.431720708819896],[5.712699746115524],[-8.928168116731628]];b[1]=[[-0.8442173421026961],[-1.8347328438421329],[-1.6789862537895264]];b[2]=[[1.2994856424588022]];let c=[1,3,3,1];let a=[];a[1]=(1+Math.exp(-t)=>);a[2]=(1+Math.exp(-t)=>);a[3]=(1+Math.exp(-t)=>);let l=[];l[0]=[];for(let i=0;i<1;i++){l[0][i]=[input[i]]};for(let i=1;i<4;i++){l[i]=[];for(let j=0;j<c[i];j++){l[i][j]=[0]}};for(let m=0;m<3;m++){for(let i=0;i<w[m].length;i++){for(let j=0;j<l[m][0].length;j++){let sum=0;for(let k=0;k<w[m][0].length;k++){sum+=w[m][i][k]*l[m][k][j]};l[m+1][i][j]=sum}};for(let i=0;i<l[m+1].length;i++){for(let j=0;j<l[m+1][0].length;j++){l[m+1][i][j]=l[m+1][i][j]+b[m][i][j]}};for(let i=0;i<l[m+1].length;i++){for(let j=0;j<l[m+1][0].length;j++){l[m+1][i][j]=a[m+1](l[m+1][i][j])}}};let o=[];for(let i=0;i<1;i++){o[i]=l[3][i][0]};return o}

// passing data to the model  
nn.feedForward([25], {log: true, decimals: 3})

Expected behavior

Generate a usable function

Actual behavior

Logging produces the following statements:

a[1]=(1+Math.exp(-t)=>);
a[2]=(1+Math.exp(-t)=>);
a[3]=(1+Math.exp(-t)=>);  

Platform

  • [ x ] Browser
  • Nodejs

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.