brunjlar / neural Goto Github PK
View Code? Open in Web Editor NEWNeural Nets in native Haskell
License: MIT License
Neural Nets in native Haskell
License: MIT License
Hi,
I'm trying to install neural 0.3.0.1 from Hackage, and I'm getting build errors.
I know that github says build is green, however Hackage says build is failing, here: https://hackage.haskell.org/package/neural-0.3.0.1/reports/
Any chance this could be fixed?
This is my build log:
[13 of 22] Compiling Data.FixedSize.Vector ( src/Data/FixedSize/Vector.hs, dist/dist-sandbox-50b32bd7/build/Data/FixedSize/Vector.o )
src/Data/FixedSize/Vector.hs:104:31: error:
• Couldn't match type ‘1 + n’ with ‘n + 1’
Expected type: VS.Vector (n + 1) a
Actual type: VS.Vector (1 + n) a
NB: ‘+’ is a type function, and may not be injective
• In the second argument of ‘($)’, namely ‘VS.cons x xs’
In the expression: Vector $ VS.cons x xs
In an equation for ‘cons’:
cons x (Vector xs) = Vector $ VS.cons x xs
• Relevant bindings include
xs :: VS.Vector n a (bound at src/Data/FixedSize/Vector.hs:104:16)
cons :: a -> Vector n a -> Vector (n + 1) a
(bound at src/Data/FixedSize/Vector.hs:104:1)
src/Data/FixedSize/Vector.hs:112:28: error:
• Couldn't match type ‘n + 1’ with ‘1 + n0’
Expected type: VS.Vector (1 + n0) a
Actual type: VS.Vector (n + 1) a
NB: ‘+’ is a type function, and may not be injective
The type variable ‘n0’ is ambiguous
• In the first argument of ‘VS.head’, namely ‘v’
In the expression: VS.head v
In an equation for ‘vhead’: vhead (Vector v) = VS.head v
• Relevant bindings include
v :: VS.Vector (n + 1) a
(bound at src/Data/FixedSize/Vector.hs:112:15)
vhead :: Vector (n + 1) a -> a
(bound at src/Data/FixedSize/Vector.hs:112:1)
src/Data/FixedSize/Vector.hs:120:37: error:
• Couldn't match type ‘n + 1’ with ‘1 + n’
Expected type: VS.Vector (1 + n) a
Actual type: VS.Vector (n + 1) a
NB: ‘+’ is a type function, and may not be injective
• In the first argument of ‘VS.tail’, namely ‘v’
In the second argument of ‘($)’, namely ‘VS.tail v’
In the expression: Vector $ VS.tail v
• Relevant bindings include
v :: VS.Vector (n + 1) a
(bound at src/Data/FixedSize/Vector.hs:120:15)
vtail :: Vector (n + 1) a -> Vector n a
(bound at src/Data/FixedSize/Vector.hs:120:1)
cabal: Leaving directory '/tmp/cabal-tmp-5629/neural-0.3.0.1'
cabal: Error: some packages failed to install:
neural-0.3.0.1 failed during the building phase. The exception was:
ExitFailure 1
Hi Lars,
Thanks for this library!
After studying the code in your MNIST.hs example, I assume that you are not, yet, using convolutional layers in your MNIST digit classification example. Is that correct? If so, I'd like to contribute to this project, by adding that (i.e. - modifying the MNIST.hs example to use convolutional layers).
Did you already have an idea in mind, as to how you'd like to proceed with this?
And, finally, I was hoping you might have time to consider these two questions:
In MNIST.hs, why is the MNISTModel type defined as:
type MNISTModel = Classifier (Matrix 28 28) 10 Img Digit
when the learning appears to work on the flattened 784-element vector, instead?
This choice seems to force mnistModel.f to come inside the learning loop and, therefore, be defined as a Diff, whereas otherwise it might have been defined as a simple function (just flattening the 2D image into a 1D vector, and remaining outside of the learning loop).
Referring to this comment:
A Classifier f n b c is a Model that classifies items of type b into categories of type c, using a component with input shape f and output shape Vector n.
Why is the apparent type redundancy required?
In other words, why not just Classifier f n
(or, Classifier b c
)?
Thanks!
-db
I know its too much to ask, but can i expect OpenCL support for AMD/nVidia gpu cards?
I think haskell is best for pretty much anything, so I wish to have a haskell library that supports gpu computations and can ease the level of experimentation/research that I can do with neural nets!
I have no idea how to fix these.
The library will not build, here is my log:
Configuring neural-0.3.0.1...
Building neural-0.3.0.1...
Preprocessing library neural-0.3.0.1...
[ 1 of 22] Compiling Data.Utils.Statistics ( src/Data/Utils/Statistics.hs, dist/build/Data/Utils/Statistics.o )
[ 2 of 22] Compiling Data.Utils.Arrow ( src/Data/Utils/Arrow.hs, dist/build/Data/Utils/Arrow.o )
[ 3 of 22] Compiling Data.MyPrelude ( src/Data/MyPrelude.hs, dist/build/Data/MyPrelude.o )
[ 4 of 22] Compiling Data.Utils.Analytic ( src/Data/Utils/Analytic.hs, dist/build/Data/Utils/Analytic.o )
[ 5 of 22] Compiling Data.Utils.Cache ( src/Data/Utils/Cache.hs, dist/build/Data/Utils/Cache.o )
[ 6 of 22] Compiling Data.Utils.Stack ( src/Data/Utils/Stack.hs, dist/build/Data/Utils/Stack.o )
[ 7 of 22] Compiling Data.Utils.Traversable ( src/Data/Utils/Traversable.hs, dist/build/Data/Utils/Traversable.o )
[ 8 of 22] Compiling Data.Utils.List ( src/Data/Utils/List.hs, dist/build/Data/Utils/List.o )
[ 9 of 22] Compiling Data.Utils.Pipes ( src/Data/Utils/Pipes.hs, dist/build/Data/Utils/Pipes.o )
[10 of 22] Compiling Data.Utils.Random ( src/Data/Utils/Random.hs, dist/build/Data/Utils/Random.o )
[11 of 22] Compiling Numeric.Neural.Model ( src/Numeric/Neural/Model.hs, dist/build/Numeric/Neural/Model.o )
[12 of 22] Compiling Data.FixedSize.Class ( src/Data/FixedSize/Class.hs, dist/build/Data/FixedSize/Class.o )
[13 of 22] Compiling Data.FixedSize.Vector ( src/Data/FixedSize/Vector.hs, dist/build/Data/FixedSize/Vector.o )
src/Data/FixedSize/Vector.hs:74:16: error:
* Couldn't match type `Int'
with `finite-typelits-0.1.3.0:Data.Finite.Internal.Finite n'
Expected type: (Index (Vector n) -> a) -> Vector n a
Actual type: (finite-typelits-0.1.3.0:Data.Finite.Internal.Finite
n
-> a)
-> Vector n a
* In the expression: Vector . VS.generate
In an equation for `generate': generate = Vector . VS.generate
In the instance declaration for `FixedSize (Vector n)'
* Relevant bindings include
generate :: (Index (Vector n) -> a) -> Vector n a
(bound at src/Data/FixedSize/Vector.hs:74:5)
src/Data/FixedSize/Vector.hs:104:31: error:
* Couldn't match type `1 + n' with `n + 1'
Expected type: VS.Vector (n + 1) a
Actual type: VS.Vector (1 + n) a
NB: `+' is a type function, and may not be injective
* In the second argument of `($)', namely `VS.cons x xs'
In the expression: Vector $ VS.cons x xs
In an equation for `cons':
cons x (Vector xs) = Vector $ VS.cons x xs
* Relevant bindings include
xs :: VS.Vector n a (bound at src/Data/FixedSize/Vector.hs:104:16)
cons :: a -> Vector n a -> Vector (n + 1) a
(bound at src/Data/FixedSize/Vector.hs:104:1)
src/Data/FixedSize/Vector.hs:112:28: error:
* Couldn't match type `n + 1' with `1 + n0'
Expected type: VS.Vector (1 + n0) a
Actual type: VS.Vector (n + 1) a
NB: `+' is a type function, and may not be injective
The type variable `n0' is ambiguous
* In the first argument of `VS.head', namely `v'
In the expression: VS.head v
In an equation for `vhead': vhead (Vector v) = VS.head v
* Relevant bindings include
v :: VS.Vector (n + 1) a
(bound at src/Data/FixedSize/Vector.hs:112:15)
vhead :: Vector (n + 1) a -> a
(bound at src/Data/FixedSize/Vector.hs:112:1)
src/Data/FixedSize/Vector.hs:120:37: error:
* Couldn't match type `n + 1' with `1 + n'
Expected type: VS.Vector (1 + n) a
Actual type: VS.Vector (n + 1) a
NB: `+' is a type function, and may not be injective
* In the first argument of `VS.tail', namely `v'
In the second argument of `($)', namely `VS.tail v'
In the expression: Vector $ VS.tail v
* Relevant bindings include
v :: VS.Vector (n + 1) a
(bound at src/Data/FixedSize/Vector.hs:120:15)
vtail :: Vector (n + 1) a -> Vector n a
(bound at src/Data/FixedSize/Vector.hs:120:1)
cabal: Leaving directory '/tmp/cabal-tmp-8105/neural-0.3.0.1'
Does the word "generation", as used in the reporting from this library, mean:
Referring to Lars' MNIST.hs example, and given this layout of a CNN:
c = reLULayer
. cArr (Diff toVector)
. (convolution (Proxy :: DP.Proxy 7) 3 reLULayer :: Component (Volume 28 28 1) (Volume 8 8 8))
. cArr (Diff fromMatrix)
what's the best way to make one of the 8 8x8 convolution results available to the reporting pipe?
I'd like to have the reporting pipe dump this "image" every so often, so I can see the convolution kernels "tuning" themselves to certain image features.
Also, am I correct in assuming that I'll fail if I try to brute force this by picking apart the model component, via pattern matching, due to the existentially hidden shape of the model parameter set?
As you can see on https://hackage.haskell.org/package/neural-0.1.0.0, the description says "Please see README.md". If you upload packages to Hackage, you're expected to write a short abstact in the description
field. An optional README does not replace the description
field, but rather expands on it, since most tooling doesn't have access to the README, and in that context referring to an inaccessible README
is rather pointless.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.