yunjhongwu / tensordecompositions.jl Goto Github PK
View Code? Open in Web Editor NEWA Julia implementation of tensor decomposition algorithms
License: Other
A Julia implementation of tensor decomposition algorithms
License: Other
Hi,
Probably it is not your fault, but in your example for using the package, the following is throwing an error:
T = cat(3, map(x -> x * u * v', w)...) + 0.2 * randn(10, 20, 30)
(v1.0) pkg> test TensorDecompositions
Updating registry at `~/.julia/registries/General`
Updating git-repo `[email protected]:JuliaRegistries/General.git`
Testing TensorDecompositions
Resolving package versions...
Status `/var/folders/3z/rj6y6gfx4nd3h867q_mvjnw0000b9r/T/tmpOMgpCG/Manifest.toml`
[7d9fca2a] Arpack v0.3.0
[9e28174c] BinDeps v0.8.10
[b99e7846] BinaryProvider v0.5.3
[34da2185] Compat v1.4.0
[864edb3b] DataStructures v0.14.0
[31c24e10] Distributions v0.16.4
[e1d29d7a] Missings v0.3.1
[bac558e1] OrderedCollections v1.0.2
[90014a1f] PDMats v0.9.6
[92933f4c] ProgressMeter v0.9.0
[1fd47b50] QuadGK v2.0.3
[79098fc4] Rmath v0.5.0
[a2af1166] SortingAlgorithms v0.3.1
[276daf66] SpecialFunctions v0.7.2
[2913bbd2] StatsBase v0.26.0
[4c63d2b9] StatsFuns v0.7.0
[5e0ebb24] Strided v0.2.2
[192bee09] TensorDecompositions v0.2.0+ [`~/.julia/dev/TensorDecompositions`]
[6aa20fa7] TensorOperations v1.0.0
[9d95972d] TupleTools v1.0.2
[30578b45] URIParser v0.4.0
[2a0f44e3] Base64 [`@stdlib/Base64`]
[ade2ca70] Dates [`@stdlib/Dates`]
[8bb1440f] DelimitedFiles [`@stdlib/DelimitedFiles`]
[8ba89e20] Distributed [`@stdlib/Distributed`]
[b77e0a4c] InteractiveUtils [`@stdlib/InteractiveUtils`]
[76f85450] LibGit2 [`@stdlib/LibGit2`]
[8f399da3] Libdl [`@stdlib/Libdl`]
[37e2e46d] LinearAlgebra [`@stdlib/LinearAlgebra`]
[56ddb016] Logging [`@stdlib/Logging`]
[d6f4376e] Markdown [`@stdlib/Markdown`]
[a63ad114] Mmap [`@stdlib/Mmap`]
[44cfe95a] Pkg [`@stdlib/Pkg`]
[de0858da] Printf [`@stdlib/Printf`]
[3fa0cd96] REPL [`@stdlib/REPL`]
[9a3f8284] Random [`@stdlib/Random`]
[ea8e919c] SHA [`@stdlib/SHA`]
[9e88b42a] Serialization [`@stdlib/Serialization`]
[1a1011a3] SharedArrays [`@stdlib/SharedArrays`]
[6462fe0b] Sockets [`@stdlib/Sockets`]
[2f01184e] SparseArrays [`@stdlib/SparseArrays`]
[10745b16] Statistics [`@stdlib/Statistics`]
[4607b0f0] SuiteSparse [`@stdlib/SuiteSparse`]
[8dfed614] Test [`@stdlib/Test`]
[cf7118a7] UUIDs [`@stdlib/UUIDs`]
[4ec0a83e] Unicode [`@stdlib/Unicode`]
tensorcontractmatrices(): Error During Test at /Users/monty/.julia/dev/TensorDecompositions/test/test_utils.jl:47
Got exception outside of a @test
MethodError: no method matching contract!(::Int64, ::Array{Float64,3}, ::Type{Val{:N}}, ::Array{Float64,2}, ::Type{Val{:N}}, ::Int64, ::Array{Float64,3}, ::Tuple{Int64,Int64}, ::Tuple{Int64}, ::Tuple{Int64}, ::Tuple{Int64}, ::Tuple{Int64,Int64,Int64}, ::Type{Val{:BLAS}})
Closest candidates are:
contract!(::Any, ::AbstractArray, !Matched::Symbol, ::AbstractArray, !Matched::Symbol, ::Any, ::AbstractArray, ::Tuple{Vararg{Int64,N}} where N, ::Tuple{Vararg{Int64,N}} where N, ::Tuple{Vararg{Int64,N}} where N, ::Tuple{Vararg{Int64,N}} where N, ::Tuple{Vararg{Int64,N}} where N) at /Users/monty/.julia/packages/TensorOperations/SOGUy/src/implementation/stridedarray.jl:196
contract!(::Any, ::AbstractArray, !Matched::Symbol, ::AbstractArray, !Matched::Symbol, ::Any, ::AbstractArray, ::Tuple{Vararg{Int64,N}} where N, ::Tuple{Vararg{Int64,N}} where N, ::Tuple{Vararg{Int64,N}} where N, ::Tuple{Vararg{Int64,N}} where N, ::Tuple{Vararg{Int64,N}} where N, !Matched::Tuple{Vararg{Int64,N}} where N) at /Users/monty/.julia/packages/TensorOperations/SOGUy/src/implementation/stridedarray.jl:85
contract!(::Any, ::AbstractArray, !Matched::Symbol, ::AbstractArray, !Matched::Symbol, ::Any, ::AbstractArray, ::Tuple{Vararg{Int64,N}} where N, ::Tuple{Vararg{Int64,N}} where N, ::Tuple{Vararg{Int64,N}} where N, ::Tuple{Vararg{Int64,N}} where N, ::Tuple{Vararg{Int64,N}} where N, !Matched::Tuple{Vararg{Int64,N}} where N, !Matched::Any) at /Users/monty/.julia/packages/TensorOperations/SOGUy/src/implementation/stridedarray.jl:85
...
Stacktrace:
[1] #tensorcontractmatrix!#3(::Bool, ::Symbol, ::Function, ::Array{Float64,3}, ::Array{Float64,3}, ::Array{Float64,2}, ::Int64) at /Users/monty/.julia/dev/TensorDecompositions/src/utils.jl:7
[2] (::getfield(TensorDecompositions, Symbol("#kw##tensorcontractmatrix!")))(::NamedTuple{(:transpose, :method),Tuple{Bool,Symbol}}, ::typeof(TensorDecompositions.tensorcontractmatrix!), ::Array{Float64,3}, ::Array{Float64,3}, ::Array{Float64,2}, ::Int64) at ./none:0
[3] #tensorcontractmatrix#8(::Bool, ::Symbol, ::Function, ::Array{Float64,3}, ::Array{Float64,2}, ::Int64) at /Users/monty/.julia/dev/TensorDecompositions/src/utils.jl:15
[4] (::getfield(TensorDecompositions, Symbol("#kw##tensorcontractmatrix")))(::NamedTuple{(:transpose, :method),Tuple{Bool,Symbol}}, ::typeof(tensorcontractmatrix), ::Array{Float64,3}, ::Array{Float64,2}, ::Int64) at ./none:0
[5] #tensorcontractmatrices#12 at /Users/monty/.julia/dev/TensorDecompositions/src/utils.jl:57 [inlined]
[6] tensorcontractmatrices at /Users/monty/.julia/dev/TensorDecompositions/src/utils.jl:53 [inlined] (repeats 2 times)
[7] macro expansion at /Users/monty/.julia/dev/TensorDecompositions/test/test_utils.jl:49 [inlined]
[8] macro expansion at /Users/osx/buildbot/slave/package_osx64/build/usr/share/julia/stdlib/v1.0/Test/src/Test.jl:1083 [inlined]
[9] macro expansion at /Users/monty/.julia/dev/TensorDecompositions/test/test_utils.jl:48 [inlined]
[10] macro expansion at /Users/osx/buildbot/slave/package_osx64/build/usr/share/julia/stdlib/v1.0/Test/src/Test.jl:1083 [inlined]
[11] top-level scope at /Users/monty/.julia/dev/TensorDecompositions/test/test_utils.jl:3
[12] include at ./boot.jl:317 [inlined]
[13] include_relative(::Module, ::String) at ./loading.jl:1044
[14] include(::Module, ::String) at ./sysimg.jl:29
[15] include(::String) at ./client.jl:392
[16] top-level scope at none:0
[17] include at ./boot.jl:317 [inlined]
[18] include_relative(::Module, ::String) at ./loading.jl:1044
[19] include(::Module, ::String) at ./sysimg.jl:29
[20] include(::String) at ./client.jl:392
[21] top-level scope at none:0
[22] eval(::Module, ::Any) at ./boot.jl:319
[23] exec_options(::Base.JLOptions) at ./client.jl:243
[24] _start() at ./client.jl:425
Test Summary: | Pass Error Total
Utilities | 18 1 19
khatrirao() | 11 11
_row_unfold() | 3 3
_col_unfold() | 3 3
tensorcontractmatrices() | 1 1
ERROR: LoadError: LoadError: Some tests did not pass: 18 passed, 0 failed, 1 errored, 0 broken.
in expression starting at /Users/monty/.julia/dev/TensorDecompositions/test/test_utils.jl:1
in expression starting at /Users/monty/.julia/dev/TensorDecompositions/test/runtests.jl:9
ERROR: Package TensorDecompositions errored during testing
using TensorDecomposition, gives;
ERROR: LoadError: LoadError: type UnionAll has no field parameters
Stacktrace:
Thanks for making this helpful package.
Is there a reason that this package doesn't use the precompile feature?
http://docs.julialang.org/en/release-0.4/manual/modules/#module-initialization-and-precompilation.
Was it just not standard as of Dec 2015?
If there is nothing preventing precompilation, I could enable it.
The progress bars are inconvenient for parallel executions
I get the following error after updating to TensorOperations to v0.4.0.
julia> using TensorDecompositions
WARNING: Base.String is deprecated, use AbstractString instead.
likely near /home/gawron/.julia/v0.4/TensorDecompositions/src/candecomp.jl:1
ERROR: LoadError: LoadError: TypeError: typealias: in parameter, expected Type{T}, got Tuple{DataType,DataType}
in include at ./boot.jl:261
in include_from_node1 at ./loading.jl:304
in include at ./boot.jl:261
in include_from_node1 at ./loading.jl:304
in require at ./loading.jl:243
while loading /home/gawron/.julia/v0.4/TensorDecompositions/src/tensorcur.jl, in expression starting on line 1
while loading /home/gawron/.julia/v0.4/TensorDecompositions/src/TensorDecompositions.jl, in expression starting on line 20
julia> versioninfo()
Julia Version 0.4.2
Commit bb73f34 (2015-12-06 21:47 UTC)
Platform Info:
System: Linux (x86_64-linux-gnu)
CPU: Intel(R) Core(TM) i5 CPU M 540 @ 2.53GHz
WORD_SIZE: 64
BLAS: libopenblas (NO_LAPACK NO_LAPACKE DYNAMIC_ARCH NO_AFFINITY Nehalem)
LAPACK: liblapack.so.3
LIBM: libopenlibm
LLVM: libLLVM-3.3
I obtain the following error while importing TensorDecompositions.
julia> using TensorDecompositions
WARNING: Base.String is deprecated, use AbstractString instead.
WARNING: Base.String is deprecated, use AbstractString instead.
ERROR: LoadError: LoadError: LoadError: MethodError: `start` has no method matching start(::Type{Symbol})
in append_any at essentials.jl:127
in include at ./boot.jl:261
in include_from_node1 at ./loading.jl:304
in require at ./loading.jl:243
in include at ./boot.jl:261
in include_from_node1 at ./loading.jl:304
in require at ./loading.jl:243
in include at ./boot.jl:261
in include_from_node1 at ./loading.jl:304
in require at ./loading.jl:243
while loading /home/gawron/.julia/v0.4/Cartesian/src/Cartesian.jl, in expression starting on line 142
while loading /home/gawron/.julia/v0.4/TensorOperations/src/TensorOperations.jl, in expression starting on line 28
while loading /home/gawron/.julia/v0.4/TensorDecompositions/src/TensorDecompositions.jl, in expression starting on line 4
julia> versioninfo()
Julia Version 0.4.0-rc4+1
Commit 296bb20* (2015-10-04 05:59 UTC)
Platform Info:
System: Linux (x86_64-linux-gnu)
CPU: Intel(R) Core(TM) i7-5820K CPU @ 3.30GHz
WORD_SIZE: 64
BLAS: libopenblas (USE64BITINT DYNAMIC_ARCH NO_AFFINITY Haswell)
LAPACK: libopenblas64_
LIBM: libopenlibm
LLVM: libLLVM-3.3
Is there an easy/preferred method for providing an initial guess of the factors in CANDECOMP? Something like:
# synthetic data
u,v,w = randn(10),randn(10),randn(10)
@begin tensor
A := u[i]*v[j]*w[k]
end
# add noise
A += randn(10,10,10)
# fit decomposition with initial guess
model = candecomp(T,1,initial_guess=(u,v,w))
As a follow up, I think it is worth considering whether the interface should be redone to agree with the StatisticalModel
interface that the JuliaStats group has decided on (see discussion here). If I'm not mistaken, the interface would look something like
# Fit rank-one candecomp without initial guess
model = fit(CANDECOMP(1), T)
# Or, equivalently
model = CANDECOMP(1)
fit!(model, T)
# Fit candecomp with initial guess
model = CANDECOMP(1, factors=(u,v,w))
fit!(model, T)
As described in https://discourse.julialang.org/t/ann-plans-for-removing-packages-that-do-not-yet-support-1-0-from-the-general-registry/ we are planning on removing packages that do not support 1.0 from the General registry. This package has been detected to not support 1.0 and is thus slated to be removed. The removal of packages from the registry will happen approximately a month after this issue is open.
To transition to the new Pkg system using Project.toml
, see https://github.com/JuliaRegistries/Registrator.jl#transitioning-from-require-to-projecttoml.
To then tag a new version of the package, see https://github.com/JuliaRegistries/Registrator.jl#via-the-github-app.
If you believe this package has erroneously been detected as not supporting 1.0 or have any other questions, don't hesitate to discuss it here or in the thread linked at the top of this post.
Hi,
CORCONDIA is an algorithm for evaluating how well the factors produced by cp match the data. It relies on the cp algorithm producing factors that are normalised in a particular way.
In the python and matlab implementations, factors are normalised by dividing by the 2-norm in iteration 1 and max(max(factor[i]), 1)
in all other iterations. In TensorDecompositions.jl, factors are normalised by dividing by sum(abs(factor[i]))
.
It's not stated explicitly anywhere I've looked so far, but I'm pretty sure that dividing by the max is required for CORCONDIA to give reasonable results*.
My questions for you:
Cheers,
* I'm checking for reasonable results like so:
Given this function:
function G(X, factors)
(reduce(kron, factors) |> pinv) * X[:]
end
G(factors)
should yield 1 if there is exactly one component.G(factors)
should exceed 1.G
comes from the original CORCONDIA paper: https://onlinelibrary.wiley.com/doi/epdf/10.1002/cem.801 Section 2.1 Equation 5.
Results from python sktensor and R multiway libraries pass this test.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.