Code Monkey home page Code Monkey logo

tensordecompositions.jl's People

Contributors

alyst avatar yunjhongwu avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

tensordecompositions.jl's Issues

Fails

(v1.0) pkg> test TensorDecompositions
  Updating registry at `~/.julia/registries/General`
  Updating git-repo `[email protected]:JuliaRegistries/General.git`
   Testing TensorDecompositions
 Resolving package versions...
    Status `/var/folders/3z/rj6y6gfx4nd3h867q_mvjnw0000b9r/T/tmpOMgpCG/Manifest.toml`
  [7d9fca2a] Arpack v0.3.0
  [9e28174c] BinDeps v0.8.10
  [b99e7846] BinaryProvider v0.5.3
  [34da2185] Compat v1.4.0
  [864edb3b] DataStructures v0.14.0
  [31c24e10] Distributions v0.16.4
  [e1d29d7a] Missings v0.3.1
  [bac558e1] OrderedCollections v1.0.2
  [90014a1f] PDMats v0.9.6
  [92933f4c] ProgressMeter v0.9.0
  [1fd47b50] QuadGK v2.0.3
  [79098fc4] Rmath v0.5.0
  [a2af1166] SortingAlgorithms v0.3.1
  [276daf66] SpecialFunctions v0.7.2
  [2913bbd2] StatsBase v0.26.0
  [4c63d2b9] StatsFuns v0.7.0
  [5e0ebb24] Strided v0.2.2
  [192bee09] TensorDecompositions v0.2.0+ [`~/.julia/dev/TensorDecompositions`]
  [6aa20fa7] TensorOperations v1.0.0
  [9d95972d] TupleTools v1.0.2
  [30578b45] URIParser v0.4.0
  [2a0f44e3] Base64  [`@stdlib/Base64`]
  [ade2ca70] Dates  [`@stdlib/Dates`]
  [8bb1440f] DelimitedFiles  [`@stdlib/DelimitedFiles`]
  [8ba89e20] Distributed  [`@stdlib/Distributed`]
  [b77e0a4c] InteractiveUtils  [`@stdlib/InteractiveUtils`]
  [76f85450] LibGit2  [`@stdlib/LibGit2`]
  [8f399da3] Libdl  [`@stdlib/Libdl`]
  [37e2e46d] LinearAlgebra  [`@stdlib/LinearAlgebra`]
  [56ddb016] Logging  [`@stdlib/Logging`]
  [d6f4376e] Markdown  [`@stdlib/Markdown`]
  [a63ad114] Mmap  [`@stdlib/Mmap`]
  [44cfe95a] Pkg  [`@stdlib/Pkg`]
  [de0858da] Printf  [`@stdlib/Printf`]
  [3fa0cd96] REPL  [`@stdlib/REPL`]
  [9a3f8284] Random  [`@stdlib/Random`]
  [ea8e919c] SHA  [`@stdlib/SHA`]
  [9e88b42a] Serialization  [`@stdlib/Serialization`]
  [1a1011a3] SharedArrays  [`@stdlib/SharedArrays`]
  [6462fe0b] Sockets  [`@stdlib/Sockets`]
  [2f01184e] SparseArrays  [`@stdlib/SparseArrays`]
  [10745b16] Statistics  [`@stdlib/Statistics`]
  [4607b0f0] SuiteSparse  [`@stdlib/SuiteSparse`]
  [8dfed614] Test  [`@stdlib/Test`]
  [cf7118a7] UUIDs  [`@stdlib/UUIDs`]
  [4ec0a83e] Unicode  [`@stdlib/Unicode`]
tensorcontractmatrices(): Error During Test at /Users/monty/.julia/dev/TensorDecompositions/test/test_utils.jl:47
  Got exception outside of a @test
  MethodError: no method matching contract!(::Int64, ::Array{Float64,3}, ::Type{Val{:N}}, ::Array{Float64,2}, ::Type{Val{:N}}, ::Int64, ::Array{Float64,3}, ::Tuple{Int64,Int64}, ::Tuple{Int64}, ::Tuple{Int64}, ::Tuple{Int64}, ::Tuple{Int64,Int64,Int64}, ::Type{Val{:BLAS}})
  Closest candidates are:
    contract!(::Any, ::AbstractArray, !Matched::Symbol, ::AbstractArray, !Matched::Symbol, ::Any, ::AbstractArray, ::Tuple{Vararg{Int64,N}} where N, ::Tuple{Vararg{Int64,N}} where N, ::Tuple{Vararg{Int64,N}} where N, ::Tuple{Vararg{Int64,N}} where N, ::Tuple{Vararg{Int64,N}} where N) at /Users/monty/.julia/packages/TensorOperations/SOGUy/src/implementation/stridedarray.jl:196
    contract!(::Any, ::AbstractArray, !Matched::Symbol, ::AbstractArray, !Matched::Symbol, ::Any, ::AbstractArray, ::Tuple{Vararg{Int64,N}} where N, ::Tuple{Vararg{Int64,N}} where N, ::Tuple{Vararg{Int64,N}} where N, ::Tuple{Vararg{Int64,N}} where N, ::Tuple{Vararg{Int64,N}} where N, !Matched::Tuple{Vararg{Int64,N}} where N) at /Users/monty/.julia/packages/TensorOperations/SOGUy/src/implementation/stridedarray.jl:85
    contract!(::Any, ::AbstractArray, !Matched::Symbol, ::AbstractArray, !Matched::Symbol, ::Any, ::AbstractArray, ::Tuple{Vararg{Int64,N}} where N, ::Tuple{Vararg{Int64,N}} where N, ::Tuple{Vararg{Int64,N}} where N, ::Tuple{Vararg{Int64,N}} where N, ::Tuple{Vararg{Int64,N}} where N, !Matched::Tuple{Vararg{Int64,N}} where N, !Matched::Any) at /Users/monty/.julia/packages/TensorOperations/SOGUy/src/implementation/stridedarray.jl:85
    ...
  Stacktrace:
   [1] #tensorcontractmatrix!#3(::Bool, ::Symbol, ::Function, ::Array{Float64,3}, ::Array{Float64,3}, ::Array{Float64,2}, ::Int64) at /Users/monty/.julia/dev/TensorDecompositions/src/utils.jl:7
   [2] (::getfield(TensorDecompositions, Symbol("#kw##tensorcontractmatrix!")))(::NamedTuple{(:transpose, :method),Tuple{Bool,Symbol}}, ::typeof(TensorDecompositions.tensorcontractmatrix!), ::Array{Float64,3}, ::Array{Float64,3}, ::Array{Float64,2}, ::Int64) at ./none:0
   [3] #tensorcontractmatrix#8(::Bool, ::Symbol, ::Function, ::Array{Float64,3}, ::Array{Float64,2}, ::Int64) at /Users/monty/.julia/dev/TensorDecompositions/src/utils.jl:15
   [4] (::getfield(TensorDecompositions, Symbol("#kw##tensorcontractmatrix")))(::NamedTuple{(:transpose, :method),Tuple{Bool,Symbol}}, ::typeof(tensorcontractmatrix), ::Array{Float64,3}, ::Array{Float64,2}, ::Int64) at ./none:0
   [5] #tensorcontractmatrices#12 at /Users/monty/.julia/dev/TensorDecompositions/src/utils.jl:57 [inlined]
   [6] tensorcontractmatrices at /Users/monty/.julia/dev/TensorDecompositions/src/utils.jl:53 [inlined] (repeats 2 times)
   [7] macro expansion at /Users/monty/.julia/dev/TensorDecompositions/test/test_utils.jl:49 [inlined]
   [8] macro expansion at /Users/osx/buildbot/slave/package_osx64/build/usr/share/julia/stdlib/v1.0/Test/src/Test.jl:1083 [inlined]
   [9] macro expansion at /Users/monty/.julia/dev/TensorDecompositions/test/test_utils.jl:48 [inlined]
   [10] macro expansion at /Users/osx/buildbot/slave/package_osx64/build/usr/share/julia/stdlib/v1.0/Test/src/Test.jl:1083 [inlined]
   [11] top-level scope at /Users/monty/.julia/dev/TensorDecompositions/test/test_utils.jl:3
   [12] include at ./boot.jl:317 [inlined]
   [13] include_relative(::Module, ::String) at ./loading.jl:1044
   [14] include(::Module, ::String) at ./sysimg.jl:29
   [15] include(::String) at ./client.jl:392
   [16] top-level scope at none:0
   [17] include at ./boot.jl:317 [inlined]
   [18] include_relative(::Module, ::String) at ./loading.jl:1044
   [19] include(::Module, ::String) at ./sysimg.jl:29
   [20] include(::String) at ./client.jl:392
   [21] top-level scope at none:0
   [22] eval(::Module, ::Any) at ./boot.jl:319
   [23] exec_options(::Base.JLOptions) at ./client.jl:243
   [24] _start() at ./client.jl:425
Test Summary:              | Pass  Error  Total
Utilities                  |   18      1     19
  khatrirao()              |   11            11
  _row_unfold()            |    3             3
  _col_unfold()            |    3             3
  tensorcontractmatrices() |           1      1
ERROR: LoadError: LoadError: Some tests did not pass: 18 passed, 0 failed, 1 errored, 0 broken.
in expression starting at /Users/monty/.julia/dev/TensorDecompositions/test/test_utils.jl:1
in expression starting at /Users/monty/.julia/dev/TensorDecompositions/test/runtests.jl:9
ERROR: Package TensorDecompositions errored during testing

Problem with Julia 0.6

using TensorDecomposition, gives;
ERROR: LoadError: LoadError: type UnionAll has no field parameters
Stacktrace:

After updating TensorOperations to v0.4.0 the module does not load

I get the following error after updating to TensorOperations to v0.4.0.

julia> using TensorDecompositions
WARNING: Base.String is deprecated, use AbstractString instead.
  likely near /home/gawron/.julia/v0.4/TensorDecompositions/src/candecomp.jl:1
ERROR: LoadError: LoadError: TypeError: typealias: in parameter, expected Type{T}, got Tuple{DataType,DataType}
 in include at ./boot.jl:261
 in include_from_node1 at ./loading.jl:304
 in include at ./boot.jl:261
 in include_from_node1 at ./loading.jl:304
 in require at ./loading.jl:243
while loading /home/gawron/.julia/v0.4/TensorDecompositions/src/tensorcur.jl, in expression starting on line 1
while loading /home/gawron/.julia/v0.4/TensorDecompositions/src/TensorDecompositions.jl, in expression starting on line 20

julia> versioninfo()
Julia Version 0.4.2
Commit bb73f34 (2015-12-06 21:47 UTC)
Platform Info:
  System: Linux (x86_64-linux-gnu)
  CPU: Intel(R) Core(TM) i5 CPU       M 540  @ 2.53GHz
  WORD_SIZE: 64
  BLAS: libopenblas (NO_LAPACK NO_LAPACKE DYNAMIC_ARCH NO_AFFINITY Nehalem)
  LAPACK: liblapack.so.3
  LIBM: libopenlibm
  LLVM: libLLVM-3.3

In Julia 0.4.0-rc4 TensorDecompoistions throw error upon import

I obtain the following error while importing TensorDecompositions.

julia> using TensorDecompositions
WARNING: Base.String is deprecated, use AbstractString instead.
WARNING: Base.String is deprecated, use AbstractString instead.
ERROR: LoadError: LoadError: LoadError: MethodError: `start` has no method matching start(::Type{Symbol})
 in append_any at essentials.jl:127
 in include at ./boot.jl:261
 in include_from_node1 at ./loading.jl:304
 in require at ./loading.jl:243
 in include at ./boot.jl:261
 in include_from_node1 at ./loading.jl:304
 in require at ./loading.jl:243
 in include at ./boot.jl:261
 in include_from_node1 at ./loading.jl:304
 in require at ./loading.jl:243
while loading /home/gawron/.julia/v0.4/Cartesian/src/Cartesian.jl, in expression starting on line 142
while loading /home/gawron/.julia/v0.4/TensorOperations/src/TensorOperations.jl, in expression starting on line 28
while loading /home/gawron/.julia/v0.4/TensorDecompositions/src/TensorDecompositions.jl, in expression starting on line 4

julia> versioninfo()
Julia Version 0.4.0-rc4+1
Commit 296bb20* (2015-10-04 05:59 UTC)
Platform Info:
  System: Linux (x86_64-linux-gnu)
  CPU: Intel(R) Core(TM) i7-5820K CPU @ 3.30GHz
  WORD_SIZE: 64
  BLAS: libopenblas (USE64BITINT DYNAMIC_ARCH NO_AFFINITY Haswell)
  LAPACK: libopenblas64_
  LIBM: libopenlibm
  LLVM: libLLVM-3.3

Initializing the factors?

Is there an easy/preferred method for providing an initial guess of the factors in CANDECOMP? Something like:

# synthetic data
u,v,w = randn(10),randn(10),randn(10)
@begin tensor
    A := u[i]*v[j]*w[k]
end

# add noise
A += randn(10,10,10)

# fit decomposition with initial guess
model = candecomp(T,1,initial_guess=(u,v,w))

As a follow up, I think it is worth considering whether the interface should be redone to agree with the StatisticalModel interface that the JuliaStats group has decided on (see discussion here). If I'm not mistaken, the interface would look something like

# Fit rank-one candecomp without initial guess
model = fit(CANDECOMP(1), T)

# Or, equivalently
model = CANDECOMP(1)
fit!(model, T)

# Fit candecomp with initial guess
model = CANDECOMP(1, factors=(u,v,w))
fit!(model, T)

Info about upcoming removal of packages in the General registry

As described in https://discourse.julialang.org/t/ann-plans-for-removing-packages-that-do-not-yet-support-1-0-from-the-general-registry/ we are planning on removing packages that do not support 1.0 from the General registry. This package has been detected to not support 1.0 and is thus slated to be removed. The removal of packages from the registry will happen approximately a month after this issue is open.

To transition to the new Pkg system using Project.toml, see https://github.com/JuliaRegistries/Registrator.jl#transitioning-from-require-to-projecttoml.
To then tag a new version of the package, see https://github.com/JuliaRegistries/Registrator.jl#via-the-github-app.

If you believe this package has erroneously been detected as not supporting 1.0 or have any other questions, don't hesitate to discuss it here or in the thread linked at the top of this post.

CP factors not normalised in a CORCONDIA-compatible way

Hi,

CORCONDIA is an algorithm for evaluating how well the factors produced by cp match the data. It relies on the cp algorithm producing factors that are normalised in a particular way.

In the python and matlab implementations, factors are normalised by dividing by the 2-norm in iteration 1 and max(max(factor[i]), 1) in all other iterations. In TensorDecompositions.jl, factors are normalised by dividing by sum(abs(factor[i])).

It's not stated explicitly anywhere I've looked so far, but I'm pretty sure that dividing by the max is required for CORCONDIA to give reasonable results*.

My questions for you:

  1. Do you recall why you're normalising by the method you are using?
  2. Do you know why it is common to use the pseudo-max-norm?
  3. Do you know if it is legitimate to renormalise the factors for use in CORCONDIA? (I think yes)

Cheers,

* I'm checking for reasonable results like so:

Given this function:

function G(X, factors)
    (reduce(kron, factors) |> pinv) * X[:]
end
  • G(factors) should yield 1 if there is exactly one component.
  • No element of G(factors) should exceed 1.

G comes from the original CORCONDIA paper: https://onlinelibrary.wiley.com/doi/epdf/10.1002/cem.801 Section 2.1 Equation 5.

Results from python sktensor and R multiway libraries pass this test.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.