Code Monkey home page Code Monkey logo

Comments (7)

ngphuoc avatar ngphuoc commented on May 18, 2024

Here is another error, maybe related to mean/abs2/broadcast

using AutoGrad, Knet

m = [rand(1,3), rand(1)]
x,y = rand(3,4),rand(1,4)

pred(m,x) = m[1]*x .+ m[2]

loss(m,x,y) = mean(abs2, pred(m,x) .- y)

∇ = grad(loss)
(m,x,y)  # OK

loss(m,x,y) = mean(abs2, pred(m,x) .- y) + mean(mean.(abs2, m))

∇ = grad(loss)
(m,x,y)  # Error

WARNING: abs2(x::AbstractArray{T}) where T <: Number is deprecated, use abs2.(x) instead.
Stacktrace:
 [1] depwarn(::String, ::Symbol) at ./deprecated.jl:70
 [2] abs2(::Array{Float64,2}) at ./deprecated.jl:57
 [3] (::AutoGrad.##rfun#7#10{Base.#abs2})(::Array{Any,1}, ::Function, ::AutoGrad.Rec{Array{Float64,2}}, ::Vararg{AutoGrad.Rec{Array{Float64,2}},N} where N) at /home/ngphuoc/.julia/v0.6/AutoGrad/src/core.jl:124
 [4] abs2(::AutoGrad.Rec{Array{Float64,2}}) at ./<missing>:0
 [5] mean(::Base.#abs2, ::AutoGrad.Rec{Array{Array{Float64,N} where N,1}}) at ./statistics.jl:25
 [6] loss(::AutoGrad.Rec{Array{Array{Float64,N} where N,1}}, ::Array{Float64,2}, ::Array{Float64,2}) at ./REPL[8]:1
 [7] forward_pass(::Function, ::Tuple{Array{Array{Float64,N} where N,1},Array{Float64,2},Array{Float64,2}}, ::Array{Any,1}, ::Int64) at /home/ngphuoc/.julia/v0.6/AutoGrad/src/core.jl:88
 [8] (::AutoGrad.##gradfun#1#3{#loss,Int64})(::Array{Any,1}, ::Function, ::Array{Array{Float64,N} where N,1}, ::Vararg{Any,N} where N) at /home/ngphuoc/.julia/v0.6/AutoGrad/src/core.jl:39
 [9] (::AutoGrad.#gradfun#2)(::Array{Array{Float64,N} where N,1}, ::Vararg{Any,N} where N) at /home/ngphuoc/.julia/v0.6/AutoGrad/src/core.jl:39
 [10] eval(::Module, ::Any) at ./boot.jl:235
 [11] eval_user_input(::Any, ::Base.REPL.REPLBackend) at ./REPL.jl:66
 [12] macro expansion at ./REPL.jl:97 [inlined]
 [13] (::Base.REPL.##1#2{Base.REPL.REPLBackend})() at ./event.jl:73
while loading no file, in expression starting on line 0
ERROR: DimensionMismatch("dimensions must match")
Stacktrace:
 [1] promote_shape(::Tuple{Base.OneTo{Int64},Base.OneTo{Int64}}, ::Tuple{Base.OneTo{Int64}}) at ./indices.jl:84
 [2] +(::Array{Float64,2}, ::Array{Float64,1}) at ./arraymath.jl:38
 [3] (::AutoGrad.##rfun#7#10{Base.#+})(::Array{Any,1}, ::Function, ::AutoGrad.Rec{Array{Float64,2}}, ::Vararg{Any,N} where N) at /home/ngphuoc/.julia/v0.6/AutoGrad/src/core.jl:124
 [4] +(::AutoGrad.Rec{Array{Float64,2}}, ::AutoGrad.Rec{Array{Float64,1}}) at ./<missing>:0
 [5] mean(::Base.#abs2, ::AutoGrad.Rec{Array{Array{Float64,N} where N,1}}) at ./statistics.jl:29
 [6] loss(::AutoGrad.Rec{Array{Array{Float64,N} where N,1}}, ::Array{Float64,2}, ::Array{Float64,2}) at ./REPL[8]:1
 [7] forward_pass(::Function, ::Tuple{Array{Array{Float64,N} where N,1},Array{Float64,2},Array{Float64,2}}, ::Array{Any,1}, ::Int64) at /home/ngphuoc/.julia/v0.6/AutoGrad/src/core.jl:88
 [8] (::AutoGrad.##gradfun#1#3{#loss,Int64})(::Array{Any,1}, ::Function, ::Array{Array{Float64,N} where N,1}, ::Vararg{Any,N} where N) at /home/ngphuoc/.julia/v0.6/AutoGrad/src/core.jl:39
 [9] (::AutoGrad.#gradfun#2)(::Array{Array{Float64,N} where N,1}, ::Vararg{Any,N} where N) at /home/ngphuoc/.julia/v0.6/AutoGrad/src/core.jl:39



loss(m,x,y) = mean(abs2, pred(m,x) .- y) + mean(sum.(abs2.(m)))
∇ = grad(loss)
(m,x,y)  # Error

ERROR: MethodError: no method matching start(::AutoGrad.Broadcasted{AutoGrad.Rec{Array{Array{Float64,N} where N,1}}})
Closest candidates are:
  start(::SimpleVector) at essentials.jl:258
  start(::Base.MethodList) at reflection.jl:560
  start(::ExponentialBackOff) at error.jl:107
  ...
Stacktrace:
 [1] mapfoldl(::Base.#identity, ::Function, ::AutoGrad.Broadcasted{AutoGrad.Rec{Array{Array{Float64,N} where N,1}}}) at ./reduce.jl:67
 [2] (::##1#2)(::AutoGrad.Broadcasted{AutoGrad.Rec{Array{Array{Float64,N} where N,1}}}) at ./<missing>:0
 [3] broadcast(::Function, ::AutoGrad.Rec{Array{Array{Float64,N} where N,1}}) at /home/ngphuoc/.julia/v0.6/AutoGrad/src/unfuse.jl:35
 [4] loss(::AutoGrad.Rec{Array{Array{Float64,N} where N,1}}, ::Array{Float64,2}, ::Array{Float64,2}) at ./REPL[16]:1
 [5] forward_pass(::Function, ::Tuple{Array{Array{Float64,N} where N,1},Array{Float64,2},Array{Float64,2}}, ::Array{Any,1}, ::Int64) at /home/ngphuoc/.julia/v0.6/AutoGrad/src/core.jl:88
 [6] (::AutoGrad.##gradfun#1#3{#loss,Int64})(::Array{Any,1}, ::Function, ::Array{Array{Float64,N} where N,1}, ::Vararg{Any,N} where N) at /home/ngphuoc/.julia/v0.6/AutoGrad/src/core.jl:39
 [7] (::AutoGrad.#gradfun#2)(::Array{Array{Float64,N} where N,1}, ::Vararg{Any,N} where N) at /home/ngphuoc/.julia/v0.6/AutoGrad/src/core.jl:39


from autograd.jl.

CarloLucibello avatar CarloLucibello commented on May 18, 2024

The last 2 examples work for me on master in this form

loss(m,x,y) = mean(abs2, pred(m,x) .- y) + mean(sum.(abs2, m))
loss(m,x,y) = mean(abs2, pred(m,x) .- y) + mean(mean.(abs2, m))

from autograd.jl.

CarloLucibello avatar CarloLucibello commented on May 18, 2024

On julia 0.7 and on the branch of PR #78 the first example now gives the following error:

julia> using AutoGrad, Knet, Statistics

julia> u = [rand(2,3), rand(2)]
2-element Array{Array{Float64,N} where N,1}:
 [0.920865 0.820558 0.804323; 0.873658 0.993891 0.010203]
 [0.362208, 0.69425]                                     

julia> v = [rand(1,2), rand(1)]
2-element Array{Array{Float64,N} where N,1}:
 [0.454112 0.564783]
 [0.274256]         

julia> m = Dict(:u=>u, :v=>v)
Dict{Symbol,Array{Array{Float64,N} where N,1}} with 2 entries:
  :v => Array{Float64,N} where N[[0.454112 0.564783], [0.274256]]
  :u => Array{Float64,N} where N[[0.920865 0.820558 0.804323; 0.873658 0.993891 0.010203], [0.362208, 0.69425]]

julia> x,y = rand(3,4),rand(1,4)
([0.916855 0.51435 0.784969 0.451743; 0.36581 0.904977 0.912647 0.385057; 0.828762 0.730146 0.321203 0.251844], [0.383026 0.280737 0.599814 0.245899])

julia> pred(m,x) = foldl((x,w)->w[1]*x .+ w[2], [m[:u],m[:v]], init=x)
pred (generic function with 1 method)

julia> l2(ws) = mean(mean.(abs2, ws))
l2 (generic function with 1 method)

julia> loss(m,x,y) = mean(abs2, pred(m,x) .- y) + mean(l2.(collect(values(m))))
loss (generic function with 1 method)

julia> loss(m,x,y)
3.84326612440536

julia> grad(loss)(m,x,y)
ERROR: MethodError: no method matching sum_outgrads(::Array{Array{Float64,N} where N,1}, ::Array{Any,1})
Closest candidates are:
  sum_outgrads(::Nothing, ::Any) at /home/carlo/.julia/dev/AutoGrad/src/core.jl:491
  sum_outgrads(::AbstractArray{T,N} where N, ::AbstractArray{T,N} where N) where T at /home/carlo/.julia/dev/AutoGrad/src/core.jl:478
  sum_outgrads(::Rec, ::Any) at /home/carlo/.julia/dev/AutoGrad/src/core.jl:482
  ...
Stacktrace:
 [1] sum_outgrads(::Dict{Symbol,Array{Array{Float64,N} where N,1}}, ::AutoGrad.UngetIndex) at /home/carlo/.julia/dev/AutoGrad/src/getindex.jl:92
 [2] backward_pass(::Rec{Dict{Symbol,Array{Array{Float64,N} where N,1}}}, ::Rec{Float64}, ::Array{AutoGrad.Node,1}) at /home/carlo/.julia/dev/AutoGrad/src/core.jl:246
 [3] (::getfield(AutoGrad, Symbol("##gradfun#1#2")){typeof(loss),Int64})(::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::Function, ::Dict{Symbol,Array{Array{Float64,N} where N,1}}, ::Vararg{Any,N} where N) at /home/carlo/.julia/dev/AutoGrad/src/core.jl:40
 [4] (::getfield(AutoGrad, Symbol("#gradfun#3")){getfield(AutoGrad, Symbol("##gradfun#1#2")){typeof(loss),Int64}})(::Dict{Symbol,Array{Array{Float64,N} where N,1}}, ::Vararg{Any,N} where N) at /home/carlo/.julia/dev/AutoGrad/src/core.jl:39
 [5] top-level scope at none:0

The gradient of each of the two terms of the loss

loss(m,x,y) = mean(abs2, pred(m,x) .- y)
loss(m,x,y) = mean(l2.(collect(values(m))))

is computed correctly, the problem arises when they are summed.

from autograd.jl.

CarloLucibello avatar CarloLucibello commented on May 18, 2024

Relaxing the signature

sum_outgrads(a::AbstractArray{T},b::AbstractArray{T}) where T = ...

to

sum_outgrads(a::AbstractArray{T},b::AbstractArray) where T = ...

solves the problem

julia> loss(m,x,y) = mean(abs2, pred(m,x) .- y) + mean(l2.(collect(values(m))))
loss (generic function with 1 method)

julia> grad(loss)(m,x,y)
Dict{Symbol,Array{Array{Float64,N} where N,1}} with 2 entries:
  :v => Array{Float64,N} where N[[6.68047 6.28584], [5.20279]]
  :u => Array{Float64,N} where N[[2.23497 2.9622 1.94912; 1.88972 2.05421 1.44816], [4.49943, 3.18227]]

but this is a bad hack. We have to understand why inference is failing and we have an Array{Any} in this error

ERROR: MethodError: no method matching sum_outgrads(::Array{Array{Float64,N} where N,1}, ::Array{Any,1})

from autograd.jl.

denizyuret avatar denizyuret commented on May 18, 2024

Fixed in latest master, please test and close.

from autograd.jl.

LiWang1 avatar LiWang1 commented on May 18, 2024

I have another problem, here is the code:

using DifferentialEquations
#using__ ForwardDiff 
#using DiffEqDiffTools 
using AutoGrad # for BAD

A = [1.0 0 0 -5
     4 -2 4 -3
     -4 0 0 1
     5 2 -2 4]
u0 = rand(4, 1)
tspan = (0.0, 1.0)
f(u, p, t) = p*u
prob = ODEProblem(f, u0, tspan, A)
sol = solve(prob)

x_obs = range(0, stop=1,length = 2000)
y_obs = [sol(x_obs[i]) + 0.01randn(4) for i in 1:length(x_obs)]

function solver(para)
    problem = ODEProblem(f, u0, tspan, para)
    _problem = remake(problem;u0=convert.(eltype(para),problem.u0),p=para)
    solution = solve(_problem, Tsit5())
    return (solution)
end

function loss(para)
    solution = solver(para)
    L = 0.0
    for i in 1:length(x_obs)
        L += sum((solution(x_obs[i]) .- y_obs[i]).^2)
    end
    return (L)
end

grad = AutoGrad.grad(loss)(A)

And the problem is :

Stacktrace:
 [1] setproperty!(::OrdinaryDiffEq.ODEIntegrator{Tsit5,false,Array{Float64,2},Float64,Param{Array{Float64,2}},Float64,Float64,Float64,Array{Array{Float64,2},1},ODESolution{Float64,3,Array{Array{Float64,2},1},Nothing,Nothing,Array{Float64,1},Array{Array{Array{Float64,2},1},1},ODEProblem{Array{Float64,2},Tuple{Float64,Float64},false,Param{Array{Float64,2}},ODEFunction{false,typeof(f),LinearAlgebra.UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},DiffEqBase.StandardODEProblem},Tsit5,OrdinaryDiffEq.InterpolationData{ODEFunction{false,typeof(f),LinearAlgebra.UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing},Array{Array{Float64,2},1},Array{Float64,1},Array{Array{Array{Float64,2},1},1},OrdinaryDiffEq.Tsit5ConstantCache{Float64,Float64}},DiffEqBase.DEStats},ODEFunction{false,typeof(f),LinearAlgebra.UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing},OrdinaryDiffEq.Tsit5ConstantCache{Float64,Float64},OrdinaryDiffEq.DEOptions{Float64,Float64,Float64,Float64,typeof(DiffEqBase.ODE_DEFAULT_NORM),typeof(LinearAlgebra.opnorm),CallbackSet{Tuple{},Tuple{}},typeof(DiffEqBase.ODE_DEFAULT_ISOUTOFDOMAIN),typeof(DiffEqBase.ODE_DEFAULT_PROG_MESSAGE),typeof(DiffEqBase.ODE_DEFAULT_UNSTABLE_CHECK),DataStructures.BinaryHeap{Float64,DataStructures.LessThan},DataStructures.BinaryHeap{Float64,DataStructures.LessThan},Nothing,Nothing,Int64,Array{Float64,1},Array{Float64,1},Array{Float64,1}},Array{Float64,2},Float64,Nothing}, ::Symbol, ::AutoGrad.Result{Array{Float64,2}}) at ./Base.jl:21
 [2] initialize!(::OrdinaryDiffEq.ODEIntegrator{Tsit5,false,Array{Float64,2},Float64,Param{Array{Float64,2}},Float64,Float64,Float64,Array{Array{Float64,2},1},ODESolution{Float64,3,Array{Array{Float64,2},1},Nothing,Nothing,Array{Float64,1},Array{Array{Array{Float64,2},1},1},ODEProblem{Array{Float64,2},Tuple{Float64,Float64},false,Param{Array{Float64,2}},ODEFunction{false,typeof(f),LinearAlgebra.UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},DiffEqBase.StandardODEProblem},Tsit5,OrdinaryDiffEq.InterpolationData{ODEFunction{false,typeof(f),LinearAlgebra.UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing},Array{Array{Float64,2},1},Array{Float64,1},Array{Array{Array{Float64,2},1},1},OrdinaryDiffEq.Tsit5ConstantCache{Float64,Float64}},DiffEqBase.DEStats},ODEFunction{false,typeof(f),LinearAlgebra.UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing},OrdinaryDiffEq.Tsit5ConstantCache{Float64,Float64},OrdinaryDiffEq.DEOptions{Float64,Float64,Float64,Float64,typeof(DiffEqBase.ODE_DEFAULT_NORM),typeof(LinearAlgebra.opnorm),CallbackSet{Tuple{},Tuple{}},typeof(DiffEqBase.ODE_DEFAULT_ISOUTOFDOMAIN),typeof(DiffEqBase.ODE_DEFAULT_PROG_MESSAGE),typeof(DiffEqBase.ODE_DEFAULT_UNSTABLE_CHECK),DataStructures.BinaryHeap{Float64,DataStructures.LessThan},DataStructures.BinaryHeap{Float64,DataStructures.LessThan},Nothing,Nothing,Int64,Array{Float64,1},Array{Float64,1},Array{Float64,1}},Array{Float64,2},Float64,Nothing}, ::OrdinaryDiffEq.Tsit5ConstantCache{Float64,Float64}) at /Users/wangli/.julia/packages/OrdinaryDiffEq/xJCph/src/perform_step/low_order_rk_perform_step.jl:565
 [3] #__init#335(::Array{Float64,1}, ::Array{Float64,1}, ::Array{Float64,1}, ::Nothing, ::Bool, ::Bool, ::Bool, ::Bool, ::Nothing, ::Bool, ::Bool, ::Float64, ::Float64, ::Float64, ::Bool, ::Bool, ::Rational{Int64}, ::Nothing, ::Nothing, ::Rational{Int64}, ::Int64, ::Int64, ::Int64, ::Rational{Int64}, ::Bool, ::Int64, ::Nothing, ::Nothing, ::Int64, ::typeof(DiffEqBase.ODE_DEFAULT_NORM), ::typeof(LinearAlgebra.opnorm), ::typeof(DiffEqBase.ODE_DEFAULT_ISOUTOFDOMAIN), ::typeof(DiffEqBase.ODE_DEFAULT_UNSTABLE_CHECK), ::Bool, ::Bool, ::Bool, ::Bool, ::Bool, ::Bool, ::Bool, ::Int64, ::String, ::typeof(DiffEqBase.ODE_DEFAULT_PROG_MESSAGE), ::Nothing, ::Bool, ::Bool, ::Bool, ::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::typeof(DiffEqBase.__init), ::ODEProblem{Array{Float64,2},Tuple{Float64,Float64},false,Param{Array{Float64,2}},ODEFunction{false,typeof(f),LinearAlgebra.UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},DiffEqBase.StandardODEProblem}, ::Tsit5, ::Array{Array{Float64,2},1}, ::Array{Float64,1}, ::Array{Any,1}, ::Type{Val{true}}) at /Users/wangli/.julia/packages/OrdinaryDiffEq/xJCph/src/solve.jl:356
 [4] __init(::ODEProblem{Array{Float64,2},Tuple{Float64,Float64},false,Param{Array{Float64,2}},ODEFunction{false,typeof(f),LinearAlgebra.UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},DiffEqBase.StandardODEProblem}, ::Tsit5, ::Array{Array{Float64,2},1}, ::Array{Float64,1}, ::Array{Any,1}, ::Type{Val{true}}) at /Users/wangli/.julia/packages/OrdinaryDiffEq/xJCph/src/solve.jl:66 (repeats 4 times)
 [5] #__solve#334 at /Users/wangli/.julia/packages/OrdinaryDiffEq/xJCph/src/solve.jl:4 [inlined]
 [6] __solve at /Users/wangli/.julia/packages/OrdinaryDiffEq/xJCph/src/solve.jl:4 [inlined]
 [7] #solve_call#433(::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::typeof(DiffEqBase.solve_call), ::ODEProblem{Array{Float64,2},Tuple{Float64,Float64},false,Param{Array{Float64,2}},ODEFunction{false,typeof(f),LinearAlgebra.UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},DiffEqBase.StandardODEProblem}, ::Tsit5) at /Users/wangli/.julia/packages/DiffEqBase/4V8I6/src/solve.jl:40
 [8] solve_call at /Users/wangli/.julia/packages/DiffEqBase/4V8I6/src/solve.jl:37 [inlined]
 [9] #solve#434 at /Users/wangli/.julia/packages/DiffEqBase/4V8I6/src/solve.jl:57 [inlined]
 [10] solve(::ODEProblem{Array{Float64,2},Tuple{Float64,Float64},false,Param{Array{Float64,2}},ODEFunction{false,typeof(f),LinearAlgebra.UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},DiffEqBase.StandardODEProblem}, ::Tsit5) at /Users/wangli/.julia/packages/DiffEqBase/4V8I6/src/solve.jl:45
 [11] solver(::Param{Array{Float64,2}}) at /Users/wangli/masterthesis/models /scalable_model.jl:36
 [12] loss(::Param{Array{Float64,2}}) at /Users/wangli/masterthesis/models /scalable_model.jl:42
 [13] #differentiate#3(::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::typeof(AutoGrad.differentiate), ::Function, ::Param{Array{Float64,2}}) at /Users/wangli/.julia/packages/AutoGrad/pTNVv/src/core.jl:144
 [14] differentiate(::Function, ::Param{Array{Float64,2}}) at /Users/wangli/.julia/packages/AutoGrad/pTNVv/src/core.jl:135
 [15] (::getfield(AutoGrad, Symbol("##gradfun#6#8")){typeof(loss),Int64,Bool})(::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::getfield(AutoGrad, Symbol("#gradfun#7")){getfield(AutoGrad, Symbol("##gradfun#6#8")){typeof(loss),Int64,Bool}}, ::Array{Float64,2}) at /Users/wangli/.julia/packages/AutoGrad/pTNVv/src/core.jl:225
 [16] (::getfield(AutoGrad, Symbol("#gradfun#7")){getfield(AutoGrad, Symbol("##gradfun#6#8")){typeof(loss),Int64,Bool}})(::Array{Float64,2}) at /Users/wangli/.julia/packages/AutoGrad/pTNVv/src/core.jl:221
 [17] top-level scope at REPL[52]:1
 [18] eval(::Module, ::Any) at ./boot.jl:330
 [19] eval_user_input(::Any, ::REPL.REPLBackend) at /Users/sabae/buildbot/worker/package_macos64/build/usr/share/julia/stdlib/v1.2/REPL/src/REPL.jl:86
 [20] macro expansion at /Users/sabae/buildbot/worker/package_macos64/build/usr/share/julia/stdlib/v1.2/REPL/src/REPL.jl:118 [inlined]
 [21] (::getfield(REPL, Symbol("##26#27")){REPL.REPLBackend})() at ./task.jl:268
ERROR: MethodError: Cannot `convert` an object of type AutoGrad.Result{Array{Float64,2}} to an object of type Array{Float64,2}
Closest candidates are:
  convert(::Type{Array{T,N}}, ::StaticArrays.SizedArray{S,T,N,M} where M) where {T, S, N} at /Users/wangli/.julia/packages/StaticArrays/DBECI/src/SizedArray.jl:62
  convert(::Type{Array{T,N}}, ::FillArrays.Zeros{V,N,Axes} where Axes) where {T, V, N} at /Users/wangli/.julia/packages/FillArrays/LYCik/src/FillArrays.jl:320
  convert(::Type{Array{T,N}}, ::FillArrays.Ones{V,N,Axes} where Axes) where {T, V, N} at /Users/wangli/.julia/packages/FillArrays/LYCik/src/FillArrays.jl:320
  ...
Stacktrace:
 [1] #differentiate#3(::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::typeof(AutoGrad.differentiate), ::Function, ::Param{Array{Float64,2}}) at /Users/wangli/.julia/packages/AutoGrad/pTNVv/src/core.jl:148
 [2] differentiate(::Function, ::Param{Array{Float64,2}}) at /Users/wangli/.julia/packages/AutoGrad/pTNVv/src/core.jl:135
 [3] (::getfield(AutoGrad, Symbol("##gradfun#6#8")){typeof(loss),Int64,Bool})(::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::getfield(AutoGrad, Symbol("#gradfun#7")){getfield(AutoGrad, Symbol("##gradfun#6#8")){typeof(loss),Int64,Bool}}, ::Array{Float64,2}) at /Users/wangli/.julia/packages/AutoGrad/pTNVv/src/core.jl:225
 [4] (::getfield(AutoGrad, Symbol("#gradfun#7")){getfield(AutoGrad, Symbol("##gradfun#6#8")){typeof(loss),Int64,Bool}})(::Array{Float64,2}) at /Users/wangli/.julia/packages/AutoGrad/pTNVv/src/core.jl:221
 [5] top-level scope at REPL[52]:1
caused by [exception 1]
MethodError: Cannot `convert` an object of type AutoGrad.Result{Array{Float64,2}} to an object of type Array{Float64,2}
Closest candidates are:
  convert(::Type{Array{T,N}}, ::StaticArrays.SizedArray{S,T,N,M} where M) where {T, S, N} at /Users/wangli/.julia/packages/StaticArrays/DBECI/src/SizedArray.jl:62
  convert(::Type{Array{T,N}}, ::FillArrays.Zeros{V,N,Axes} where Axes) where {T, V, N} at /Users/wangli/.julia/packages/FillArrays/LYCik/src/FillArrays.jl:320
  convert(::Type{Array{T,N}}, ::FillArrays.Ones{V,N,Axes} where Axes) where {T, V, N} at /Users/wangli/.julia/packages/FillArrays/LYCik/src/FillArrays.jl:320
  ...
Stacktrace:
 [1] setproperty!(::OrdinaryDiffEq.ODEIntegrator{Tsit5,false,Array{Float64,2},Float64,Param{Array{Float64,2}},Float64,Float64,Float64,Array{Array{Float64,2},1},ODESolution{Float64,3,Array{Array{Float64,2},1},Nothing,Nothing,Array{Float64,1},Array{Array{Array{Float64,2},1},1},ODEProblem{Array{Float64,2},Tuple{Float64,Float64},false,Param{Array{Float64,2}},ODEFunction{false,typeof(f),LinearAlgebra.UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},DiffEqBase.StandardODEProblem},Tsit5,OrdinaryDiffEq.InterpolationData{ODEFunction{false,typeof(f),LinearAlgebra.UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing},Array{Array{Float64,2},1},Array{Float64,1},Array{Array{Array{Float64,2},1},1},OrdinaryDiffEq.Tsit5ConstantCache{Float64,Float64}},DiffEqBase.DEStats},ODEFunction{false,typeof(f),LinearAlgebra.UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing},OrdinaryDiffEq.Tsit5ConstantCache{Float64,Float64},OrdinaryDiffEq.DEOptions{Float64,Float64,Float64,Float64,typeof(DiffEqBase.ODE_DEFAULT_NORM),typeof(LinearAlgebra.opnorm),CallbackSet{Tuple{},Tuple{}},typeof(DiffEqBase.ODE_DEFAULT_ISOUTOFDOMAIN),typeof(DiffEqBase.ODE_DEFAULT_PROG_MESSAGE),typeof(DiffEqBase.ODE_DEFAULT_UNSTABLE_CHECK),DataStructures.BinaryHeap{Float64,DataStructures.LessThan},DataStructures.BinaryHeap{Float64,DataStructures.LessThan},Nothing,Nothing,Int64,Array{Float64,1},Array{Float64,1},Array{Float64,1}},Array{Float64,2},Float64,Nothing}, ::Symbol, ::AutoGrad.Result{Array{Float64,2}}) at ./Base.jl:21
 [2] initialize!(::OrdinaryDiffEq.ODEIntegrator{Tsit5,false,Array{Float64,2},Float64,Param{Array{Float64,2}},Float64,Float64,Float64,Array{Array{Float64,2},1},ODESolution{Float64,3,Array{Array{Float64,2},1},Nothing,Nothing,Array{Float64,1},Array{Array{Array{Float64,2},1},1},ODEProblem{Array{Float64,2},Tuple{Float64,Float64},false,Param{Array{Float64,2}},ODEFunction{false,typeof(f),LinearAlgebra.UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},DiffEqBase.StandardODEProblem},Tsit5,OrdinaryDiffEq.InterpolationData{ODEFunction{false,typeof(f),LinearAlgebra.UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing},Array{Array{Float64,2},1},Array{Float64,1},Array{Array{Array{Float64,2},1},1},OrdinaryDiffEq.Tsit5ConstantCache{Float64,Float64}},DiffEqBase.DEStats},ODEFunction{false,typeof(f),LinearAlgebra.UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing},OrdinaryDiffEq.Tsit5ConstantCache{Float64,Float64},OrdinaryDiffEq.DEOptions{Float64,Float64,Float64,Float64,typeof(DiffEqBase.ODE_DEFAULT_NORM),typeof(LinearAlgebra.opnorm),CallbackSet{Tuple{},Tuple{}},typeof(DiffEqBase.ODE_DEFAULT_ISOUTOFDOMAIN),typeof(DiffEqBase.ODE_DEFAULT_PROG_MESSAGE),typeof(DiffEqBase.ODE_DEFAULT_UNSTABLE_CHECK),DataStructures.BinaryHeap{Float64,DataStructures.LessThan},DataStructures.BinaryHeap{Float64,DataStructures.LessThan},Nothing,Nothing,Int64,Array{Float64,1},Array{Float64,1},Array{Float64,1}},Array{Float64,2},Float64,Nothing}, ::OrdinaryDiffEq.Tsit5ConstantCache{Float64,Float64}) at /Users/wangli/.julia/packages/OrdinaryDiffEq/xJCph/src/perform_step/low_order_rk_perform_step.jl:565
 [3] #__init#335(::Array{Float64,1}, ::Array{Float64,1}, ::Array{Float64,1}, ::Nothing, ::Bool, ::Bool, ::Bool, ::Bool, ::Nothing, ::Bool, ::Bool, ::Float64, ::Float64, ::Float64, ::Bool, ::Bool, ::Rational{Int64}, ::Nothing, ::Nothing, ::Rational{Int64}, ::Int64, ::Int64, ::Int64, ::Rational{Int64}, ::Bool, ::Int64, ::Nothing, ::Nothing, ::Int64, ::typeof(DiffEqBase.ODE_DEFAULT_NORM), ::typeof(LinearAlgebra.opnorm), ::typeof(DiffEqBase.ODE_DEFAULT_ISOUTOFDOMAIN), ::typeof(DiffEqBase.ODE_DEFAULT_UNSTABLE_CHECK), ::Bool, ::Bool, ::Bool, ::Bool, ::Bool, ::Bool, ::Bool, ::Int64, ::String, ::typeof(DiffEqBase.ODE_DEFAULT_PROG_MESSAGE), ::Nothing, ::Bool, ::Bool, ::Bool, ::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::typeof(DiffEqBase.__init), ::ODEProblem{Array{Float64,2},Tuple{Float64,Float64},false,Param{Array{Float64,2}},ODEFunction{false,typeof(f),LinearAlgebra.UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},DiffEqBase.StandardODEProblem}, ::Tsit5, ::Array{Array{Float64,2},1}, ::Array{Float64,1}, ::Array{Any,1}, ::Type{Val{true}}) at /Users/wangli/.julia/packages/OrdinaryDiffEq/xJCph/src/solve.jl:356
 [4] __init(::ODEProblem{Array{Float64,2},Tuple{Float64,Float64},false,Param{Array{Float64,2}},ODEFunction{false,typeof(f),LinearAlgebra.UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},DiffEqBase.StandardODEProblem}, ::Tsit5, ::Array{Array{Float64,2},1}, ::Array{Float64,1}, ::Array{Any,1}, ::Type{Val{true}}) at /Users/wangli/.julia/packages/OrdinaryDiffEq/xJCph/src/solve.jl:66 (repeats 4 times)
 [5] #__solve#334 at /Users/wangli/.julia/packages/OrdinaryDiffEq/xJCph/src/solve.jl:4 [inlined]
 [6] __solve at /Users/wangli/.julia/packages/OrdinaryDiffEq/xJCph/src/solve.jl:4 [inlined]
 [7] #solve_call#433(::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::typeof(DiffEqBase.solve_call), ::ODEProblem{Array{Float64,2},Tuple{Float64,Float64},false,Param{Array{Float64,2}},ODEFunction{false,typeof(f),LinearAlgebra.UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},DiffEqBase.StandardODEProblem}, ::Tsit5) at /Users/wangli/.julia/packages/DiffEqBase/4V8I6/src/solve.jl:40
 [8] solve_call at /Users/wangli/.julia/packages/DiffEqBase/4V8I6/src/solve.jl:37 [inlined]
 [9] #solve#434 at /Users/wangli/.julia/packages/DiffEqBase/4V8I6/src/solve.jl:57 [inlined]
 [10] solve(::ODEProblem{Array{Float64,2},Tuple{Float64,Float64},false,Param{Array{Float64,2}},ODEFunction{false,typeof(f),LinearAlgebra.UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},DiffEqBase.StandardODEProblem}, ::Tsit5) at /Users/wangli/.julia/packages/DiffEqBase/4V8I6/src/solve.jl:45
 [11] solver(::Param{Array{Float64,2}}) at /Users/wangli/masterthesis/models /scalable_model.jl:36
 [12] loss(::Param{Array{Float64,2}}) at /Users/wangli/masterthesis/models /scalable_model.jl:42
 [13] #differentiate#3(::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::typeof(AutoGrad.differentiate), ::Function, ::Param{Array{Float64,2}}) at /Users/wangli/.julia/packages/AutoGrad/pTNVv/src/core.jl:144
 [14] differentiate(::Function, ::Param{Array{Float64,2}}) at /Users/wangli/.julia/packages/AutoGrad/pTNVv/src/core.jl:135
 [15] (::getfield(AutoGrad, Symbol("##gradfun#6#8")){typeof(loss),Int64,Bool})(::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::getfield(AutoGrad, Symbol("#gradfun#7")){getfield(AutoGrad, Symbol("##gradfun#6#8")){typeof(loss),Int64,Bool}}, ::Array{Float64,2}) at /Users/wangli/.julia/packages/AutoGrad/pTNVv/src/core.jl:225
 [16] (::getfield(AutoGrad, Symbol("#gradfun#7")){getfield(AutoGrad, Symbol("##gradfun#6#8")){typeof(loss),Int64,Bool}})(::Array{Float64,2}) at /Users/wangli/.julia/packages/AutoGrad/pTNVv/src/core.jl:221
 [17] top-level scope at REPL[52]:1

from autograd.jl.

denizyuret avatar denizyuret commented on May 18, 2024

This seems to be a different issue. Reopening to investigate when I find time. setproperty! suggests we are playing with object properties. Type mismatch suggests the code might be trying to put a Result where an Array is expected. Should try new AutoGrad interface if possible.

from autograd.jl.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.