julianlsolvers / nlsolve.jl Goto Github PK
View Code? Open in Web Editor NEWJulia solvers for systems of nonlinear equations and mixed complementarity problems
License: Other
Julia solvers for systems of nonlinear equations and mixed complementarity problems
License: Other
Making this to track regression in master and to make people aware
using NLsolve
function f!(x::Vector, fvec::Vector)
fvec[1] = 1 - x[1]
fvec[2] = 10(x[2]-x[1]^2)
end
function g!(x::Vector, fjac::Matrix)
fjac[1,1] = -1
fjac[1,2] = 0
fjac[2,1] = -20x[1]
fjac[2,2] = 10
end
df! = DifferentiableMultivariateFunction(f!, g!)
x0 = [-1.2; 1.]
@time for i in 1:10^4 nlsolve(df!, x0, method = :newton) end
On 0.4.3: 0.292429 seconds (5.31 M allocations: 123.749 MB, 10.00% gc time)
On master: 1.336571 seconds (5.53 M allocations: 101.471 MB, 2.06% gc time)
I believe this is because we are mutating a variable inside a closure which is slow in 0.5.. see JuliaLang/julia#15276
I get following error when I run I run the ImplicitEuler test from here on the latest version of NLSolve. The when I set the kwarg to audodiff=false
the error goes away. Inside the code it's just building a function which has matrix and element-wise multiplications and applying nlsolve with/without autodiff, so the issue should just be something to do with that. It was working before new versions of the packages were tagged.
When I get more time I'll try to isolate it a bit, but it should just be a function with matrix multiplications and autodiff. Note that this still occurs when I change the test case to using just Float/Ints.
signal (11): Segmentation fault
_ZN4llvm13LiveVariables16HandleVirtRegUseEjPNS_17MachineBasicBlockEPNS_12MachineInstrE at /usr/lib64/llvm33/libLLVM-3.3.so (unknown line)
_ZN4llvm13LiveVariables20runOnMachineFunctionERNS_15MachineFunctionE at /usr/lib64/llvm33/libLLVM-3.3.so (unknown line)
_ZN4llvm13FPPassManager13runOnFunctionERNS_8FunctionE at /usr/lib64/llvm33/libLLVM-3.3.so (unknown line)
_ZN4llvm23FunctionPassManagerImpl3runERNS_8FunctionE at /usr/lib64/llvm33/libLLVM-3.3.so (unknown line)
_ZN4llvm19FunctionPassManager3runERNS_8FunctionE at /usr/lib64/llvm33/libLLVM-3.3.so (unknown line)
_ZN4llvm3JIT14jitTheFunctionEPNS_8FunctionERKNS_10MutexGuardE at /usr/lib64/llvm33/libLLVM-3.3.so (unknown line)
_ZN4llvm3JIT24runJITOnFunctionUnlockedEPNS_8FunctionERKNS_10MutexGuardE at /usr/lib64/llvm33/libLLVM-3.3.so (unknown line)
_ZN4llvm3JIT20getPointerToFunctionEPNS_8FunctionE at /usr/lib64/llvm33/libLLVM-3.3.so (unknown line)
unknown function (ip: 0x7f634d34079c)
jl_trampoline at /usr/bin/../lib64/julia/libjulia.so (unknown line)
jl_apply_generic at /usr/bin/../lib64/julia/libjulia.so (unknown line)
rhs! at /home/crackauc/.julia/v0.4/DifferentialEquations/src/fem/femSolvers.jl:318
jl_apply_generic at /usr/bin/../lib64/julia/libjulia.so (unknown line)
anonymous at /home/crackauc/.julia/v0.4/DifferentialEquations/src/fem/femSolvers.jl:380
anonymous at /home/crackauc/.julia/v0.4/NLsolve/src/autodiff.jl:3
chunk_mode_jacobian! at /home/crackauc/.julia/v0.4/ForwardDiff/src/jacobian.jl:191
jacobian! at /home/crackauc/.julia/v0.4/ForwardDiff/src/jacobian.jl:65
jacobian! at /home/crackauc/.julia/v0.4/ForwardDiff/src/jacobian.jl:71
julia_jacobian!_24101 at (unknown line)
jlcall_jacobian!_24101 at (unknown line)
jl_apply_generic at /usr/bin/../lib64/julia/libjulia.so (unknown line)
jacobian! at /home/crackauc/.julia/v0.4/ForwardDiff/src/jacobian.jl:71
julia_jacobian!_24076 at (unknown line)
jl_apply_generic at /usr/bin/../lib64/julia/libjulia.so (unknown line)
anonymous at /home/crackauc/.julia/v0.4/NLsolve/src/autodiff.jl:13
trust_region_ at /home/crackauc/.julia/v0.4/NLsolve/src/trust_region.jl:87
trust_region at /home/crackauc/.julia/v0.4/NLsolve/src/trust_region.jl:187
nlsolve at /home/crackauc/.julia/v0.4/NLsolve/src/nlsolve_func_defs.jl:24
jlcall___nlsolve#0___23923 at (unknown line)
jl_apply_generic at /usr/bin/../lib64/julia/libjulia.so (unknown line)
julia_nlsolve_23855 at (unknown line)
nlsolve at /home/crackauc/.julia/v0.4/NLsolve/src/nlsolve_func_defs.jl:70
jlcall___nlsolve#2___23852 at (unknown line)
jl_apply_generic at /usr/bin/../lib64/julia/libjulia.so (unknown line)
julia_nlsolve_23851 at (unknown line)
solve at /home/crackauc/.julia/v0.4/DifferentialEquations/src/fem/femSolvers.jl:380
jlcall___solve#42___23381 at (unknown line)
jl_apply_generic at /usr/bin/../lib64/julia/libjulia.so (unknown line)
julia_solve_23380 at (unknown line)
unknown function (ip: 0x7f634d3866a3)
unknown function (ip: 0x7f634d385ae1)
unknown function (ip: 0x7f634d3857ed)
unknown function (ip: 0x7f634d386d4d)
unknown function (ip: 0x7f634d387149)
unknown function (ip: 0x7f634d39b92f)
unknown function (ip: 0x7f634d39c239)
jl_load_file_string at /usr/bin/../lib64/julia/libjulia.so (unknown line)
include_string at /home/crackauc/.julia/v0.4/CodeTools/src/eval.jl:28
jlcall_include_string_22952 at (unknown line)
jl_apply_generic at /usr/bin/../lib64/julia/libjulia.so (unknown line)
unknown function (ip: 0x7f634d3866a3)
unknown function (ip: 0x7f634d385ae1)
unknown function (ip: 0x7f634d39b507)
jl_toplevel_eval_in at /usr/bin/../lib64/julia/libjulia.so (unknown line)
include_string at /home/crackauc/.julia/v0.4/CodeTools/src/eval.jl:32
jl_apply_generic at /usr/bin/../lib64/julia/libjulia.so (unknown line)
anonymous at /home/crackauc/.julia/v0.4/Atom/src/eval.jl:39
withpath at /home/crackauc/.julia/v0.4/Requires/src/require.jl:37
jl_apply_generic at /usr/bin/../lib64/julia/libjulia.so (unknown line)
withpath at /home/crackauc/.julia/v0.4/Atom/src/eval.jl:53
jl_apply_generic at /usr/bin/../lib64/julia/libjulia.so (unknown line)
anonymous at /home/crackauc/.julia/v0.4/Atom/src/eval.jl:61
unknown function (ip: 0x7f634d38d05b)
unknown function (ip: (nil))
The latest tag is pretty old and there have been some depwarn fixes since then. It would be nice to have a new tag so that way the warnings aren't thrown (mostly for users of dependencies). Thanks!
I've tried to solve the following simple linear complementarity problem using NLsolve
.
using NLsolve
M = [0 0 -1 -1 ;
0 0 1 -2 ;
1 -1 2 -2 ;
1 2 -2 4 ]
q = [2; 2; -2; -6]
function f!(x, fvec)
fvec = M * x + q
end
r = mcpsolve(f!, [0., 0., 0., 0.], [Inf, Inf, Inf, Inf],
[1.25, 0., 0., 0.5], reformulation = :smooth, autodiff = true)
x = r.zero # [1.25, 0.0, 0.0, 0.5]
@show dot( M*x + q, x ) # 0.5
sol = [2.8, 0.0, 0.8, 1.2]
@show dot( M*sol + q, sol ) # 0.0
While the solution is know to [2.8, 0.0, 0.8, 1.2]
, I obtained [1.25, 0.0, 0.0, 0.5]
, just same as the initial solution provided.
julia> @show r
r = Results of Nonlinear Solver Algorithm
* Algorithm: Trust-region with dogleg and autoscaling
* Starting Point: [1.25,0.0,0.0,0.5]
* Zero: [1.25,0.0,0.0,0.5]
* Inf-norm of residuals: 0.000000
* Iterations: 0
* Convergence: true
* |x - x'| < 0.0e+00: false
* |f(x)| < 1.0e-08: true
* Function Calls (f): 1
* Jacobian Calls (df/dx): 1
It seems that it was not run properly with 0 iterations.
PackageEvaluator.jl is a script that runs nightly. It attempts to load all Julia packages and run their tests (if available) on both the stable version of Julia (0.2) and the nightly build of the unstable version (0.3). The results of this script are used to generate a package listing enhanced with testing results.
Tests pass.
.Tests fail, but package loads.
.Tests pass.
means that PackageEvaluator found the tests for your package, executed them, and they all passed.
Tests fail, but package loads.
means that PackageEvaluator found the tests for your package, executed them, and they didn't pass. However, trying to load your package with using
worked.
This issue was filed because your testing status became worse. No additional issues will be filed if your package remains in this state, and no issue will be filed if it improves. If you'd like to opt-out of these status-change messages, reply to this message saying you'd like to and @IainNZ will add an exception. If you'd like to discuss PackageEvaluator.jl please file an issue at the repository. For example, your package may be untestable on the test machine due to a dependency - an exception can be added.
As discussed here:
https://discourse.julialang.org/t/unified-interface-for-linear-solving/699/14
using factorization objects lets the user choose the linear solving technique, which can greatly affect the performance of the method. If the interface exposes to allow someone to pass a factorization object, say lufact!
, then they could also use PETSc.jl and other package's solvers with the problem.
The implementation shouldn't be too difficult. Internally, you'd change:
p = fjac\fvec
to
facotized_fjac = factorization(fjac)
p = facotized_fjac\fvec
and give a default. If you make factorization=factorize
as the default, it should work the same as before (there's type-stability issues by doing as a kwarg though, so some care would need to be had).
I see you have homotopy continuation methods on your TODO list. Would you be interested in implementing an interface to the PHCpack package?
See also:
janverschelde/PHCpack#3
https://groups.google.com/forum/?fromgroups=#!topic/julia-users/HTbw39F0ejc
It would be great to move the tests in this library into a standalone package similar to what Optim has done.
using NLsolve
function f!(x, fvec)
fvec[1] = (x[1]+3)*(x[2]^3-7)+18
fvec[2] = sin(x[2]*exp(x[1])-1)
end
function g!(x, fjac)
fjac[1, 1] = x[2]^3-7
fjac[1, 2] = 3*x[2]^2*(x[1]+3)
u = exp(x[1])*cos(x[2]*exp(x[1])-1)
fjac[2, 1] = x[2]*u
fjac[2, 2] = u
end
nlsolve(f!, g!, [ 0.1+im; 1.2])
MethodError: no method matching #nlsolve#12(::Symbol, ::Complex{Float64}, ::Complex{Float64}, ::Int64, ::Bool, ::Bool, ::Bool, ::NLsolve.#no_linesearch!, ::Complex{Float64}, ::Bool, ::Int64, ::Float64, ::NLsolve.#nlsolve, ::#f!, ::#g!, ::Array{Complex{Float64},1})
Closest candidates are:
#nlsolve#12(::Symbol, !Matched::Real, !Matched::Real, ::Integer, ::Bool, ::Bool, ::Bool, ::Function, !Matched::Real, ::Bool, ::Integer, ::Real, ::Any, ::Function, ::Function, ::Array{T,1}) where T at C:\Users\Chris\.julia\v0.6\NLsolve\src\nlsolve_func_defs.jl:52
nlsolve(::Function, ::Function, ::Array{Complex{Float64},1}) at nlsolve_func_defs.jl:52
include_string(::String, ::String) at loading.jl:515
include_string(::String, ::String, ::Int64) at eval.jl:30
include_string(::Module, ::String, ::String, ::Int64, ::Vararg{Int64,N} where N) at eval.jl:34
(::Atom.##49#52{String,Int64,String})() at eval.jl:50
withpath(::Atom.##49#52{String,Int64,String}, ::String) at utils.jl:30
withpath(::Function, ::String) at eval.jl:38
macro expansion at eval.jl:49 [inlined]
(::Atom.##48#51{Dict{String,Any}})() at task.jl:80
It looks like there are assumptions baked into there that it needs to be real numbers
https://github.com/JuliaNLSolvers/NLsolve.jl/blob/master/src/nlsolve_func_defs.jl#L5
Could those be removed?
PackageEvaluator.jl is a script that runs nightly. It attempts to load all Julia packages and run their tests (if available) on both the stable version of Julia (0.2) and the nightly build of the unstable version (0.3). The results of this script are used to generate a package listing enhanced with testing results.
Tests pass.
Package doesn't load.
Tests pass.
means that PackageEvaluator found the tests for your package, executed them, and they all passed.
Package doesn't load.
means that PackageEvaluator did not find tests for your package. Additionally, trying to load your package with using
failed.
This issue was filed because your testing status became worse. No additional issues will be filed if your package remains in this state, and no issue will be filed if it improves. If you'd like to opt-out of these status-change messages, reply to this message saying you'd like to and @IainNZ will add an exception. If you'd like to discuss PackageEvaluator.jl please file an issue at the repository. For example, your package may be untestable on the test machine due to a dependency - an exception can be added.
Test log:
INFO: Installing ArrayViews v0.4.6
INFO: Installing Calculus v0.1.4
INFO: Installing Distributions v0.5.2
INFO: Installing DualNumbers v0.1.0
INFO: Installing NLsolve v0.1.4
INFO: Installing Optim v0.3.0
INFO: Installing Options v0.2.2
INFO: Installing PDMats v0.2.1
INFO: Installing StatsBase v0.5.3
INFO: Package database updated
ERROR: NumericExtensions not found
in require at loading.jl:47
in include at ./boot.jl:245
in include_from_node1 at ./loading.jl:128
in reload_path at loading.jl:152
in _require at loading.jl:67
in require at loading.jl:54
in include at ./boot.jl:245
in include_from_node1 at ./loading.jl:128
in reload_path at loading.jl:152
in _require at loading.jl:67
in require at loading.jl:54
in include at ./boot.jl:245
in include_from_node1 at ./loading.jl:128
in reload_path at loading.jl:152
in _require at loading.jl:67
in require at loading.jl:54
in include at ./boot.jl:245
in include_from_node1 at ./loading.jl:128
in reload_path at loading.jl:152
in _require at loading.jl:67
in require at loading.jl:51
in include at ./boot.jl:245
in include_from_node1 at loading.jl:128
in process_options at ./client.jl:285
in _start at ./client.jl:354
while loading /home/idunning/pkgtest/.julia/v0.3/PDMats/src/PDMats.jl, in expression starting on line 9
while loading /home/idunning/pkgtest/.julia/v0.3/Distributions/src/Distributions.jl, in expression starting on line 4
while loading /home/idunning/pkgtest/.julia/v0.3/Optim/src/Optim.jl, in expression starting on line 5
while loading /home/idunning/pkgtest/.julia/v0.3/NLsolve/src/NLsolve.jl, in expression starting on line 3
while loading /home/idunning/pkgtest/.julia/v0.3/NLsolve/testusing.jl, in expression starting on line 1
INFO: Package database updated
JuliaLang/METADATA.jl#6293 (comment)
To fix when that is done.
Using the README example on master:
julia> using NLsolve
julia>
julia> function f!(x, fvec)
fvec[1] = (x[1]+3)*(x[2]^3-7)+18
fvec[2] = sin(x[2]*exp(x[1])-1)
end
f! (generic function with 1 method)
julia>
julia> function g!(x, fjac)
fjac[1, 1] = x[2]^3-7
fjac[1, 2] = 3*x[2]^2*(x[1]+3)
u = exp(x[1])*cos(x[2]*exp(x[1])-1)
fjac[2, 1] = x[2]*u
fjac[2, 2] = u
end
g! (generic function with 1 method)
julia>
julia> nlsolve(f!, g!, [ 0.1; 1.2])
ERROR: UndefVarError: tracing not defined
in trust_region_(::NLsolve.DifferentiableMultivariateFunction, ::Array{Float64,1}, ::Float64, ::Float64, ::Int64, ::Bool, ::Bool, ::Bool, ::Float64, ::Bool) at C:\Users\Chris\.julia\v0.6\NLsolve\src\trust_region.jl:3
in #nlsolve#17(::Symbol, ::Float64, ::Float64, ::Int64, ::Bool, ::Bool, ::Bool, ::Function, ::Float64, ::Bool, ::NLsolve.#nlsolve, ::NLsolve.DifferentiableMultivariateFunction, ::Array{Float64,1}) at C:\Users\Chris\.julia\v0.6\NLsolve\src\nlsolve_func_defs.jl:24
in (::NLsolve.#kw##nlsolve)(::Array{Any,1}, ::NLsolve.#nlsolve, ::NLsolve.DifferentiableMultivariateFunction, ::Array{Float64,1}) at .\<missing>:0
in nlsolve(::Function, ::Function, ::Array{Float64,1}) at C:\Users\Chris\.julia\v0.6\NLsolve\src\nlsolve_func_defs.jl:45
ref: JuliaNLSolvers/Optim.jl#358
Optim is leaving JuliaOpt, as the latter is refocusing, such that it will be an org for MathProgBase, JuMP and extensions, and backend solvers. This means that Optim+LineSearches+LsqFit is looking for a new home; a new org (current most likely name according to the issue above: JuliaNLSolvers
).
The org is going to contain a quite tightly knit collection of packages: A base package, Optim, LineSearches, some benchmarking and testing packages, and a Least Squares package + curve fitting (LsqFit currently - not sure if there's some refactoring that makes sense there). There's a lot of common code base in there, so it actually does make a lot of sense for us to move.
I was hoping that NLsolve could follow, such that we could get a stack of solver packages (optimization and nonlinear equations solving) written in Julia, and with a (to the extent possible) similar API. There's a lot of code that is repeated across Optim and NLsolve, so I think we can really improve the development work flow and throughput by joining forces formally.
Below, I've CCed everyone who has contributed to this package. The reason is, that I very much prefer that NLsolve (NLSolve?...another discussion) moves into our new home as an MIT licensed package. If people do not agree, we can discuss it, but let's just say that I prefer it that way, and I know other people in the Optim stack do as well.
So this issue is really about the following two questions:
Thanks!
cc: @KristofferC @sebastien-villemot @timholy @ChrisRackauckas @matthieugomez @tkelman @sglyon @mlubin
It would be nice if NLsolve accepted AbstractArray
inputs x
and treated internally the vec(x)
, and just reshape
d afterwards (and with each function call) to hide this abstraction from the user. Mathematically (and computationally since vec
and reshape
are views) there's no difference here, but it takes user burden away.
It will be really nice to add banded Jacobian matrix support in NLsolve.jl
.
Would be great to have a tagged version that uses the new ForwardDiff.
Just a question. Do we want to do this? It would cost one function evaluation and would rarely happen, but would be nice when it does happen...
The nice part of DifferentiableFunction object is that depending of which derivative orders where supplied for the function to solve, it can ensure that no unnecessary calculations are performed.
If f
is a the function to solve, g
its derivative, and fg
a function that computes both at the same time, it is natural to define a DifferentiableFunction with either:
f
f
and g
f
and fg
fg
f
and g
and fg
")
can optionally specialize on the more efficient call order.
PackageEvaluator.jl is a script that runs nightly. It attempts to load all Julia packages and run their tests (if available) on both the stable version of Julia (0.2) and the nightly build of the unstable version (0.3). The results of this script are used to generate a package listing enhanced with testing results.
Tests pass.
Tests fail, but package loads.
Tests pass.
means that PackageEvaluator found the tests for your package, executed them, and they all passed.
Tests fail, but package loads.
means that PackageEvaluator found the tests for your package, executed them, and they didn't pass. However, trying to load your package with using
worked.
This issue was filed because your testing status became worse. No additional issues will be filed if your package remains in this state, and no issue will be filed if it improves. If you'd like to opt-out of these status-change messages, reply to this message saying you'd like to and @IainNZ will add an exception. If you'd like to discuss PackageEvaluator.jl please file an issue at the repository. For example, your package may be untestable on the test machine due to a dependency - an exception can be added.
Test log:
INFO: Installing ArrayViews v0.4.4
INFO: Installing Calculus v0.1.3
INFO: Installing Distributions v0.4.7
INFO: Installing DualNumbers v0.1.0
INFO: Installing NLsolve v0.1.3
INFO: Installing Optim v0.2.0
INFO: Installing Options v0.2.2
INFO: Installing PDMats v0.2.0
INFO: Installing StatsBase v0.4.0
INFO: Package database updated
ERROR: assertion failed: lsr.slope[ia] < 0
in bisect! at /home/idunning/pkgtest/.julia/v0.3/Optim/src/linesearch/hz_linesearch.jl:575
in hz_linesearch! at /home/idunning/pkgtest/.julia/v0.3/Optim/src/linesearch/hz_linesearch.jl:273
in hz_linesearch! at /home/idunning/pkgtest/.julia/v0.3/Optim/src/linesearch/hz_linesearch.jl:201
in newton at /home/idunning/pkgtest/.julia/v0.3/NLsolve/src/newton.jl:121
in nlsolve at /home/idunning/pkgtest/.julia/v0.3/NLsolve/src/NLsolve.jl:261
in include at boot.jl:244
in anonymous at no file:15
in include at boot.jl:244
in include_from_node1 at loading.jl:128
while loading /home/idunning/pkgtest/.julia/v0.3/NLsolve/test/2by2.jl, in expression starting on line 31
while loading /home/idunning/pkgtest/.julia/v0.3/NLsolve/test/runtests.jl, in expression starting on line 13
INFO: Package database updated
Currently, a simple function f(x) = sin(x)
can't be solved directly, because the solver expects the function to store its result in its second argument instead of returning it. This is good for memory usage but frustrating for very simple optimization problems like rapidly solving a function f(x)=sin(x)
Some suggestions to improve the matter:
1/ check the function arity
2/ have another convenience function to call instead of nlsolve
3/ by default, nlsolve could assume that the value is returned, but for the "DifferentiableFunction" type it would assume that inplace operations are performed.
As it is right now the gradient function is called twice per iteration in the newton scheme.
The first one here: https://github.com/EconForge/NLsolve.jl/blob/ae1e89a5795324a7a35f82a22499932cc1f569f7/src/newton.jl#L96
and the second one in the linesearch i.e: https://github.com/JuliaOpt/Optim.jl/blob/master/src/linesearch/backtracking_linesearch.jl#L26
This is a bit unfortunate since the gradient is sometimes quite expensive to compute.
We could fix this by "cheating" by knowing that the linesearches only computes the gradients at the current point where we have already computed the derivative.
As an example, consider this test case with an artificial sleep introduced in the gradient:
function f!(x, fvec)
fvec[1] = (x[1]+3)*(x[2]^3-7)+18
fvec[2] = sin(x[2]*exp(x[1])-1)
end
function g!(x, fjac)
fjac[1, 1] = x[2]^3-7
fjac[1, 2] = 3*x[2]^2*(x[1]+3)
u = exp(x[1])*cos(x[2]*exp(x[1])-1)
fjac[2, 1] = x[2]*u
fjac[2, 2] = u
sleep(1)
end
df = DifferentiableMultivariateFunction(f!, g!)
I get this on master (7 seconds):
julia> @time r = nlsolve(df, [ -0.5; 1.4], method = :newton, linesearch! = Optim.backtracking_linesearch!, ftol = 1e-6)
7.020818 seconds (237 allocations: 10.500 KB)
Results of Nonlinear Solver Algorithm
* Algorithm: Newton with line-search
* Starting Point: [-0.5,1.4]
* Zero: [-1.779077697780431e-8,1.0000000055513854]
* Inf-norm of residuals: 0.000000
* Iterations: 3
* Convergence: true
* |x - x'| < 0.0e+00: false
* |f(x)| < 1.0e-06: true
* Function Calls (f): 10
* Jacobian Calls (df/dx): 6
end
Using #33 I get (3 seconds):
julia> @time r = nlsolve(df, [ -0.5; 1.4], method = :newton, linesearch! = Optim.backtracking_linesearch!, ftol = 1e-6)
3.009181 seconds (189 allocations: 8.359 KB)
Results of Nonlinear Solver Algorithm
* Algorithm: Newton with line-search
* Starting Point: [-0.5,1.4]
* Zero: [-1.779077697780431e-8,1.0000000055513854]
* Inf-norm of residuals: 0.000000
* Iterations: 3
* Convergence: true
* |x - x'| < 0.0e+00: false
* |f(x)| < 1.0e-06: true
* Function Calls (f): 10
* Jacobian Calls (df/dx): 3
end
It is a bit ugly and we have to double check that it works with the other line searches but it is my opinion that we need to do something to fix this because double calling the graident is unacceptable.
From the README
using NLsolve
function f!(x, fvec)
fvec[1] = (x[1]+3)*(x[2]^3-7)+18
fvec[2] = sin(x[2]*exp(x[1])-1)
end
nlsolve(f!, [ 0.1; 1.2],autodiff=true)
gives the error
MethodError: no method matching (::NLsolve.##26#29{#f!})(::Array{ForwardDiff.Dual{2,Float64},1})
Closest candidates are:
#26(::Any, !Matched::Any) at /home/crackauc/.julia/v0.5/NLsolve/src/autodiff.jl:3
in vector_mode_jacobian!(::DiffBase.DiffResult{1,Array{Float64,1},Tuple{Array{Float64,2}}}, ::NLsolve.##26#29{#f!}, ::Array{Float64,1}, ::ForwardDiff.JacobianConfig{2,Float64,Array{ForwardDiff.Dual{2,Float64},1}}) at /home/crackauc/.julia/v0.5/ForwardDiff/src/jacobian.jl:92
in jacobian!(::DiffBase.DiffResult{1,Array{Float64,1},Tuple{Array{Float64,2}}}, ::NLsolve.##26#29{#f!}, ::Array{Float64,1}, ::ForwardDiff.JacobianConfig{2,Float64,Array{ForwardDiff.Dual{2,Float64},1}}) at /home/crackauc/.julia/v0.5/ForwardDiff/src/jacobian.jl:23
in (::NLsolve.##28#31{NLsolve.##26#29{#f!}})(::Array{Float64,1}, ::Array{Float64,1}, ::Array{Float64,2}) at /home/crackauc/.julia/v0.5/NLsolve/src/autodiff.jl:13
in trust_region_(::NLsolve.DifferentiableMultivariateFunction, ::Array{Float64,1}, ::Float64, ::Float64, ::Int64, ::Bool, ::Bool, ::Bool, ::Float64, ::Bool) at /home/crackauc/.julia/v0.5/NLsolve/src/trust_region.jl:87
in #nlsolve#17(::Symbol, ::Float64, ::Float64, ::Int64, ::Bool, ::Bool, ::Bool, ::Function, ::Float64, ::Bool, ::NLsolve.#nlsolve, ::NLsolve.DifferentiableMultivariateFunction, ::Array{Float64,1}) at /home/crackauc/.julia/v0.5/NLsolve/src/nlsolve_func_defs.jl:24
in (::NLsolve.#kw##nlsolve)(::Array{Any,1}, ::NLsolve.#nlsolve, ::NLsolve.DifferentiableMultivariateFunction, ::Array{Float64,1}) at ./<missing>:0
in #nlsolve#19(::Symbol, ::Float64, ::Float64, ::Int64, ::Bool, ::Bool, ::Bool, ::Function, ::Float64, ::Bool, ::Bool, ::NLsolve.#nlsolve, ::#f!, ::Array{Float64,1}) at /home/crackauc/.julia/v0.5/NLsolve/src/nlsolve_func_defs.jl:70
in (::NLsolve.#kw##nlsolve)(::Array{Any,1}, ::NLsolve.#nlsolve, ::Function, ::Array{Float64,1}) at ./<missing>:0
in (::Atom.##67#70)() at /home/crackauc/.julia/v0.5/Atom/src/eval.jl:40
in withpath(::Atom.##67#70, ::Void) at /home/crackauc/.julia/v0.5/CodeTools/src/utils.jl:30
in withpath(::Function, ::Void) at /home/crackauc/.julia/v0.5/Atom/src/eval.jl:46
in macro expansion at /home/crackauc/.julia/v0.5/Atom/src/eval.jl:109 [inlined]
in (::Atom.##66#69)() at ./task.jl:60
Could this be due to the newest release of DiffBase?
One of my students last semester did an extensive study of Broyden updates for his final project (compared to finite-differencing etc.). In order to get good performance (e.g. to beat Matlab's fsolve
and Sundials' kinsol
, both of which use finite differences), it turns out that you need to be somewhat careful about how to do the line searches, and you occasionally may need to "restart" with a finite-difference Jacobian if progress stagnates, and these practical robustness issues weren't described well in the literature.
It would be good to touch base with him before banging on that feature (which should be pretty easy to implement once you know the tricks). I think he would be willing to share his paper and his (Matlab) code as a starting point; I don't think he has a github account, but I could check with him if you like.
Hi, thanks for putting this package out there! I've been using this more and more in IncrementalInference.jl. My use case suffers a large performance penalty for a rather trivial reason. I am frequently generating new function handles (lambdas) during general solving. The performance hit comes each time I have to generate a new lambda function. The first run requires type inference inside Julia itself, which in turn invokes a mass of code and resources each time.
I can avoid this problem by generating the lambdas less frequently, but would require NLsolve to allow me pass additional parameters through to my residual functions. Currently I'm forced to repeatedly do
f! = (x, res) -> somefnc(x,res,...).
Would something like this be possible:
function f2!(x, res, p1,p2)
@show p1,p2
# compute with variables
res[:] = zeros(length(res))
nothing
end
NLsolve.nlsolve(f!, init, params1, params2)
I tried the following and multiple dispatch seems to do fine with (testing on Julia v0.5):
function nlsolve{T}(f!::Function,
g!::Function,
initial_x::Vector{T},
params...;
method::Symbol = :trust_region)
#
f!(initial_x, ones(2), params...)
end
nlsolve(f2!, +, ones(2), 2.0, 3.0)
nlsolve(f2!, +, ones(2), 4.0, 5.0,method=:newton)
nlsolve(f2!, +, ones(2), :foo, "bar")
# for reference and compatibility
function f0!(x, res)
@show x
res[:] = zeros(length(res))
nothing
end
nlsolve(f0!, +, ones(2))
nlsolve(f0!, +, -1*ones(2),method=:newton)
This will change function nlsolve{T}(...)
lines of src/nlsolve_func_defs.jl, and all the places where the function is called in :newton
and :trust_region
cases.
Is anyone else interested in such an ability and/or not in favor of such a change?
Thanks
This issue is being filed by a script, but if you reply, I will see it.
PackageEvaluator.jl is a script that runs nightly. It attempts to load all Julia packages and run their test (if available) on both the stable version of Julia (0.2) and the nightly build of the unstable version (0.3).
The results of this script are used to generate a package listing enhanced with testing results.
The status of this package, NLsolve, on...
'No tests, but package loads.' can be due to their being no tests (you should write some if you can!) but can also be due to PackageEvaluator not being able to find your tests. Consider adding a test/runtests.jl
file.
'Package doesn't load.' is the worst-case scenario. Sometimes this arises because your package doesn't have BinDeps support, or needs something that can't be installed with BinDeps. If this is the case for your package, please file an issue and an exception can be made so your package will not be tested.
This automatically filed issue is a one-off message. Starting soon, issues will only be filed when the testing status of your package changes in a negative direction (gets worse). If you'd like to opt-out of these status-change messages, reply to this message.
We should update the AD stuff to the new ForwardDiff.
Which prevents the package from being updated or removed by Pkg. Not good - this happens on PkgEval, and I can also get it to happen on Ubuntu 14.04. Here's what that file is looking like for me: https://gist.github.com/4e59d49d58852062664f6970bbb82b11
function sistema_NLS(x, funรงรฃo)
funรงรฃo[1] = 3*x[1] + cos(x[2]) - 5
funรงรฃo[2] = -sin(x[1]) + x[2] - 2
end
solNLS = nlsolve(sistema_NLS, [ 1.0, 1.0])
WARNING: slice is deprecated, use view instead.
in depwarn(::String, ::Symbol) at ./deprecated.jl:64
in slice(::Array{Float64,2}, ::Vararg{Any,N}) at ./deprecated.jl:30
in sumabs2j at /home/jmarcellopereira/.julia/v0.5/NLsolve/src/utils.jl:1 [inlined]
in trust_region_(::NLsolve.DifferentiableMultivariateFunction, ::Array{Float64,1}, ::Float64, ::Float64, ::Int64, ::Bool, ::Bool, ::Bool, ::Float64, ::Bool) at /home/jmarcellopereira/.julia/v0.5/NLsolve/src/trust_region.jl:105
in #nlsolve#17(::Symbol, ::Float64, ::Float64, ::Int64, ::Bool, ::Bool, ::Bool, ::Function, ::Float64, ::Bool, ::NLsolve.#nlsolve, ::NLsolve.DifferentiableMultivariateFunction, ::Array{Float64,1}) at /home/jmarcellopereira/.julia/v0.5/NLsolve/src/nlsolve_func_defs.jl:24
in (::NLsolve.#kw##nlsolve)(::Array{Any,1}, ::NLsolve.#nlsolve, ::NLsolve.DifferentiableMultivariateFunction, ::Array{Float64,1}) at ./:0
in #nlsolve#19(::Symbol, ::Float64, ::Float64, ::Int64, ::Bool, ::Bool, ::Bool, ::Function, ::Float64, ::Bool, ::Bool, ::NLsolve.#nlsolve, ::#sistema_NLS, ::Array{Float64,1}) at /home/jmarcellopereira/.julia/v0.5/NLsolve/src/nlsolve_func_defs.jl:70
in nlsolve(::Function, ::Array{Float64,1}) at /home/jmarcellopereira/.julia/v0.5/NLsolve/src/nlsolve_func_defs.jl:65
in include_string(::String, ::String) at ./loading.jl:441
in execute_request(::ZMQ.Socket, ::IJulia.Msg) at /home/jmarcellopereira/.julia/v0.5/IJulia/src/execute_request.jl:169
in eventloop(::ZMQ.Socket) at /home/jmarcellopereira/.julia/v0.5/IJulia/src/eventloop.jl:8
in (::IJulia.##9#15)() at ./task.jl:360
while loading In[3], in expression starting on line 1
WARNING: slice is deprecated, use view instead.
in depwarn(::String, ::Symbol) at ./deprecated.jl:64
in slice(::Array{Float64,2}, ::Vararg{Any,N}) at ./deprecated.jl:30
in sumabs2j at /home/jmarcellopereira/.julia/v0.5/NLsolve/src/utils.jl:1 [inlined]
in trust_region_(::NLsolve.DifferentiableMultivariateFunction, ::Array{Float64,1}, ::Float64, ::Float64, ::Int64, ::Bool, ::Bool, ::Bool, ::Float64, ::Bool) at /home/jmarcellopereira/.julia/v0.5/NLsolve/src/trust_region.jl:145
in #nlsolve#17(::Symbol, ::Float64, ::Float64, ::Int64, ::Bool, ::Bool, ::Bool, ::Function, ::Float64, ::Bool, ::NLsolve.#nlsolve, ::NLsolve.DifferentiableMultivariateFunction, ::Array{Float64,1}) at /home/jmarcellopereira/.julia/v0.5/NLsolve/src/nlsolve_func_defs.jl:24
in (::NLsolve.#kw##nlsolve)(::Array{Any,1}, ::NLsolve.#nlsolve, ::NLsolve.DifferentiableMultivariateFunction, ::Array{Float64,1}) at ./:0
in #nlsolve#19(::Symbol, ::Float64, ::Float64, ::Int64, ::Bool, ::Bool, ::Bool, ::Function, ::Float64, ::Bool, ::Bool, ::NLsolve.#nlsolve, ::#sistema_NLS, ::Array{Float64,1}) at /home/jmarcellopereira/.julia/v0.5/NLsolve/src/nlsolve_func_defs.jl:70
in nlsolve(::Function, ::Array{Float64,1}) at /home/jmarcellopereira/.julia/v0.5/NLsolve/src/nlsolve_func_defs.jl:65
in include_string(::String, ::String) at ./loading.jl:441
in execute_request(::ZMQ.Socket, ::IJulia.Msg) at /home/jmarcellopereira/.julia/v0.5/IJulia/src/execute_request.jl:169
in eventloop(::ZMQ.Socket) at /home/jmarcellopereira/.julia/v0.5/IJulia/src/eventloop.jl:8
in (::IJulia.##9#15)() at ./task.jl:360
while loading In[3], in expression starting on line 1
PackageEvaluator.jl is a script that runs nightly. It attempts to load all Julia packages and run their tests (if available) on both the stable version of Julia (0.3) and the nightly build of the unstable version (0.4). The results of this script are used to generate a package listing enhanced with testing results.
Tests pass.
Tests fail.
This issue was filed because your testing status became worse. No additional issues will be filed if your package remains in this state, and no issue will be filed if it improves. If you'd like to opt-out of these status-change messages, reply to this message saying you'd like to and @IainNZ will add an exception. If you'd like to discuss PackageEvaluator.jl please file an issue at the repository. For example, your package may be untestable on the test machine due to a dependency - an exception can be added.
Test log:
>>> 'Pkg.add("NLsolve")' log
INFO: Cloning cache of NLsolve from git://github.com/EconForge/NLsolve.jl.git
INFO: Installing Calculus v0.1.8
INFO: Installing DualNumbers v0.1.3
INFO: Installing NLsolve v0.3.3
INFO: Installing NaNMath v0.0.2
INFO: Installing Optim v0.4.1
INFO: Package database updated
>>> 'Pkg.test("NLsolve")' log
INFO: Testing NLsolve
WARNING: [a] concatenation is deprecated; use collect(a) instead
in depwarn at ./deprecated.jl:62
in oldstyle_vcat_warning at ./abstractarray.jl:29
in vect at abstractarray.jl:32
in include at ./boot.jl:254
in include_from_node1 at ./loading.jl:133
in include at ./boot.jl:254
in include_from_node1 at ./loading.jl:133
in reload_path at ./loading.jl:157
in _require at ./loading.jl:69
in require at ./loading.jl:55
in include at ./boot.jl:254
in include_from_node1 at ./loading.jl:133
in reload_path at ./loading.jl:157
in _require at ./loading.jl:69
in require at ./loading.jl:52
in include at ./boot.jl:254
in include_from_node1 at loading.jl:133
in process_options at ./client.jl:304
in _start at ./client.jl:404
while loading /home/vagrant/.julia/v0.4/Optim/src/problems/unconstrained.jl, in expression starting on line 45
WARNING: [a] concatenation is deprecated; use collect(a) instead
in depwarn at ./deprecated.jl:62
in oldstyle_vcat_warning at ./abstractarray.jl:29
in vect at abstractarray.jl:32
in include at ./boot.jl:254
in include_from_node1 at ./loading.jl:133
in include at ./boot.jl:254
in include_from_node1 at ./loading.jl:133
in reload_path at ./loading.jl:157
in _require at ./loading.jl:69
in require at ./loading.jl:55
in include at ./boot.jl:254
in include_from_node1 at ./loading.jl:133
in reload_path at ./loading.jl:157
in _require at ./loading.jl:69
in require at ./loading.jl:52
in include at ./boot.jl:254
in include_from_node1 at loading.jl:133
in process_options at ./client.jl:304
in _start at ./client.jl:404
while loading /home/vagrant/.julia/v0.4/Optim/src/problems/unconstrained.jl, in expression starting on line 86
Running tests:
* 2by2.jl
ERROR: LoadError: LoadError: AssertionError: converged(r)
in include at ./boot.jl:254
in include_from_node1 at ./loading.jl:133
in anonymous at no file:18
in include at ./boot.jl:254
in include_from_node1 at loading.jl:133
in process_options at ./client.jl:304
in _start at ./client.jl:404
while loading /home/vagrant/.julia/v0.4/NLsolve/test/2by2.jl, in expression starting on line 42
while loading /home/vagrant/.julia/v0.4/NLsolve/test/runtests.jl, in expression starting on line 16
===============================[ ERROR: NLsolve ]===============================
failed process: Process(`/home/vagrant/julia/bin/julia --check-bounds=yes --code-coverage=none --color=no /home/vagrant/.julia/v0.4/NLsolve/test/runtests.jl`, ProcessExited(1)) [1]
================================================================================
INFO: No packages to install, update or remove
ERROR: NLsolve had test errors
in error at ./error.jl:21
in test at pkg/entry.jl:746
in anonymous at pkg/dir.jl:31
in cd at file.jl:22
in cd at pkg/dir.jl:31
in test at pkg.jl:71
in process_options at ./client.jl:280
in _start at ./client.jl:404
>>> End of log
In Optim, we've implemented forward-mode automatic differentiation for computing exact numerical gradients. It seems like this would be helpful for computing Jacobians here. If you're interested in incorporating this functionality into NLsolve, I'm happy to help out.
For example,
Iter f(x) inf-norm Step 2-norm
------ -------------- --------------
0 3.324599e+00 NaN
1 1.074657e+00 1.023475e+01
2 1.074657e+00 8.011869e-31
3 1.074657e+00 0.000000e+00
4 1.074657e+00 1.232595e-32
5 1.074657e+00 8.011869e-31
6 1.074657e+00 0.000000e+00
7 7.190746e-01 4.632532e+00
8 7.190746e-01 5.320852e-01
Notice that 7 & 8 have the same inf-norm.
The code then stays there forever
It would be nice to have methods like:
https://users.wpi.edu/~walker/Papers/Walker-Ni,SINUM,V49,1715-1735.pdf
also available through NLSolve.jl.
See #95
I'm getting a domain error when running the NLsolve code given below: "DomainError in ^ at math.jl:252" I think this is because mcpsolve is straying out of the zero one bounds I have given it, and so tries to evaluate the root of a negative number. Is this something that could/should be fixed? I may well be wrong in my diagnosis though - happy to be corrected.
David
using Distributions
using Devectorize
using Distances
using StatsBase
using NumericExtensions
using NLsolve
beta = 0.95;
lambda0 = .90;
lambda1 = 0.05;
mu = 2;
rho = 5.56;
xmin= 0.73;
xmax = xmin+1;
z0 = 0.0;
nu = 0.64;
sigma = 0.023;
alpha = 2;
TFP = 1;
eta = 0.3;
delta = 0.01;
amaximum=500;
mwruns=15;
gamma=0.5
kappa = 1
psi=0.5
prod=linspace(xmin,xmax,ns);
l1=0.7
l2=0.3
wbar=1
r=((1/beta)-1-1e-6 +delta)
## Test code
function f!(x, fvec)
ps1= wbar + ((kappa*(1-beta*(1-sigma*((1-x[1])/x[1]))))/(beta*((x[1]/(sigma*(1-x[1])))^(gamma/(1- gamma)))*(1/(2-x[1]))))
ps2= wbar + ((kappa*(1-beta*(1-sigma*((1-x[2])/x[1]))))/(beta*((x[2]/(sigma*(1-x[2])))^(gamma/(1-gamma)))*(1/(2-x[2]))))
prod1=prod[1]
prod2=prod[50]
y1=(1-x[1])*l1
y2=(1-x[2])*l2
M=(((prod1*y1)^((psi-1)/psi))+((prod2*y2)^((psi-1)/psi)))^(psi/(psi-1))
K=((r/eta)^(1/(eta-1)))*M
pd1=(1-eta)*(K^eta)*(M^(-eta))*((((prod1*y1)^((psi-1)/psi))+((prod2*y2)^((psi-1)/psi)))^(1/(psi-1)))* ((prod1*y1)^(-1/psi))*prod1
pd2=(1-eta)*(K^eta)*(M^(-eta))*((((prod1*y1)^((psi-1)/psi))+((prod2*y2)^((psi-1)/psi)))^(1/(psi-1)))*((prod2*y2)^(-1/psi))*prod2
fvec[1]=pd1-ps1
fvec[2]=pd2-ps2
end
mcpsolve(f!,[0.0,0.0],[1.0,1.0], [ 0.3; 0.3])
I have been using this package extensively for solving collocation problems in economics. In my previous experience (with other code), trust region methods have almost always dominated quasi-Newton methods, and definitely Newton methods. This is in accordance with what the Nocedal-Wright book says.
However, with NLsolve
, trust region methods frequently don't converge, while plain vanilla Newton converges very well. An MWE is
using NLsolve
"Freudenstein and Roth function."
function f!(x, fval)
fval[1] = -13 + x[1] + ((5-x[2])*x[2]-2)*x[2]
fval[2] = -29 + x[1] + ((x[2] + 1)*x[2]-14)*x[2]
end
o1 = nlsolve(f!, [0.5, -2]; autodiff = true)
NLsolve.converged(o1) # false
o1.zero # "bad root" 11.4128, -0.896805
o2 = nlsolve(f!, [0.5, -2]; autodiff = true, method = :newton)
NLsolve.converged(o2) # true
o2.zero # root 5, 4
I don't think there is anything with trust region as a method in theory, so I am wondering if there is a bug in the implementation.
I would like to help out. I could:
Optim.jl
too). I am not aware of an existing library/collection like this for Julia.Many of my optimization attempts failed with DomainError()
exceptions when the optimizer tries any point where the function is not properly defined. Is there a way to avoid that with the existing algorithms ?
Otherwise, it would be nice if the backtracking algorithm didn't stop the first time it sees an invalid value.
Here is an example:
import Optim
using NLsolve
# w() has a zero around 0.16, is not defined for x<0.15
w(x) = (x-0.15).^0.2-0.4
# optimizing starting from below the root works:
sol = nlsolve(not_in_place(w), [0.151], method=:newton)
sol.zero
# not from above (raises DomainError())
sol = nlsolve(not_in_place(w), [0.4], method=:newton)
# backtracking doesn't avoid bad points (raises DomainError())
sol = nlsolve(not_in_place(w), [0.4], method=:newton, linesearch! = Optim.backtracking_linesearch!)
Many of the inner algorithms do large allocations each times.
https://github.com/JuliaNLSolvers/NLsolve.jl/blob/master/src/newton.jl#L43
This is painful when using NLsolve repeatedly and iteratively in a loop. It would be nice to have a way to keep the cache so that way all of the inner variables can be reused.
I am solving a large number of similar functions of two variables, where the function is defined in terms of several constants that change between each call of nlsolve
. Is there any suggested method for passing these parameters to nlsolve
without using metaprogramming to constantly redefine the function or defining them outside of the scope?
using NLsolve
function f!(x, fvec, c)
fvec[1] = (x[1]+c[1])*(x[2]^3-7)+c[2]
fvec[2] = sin(x[2]*exp(x[1])-c[3])
end
#It would be nice to do something like this?
nlsolve(f!, [ 0.1; 1.2], c = [1,2,18])
#This works, but looks really dodgy
function f2!(x, fvec)
fvec[1] = (x[1]+c[1])*(x[2]^3-7)+c[2]
fvec[2] = sin(x[2]*exp(x[1])-c[3])
end
c = [1,2,3]
nlsolve(f2!, [ 0.1; 1.2])
I do think that we need to adapt this here, just as we did in Optim. Is anyone strongly against?
Not all linesearches in Optim
returns the computed function value. Instead we unnecessarily have to recompute it, even though we just evaluated the function at the same point. https://github.com/EconForge/NLsolve.jl/blob/ff78c8e1323c0038db8a95a20ce8250bf3b9f86b/src/newton.jl#L130
We should probably look into getting the backtracking functions in Optim return the function value and we can then skip a function evaluation for each newton iteration.
https://github.com/JuliaNLSolvers/NLsolve.jl/blob/master/src/newton.jl#L116
The Jacobian is repeatedly used in \
but it's not factorized, making this a lot more expensive.
If you do:
nlsolve(magnet_equation_set!,x0).zero
or
mcpsolve(
magnet_equation_set!, [-Inf, -Inf], [+Inf, +Inf], cur_initial_values, xtol = 5e-8
).zero
you get [0.844365,0.33534]
// the value I want
however, if you change [-Inf, -Inf]
to [0, 0]
, you get a very wrong answer:
mcpsolve(
magnet_equation_set!, [0.0, 0.0], [+Inf, +Inf], cur_initial_values, xtol = 5e-8
).zero
yields: [-2.62852e-15,-3.51704e-9]
Why is setting [0, 0]
as bottom bounds giving such crazy answers?
In particular, the current release version is unusable (fixed in f4f1785)
I was wondering if you think allowing the chunksize choice in the autodiff setup is a good idea. I have a working implementation that I am using since it's currently not allowed:
https://github.com/JuliaDiffEq/OrdinaryDiffEq.jl/blob/master/src/misc_utils.jl#L33
But the issue is, if I was to make a PR, I can't think of a good way to pass it in. You'd want to make it a kwarg like autodiff
, but then it wouldn't be type stable since you would need to pass it by value-type for the type-stability. I handle this with a "pre-stage" where I can use a @pure
constructor, but I'm not sure that applies here.
The way the autodifferentiation is done isn't compatible with FowardDiff v0.5.0. With the new tagging system, it seems that the function that is used in the configuration part may not be the same function that is later solved. I think there may be an ordering issue, i.e. build config --> function setup but now it's required that function setup --> build config. But maybe it's better just to opt out of the function tagging and just use a package-wide tag so that way it's semi-safe but doesn't require building the final function before the config.
minpack: Error During Test
Got an exception of type ForwardDiff.ConfigMismatchError{NLsolve.##28#31{#f!#8},Array{Float64,1},0xb046287d533b082d} outside of a @test
The provided configuration (of type ForwardDiff.JacobianConfig{ForwardDiff.Tag{Array{Float64,1},0xb046287d533b082d},Float64,2,Array{ForwardDiff.Dual{ForwardDiff.Tag{Array{Float64,1},0xb046287d533b082d},Float64,2},1}}) was constructed for a function other than the current target function. ForwardDiff cannot safely perform differentiation in this context; see the following issue for details: https://github.com/JuliaDiff/ForwardDiff.jl/issues/83. You can resolve this problem by constructing and using a configuration with the appropriate target function, e.g. `ForwardDiff.GradientConfig(NLsolve.#28, x)`
Stacktrace:
[1] (::NLsolve.##30#33{NLsolve.##28#31{#f!#8},Array{Float64,1},ForwardDiff.JacobianConfig{ForwardDiff.Tag{Array{Float64,1},0xb046287d533b082d},Float64,2,Array{ForwardDiff.Dual{ForwardDiff.Tag{Array{Float64,1},0xb046287d533b082d},Float64,2},1}}})(::Array{Float64,1}, ::Array{Float64,1}, ::Array{Float64,2}) at C:\Users\Chris\.julia\v0.6\NLsolve\src\autodiff.jl:11
[2] trust_region_(::NLsolve.DifferentiableMultivariateFunction, ::Array{Float64,1}, ::Float64, ::Float64, ::Int64, ::Bool, ::Bool, ::Bool, ::Float64, ::Bool) at C:\Users\Chris\.julia\v0.6\NLsolve\src\trust_region.jl:87
[3] #nlsolve#17(::Symbol, ::Float64, ::Float64, ::Int64, ::Bool, ::Bool, ::Bool, ::Function, ::Float64, ::Bool, ::NLsolve.#nlsolve, ::NLsolve.DifferentiableMultivariateFunction, ::Array{Float64,1}) at C:\Users\Chris\.julia\v0.6\NLsolve\src\nlsolve_func_defs.jl:24
[4] (::NLsolve.#kw##nlsolve)(::Array{Any,1}, ::NLsolve.#nlsolve, ::NLsolve.DifferentiableMultivariateFunction, ::Array{Float64,1}) at .\<missing>:0
[5] #nlsolve#19(::Symbol, ::Float64, ::Float64, ::Int64, ::Bool, ::Bool, ::Bool, ::Function, ::Float64, ::Bool, ::Bool, ::NLsolve.#nlsolve, ::#f!#8, ::Array{Float64,1}) at C:\Users\Chris\.julia\v0.6\NLsolve\src\nlsolve_func_defs.jl:70
[6] (::NLsolve.#kw##nlsolve)(::Array{Any,1}, ::NLsolve.#nlsolve, ::Function, ::Array{Float64,1}) at .\<missing>:0
[7] macro expansion at .\util.jl:293 [inlined]
[8] macro expansion at C:\Users\Chris\.julia\v0.6\NLsolve\test\minpack.jl:538 [inlined]
[9] macro expansion at .\test.jl:856 [inlined]
[10] anonymous at .\<missing>:?
[11] include_from_node1(::String) at .\loading.jl:569
[12] include(::String) at .\sysimg.jl:14
[13] macro expansion at C:\Users\Chris\.julia\v0.6\NLsolve\test\runtests.jl:27 [inlined]
[14] anonymous at .\<missing>:?
[15] include_from_node1(::String) at .\loading.jl:569
[16] include(::String) at .\sysimg.jl:14
[17] process_options(::Base.JLOptions) at .\client.jl:305
[18] _start() at .\client.jl:371
Test Summary: | Pass Error Total
minpack | 1 1 2
ERROR: LoadError: LoadError: Some tests did not pass: 1 passed, 0 failed, 1 errored, 0 broken.
while loading C:\Users\Chris\.julia\v0.6\NLsolve\test\minpack.jl, in expression starting on line 16
while loading C:\Users\Chris\.julia\v0.6\NLsolve\test\runtests.jl, in expression starting on line 26
===============================[ ERROR: NLsolve ]===============================
failed process: Process(`'C:\Users\Chris\AppData\Local\Julia-0.6.0-rc2\bin\julia.exe' -Cx86-64 '-JC:\Users\Chris\AppData\Local\Julia-0.6.0-rc2\lib\julia\sys.dll' --compile=yes --depwarn=yes --check-bounds=yes --code-coverage=none --color=no --compilecache=yes 'C:\Users\Chris\.julia\v0.6\NLsolve\test\runtests.jl'`, ProcessExited(1)) [1]
================================================================================
It would be nice to change the order of storage arrays and evaluation point arrays in f! and g! calls, to match Julia usual syntax and Optim v0.9
PackageEvaluator.jl is a script that runs nightly. It attempts to load all Julia packages and run their tests (if available) on both the stable version of Julia (0.2) and the nightly build of the unstable version (0.3). The results of this script are used to generate a package listing enhanced with testing results.
Tests pass.
.Tests fail, but package loads.
.Tests pass.
means that PackageEvaluator found the tests for your package, executed them, and they all passed.
Tests fail, but package loads.
means that PackageEvaluator found the tests for your package, executed them, and they didn't pass. However, trying to load your package with using
worked.
This issue was filed because your testing status became worse. No additional issues will be filed if your package remains in this state, and no issue will be filed if it improves. If you'd like to opt-out of these status-change messages, reply to this message saying you'd like to and @IainNZ will add an exception. If you'd like to discuss PackageEvaluator.jl please file an issue at the repository. For example, your package may be untestable on the test machine due to a dependency - an exception can be added.
I cannot use not_in_place
and autodiff
at the same time.
nlsolve(not_in_place(x-> x[1] + 2), [0.0], autodiff = true)
ERROR: unrecognized keyword argument "autodiff"
PackageEvaluator.jl is a script that runs nightly. It attempts to load all Julia packages and run their tests (if available) on both the stable version of Julia (0.2) and the nightly build of the unstable version (0.3). The results of this script are used to generate a package listing enhanced with testing results.
Tests pass.
.Tests fail, but package loads.
.Tests pass.
means that PackageEvaluator found the tests for your package, executed them, and they all passed.
Tests fail, but package loads.
means that PackageEvaluator found the tests for your package, executed them, and they didn't pass. However, trying to load your package with using
worked.
This issue was filed because your testing status became worse. No additional issues will be filed if your package remains in this state, and no issue will be filed if it improves. If you'd like to opt-out of these status-change messages, reply to this message saying you'd like to and @IainNZ will add an exception. If you'd like to discuss PackageEvaluator.jl please file an issue at the repository. For example, your package may be untestable on the test machine due to a dependency - an exception can be added.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.