Comments (6)
As a further question are their any line searches that attempt to find the same minimum that corresponds to taking infinitesimally small steps along the negative gradient? This is normally how a basin of attraction is defined. I imagine that a backtracking line search that has a maximum step size that is smaller than the width of a typical basin might work okay.
I typically use optimization algorithms without a sophisticated line search. Either directly using the LBFGS search direction and length or conjugate gradients with a single newton's method step along the descent direction. Both of which with a maximum step size.
I'd like to see if a fancy line search can improve performance for the problems I am interested in, but as it stands I can't get any of the available line searches in Optim to work. Each line search seems to have options that should be able to be changed, but I don't see any way to do that. The line search seems to be undocumented.
from optim.jl.
I'd certainly like to include max step sizes as an option. This will take some time to do, though, as the line search code is fairly complex.
I'm not sure about what infinitesimally small steps along the gradient means in floating point. You should use a constant-step gradient descent algorithm with step sizes equal to the the floating point epsilon, but this seems like a bad idea for both convergence and numerical stability. I suspect I'm missing something.
The line search is definitely undocumented. Some of the options are also not available because we haven't done enough to generalize our line search framework to discuss all of the different options in an easy way.
If you'd like to experiment with alternative line searches, you can copy the code for backtracking line search and then modify it to your taste. As long as you maintain the existing interface, you can pass in your new line search function as an argument to optimize
.
from optim.jl.
linesearch_hz already has the ability to specify a maximum step size (it's needed for fminbox).
from optim.jl.
I think I see a way forward at least for my own personal testing.
I'll clarify what I meant about "infinitesimally small steps along the negative gradient". I have the view that the goal of a local optimization routine should be to try find the same minimum that would be found with gradient descent in the limit of the step size going to zero. This of course is only a theoretical definition, not a real approach to optimization.
Here is a recent paper that maps out which minima are reached from different starting coordinates for different optimization algorithms in atomistic systems. This sort of study is interesting because some algorithms try to associate points in space with their associated minimum. However, which minimum will be reached from a given initial point is dependent on many different parameters.
from optim.jl.
Since hz_linesearch
is the default line search algorithm, and it includes alphamax
as an option, can this be closed?
from optim.jl.
As mentioned, the functionality is already here, so I am closing this.
from optim.jl.
Related Issues (20)
- [Feature request] Adding Hill Climbing HOT 7
- Docs Latex expressions displays as raw HOT 4
- JuMP Interface, Incorrect Status Returned HOT 5
- Mac nightly fails for some reason
- IPNewton : start is not an interior point HOT 1
- CUDA and Adam errors HOT 3
- Incorrect referencing to ipnewton_basics.ipynb in Nonlinear constrained optimization HOT 1
- `Adam` supported by `Fminbox`? HOT 2
- Add progress meter HOT 1
- Documentation Incorrect HOT 2
- Simple optimization failing HOT 1
- Not all convergence criteria mentioned in documentation HOT 2
- `eval` in OptimMOIExt fails to precompile HOT 2
- Bounds Error in Neural Network Training Process HOT 1
- Expose more optimisation parameters to users HOT 4
- Extended trace for Adam fails
- Univariate minimization given function and derivatives HOT 1
- Problem with f_calls_limit in Fminbox
- Convergence of SAMIN is converted to non-convergence.
- Optim.OptimizationState HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from optim.jl.