Code Monkey home page Code Monkey logo

nodal.jl's Introduction

Git Version Julia Package Version Build Status Coverage Documentation Status

NODAL provides tools for implementing parallel and distributed program autotuners. This Julia package provides tools and optimization algorithms for implementing different Stochastic Local Search methods, such as Simulated Annealing and Tabu Search. NODAL is an ongoing project, and will implement more optimization and local search algorithms.

You can use NODAL to optimize user-defined functions with a few Stochastic Local Search basic methods, that are composed by building blocks also provided in the package. The package distributes evaluations of functions and technique executions between Julia workers. It is possible to have multiple instances of search techniques running on the same problem.

Installing

NODAL runs on Julia nightly. From the Julia REPL, run:

Pkg.add("NODAL")

If you want the latest version, which may be unstable, run instead:

Pkg.clone("NODAL")

Documentation

Please, refer to the documentation for more information and examples.

Example: The Rosenbrock Function

The following is a very simple example, and you can find the source code for its latest version in the GitHub repository.

We will optimize the Rosenbrock cost function. For this we must define a Configuration that represents the arguments to be tuned. We also have to create and configure a tuning run. First, let's import NODAL and define the cost function:

addprocs()

import NODAL

@everywhere begin
    using NODAL
    function rosenbrock(x::Configuration, parameters::Dict{Symbol, Any})
        return (1.0 - x["i0"].value)^2 + 100.0 * (x["i1"].value - x["i0"].value^2)^2
    end
end

Note:

The Rosenbrock function is by no means a good autotuning objetive, although it is a good tool to help you get familiar with the API. NODAL certainly performs worse than most tools for this kind of function. Look at further examples is this page for more fitting applications.

We use the addprocs() function to add the default number of Julia workers, one per processing core, to our application. The import statement loads NODAL in the current Julia worker, and the @everywhere macro defines the rosenbrock function and the module in all Julia workers available.

Cost functions must accept a Configuration and a Dict{Symbol, Any} as input. The Configuration is used to define the autotuner's search space, and the parameter dictionary can store data or function configurations.

Our cost function simply ignores the parameter dictionary, and uses the "i0" and "i1" parameters of the received configuration to calculate a value. There is no restriction on the names of Configuration parameter.

Our configuration will have two FloatParameters, which will be Float64 values constrained to an interval. The intervals are [-2.0, 2.0] for both parameters, and their values start at 0.0. Since we already used the names "i0" and "i1", we name the parameters the same way:

configuration = Configuration([FloatParameter(-2.0, 2.0, 0.0, "i0"),
                               FloatParameter(-2.0, 2.0, 0.0, "i1")],
                               "rosenbrock_config")

Now we must configure a new tuning run using the Run type. There are many parameters to configure, but they all have default values. Since we won't be using them all, please see Run's source for further details:

tuning_run = Run(cost                = rosenbrock,
                 starting_point      = configuration,
                 stopping_criterion  = elapsed_time_criterion,
                 report_after        = 10,
                 reporting_criterion = elapsed_time_reporting_criterion,
                 duration            = 60,
                 methods             = [[:simulated_annealing 1];
                                        [:iterative_first_improvement 1];
                                        [:iterated_local_search 1];
                                        [:randomized_first_improvement 1];
                                        [:iterative_greedy_construction 1];])

The methods array defines the search methods, and their respective number of instances, that will be used in this tuning run. This example uses one instance of every implemented search technique. The search will start at the point defined by starting_point.

The stopping_criterion parameter is a function. It tells your autotuner when to stop iterating. The two default criteria implemented are elapsed_time_criterion and iterations_criterion. The reporting_criterion parameter is also function, but it tells your autotuner when to report the current results. The two default implementations are elapsed_time_reporting_criterion and iterations_reporting_criterion. Take a look at the code if you want to dive deeper.

We are ready to start autotuning, using the @spawn macro. For more information on how parallel and distributed computing works in Julia, please check the Julia Docs. This macro call will run the optimize method, which receives a tuning run configuration and runs the search techniques in the background. The autotuner will write its results to a RemoteChannel stored in the tuning run configuration:

@spawn optimize(tuning_run)
result = take!(tuning_run.channel)

The tuning run will use the default neighboring and perturbation methods implemented by NODAL to find new results. Now we can process the current result. In this example we just print it and loop until optimize is done:

print(result)
while !result.is_final
    result = take!(tuning_run.channel)
    print(result)
end

Running the complete example, we get:

$ julia --color=yes rosenbrock.jl
[Result]
Cost              : 1.0
Found in Iteration: 1
Current Iteration : 1
Technique         : Initialize
Function Calls    : 1
  ***
[Result]
Cost              : 1.0
Found in Iteration: 1
Current Iteration : 3973
Technique         : Initialize
Function Calls    : 1
  ***
[Result]
Current Iteration : 52289
Technique         : Iterative First Improvement
Function Calls    : 455
  ***
[Result]
Cost              : 0.01301071782455056
Found in Iteration: 10
Current Iteration : 70282
Technique         : Randomized First Improvement
Function Calls    : 3940
  ***
[Result]
Cost              : 0.009463518035824526
Found in Iteration: 11
Current Iteration : 87723
Technique         : Randomized First Improvement
Function Calls    : 4594
  ***
[Final Result]
Cost                  : 0.009463518035824526
Found in Iteration    : 11
Current Iteration     : 104261
Technique             : Randomized First Improvement
Function Calls        : 4594
Starting Configuration:
  [Configuration]
  name      : rosenbrock_config
  parameters:
    [NumberParameter]
    name : i0
    min  : -2.000000
    max  : 2.000000
    value: 1.100740
    ***
    [NumberParameter]
    name : i1
    min  : -2.000000
    max  : 2.000000
    value: 1.216979
Minimum Configuration :
  [Configuration]
  name      : rosenbrock_config
  parameters:
    [NumberParameter]
    name : i0
    min  : -2.000000
    max  : 2.000000
    value: 0.954995
    ***
    [NumberParameter]
    name : i1
    min  : -2.000000
    max  : 2.000000
    value: 0.920639

nodal.jl's People

Contributors

phrb avatar tkelman avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

nodal.jl's Issues

Search is not an exported method

Hi @phrb. So I've come stuck at the next hurdle. In the README it states

Now we are ready to use the search method, which could be called with just a Configuration and a function that accepts a Configuration as input.

The problem is that search doesn't seem to be an exported method so I end up with the following error:

ERROR: LoadError: MethodError: `search` has no method matching search(::Function, ::StochasticSearch.Configuration{StochasticSearch.NumberParameter{AbstractFloat}})
 in include at boot.jl:261
 in include_from_node1 at loading.jl:304
while loading C:\Users\jonathan\Documents\UniOfOxford\HelpingOthers\DavidBrown\FRAP_filter\filter_FRAP.jl, in expression starting on line 350

What method should I use if I want to run simulated annealing?
I assume something in the README or the code needs to be changed.

Thanks again in advance for helping me out

0.3.0 Roadmap

SLS techniques (moved to v0.3.x #21):

  • Best Improvement
  • Iterative Best Improvement
  • Randomized Best Improvement
  • Iterated Local Search
  • Dynamic Local Search
  • Tabu Search
  • Ant Colony Optimization

Features:

  • PermutationParameter
  • neighbor! for PermutationParameter
  • perturb! for PermutationParameter
  • sequential_measure_mean!: Obtain results sequentially, from a single worker.
  • Techniques run in different workers
  • Support v0.5.

Supporting v0.5 and abandoning v0.4:

  • Rename RemoteRefs ~> RemoteChannels
  • Fix remotecall signature
  • Rewrite constructors

Info about upcoming removal of packages in the General registry

As described in https://discourse.julialang.org/t/ann-plans-for-removing-packages-that-do-not-yet-support-1-0-from-the-general-registry/ we are planning on removing packages that do not support 1.0 from the General registry. This package has been detected to not support 1.0 and is thus slated to be removed. The removal of packages from the registry will happen approximately a month after this issue is open.

To transition to the new Pkg system using Project.toml, see https://github.com/JuliaRegistries/Registrator.jl#transitioning-from-require-to-projecttoml.
To then tag a new version of the package, see https://github.com/JuliaRegistries/Registrator.jl#via-the-github-app.

If you believe this package has erroneously been detected as not supporting 1.0 or have any other questions, don't hesitate to discuss it here or in the thread linked at the top of this post.

Segfault with multiple processes on v0.4.

Obtaining results using multiple processes causes a segfault in Julia v0.4. In v0.3 (or v0.4 with a single process) there is no segfault. Also, check a similar issue: JuliaLang/julia/issues/12558

Output:

% julia0.4 -p 2 examples/sorting.jl
WARNING: replacing module StochasticSearch
WARNING: replacing module StochasticSearch

signal (11): Segmentation fault
unknown function (ip: 0x7f1fb9245e68)
jl_module_export at /home/phrb/.bin/julia-f2bd731327/bin/../lib/julia/libjulia.so (unknown line)
unknown function (ip: 0x7f1fb921ba46)
unknown function (ip: 0x7f1fb921bfc5)
unknown function (ip: 0x7f1fb921bc2d)
unknown function (ip: 0x7f1fb921c4ac)
jl_load_file_string at /home/phrb/.bin/julia-f2bd731327/bin/../lib/julia/libjulia.so (unknown line)
include_string at loading.jl:226
jl_apply_generic at /home/phrb/.bin/julia-f2bd731327/bin/../lib/julia/libjulia.so (unknown line)
include_from_node1 at ./loading.jl:267
jl_apply_generic at /home/phrb/.bin/julia-f2bd731327/bin/../lib/julia/libjulia.so (unknown line)
unknown function (ip: 0x7f1fb9207e13)
unknown function (ip: 0x7f1fb9207199)
unknown function (ip: 0x7f1fb921b89c)
jl_toplevel_eval_in at /home/phrb/.bin/julia-f2bd731327/bin/../lib/julia/libjulia.so (unknown line)
require at ./loading.jl:201
unknown function (ip: 0x7f1fb65e2dbc)
jl_apply_generic at /home/phrb/.bin/julia-f2bd731327/bin/../lib/julia/libjulia.so (unknown line)
unknown function (ip: 0x7f1fb921aa9d)
unknown function (ip: 0x7f1fb921bb4b)
jl_toplevel_eval_in at /home/phrb/.bin/julia-f2bd731327/bin/../lib/julia/libjulia.so (unknown line)
eval at ./sysimg.jl:14
jl_apply_generic at /home/phrb/.bin/julia-f2bd731327/bin/../lib/julia/libjulia.so (unknown line)
anonymous at multi.jl:1349
jl_f_apply at /home/phrb/.bin/julia-f2bd731327/bin/../lib/julia/libjulia.so (unknown line)
anonymous at multi.jl:889
run_work_thunk at multi.jl:642
jlcall_run_work_thunk_21278 at  (unknown line)
jl_apply_generic at /home/phrb/.bin/julia-f2bd731327/bin/../lib/julia/libjulia.so (unknown line)
anonymous at task.jl:889
unknown function (ip: 0x7f1fb920d770)
unknown function (ip: (nil))
Worker 2 terminated.
ERROR (unhandled task failure): EOFError: read end of file
 in yieldto at ./task.jl:75
 in wait at ./task.jl:371
 in wait at ./task.jl:286
 in wait at ./task.jl:112
 in sync_end at ./task.jl:400
 in anonymous at multi.jl:422
 in include at ./boot.jl:254
 in include_from_node1 at ./loading.jl:264
 in process_options at ./client.jl:308
 in _start at ./client.jl:411
ERROR: LoadError: ProcessExitedException()
 in yieldto at ./task.jl:75
 in wait at ./task.jl:371
 in wait at ./task.jl:286
 in wait at ./channels.jl:93
 in take! at ./channels.jl:82
 in take! at ./multi.jl:789
 in remotecall_fetch at multi.jl:726
 in anonymous at task.jl:447
 in sync_end at ./task.jl:413
 in anonymous at multi.jl:422
 in include at ./boot.jl:254
 in include_from_node1 at ./loading.jl:264
 in process_options at ./client.jl:308
 in _start at ./client.jl:411
while loading /home/phrb/.julia/v0.3/StochasticSearch/examples/sorting.jl, in expression starting on line 1

Several thousends arguments of the function as a vector

How it would be the input of the configuration item, if you have like 6000 input variables? Is there a simpler way to do it, without the Dict option.

configuration = Configuration([FloatParameter(-2.0, 2.0, 0.0, "i0"),
                               FloatParameter(-2.0, 2.0, 0.0, "i1")],
                               "rosenbrock_config")

0.1.0 Roadmap

These are the intended features for v0.1.0:

  • Provide building blocks for the implementation of common Stochastic Local Search algorithms.

    • First Improvement
    • Iterative First Improvement
    • Best Improvement (pushed to v0.1.x)
    • Iterative Best Improvement (pushed to v0.1.x)
    • Randomized Best Improvement (pushed to v0.1.x)
    • Random Walk
    • Randomized First Improvement
    • Probabilistic Iterative Improvement
    • Greedy Construction
    • Iterative Greedy Construction
    • Simulated Annealing (rewrite with building blocks)

    Check: Stochastic Local Search Algorithms: An Overview

  • Independent measurement module.

    The measurement module should provide values of the user-defined function with a given configuration. Computational resources should be distributed between measurements. (see #4)

    • measure_mean!: Uses remotecall_fetch to get results from multiple workers.
    • sequential_measure_mean!: Sequentially obtain results from a single worker. (pushed to v0.1.x)
  • Independent search module.

    Stochastic Local Search techniques should execute independently, and request results from the measurement module.

  • Tests pass in Julia stable (v0.3.x).

  • Tests pass in Julia latest (v0.4).

Parameters should store a normalized value

Modify Parameters to store a value normalized to be within its range. Values should only be converted back when reading the in-range interval.

  • EnumParameters
  • PermutationParameters
  • NumberParameters
  • BoolParameters

Cannot install NODAL

Pkg.add() does not work (and Pkg.update() as suggested does not help):

ERROR: NODAL can't be installed because it has no versions that support 0.6.4 of julia. You may need to update METADATA by running
 `Pkg.update()`

Pkg.clone() gives the following warning

WARNING: julia is fixed at 0.6.4 conflicting with requirement for NODAL: [0.7.0-,โˆž)

Is it possible to fix this, or is NODAL.jl only intended for v0.7 and above?

Implement more SLS techniques

Implement the following SLS techniques:

  • Best Improvement
  • Iterative Best Improvement
  • Randomized Best Improvement
  • Dynamic Local Search
  • Tabu Search
  • Ant Colony Optimization

NODAL is not supported on v0.6.2

I am using Julia 0.6.2 and when trying to add NODAL I get the following error.

ERROR: NODAL can't be installed because it has no versions that support 0.6.2 of julia. You may need
to update METADATA by running Pkg.update()

Fix tests that depend on RNG.

Tests for the neighbor! methods currently break the tests at random. This probably happen because the comparisons are exact.

Rewrite 'neighbor!' methods

Refactor the neighbor! methods to receive keyword arguments properly, and possibly get rid of default values.

Overhaul Parameter Type System

Modify the Parameter and Configuration types to support "conditional parameters", that is, parameters that are added or removed depending on other parameters's values.

This will be useful for Genetic Algorithm configuration, for example, where multiple strategies can be used, each with its own parameters.

Search techniques should receive multiple arguments.

Search techniques instantiated by 'optimize' currently receive an incomplete list of arguments. Default values are used for their parameters. Techniques should receive all their parameters, and only fall back to defaults when parameters are not provided.

Extract Duplicated Code in Search Techniques

There is duplicated code in all search techniques. Extract it by creating a function representing a general technique and make all techniques use it. This function should perform all boilerplate for techniques.

  • Create general technique function
  • Modify all techniques to use new general function

Execution hangs on multiple workers.

Execution hangs when running scripts launched with julia -p, or workers are added in the script using addprocs().

Possible causes include module imports, serialization/deserialization and undetected errors in workers.

Implement ClusterManagers

Implement ClusterManagers that:

  • Have unidirectional communication and connection between central and spawned processes
  • Use non-blocking channels
  • Are able to run Julia workers in public clouds

Rosenbrock example not working

I was trying to run the Rosenbrock example from the example folder, and I got the following error:

type Run has no field channel

It comes from tuning_run that has no field channel. How can we fix that?

Using Julia v0.5.0 in Jupyter.

Can't construct NumberParameter

I'm trying to use your package and I'm failing at the first hurdle.

I'm trying to create a Number parameter object but it's not working:

julia> NumberParameter(0.0, 2.0, 1.0, "test")
ERROR: MethodError: `convert` has no method matching convert(::Type{StochasticSearch.NumberParameter{T<:Number}}, ::Float64, ::Float64, ::Float64, ::ASCIIString)
This may have arisen from a call to the constructor StochasticSearch.NumberParameter{T<:Number (...),
since type constructors fall back to convert methods.
Closest candidates are:
  call{T}(::Type{T}, ::Any)
  convert{T}(::Type{T}, ::T)
 in call at essentials.jl:57

Can you help me out here. I assume I'm missing something really small. I'm using Julia v0.4.3

0.4 Roadmap

Maintenance:

  • Extract Duplicated Code (#27,)
  • Move to Julia nightly until 1.0
  • Fix #28
  • Use Base.Test in all tests (#23)

API Features:

  • Cluster managers (#15)
  • A 'tuning run manager' to coordinate multiple runs
  • Results logging and graphing utilities
  • 'Cleanup' and 'startup' callback functions

SLS Features:

  • More SLS techniques (#22)
  • Option to split search space between technique instances
  • Parameters should store a normalized value (#24)
  • Perturbation functions should only modify normalized values (#25)
  • Support for "conditional parameters" (#26)

Fix segfault in 'src/core/search/optimize.jl'

In Julia v0.3.11, when 'simulated_annealing' is called from within 'initialize_search_tasks!' and not in 'optimize', there is a segfault that is not fixed by importing 'simulated_annealing'. Closing this issue will require understanding why this happen and finding a definitive solution.

Workaround:
The problem seems to be related to namespaces, but importing 'simulated_annealing.jl' does not solve it.
Using a dummy line that calls 'simulated_annealing' solves the problem, even if the line is never called.

The segfault:

% julia examples/rosenbrock.jl

signal (11): Segmentation fault
unknown function (ip: 1802521875)
unknown function (ip: 1802527421)
unknown function (ip: 1805747695)
unknown function (ip: 1805747934)
unknown function (ip: 1805748227)
unknown function (ip: 1796914869)
unknown function (ip: 1796915214)
jl_trampoline at /home/phrb/.bin/julia-483dbf5279/bin/../lib/julia/libjulia.so (unknown line)
jl_apply_generic at /home/phrb/.bin/julia-483dbf5279/bin/../lib/julia/libjulia.so (unknown line)
julia_optimize_20275 at  (unknown line)
jlcall_optimize_20275 at  (unknown line)
anonymous at task.jl:6
jl_handle_stack_switch at /home/phrb/.bin/julia-483dbf5279/bin/../lib/julia/libjulia.so (unknown line)
julia_trampoline at /home/phrb/.bin/julia-483dbf5279/bin/../lib/julia/libjulia.so (unknown line)
unknown function (ip: 4199805)
__libc_start_main at /lib/x86_64-linux-gnu/libc.so.6 (unknown line)
unknown function (ip: 4199861)
zsh: segmentation fault  julia examples/rosenbrock.jl

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.