Code Monkey home page Code Monkey logo

coluna.jl's Introduction

Coluna.jl

Documentation CI codecov Project Status: Active – The project has reached a stable, usable state and is being actively developed License: MPL 2.0

Coluna is a branch-and-price-and-cut framework written in Julia. You write an original MIP that models your problem using the JuMP modeling language and our specific extension BlockDecomposition offers a syntax to specify the problem decomposition. Then, Coluna reformulates the original MIP and optimizes the reformulation using the algorithms you choose. Coluna aims to be very modular and tweakable so that you can define the behavior of your customized branch-and-price-and-cut algorithm.

Installation

Coluna is a Julia Language package.

You can install Coluna through the Julia package manager. Open Julia's interactive session (REPL) and type:

   ] add Coluna

The documentation provides examples to run advanced branch-cut-and-price. Improvements in documentation are expected in the future. You can browse the stable documentation if you work with the latest release or the dev documentation if you work with the master version of Coluna.

Features

We aim to integrate into Coluna the state-of-the-art techniques used for branch-and-cut-and-price algorithms.

  • Stable Features which are well-tested (but performance may still be improved).
    • Dantzig-Wolfe decomposition
    • Branch-and-bound algorithm (with branching in master)
    • Column generation (MILP pricing solver/pricing callback)
  • Beta Features that work well but need more tests/usage and performance review before being stable:
    • Strong branching (with branching in master)
    • Stabilization for column generation
    • Cut generation (robust and non-robust)
    • Benders decomposition
    • Preprocessing (presolve) of formulations and reformulations
  • Alpha Features that should work. Structural work is done but these features may have bugs:
    • Benders cut generation
  • Dev Features in development.
    • Clean-up of the master formulation (removal of unpromising columns and cuts)
    • Saving/restoring LP basis when changing a node in branch-and-bound

Contributing

If you encounter a bug or something unexpected happens while using Coluna, please open an issue via the GitHub issues tracker.

See the list of contributors who make Coluna possible.

Premium support

Using Coluna for your business? Contact us to get tailored and qualified support.

Acknowledgments

The platform development has received an important support grant from the international scientific society Mathematical Optimization Society (MOS) and Région Nouvelle-Aquitaine.

Atoptima

University of Bordeaux

Inria

Related packages

coluna.jl's People

Contributors

artalvpes avatar cristianabentes avatar dianabarros avatar eduardovegas avatar fd-v avatar github-actions[bot] avatar gitter-badger avatar guilhemdupuis avatar guimarqu avatar hhkramer avatar issamt avatar itamarrocha avatar jolan99 avatar juliatagbot avatar laradicp avatar loti45 avatar mbrachner avatar najaverzat avatar rrsadykov avatar tbulhoes avatar vitornesello avatar yp-ye avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

coluna.jl's Issues

Add art vars to branching constraints

Needed to make sure that column generation is not interrupted after adding a branching constraint unless the problem is infeasible (detected in pure phase 1)

UndefVarError: with_optimizer not defined

The following error occurs when the GAP example is executed:

ERROR: LoadError: UndefVarError: with_optimizer not defined
Stacktrace:
 [1] model_sgap(::DataGap) at /home/eduardo/.julia/clones/13756470133505061496_full/examples/GeneralizedAssignment_SimpleColGen/model_sgap.jl:3
 [2] top-level scope at none:0
 [3] include at ./boot.jl:317 [inlined]
 [4] include_relative(::Module, ::String) at ./loading.jl:1038
 [5] include(::Module, ::String) at ./sysimg.jl:29
 [6] include(::String) at ./client.jl:388
 [7] top-level scope at none:0
in expression starting at /home/eduardo/.julia/clones/13756470133505061496_full/examples/GeneralizedAssignment_SimpleColGen/run_sgap.jl:11

I am using Julia 1.0

Bounds of convexity constraints in master

Hello to all.
While implementing the Capacitated Lot-sizing Problem with coluna I noticed that the convexity constraints are imposing a lb of 0 and a ub of 1. Thus, as for this problem I should have lb = ub = 1 in such constraints, the application does not work correctly, resulting in a solution value equal to zero.
I changed the rhs of the lower bound convexity constraint from 0 to 1 in decomposition.jl and it worked.
Having lb = ub = 1 in the convexity constraints is generic for column generation or coluna should have some way to set the sense of the convexity constraint?
Application is available in: https://github.com/hhkramer/ColunaCodes/tree/master/clsp
Best regards,
Hugo

Variable added to master constraints with wrong coefficients

In my Capacitated Lot Sizing application (https://github.com/hhkramer/ColunaCodes/tree/master/CapacitatedLotSizing) the coefficients of a new MC variable in the master capacity constraints are not being calculated considering both x and y variables from the corresponding subproblem. Thus, generated columns are added with wrong coefficients in such constraints.

To fix this I changed the setprimalspsol! function in formulation.jl replacing the following line:

coef_matrix[constr_id, ps_id] = var_val * var_coef

with

coef_matrix[constr_id, ps_id] += var_val * var_coef

After that, the application worked and the root node bound returned by coluna is correct (branch-and-price didn't work...)

Is this fix correct?

no setrhs! in formulation

There is no setrhs function in formulation.jl. I think the idea is to communicate all changes in the formulation through the functions defined in this file, right? There is a setrhs function in constraint.jl, but I am not sure if it is the correct option to change the current rhs. I need to change it when preprocessing with a partial solution.

OriginalRepresentatives

Should we include DwSpPricingVar in OriginalRepresentatives?

const OriginalRepresentatives = Union{
    Type{MasterPureVar},
    Type{MasterRepPricingVar}
}

Constraint types tree

The current types tree for constraints does not seem to be very natural.

Currently We have:

struct Constraint <: VarConstr
struct MasterConstr <: Constraint
struct BranchConstr <: Constraint
struct SubProbBranchConstr <: BranchConstr

Should we change to:

struct Constraint <: VarConstr
struct MasterConstr <: Constraint
struct MasterBranchConstr <: MasterConstr
struct SubProbBranchConstr <: Constraint

behaviour of setlb! and setub!

Are the functions setlb! and setub! removing the master columns that do not satisfy the new bounds on subproblem variables? Preprocessing is calling these functions.

Improve tests

  • Write unit tests for missing files
  • Fix issue where codecov misses lines that are actually not missed by the tests
  • Add tests with different objective sense + constraint sense
  • silence Logger output

normalizing problem functions naming

There are some polymorphic functions defined for different subtypes (see the list below). The naming needs to be more consistent. In particular, a function with the same name should have the same "role", even when defined for different subtypes.

add_variable
add_constraint
add_full_variable
add_full_constraint
add_variable_in_optimizer
add_constraint_in_optimizer
add_full_variable_in_optimizer
add_full_constraint_in_optimizer
add_membership

Here are the corresponding lines in the code
https://github.com/atoptima/Coluna.jl/blob/master/src/problem.jl#L331-L472

UndefVarError: with_optimizer not defined

Stacktrace:
[1] top-level scope at none:0
[2] include at .\boot.jl:317 [inlined]
[3] include_relative(::Module, ::String) at .\loading.jl:1044
[4] include(::Module, ::String) at .\sysimg.jl:29
[5] exec_options(::Base.JLOptions) at .\client.jl:231
[6] _start() at .\client.jl:425

For the Knapsack problem copied from GitHub JuMP examples folder

François's review of varconstr

In Variable <: VarConstr, do you need

cur_lb::Float
cur_ub::Float

———————

 mutable struct Constraint <: VarConstr
    # ```
    # Index in MOI optimizer
    # ```
    moi_index::MOI.ConstraintIndex{F,S} where {F,S}

Shouldn't it use JuMp ConstraintRef instead; then it could be a vector ?

———————

function switch_primary_secondary_moi_indices(vc::VarConstr)
    temp_idx = vc.moi_index
    vc.moi_index = vc.secondary_moi_index
    vc.secondary_moi_index = temp_idx
end

function switch_primary_secondary_moi_indices(var::Variable)
    @callsuper switch_primary_secondary_moi_indices(var::VarConstr)
    # This also changes the bounds index
    temp_idx = var.moi_bounds_index
    var.moi_bounds_index = var.secondary_moi_bounds_index
    var.secondary_moi_bounds_index = temp_idx
end

should the be replace by have the seating the current index in vector

———————

mutable struct VarConstrCounter
    value::Int
end

function increment_counter(counter::VarConstrCounter)
    counter.value += 1
    return counter.value
end

# mutable struct VarConstrStabInfo
# end

@hl mutable struct VarConstr
    vc_ref::Int

needed anymore ??? is in the key in coef_map ?

———————

@hl mutable struct Constraint <: VarConstr
    set_type::Type{<:MOI.AbstractSet}
end

record both function and set type or none of the two

———————

@hl mutable struct VarConstr
    # Ref of Problem which this VarConstr is part of
    prob_ref::Int

should be a vector with [0] is the one that has owerrship

———————

@hl mutable struct SubprobVar <: Variable
    # ```
    # To represent global lower bound on sp variable primal value
    # Aggregated bound in master
    # ```
    global_lb::Float

    # ```
    # To represent global upper bound on sp variable primal value
    # Aggregated bound in master
    # ```
    global_ub::Float

    # ```
    # Current global bound values (aggregated in master)
    # Used in preprocessing
    # ```
    cur_global_lb::Float
    cur_global_ub::Float

global and local should simply be bound constraint in different formulation : ie.e global bound is a master constraint

Subroblem membership

The var and constr membership defined in Coluna can only store linear membership. Subroblems can be non linear. Only their contribution in the master is required to be linear. And I believe only the latter need to be known by coluna while the subroblem own membership can be directly passed to the MOI solver without being stored in coluna. Am I wrong?

Update/clean the code regarding the roles of algorithms/strategies

  • methods run!(::AbstractAlgorithm)! should only modify the formulations and should return an object of type AbstractAlgorithmRecord
  • methods apply!(::AbstractStrategy) are allowed to modify the node according to the record returned by the algorithms.
  • Should methods prepare!(::AbstractAlgorithm) create the object AbstractAlgorithmData and return it to the function apply!(::AbstractAlgorithm)? apply!(::AbstractAlgorithm) would then call the methods run!(::AbstractAlgorithm) passing only the algorithm type and its data.

Wrong termination status

In the preprocessing tests, I am solving only the root node of a GAP instance and getting the termination status. If the problem is not solved in the root, termination status is INFEASIBLE even when a primal solution has been found.

Handle global infeasibility

After finding the best solution, should we run a checker agains the original formulation to determine if the solution is feasible or not?

Move code from eval to treat

The evaluation function contains a lot of things: setup, preprocessing, col gen and setdown. I am aware that a node may need to be evaluated twice because of strong branching, but I believe the current structure is not good. For the diving root node, for example, the evaluation does not need to be executed but then all these aforementioned things are skipped. I have modified the treat function a bit in branch "diving" as a first attempt to solve this problem.

Make Coluna work with CPLEX

Not working yet probably due to a bug in CPLEX wrapper. When a formulation is changed from LP -> IP -> LP it still thinks that the formulation is an IP.

Styleguide

  • Should we allow non-characters?
  • Should we always put the full function signature?

Write a nice readme

We need to show some basic information:

  • Whats it is all about
  • How it works in VERY short
  • Status of development
  • What to expect in the future
  • Authors/who to contact
  • How to get engaged / how to contribute

bound changes are wrongly propagated

In a small GAP instance, I could observe the following behaviour. In the root, no bound is changed by preprocessing. The first child solved contains the branching constraint is x[1,7] <= 0. Then, preprocessing sets the ub of x[1,7] to 0. However, this bound change is not reverted before solving the second child (the one associated with x[1,7] >= 1) and thus preprocessing detects infeasibility.

Update docs

Docs will be soon outdated because of the new modelling. Remove pointers to misleading information.

Preprocessing

Update preprocessing code so that it works for the new Coluna.

user-friendly modeling of multiple subproblems

I tried to write down the suggestion of François.

Instead of this: https://github.com/atoptima/Coluna.jl/blob/master/examples/CuttingStock_SubprobMultiplicity/model_csp.jl

we could have this:

function model_scsp(d::DataCsp)
    params = Coluna.Params(use_restricted_master_heur = true)

    csp = Model(with_optimizer(Coluna.ColunaModelOptimizer, params = params),
              bridge_constraints = false)

    max_nb_patterns = sum(d.orders[o].width for o in 1:d.nborders) /
                       d.stocksheetswidth

    K = identical_block_indices(csp, max = max_nb_patterns)
    # K is the implicit description of 1:max_nb_patterns
    # value(K) is the explicit describtion of 1:nb_patterns after the model is solved

    @variable(csp, 0 <= x[k in K, o in 1:d.nborders] <= d.orders[o].demand, Int)

    @variable(csp, 0 <= y[k in K] <= 1, Bin)

    @variable(csp, 0 <= z[k in K] <= max_nb_patterns, Int)

    @constraint(csp, cov[o in 1:d.nborders],
            sum(x[k, o] * z[k] for k in K) >= d.orders[o].demand)

    @constraint(csp, knp[k in K],
            sum(x[k, o] * d.orders[o].width for o in 1:d.nborders)
            <= y[k] * d.stocksheetswidth )

    @objective(csp, Min, sum(z[k] for k in K))

    # setting Dantzig Wolfe composition: one subproblem per machine
    function csp_decomp_func(name, key)
      if name == :cov
          return 0
      else
          return key[1] # which is not an int but an IdenticalBlockIndex
      end
    end
    Coluna.set_dantzig_wolfe_decompostion(csp, csp_decomp_func)

    # setting pricing cardinality bounds (NOT NEEDED ANYMORE)
    # card_bounds_dict = Dict(1 => (0, max_nb_patterns))
    # Coluna.set_dantzig_wolfe_cardinality_bounds(csp, card_bounds_dict)

    return (csp, x,  y)
end

Then we can access the solution using:

for k in value(K)
    @show value(z[k])
    for o in 1:d.nborders
        @show value(x[o, k])
    end
end

Advantages :

  • easier to understand for users.
  • the original formulation could be solved by a solver without any modification/decomposition
  • accessing the solution is very intuitive and does not require new tools other than those of JuMP.

Drawbacks :

  • I̶t̶ ̶r̶e̶q̶u̶i̶r̶e̶s̶ ̶c̶o̶m̶m̶u̶n̶i̶c̶a̶t̶i̶o̶n̶ ̶o̶f̶ ̶i̶n̶d̶i̶c̶e̶s̶ ̶t̶o̶ ̶M̶O̶I̶ ̶w̶h̶i̶c̶h̶ ̶i̶s̶ ̶n̶o̶t̶ ̶t̶h̶e̶ ̶c̶a̶s̶e̶ ̶t̶o̶d̶a̶y̶.̶ ̶M̶O̶I̶ ̶o̶n̶l̶y̶ ̶k̶e̶e̶p̶s̶ ̶a̶ ̶s̶i̶n̶g̶l̶e̶ ̶i̶n̶t̶ ̶i̶d̶e̶n̶t̶i̶f̶i̶e̶r̶ ̶f̶o̶r̶ ̶e̶a̶c̶h̶ ̶v̶a̶r̶i̶a̶b̶l̶e̶ ̶a̶n̶d̶ ̶t̶h̶a̶t̶'̶s̶ ̶t̶h̶e̶ ̶o̶n̶l̶y̶ ̶t̶h̶i̶n̶g̶ ̶c̶o̶m̶m̶u̶n̶i̶c̶a̶t̶e̶d̶ ̶t̶o̶ ̶s̶o̶l̶v̶e̶r̶s̶.
  • It requires generating variables on JuMP side after the model is solved or an extra number of variables before the model is solved.
  • blocks (master, pricing sp, ...) are not annotated only with ints
  • longer to write for expert users

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.