Comments (9)
I suggest we take the discussion over to... π§ MathOptPresolve.jl π§
from tulip.jl.
It seems reasonable to me to: keep the internal representation the same, and then add an MOI and/or MatrixOptInterface interface on top as-needed. Based on my brief look, I don't see any reason why you could not directly add the MIP information to your internal data structures. Generally, I'm not all that concerned about the indirection of going through MOI, so I'm fine making tying the API to MOI if others are as well.
Ideally, there would also be a programmatic way to configure the presolve!
function as well.
from tulip.jl.
I think it makes sense to eventually break things off into smaller packages, especially if several projects use it.
At this point, Tulip's presolve is almost self-contained in src/Presolve
. It is still tied to some Tulip-level data structures, notably ProblemData
and, to a lesser extent, Model
and Solution
(which handle the interface with the internal IPM optimizers).
My current belief is that a "stand-alone" presolve package should be able to operate as follows:
- Receive the original problem in some format
- Perform presolve (this can be black-box)
- Return presolved model (in a format similar to the original) and necessary ingredients for pre/post crush
Pointer for some inspiration: COIN-OR's OSIPresolve
.
Some questions to fuel the discussion:
- What classes of problems? (MI)LP? (MI)Conic? (MI)NLP?
- What would it be interfaced to? MOI? Something lower-level?
cc @frapac
from tulip.jl.
We only care about MILP.
For our purposes, we don't necessarily want to be tied to a particular solver, and will eventually pipe the model through MOI anyway (to solve and/or bridge). So, working at the MOI level is probably best for us.
Is there anything about your current approach that will not map nicely to MOI?
We will also potentially want to disable certain presolve routines but not others (e.g. in order to keep problem dimensions the same). So baking that into the API would be very useful.
from tulip.jl.
Internally, the presolve code works with an LP representation
min c'x + cβ
s.t. lr β€ Ax β€ ur
lc β€ x β€ uc
where A
is stored row-by-row and column-by-column (to allow fast column-wise and row-wise access).
Bounds, right-hand side & integrality requirements (if you were going MILP) are stored in vectors.
To interface with MOI, you need to do the conversion MOI -> LP and then back LP -> MOI.
It's not hard, and should be even easier once the functionalities of MatrixOptInterface get merged into MOI.
from tulip.jl.
I would be also interested in having a presolve package for https://github.com/exanauts/Simplex.jl
I do not know when MatrixOptInterface will be merged into MOI. Maybe it would make sense to build a MOIPresolve.jl on top of MatOI, but I do not know if that's the best solution available.
from tulip.jl.
(By the way, we have some cycles to spend on making this happen, once we converge on a plan).
from tulip.jl.
I'm also very interested in building upon Tulip's presolve but I don't use MOI. It would be great to simply pass an LP or QP in "matrix form" and receive a presolved problem in the same format. It seems there could be a low-level API with an MOI layer on top.
from tulip.jl.
This issue has been stale for 2 years; closing.
For reference:
- Tulip's presolve code has been included in MathOptPresolve
- For other presolve capabilities, see PaPILO and the julia interface PaPILO.jl
from tulip.jl.
Related Issues (20)
- Wrong vector in residual norm HOT 1
- New release of Tulip for JuMP 0.22 / MOI 0.10
- Error when calling `solution_summary()` HOT 3
- Duplicate constraint name errors with JuMP HOT 10
- Are models with Rational arithmetic supposed to be supported? HOT 3
- IPM/HSD: for BigFloat optimization, computing ΞΎg_ in solve_newton_system! causes a big portion of total allocations HOT 4
- New release? HOT 1
- New failing tests in MOI
- PackageCompiler Incompatibility: InitError(mod=:Tulip, ... HOT 2
- Support for compute_conflict!(model) HOT 10
- MathOptInterface.BarrierIterations
- Extended Precision when called from JuMP HOT 1
- Writing user extension for `Tulip.KKT.AbstractKKTBackend`, `Tulip.KKT.AbstractKKTSolver` etc. HOT 1
- Add iterative refinement
- Add solver MUMPS using KKT interface as new backend. HOT 1
- Add solver Paradiso as backend for KKT Interface HOT 1
- Detect out of memory errors and return proper error massage
- Mittelmann's benchmarks, 23 problems x 18 solvers HOT 2
- Define error code for failed factorization
- Presolve causes infeasibility error? HOT 6
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
π Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. πππ
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google β€οΈ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from tulip.jl.