@@ -9,65 +9,265 @@ DocTestFilters = [r"MathOptInterface|MOI"]
99
1010# The Utilities submodule
1111
12- ## Model
12+ The Utilities submodule provides a variety of functionality for managing
13+ ` MOI.ModelLike ` objects.
14+ ## Utilities.Model
1315
1416[ ` Utilities.Model ` ] ( @ref ) provides an implementation of a [ ` ModelLike ` ] ( @ref )
1517that efficiently supports all functions and sets defined within MOI. However,
1618given the extensibility of MOI, this might not cover all use cases.
1719
20+ Create a model as follows:
21+ ``` jldoctest
22+ julia> model = MOI.Utilities.Model{Float64}()
23+ MOIU.Model{Float64}
24+ ```
25+
26+ ## Utilities.UniversalFallback
27+
1828[ ` Utilities.UniversalFallback ` ] ( @ref ) is a layer that sits on top of any
19- ` ModelLike ` and provides non-specialized (slower) fallbacks for constraints and
20- attributes that the underlying ` ModeLike ` does not support.
29+ [ ` ModelLike ` ] ( @ref ) and provides non-specialized (slower) fallbacks for
30+ constraints and attributes that the underlying [ ` ModelLike ` ] ( @ref ) does not
31+ support.
32+
33+ For example, [ ` Utilities.Model ` ] ( @ref ) doesn't support some variable attributes
34+ like [ ` VariablePrimalStart ` ] ( @ref ) , so JuMP uses a combination of Universal
35+ fallback and [ ` Utilities.Model ` ] ( @ref ) as a generic problem cache:
36+ ``` jldoctest
37+ julia> model = MOI.Utilities.UniversalFallback(MOI.Utilities.Model{Float64}())
38+ MOIU.UniversalFallback{MOIU.Model{Float64}}
39+ fallback for MOIU.Model{Float64}
40+ ```
41+
42+ !!! warning
43+ Adding a UniversalFallback means that your ` model ` will now support all
44+ constraints, even if the inner-model does not! This can lead to unexpected
45+ behavior.
46+
47+ ## Utilities.@model
2148
2249For advanced use cases that need efficient support for functions and sets
2350defined outside of MOI (but still known at compile time), we provide the
2451[ ` Utilities.@model ` ] ( @ref ) macro.
2552
26- ## CachingOptimizer
27-
28- Some solvers do not support incremental definition of optimization
29- models. Nevertheless, you are still able to build incrementally an optimization
30- model with such solvers. MathOptInterface provides a utility,
31- [ ` Utilities.CachingOptimizer ` ] ( @ref ) , that will store in a [ ` ModelLike ` ] ( @ref )
32- the optimization model during its incremental definition. Once the
33- model is completely defined, the ` CachingOptimizer ` specifies all problem
34- information to the underlying solver, all at once.
35-
36- The function [ ` Utilities.state ` ] ( @ref ) allows to query the state
37- of the optimizer cached inside a ` CachingOptimizer ` . The state
38- could be:
39- * ` NO_OPTIMIZER ` , if no optimizer is attached;
40- * ` EMPTY_OPTIMIZER ` , if the attached optimizer is empty;
41- * ` ATTACHED_OPTIMIZER ` , if the attached optimizer is synchronized with the
42- cached model defined in ` CachingOptimizer ` .
43-
44- The following methods modify the state of the attached optimizer:
45- * [ ` Utilities.attach_optimizer ` ] ( @ref ) attachs a new ` optimizer `
46- to a ` cached_optimizer ` with state ` EMPTY_OPTIMIZER ` .
47- The state of ` cached_optimizer ` is set to ` ATTACHED_OPTIMIZER ` after the call.
48- * [ ` Utilities.drop_optimizer ` ] ( @ref ) drops the underlying ` optimizer `
49- from ` cached_optimizer ` , without emptying it. The state of ` cached_optimizer `
50- is set to ` NO_OPTIMIZER ` after the call.
51- * [ ` Utilities.reset_optimizer ` ] ( @ref ) empties ` optimizer ` inside
52- ` cached_optimizer ` , without droping it. The state of ` cached_optimizer `
53- is set to ` EMPTY_OPTIMIZER ` after the call.
54-
55- The way to operate a ` CachingOptimizer ` depends whether the mode
56- is set to ` AUTOMATIC ` or to ` MANUAL ` .
57- * In ` MANUAL ` mode, the state of the ` CachingOptimizer ` changes only
58- if the methods [ ` Utilities.attach_optimizer ` ] ( @ref ) ,
59- [ ` Utilities.reset_optimizer ` ] ( @ref ) or [ ` Utilities.drop_optimizer ` ] ( @ref )
60- are being called. Any unattended operation results in an error.
61- * In ` AUTOMATIC ` mode, the state of the ` CachingOptimizer ` changes when
62- necessary. Any modification not supported by the solver (e.g. dropping
63- a constraint) results in a drop to the state ` EMPTY_OPTIMIZER ` .
64-
65- When calling [ ` Utilities.attach_optimizer ` ] ( @ref ) , the ` CachingOptimizer ` copies
66- the cached model to the optimizer with [ ` copy_to ` ] ( @ref ) .
67- We refer to [ Implementing copy] ( @ref ) for more details.
53+ The ` @model ` macro takes a name (for a new type, which must not exist yet),
54+ eight tuples specifying the types of constraints that are supported, and then a
55+ ` Bool ` indicating the type should be a subtype of ` MOI.AbstractOptimizer ` (if
56+ ` true ` ) or ` MOI.ModelLike ` (if ` false ` ).
57+
58+ The eight tuples are in the following order:
59+ 1 . Un-typed scalar sets, e.g., [ ` Integer ` ] ( @ref )
60+ 2 . Typed scalar sets, e.g., [ ` LessThan ` ] ( @ref )
61+ 3 . Un-typed vector sets, e.g., [ ` Nonnegatives ` ] ( @ref )
62+ 4 . Typed vector sets, e.g., [ ` PowerCone ` ] ( @ref )
63+ 5 . Un-typed scalar functions, e.g., [ ` SingleVariable ` ] ( @ref )
64+ 6 . Typed scalar functions, e.g., [ ` ScalarAffineFunction ` ] ( @ref )
65+ 7 . Un-typed vector functions, e.g., [ ` VectorOfVariables ` ] ( @ref )
66+ 8 . Typed vector functions, e.g., [ ` VectorAffineFunction ` ] ( @ref )
67+
68+ The tuples can contain more than one element. Typed-sets should be speficied
69+ without their type parameter, i.e., ` MOI.LessThan ` , not ` MOI.LessThan{Float64} ` .
70+
71+ Here is an example:
72+ ``` jldoctest
73+ julia> MOI.Utilities.@model(
74+ MyNewModel,
75+ (MOI.Integer,), # Un-typed scalar sets
76+ (MOI.GreaterThan,), # Typed scalar sets
77+ (MOI.Nonnegatives,), # Un-typed vector sets
78+ (MOI.PowerCone,), # Typed vector sets
79+ (MOI.SingleVariable,), # Un-typed scalar functions
80+ (MOI.ScalarAffineFunction,), # Typed scalar functions
81+ (MOI.VectorOfVariables,), # Un-typed vector functions
82+ (MOI.VectorAffineFunction,), # Typed vector functions
83+ true, # <:MOI.AbstractOptimizer?
84+ )
85+
86+ julia> model = MyNewModel{Float64}()
87+ MyNewModel{Float64}
88+ ```
89+
90+ !!! warning
91+ ` MyNewModel ` supports every ` SingleVariable ` -in-Set constraint, as well as
92+ [ ` SingleVariable ` ] ( @ref ) , [ ` ScalarAffineFunction ` ] ( @ref ) , and
93+ [ ` ScalarQuadraticFunction ` ] ( @ref ) objective functions. Implement
94+ ` MOI.supports ` as needed to forbid constraint and objective function
95+ combinations.
96+
97+ As another example, [ PATHSolver] ( https://github.com/chkwon/PATHSolver.jl ) , which
98+ only supports [ ` VectorAffineFunction ` ] ( @ref ) -in-[ ` Complements ` ] ( @ref )
99+ defines its optimizer as:
100+ ``` jldoctest pathoptimizer
101+ julia> MOI.Utilities.@model(
102+ PathOptimizer,
103+ (), # Scalar sets
104+ (), # Typed scalar sets
105+ (MOI.Complements,), # Vector sets
106+ (), # Typed vector sets
107+ (), # Scalar functions
108+ (), # Typed scalar functions
109+ (), # Vector functions
110+ (MOI.VectorAffineFunction,), # Typed vector functions
111+ true, # is_optimizer
112+ )
113+ ```
114+ However, ` PathOptimizer ` does not support some ` SingleVariable ` -in-Set
115+ constraints, so we must explicitly define:
116+ ``` jldoctest pathoptimizer
117+ julia> function MOI.supports_constraint(
118+ ::PathOptimizer,
119+ ::Type{MOI.SingleVariable},
120+ ::Type{Union{<:MOI.Semiinteger,MOI.Semicontinuous,MOI.ZeroOne,MOI.Integer}}
121+ )
122+ return false
123+ end
124+ ```
125+
126+ Finally, PATH doesn't support an objective function, so we need to add:
127+ ``` jldoctest pathoptimizer
128+ julia> MOI.supports(::PathOptimizer, ::MOI.ObjectiveFunction) = false
129+ ```
130+
131+ !!! warning
132+ This macro creates a new type, so it must be called from the top-level of a
133+ module, e.g., it cannot be called from inside a function.
134+
135+ ## Utilities.CachingOptimizer
136+
137+ A [ ` Utilities.CachingOptimizer ` ] is an MOI layer that abstracts the difference
138+ between solvers that support incremental modification (e.g., they support adding
139+ variables one-by-one), and solvers that require the entire problem in a single
140+ API call (e.g., they only accept the ` A ` , ` b ` and ` c ` matrices of a linear
141+ program).
142+
143+ It has two parts:
144+ 1 . A cache, where the model can be built and modified incrementally
145+ 2 . An optimizer, which is used to solve the problem
146+
147+ ``` jldoctest pathoptimizer
148+ julia> model = MOI.Utilities.CachingOptimizer(
149+ MOI.Utilities.Model{Float64}(),
150+ PathOptimizer{Float64}(),
151+ )
152+ MOIU.CachingOptimizer{PathOptimizer{Float64},MOIU.Model{Float64}}
153+ in state ATTACHED_OPTIMIZER
154+ in mode AUTOMATIC
155+ with model cache MOIU.Model{Float64}
156+ with optimizer PathOptimizer{Float64}
157+ ```
158+
159+ A [ ` Utilities.CachingOptimizer ` ] ( @ref ) may be in one of three possible states:
160+
161+ * ` NO_OPTIMIZER ` : The CachingOptimizer does not have any optimizer.
162+ * ` EMPTY_OPTIMIZER ` : The CachingOptimizer has an empty optimizer, and it is not
163+ synchronized with the cached model. Modifications are forwarded to the cache,
164+ but _ not_ to the optimizer.
165+ * ` ATTACHED_OPTIMIZER ` : The CachingOptimizer has an optimizer, and it is
166+ synchronized with the cached model. Modifications are forwarded to the
167+ optimizer. If the optimizer does not support modifications, and error will be
168+ thrown.
169+
170+ Use [ ` Utilities.reset_optimizer ` ] ( @ref ) to go from ` ATTACHED_OPTIMIZER ` to
171+ ` EMPTY_OPTIMIZER ` :
172+ ``` jldoctest pathoptimizer
173+ julia> MOI.Utilities.reset_optimizer(model)
174+
175+ julia> model
176+ MOIU.CachingOptimizer{PathOptimizer{Float64},MOIU.Model{Float64}}
177+ in state EMPTY_OPTIMIZER
178+ in mode AUTOMATIC
179+ with model cache MOIU.Model{Float64}
180+ with optimizer PathOptimizer{Float64}
181+ ```
182+
183+ Use [ ` Utilities.attach_optimizer ` ] ( @ref ) to go from ` EMPTY_OPTIMIZER ` to
184+ ` ATTACHED_OPTIMIZER ` :
185+ ``` jldoctest pathoptimizer
186+ julia> MOI.Utilities.attach_optimizer(model)
187+
188+ julia> model
189+ MOIU.CachingOptimizer{PathOptimizer{Float64},MOIU.Model{Float64}}
190+ in state ATTACHED_OPTIMIZER
191+ in mode AUTOMATIC
192+ with model cache MOIU.Model{Float64}
193+ with optimizer PathOptimizer{Float64}
194+ ```
195+ !!! info
196+ You must be in ` ATTACHED_OPTIMIZER ` to use [ ` optimize! ` ] ( @ref ) .
197+
198+ Use [ ` Utilities.drop_optimizer ` ] ( @ref ) to go from any state to ` NO_OPTIMIZER ` :
199+ ``` jldoctest pathoptimizer
200+ julia> MOI.Utilities.drop_optimizer(model)
201+
202+ julia> model
203+ MOIU.CachingOptimizer{PathOptimizer{Float64},MOIU.Model{Float64}}
204+ in state NO_OPTIMIZER
205+ in mode AUTOMATIC
206+ with model cache MOIU.Model{Float64}
207+ with optimizer nothing
208+ ```
209+
210+ Pass an empty optimizer to [ ` Utilities.reset_optimizer ` ] ( @ref ) to go from
211+ ` NO_OPTIMIZER ` to ` EMPTY_OPTIMIZER ` :
212+ ``` jldoctest pathoptimizer
213+ julia> MOI.Utilities.reset_optimizer(model, PathOptimizer{Float64}())
214+
215+ julia> model
216+ MOIU.CachingOptimizer{PathOptimizer{Float64},MOIU.Model{Float64}}
217+ in state EMPTY_OPTIMIZER
218+ in mode AUTOMATIC
219+ with model cache MOIU.Model{Float64}
220+ with optimizer PathOptimizer{Float64}
221+ ```
222+
223+ Deciding when to attach and reset the optimizer is tedious, and you will often
224+ write code like this:
225+ ``` julia
226+ try
227+ # modification
228+ catch
229+ MOI. Utilities. reset_optimizer (model)
230+ # Re-try modification
231+ end
232+ ```
233+
234+ To make this easier, [ ` Utilities.CachingOptimizer ` ] ( @ref ) has two modes of
235+ operation:
236+
237+ * ` AUTOMATIC ` : The ` CachingOptimizer ` changes its state when necessary.
238+ Attempting to add a constraint or perform a modification not supported by the
239+ optimizer results in a drop to ` EMPTY_OPTIMIZER ` mode.
240+ * ` MANUAL ` : The user must change the state of the ` CachingOptimizer ` . Attempting
241+ to perform an operation in the incorrect state results in an error.
242+
243+ By default, ` AUTOMATIC ` mode is chosen. However, you can create a
244+ ` CachingOptimizer ` in ` MANUAL ` mode as follows:
245+ ``` jldoctest pathoptimizer
246+ julia> model = MOI.Utilities.CachingOptimizer(
247+ MOI.Utilities.Model{Float64}(),
248+ MOI.Utilities.MANUAL,
249+ )
250+ MOIU.CachingOptimizer{MOI.AbstractOptimizer,MOIU.Model{Float64}}
251+ in state NO_OPTIMIZER
252+ in mode MANUAL
253+ with model cache MOIU.Model{Float64}
254+ with optimizer nothing
255+
256+ julia> MOI.Utilities.reset_optimizer(model, PathOptimizer{Float64}())
257+
258+ julia> model
259+ MOIU.CachingOptimizer{MOI.AbstractOptimizer,MOIU.Model{Float64}}
260+ in state EMPTY_OPTIMIZER
261+ in mode MANUAL
262+ with model cache MOIU.Model{Float64}
263+ with optimizer PathOptimizer{Float64}
264+ ```
68265
69266## Copy utilities
70267
268+ !!! info
269+ This section is unfinished.
270+
71271### Allocate-Load API
72272
73273The Allocate-Load API allows solvers that do not support loading the problem
@@ -108,3 +308,20 @@ order:
1083082 ) [ ` Utilities.allocate ` ] ( @ref ) and [ ` Utilities.allocate_constraint ` ] ( @ref )
1093093 ) [ ` Utilities.load_variables ` ] ( @ref )
1103104 ) [ ` Utilities.load ` ] ( @ref ) and [ ` Utilities.load_constraint ` ] ( @ref )
311+
312+ ## Fallbacks
313+
314+ The value of some attributes can be inferred from the value of other
315+ attributes.
316+
317+ For example, the value of [ ` ObjectiveValue ` ] ( @ref ) can be computed using
318+ [ ` ObjectiveFunction ` ] ( @ref ) and [ ` VariablePrimal ` ] ( @ref ) .
319+
320+ When a solver gives direct access to an attribute, it is better to return this
321+ value. However, if this is not the case, [ ` Utilities.get_fallback ` ] ( @ref ) can be
322+ used instead. For example:
323+ ``` julia
324+ function MOI. get (model:: Optimizer , attr:: MOI.ObjectiveFunction )
325+ return MOI. Utilities. get_fallback (model, attr)
326+ end
327+ ```
0 commit comments