Skip to content

Commit

Permalink
Remove GenericModel function
Browse files Browse the repository at this point in the history
  • Loading branch information
odow committed Jun 4, 2023
1 parent 0522f5a commit e2595df
Show file tree
Hide file tree
Showing 8 changed files with 68 additions and 49 deletions.
14 changes: 7 additions & 7 deletions docs/src/manual/models.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ well as which solver to use and even solution information.

## Create a model

Create a model by passing an optimizer to [`GenericModel`](@ref):
Create a model by passing an optimizer to [`Model`](@ref):
```jldoctest
julia> model = Model(HiGHS.Optimizer)
A JuMP Model
Expand Down Expand Up @@ -56,7 +56,7 @@ julia> set_optimizer(model, HiGHS.Optimizer)
### What is the difference?

For most models, there is no difference between passing the optimizer to
[`GenericModel`](@ref), and calling [`set_optimizer`](@ref).
[`Model`](@ref), and calling [`set_optimizer`](@ref).

However, if an optimizer does not support a constraint in the model, the timing
of when an error will be thrown can differ:
Expand All @@ -66,7 +66,7 @@ of when an error will be thrown can differ:
* If you call [`set_optimizer`](@ref), an error will be thrown when you try to
solve the model via [`optimize!`](@ref).

Therefore, most users should pass an optimizer to [`GenericModel`](@ref) because it
Therefore, most users should pass an optimizer to [`Model`](@ref) because it
provides the earliest warning that your solver is not suitable for the model you
are trying to build. However, if you are modifying a problem by adding and
deleting different constraint types, you may need to use
Expand Down Expand Up @@ -115,7 +115,7 @@ The list of available solvers, along with the problem types they support, is ava

Some solvers accept (or require) positional arguments such as a license
environment or a path to a binary executable. For these solvers, you can pass
a function to [`GenericModel`](@ref) which takes zero arguments and returns an instance
a function to [`Model`](@ref) which takes zero arguments and returns an instance
of the optimizer.

A common use-case for this is passing an environment or sub-solver to the
Expand Down Expand Up @@ -416,15 +416,15 @@ reduced_cost(x) # Works
want to skip this section. You don't need to know how JuMP manages problems
behind the scenes to create and solve JuMP models.

A JuMP [`GenericModel`](@ref) is a thin layer around a *backend* of type
A JuMP [`Model`](@ref) is a thin layer around a *backend* of type
[`MOI.ModelLike`](@ref) that stores the optimization problem and acts as the
optimization solver.

However, if you construct a model like `Model(HiGHS.Optimizer)`, the backend is
not a `HiGHS.Optimizer`, but a more complicated object.

From JuMP, the MOI backend can be accessed using the [`backend`](@ref) function.
Let's see what the [`backend`](@ref) of a JuMP [`GenericModel`](@ref) is:
Let's see what the [`backend`](@ref) of a JuMP [`Model`](@ref) is:
```jldoctest models_backends
julia> model = Model(HiGHS.Optimizer);
Expand Down Expand Up @@ -499,7 +499,7 @@ A `CachingOptimizer` has two modes of operation:
[`MOIU.attach_optimizer(::JuMP.Model)`](@ref). Attempting to perform
an operation in the incorrect state results in an error.

By default [`GenericModel`](@ref) will create a `CachingOptimizer` in `AUTOMATIC` mode.
By default [`Model`](@ref) will create a `CachingOptimizer` in `AUTOMATIC` mode.

### LazyBridgeOptimizer

Expand Down
2 changes: 1 addition & 1 deletion docs/src/reference/models.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ the manual.
## Constructors

```@docs
GenericModel
Model
direct_model
```

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -135,7 +135,7 @@ using JuMP
using HiGHS

# JuMP builds problems incrementally in a `Model` object. Create a model by
# passing an optimizer to the [`Model`](@ref GenericModel) function:
# passing an optimizer to the [`Model`](@ref) function:

model = Model(HiGHS.Optimizer)

Expand Down
2 changes: 1 addition & 1 deletion docs/src/tutorials/getting_started/performance_tips.jl
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ using HiGHS # hide
# At present, the majority of the latency problems are caused by JuMP's bridging
# mechanism. If you only use constraints that are natively supported by the
# solver, you can disable bridges by passing `add_bridges = false` to
# [`Model`](@ref GenericModel).
# [`Model`](@ref).

model = Model(HiGHS.Optimizer; add_bridges = false)

Expand Down
85 changes: 52 additions & 33 deletions src/JuMP.jl
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@ defaults to `Float64` if it is not implemented.
"""
value_type(::Type{<:AbstractModel}) = Float64

mutable struct GenericModel{T} <: AbstractModel
mutable struct GenericModel{T<:Real} <: AbstractModel
# In MANUAL and AUTOMATIC modes, CachingOptimizer.
# In DIRECT mode, will hold an AbstractOptimizer.
moi_backend::MOI.AbstractOptimizer
Expand Down Expand Up @@ -132,33 +132,38 @@ function Base.getproperty(model::GenericModel, name::Symbol)
return getfield(model, name)
end

const Model = GenericModel{Float64}

"""
GenericModel{T}()
Model(
[optimizer_factory;]
add_bridges::Bool = true,
value_type::Type{T} = Float64,
) where {T<:Real}
Return a new JuMP model of value type `T` without any optimizer; the model is
stored in a cache.
Create a new instance of a JuMP model.
Use [`set_optimizer`](@ref) to set the optimizer before calling
[`optimize!`](@ref).
"""
function GenericModel{T}() where {T}
caching_opt = MOIU.CachingOptimizer(
MOIU.UniversalFallback(MOIU.Model{T}()),
MOIU.AUTOMATIC,
)
return direct_model(caching_opt, T)
end
If `optimizer_factory` is provided, the model is initialized with thhe optimizer
returned by `MOI.instantiate(optimizer_factory)`.
const Model = GenericModel{Float64}
If `optimizer_factory` is not provided, use [`set_optimizer`](@ref) to set the
optimizer before calling [`optimize!`](@ref).
"""
GenericModel{T}(optimizer_factory; add_bridges::Bool = true) where {T}
If `add_bridges`, JuMP adds a [`MOI.Bridges.LazyBridgeOptimizer`](@ref) to
automatically reformulate the problem into a form supported by the optimizer.
Return a new JuMP model with the provided optimizer, bridge settings and
[`value_type`](@ref) `T`.
## `value_type`
See [`set_optimizer`](@ref) for the description of the `optimizer_factory` and
`add_bridges` arguments.
Passing a value type other than `Float64` is an advanced operation. The value
type must match that expected by the chosen optimizer. Consult the optimizers
documentation for details.
If not documented, assume that the optimizer supports only `Float64`.
Choosing an unsupported value type will throw an [`MOI.UnsupportedConstraint`](@ref)
or an [`MOI.UnsupportedAttribute`](@ref) error, the timing of which (during the
model construction or during a call to [`optimize!`](@ref)) depends on how the
solver is interfaced to JuMP.
## Example
Expand All @@ -169,32 +174,43 @@ julia> model = Model(Ipopt.Optimizer);
julia> solver_name(model)
"Ipopt"
```
```jldoctest
julia> import HiGHS
julia> import MultiObjectiveAlgorithms as MOA
julia> model = Model(() -> MOA.Optimizer(HiGHS.Optimizer); add_bridges = false);
julia> model = Model(; value_type = Int);
julia> typeof(model)
GenericModel{Int}
```
"""
function GenericModel{T}(
(@nospecialize optimizer_factory);
function Model(
(@nospecialize optimizer_factory) = nothing;
add_bridges::Bool = true,
) where {T}
model = GenericModel{T}()
set_optimizer(model, optimizer_factory; add_bridges = add_bridges)
value_type::Type{T} = Float64,
) where {T<:Real}
inner = MOI.Utilities.UniversalFallback(MOI.Utilities.Model{T}())
cache = MOI.Utilities.CachingOptimizer(inner, MOI.Utilities.AUTOMATIC)
model = direct_model(cache; value_type = value_type)
if optimizer_factory !== nothing
set_optimizer(model, optimizer_factory; add_bridges = add_bridges)
end
return model
end

"""
direct_model(backend::MOI.ModelLike, T::Type = Float64)
direct_model(
backend::MOI.ModelLike;
value_type::Type{T} = Float64,
) where {T<:Real}
Return a new JuMP model using [`backend`](@ref) to store the model and solve it.
As opposed to the [`GenericModel`](@ref) constructor, no cache of the model is stored
outside of [`backend`](@ref) and no bridges are automatically applied to
As opposed to the [`Model`](@ref) constructor, no cache of the model is
stored outside of [`backend`](@ref) and no bridges are automatically applied to
[`backend`](@ref).
## Notes
Expand All @@ -204,14 +220,17 @@ in mind the following implications of creating models using this *direct* mode:
* When [`backend`](@ref) does not support an operation, such as modifying
constraints or adding variables/constraints after solving, an error is
thrown. For models created using the [`GenericModel`](@ref) constructor, such
thrown. For models created using the [`Model`](@ref) constructor, such
situations can be dealt with by storing the modifications in a cache and
loading them into the optimizer when `optimize!` is called.
* No constraint bridging is supported by default.
* The optimizer used cannot be changed the model is constructed.
* The model created cannot be copied.
"""
function direct_model(backend::MOI.ModelLike, T::Type = Float64)
function direct_model(
backend::MOI.ModelLike;
value_type::Type{T} = Float64,
) where {T<:Real}
@assert MOI.is_empty(backend)
return GenericModel{T}(
backend,
Expand Down
8 changes: 4 additions & 4 deletions src/copy.jl
Original file line number Diff line number Diff line change
Expand Up @@ -106,7 +106,7 @@ constraint reference as argument.
## Note
Model copy is not supported in `DIRECT` mode, i.e. when a model is constructed
using the [`direct_model`](@ref) constructor instead of the [`GenericModel`](@ref)
using the [`direct_model`](@ref) constructor instead of the [`Model`](@ref)
constructor. Moreover, independently on whether an optimizer was provided at
model construction, the new model will have no optimizer, i.e., an optimizer
will have to be provided to the new model in the [`optimize!`](@ref) call.
Expand Down Expand Up @@ -146,7 +146,7 @@ function copy_model(
"able to copy the constructed model.",
)
end
new_model = GenericModel{T}()
new_model = Model(; value_type = T)

# At JuMP's level, filter_constraints should work with JuMP.ConstraintRef,
# whereas MOI.copy_to's filter_constraints works with MOI.ConstraintIndex.
Expand Down Expand Up @@ -210,7 +210,7 @@ and its copy.
## Note
Model copy is not supported in `DIRECT` mode, i.e. when a model is constructed
using the [`direct_model`](@ref) constructor instead of the [`GenericModel`](@ref)
using the [`direct_model`](@ref) constructor instead of the [`Model`](@ref)
constructor. Moreover, independently on whether an optimizer was provided at
model construction, the new model will have no optimizer, i.e., an optimizer
will have to be provided to the new model in the [`optimize!`](@ref) call.
Expand Down Expand Up @@ -258,7 +258,7 @@ This is a convenience function that provides a filtering function for
## Note
Model copy is not supported in `DIRECT` mode, i.e. when a model is constructed
using the [`direct_model`](@ref) constructor instead of the [`GenericModel`](@ref)
using the [`direct_model`](@ref) constructor instead of the [`Model`](@ref)
constructor. Moreover, independently on whether an optimizer was provided at
model construction, the new model will have no optimizer, i.e., an optimizer
will have to be provided to the new model in the [`optimize!`](@ref) call.
Expand Down
2 changes: 1 addition & 1 deletion src/file_formats.jl
Original file line number Diff line number Diff line change
Expand Up @@ -140,7 +140,7 @@ function Base.read(
end
src = MOI.FileFormats.Model(; format = format, kwargs...)
read!(io, src)
model = GenericModel{T}()
model = Model(; value_type = T)
MOI.copy_to(model, src)
return model
end
2 changes: 1 addition & 1 deletion src/optimizer_interface.jl
Original file line number Diff line number Diff line change
Expand Up @@ -615,7 +615,7 @@ struct OptimizeNotCalled <: Exception end
"""
struct NoOptimizer <: Exception end
No optimizer is set. The optimizer can be provided to the [`GenericModel`](@ref)
No optimizer is set. The optimizer can be provided to the [`Model`](@ref)
constructor or by calling [`set_optimizer`](@ref).
"""
struct NoOptimizer <: Exception end
Expand Down

0 comments on commit e2595df

Please sign in to comment.