Skip to content

Commit

Permalink
Merge pull request #18 from SciML/RoA_approximation
Browse files Browse the repository at this point in the history
Added training option for region of attraction approximation
  • Loading branch information
nicholaskl97 authored Mar 6, 2024
2 parents 5be782c + a3d694c commit 2130100
Show file tree
Hide file tree
Showing 9 changed files with 717 additions and 469 deletions.
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -4,3 +4,4 @@
/docs/Manifest.toml
/docs/build/
Manifest.toml
.vscode/settings.json
15 changes: 9 additions & 6 deletions src/NeuralLyapunov.jl
Original file line number Diff line number Diff line change
Expand Up @@ -7,16 +7,19 @@ using ModelingToolkit
import SciMLBase

include("conditions_specification.jl")
include("structure_specification.jl")
include("minimization_conditions.jl")
include("decrease_conditions.jl")
include("decrease_conditions_RoA_aware.jl")
include("NeuralLyapunovPDESystem.jl")
include("local_Lyapunov.jl")

export NeuralLyapunovPDESystem, NumericalNeuralLyapunovFunctions
export local_Lyapunov
export NeuralLyapunovSpecification, NeuralLyapunovStructure,
UnstructuredNeuralLyapunov, NonnegativeNeuralLyapunov,
PositiveSemiDefiniteStructure,
LyapunovMinimizationCondition, StrictlyPositiveDefinite,
PositiveSemiDefinite, DontCheckNonnegativity, LyapunovDecreaseCondition,
AsymptoticDecrease, ExponentialDecrease, DontCheckDecrease
export NeuralLyapunovSpecification, NeuralLyapunovStructure, UnstructuredNeuralLyapunov,
NonnegativeNeuralLyapunov, PositiveSemiDefiniteStructure,
LyapunovMinimizationCondition, StrictlyPositiveDefinite, PositiveSemiDefinite,
DontCheckNonnegativity, LyapunovDecreaseCondition, AsymptoticDecrease,
ExponentialDecrease, DontCheckDecrease, RoAAwareDecreaseCondition, make_RoA_aware

end
84 changes: 43 additions & 41 deletions src/NeuralLyapunovPDESystem.jl
Original file line number Diff line number Diff line change
@@ -1,28 +1,26 @@
"""
NeuralLyapunovPDESystem(dynamics, lb, ub, spec; fixed_point, ps)
Constructs a ModelingToolkit PDESystem to train a neural Lyapunov function
Returns the PDESystem and a function representing the neural network, which
operates columnwise.
The neural Lyapunov function will only be trained for { x : lb .≤ x .≤ ub }.
The Lyapunov function will be for the dynamical system represented by dynamics
If dynamics is an ODEProblem or ODEFunction, then the corresponding ODE; if
dynamics is a function, then the ODE is ẋ = dynamics(x, p, t). This ODE should
not depend on t (time t=0.0 alone will be used) and should have a fixed point
at x = fixed_point. The particular Lyapunov conditions to be used and structure
of the neural Lyapunov function are specified through spec, which is a
NeuralLyapunovSpecification.
The returned neural network function takes three inputs: the neural network
structure phi, the trained network parameters, and a matrix of inputs to
operate on columnwise.
If dynamics requires parameters, their values can be supplied through the
Vector p, or through dynamics.p if dynamics isa ODEProblem (in which case, let
the other be SciMLBase.NullParameters()). If dynamics is an ODEFunction and
dynamics.paramsyms is defined, then p should have the same order.
Constructs a ModelingToolkit `PDESystem` to train a neural Lyapunov function
Returns the `PDESystem` and a function representing the neural network, which operates
columnwise.
The neural Lyapunov function will only be trained for `{ x : lb .≤ x .≤ ub }`. The Lyapunov
function will be for the dynamical system represented by `dynamics`. If `dynamics` is an
`ODEProblem` or `ODEFunction`, then the corresponding ODE; if `dynamics` is a function, then
the ODE is `ẋ = dynamics(x, p, t)`. This ODE should not depend on `t` (time `t=0.0` alone
will be used) and should have a fixed point at `x = fixed_point`. The particular Lyapunov
conditions to be used and structure of the neural Lyapunov function are specified through
`spec`, which is a `NeuralLyapunovSpecification`.
The returned neural network function takes three inputs: the neural network structure `phi`,
the trained network parameters, and a matrix of inputs to operate on columnwise.
If `dynamics` requires parameters, their values can be supplied through the Vector `p`, or
through the parameters of `dynamics` if `dynamics isa ODEProblem` (in which case, let
the other be `SciMLBase.NullParameters()`). If `dynamics` is an `ODEFunction` and
`dynamics.paramsyms` is defined, then `p` should have the same order.
"""
function NeuralLyapunovPDESystem(
dynamics::ODEFunction,
Expand Down Expand Up @@ -128,7 +126,9 @@ function NeuralLyapunovPDESystem(
elseif dynamics.p == p
p
else
throw(ErrorException("Conflicting parameter definitions. Please define parameters only through p or dynamics.p; the other should be SciMLBase.NullParameters()"))
throw(ErrorException("Conflicting parameter definitions. Please define parameters" *
" only through p or dynamics.p; the other should be " *
"SciMLBase.NullParameters()"))
end

return NeuralLyapunovPDESystem(
Expand Down Expand Up @@ -269,20 +269,21 @@ function _NeuralLyapunovPDESystem(
end

"""
NumericalNeuralLyapunovFunctions(phi, θ, network_func, structure, dynamics, fixed_point; jac, J_net)
NumericalNeuralLyapunovFunctions(phi, θ, network_func, structure, dynamics, fixed_point;
jac, J_net)
Returns the Lyapunov function, its time derivative, and its gradient: V(state),
V̇(state), and ∇V(state)
Returns the Lyapunov function, its time derivative, and its gradient: `V(state)`,
`V̇(state)`, and `∇V(state)`
These functions can operate on a state vector or columnwise on a matrix of state
vectors. phi is the neural network with parameters θ. network_func(phi, θ, state)
is an output of NeuralLyapunovPDESystem, which evaluates the neural network
represented phi with parameters θ at state.
These functions can operate on a state vector or columnwise on a matrix of state vectors.
`phi` is the neural network with parameters `θ`. `network_func(phi, θ, state)` is an output
of `NeuralLyapunovPDESystem`, which evaluates the neural network represented by `phi` with
parameters `θ` at `state`.
The Lyapunov function structure is specified in structure, which is a
NeuralLyapunovStructure. The Jacobian of the network is either specified via
J_net(_phi, _θ, state) or calculated using jac, which defaults to
ForwardDiff.jacobian
`NeuralLyapunovStructure`. The Jacobian of the network is either specified via
`J_net(_phi, _θ, state)` or calculated using `jac`, which defaults to
`ForwardDiff.jacobian`.
"""
function NumericalNeuralLyapunovFunctions(
phi,
Expand Down Expand Up @@ -328,18 +329,19 @@ function NumericalNeuralLyapunovFunctions(
end

"""
NumericalNeuralLyapunovFunctions(phi, θ, network_func, V_structure, dynamics, fixed_point, grad)
NumericalNeuralLyapunovFunctions(phi, θ, network_func, V_structure, dynamics,
fixed_point, grad)
Returns the Lyapunov function, its time derivative, and its gradient: V(state),
V̇(state), and ∇V(state)
Returns the Lyapunov function, its time derivative, and its gradient: `V(state)`,
`V̇(state)`, and `∇V(state)`.
These functions can operate on a state vector or columnwise on a matrix of state
vectors. phi is the neural network with parameters θ. network_func is an output
of NeuralLyapunovPDESystem.
These functions can operate on a state vector or columnwise on a matrix of state vectors.
`phi` is the neural network with parameters `θ`. `network_func` is an output of
`NeuralLyapunovPDESystem`.
The Lyapunov function structure is defined by
V_structure(_network_func, state, fixed_point)
Its gradient is calculated using grad, which defaults to ForwardDiff.gradient.
`V_structure(_network_func, state, fixed_point)`
Its gradient is calculated using `grad`, which defaults to `ForwardDiff.gradient`.
"""
function NumericalNeuralLyapunovFunctions(
phi,
Expand Down
Loading

0 comments on commit 2130100

Please sign in to comment.