Skip to content

Commit

Permalink
Merge pull request #881 from ArnoStrouwen/qa
Browse files Browse the repository at this point in the history
Aqua and typos CI
  • Loading branch information
ChrisRackauckas authored Dec 3, 2023
2 parents 4bc8452 + 00541d4 commit b218ed6
Show file tree
Hide file tree
Showing 8 changed files with 37 additions and 22 deletions.
13 changes: 13 additions & 0 deletions .github/workflows/SpellCheck.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
name: Spell Check

on: [pull_request]

jobs:
typos-check:
name: Spell Check with Typos
runs-on: ubuntu-latest
steps:
- name: Checkout Actions Repository
uses: actions/checkout@v3
- name: Check spelling
uses: crate-ci/typos@v1.16.23
2 changes: 2 additions & 0 deletions .typos.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
[default.extend-words]
Teh = "Teh"
2 changes: 1 addition & 1 deletion docs/src/examples/GPUs.md
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,7 @@ using Lux, Optimization, OptimizationOptimisers, Zygote, OrdinaryDiffEq,
Plots, LuxCUDA, SciMLSensitivity, Random, ComponentArrays
import DiffEqFlux: NeuralODE

CUDA.allowscalar(false) # Makes sure no slow operations are occuring
CUDA.allowscalar(false) # Makes sure no slow operations are occurring

#rng for Lux.setup
rng = Random.default_rng()
Expand Down
4 changes: 2 additions & 2 deletions docs/src/examples/neural_ode_weather_forecast.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ This example is adapted from [Forecasting the weather with neural ODEs - Sebatia

## The data

The data is a four-dimensional dataset of daily temperature, humidity, wind speed and pressure meassured over four years in the city Delhi. Let us download and plot it.
The data is a four-dimensional dataset of daily temperature, humidity, wind speed and pressure measured over four years in the city Delhi. Let us download and plot it.

```julia
using Random, Dates, Optimization, ComponentArrays, Lux, OptimizationOptimisers, DiffEqFlux,
Expand All @@ -29,7 +29,7 @@ df = download_data()

```julia
FEATURES = [:meantemp, :humidity, :wind_speed, :meanpressure]
UNITS = ["Celcius", "g/m³ of water", "km/h", "hPa"]
UNITS = ["Celsius", "g/m³ of water", "km/h", "hPa"]
FEATURE_NAMES = ["Mean temperature", "Humidity", "Wind speed", "Mean pressure"]

function plot_data(df)
Expand Down
4 changes: 2 additions & 2 deletions docs/src/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ The following layer functions exist:
- [Collocation-Based Neural ODEs (Neural ODEs without a solver, by far the fastest way!)](https://www.degruyter.com/document/doi/10.1515/sagmb-2020-0025/html)
- [Multiple Shooting Neural Ordinary Differential Equations](https://arxiv.org/abs/2109.06786)
- [Neural Stochastic Differential Equations (Neural SDEs)](https://arxiv.org/abs/1907.07587)
- [Neural Differential-Algebriac Equations (Neural DAEs)](https://arxiv.org/abs/2001.04385)
- [Neural Differential-Algebraic Equations (Neural DAEs)](https://arxiv.org/abs/2001.04385)
- [Neural Delay Differential Equations (Neural DDEs)](https://arxiv.org/abs/2001.04385)
- [Augmented Neural ODEs](https://arxiv.org/abs/1904.01681)
- [Hamiltonian Neural Networks (with specialized second order and symplectic integrators)](https://arxiv.org/abs/1906.01563)
Expand Down Expand Up @@ -63,7 +63,7 @@ the precision of the arguments are correct, and anything that requires alternati

Lux.jl has none of these issues, is simpler to work with due to the parameters in its function calls being explicit rather than implicit global
references, and achieves higher performance. It is built on the same foundations as Flux.jl, such as Zygote and NNLib, and thus it supports the
same layers underneith and calls the same kernels. The better performance comes from not having the overhead of `restructure` required.
same layers underneath and calls the same kernels. The better performance comes from not having the overhead of `restructure` required.
Thus we highly recommend people use Lux instead and only use the Flux fallbacks for legacy code.

## Citation
Expand Down
2 changes: 1 addition & 1 deletion src/hnn.jl
Original file line number Diff line number Diff line change
Expand Up @@ -75,7 +75,7 @@ end
"""
NeuralHamiltonianDE(model, tspan, args...; kwargs...)
Contructs a Neural Hamiltonian DE Layer for solving Hamiltonian Problems parameterized by a
Constructs a Neural Hamiltonian DE Layer for solving Hamiltonian Problems parameterized by a
Neural Network [`HamiltonianNN`](@ref).
Arguments:
Expand Down
16 changes: 8 additions & 8 deletions src/neural_de.jl
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ Arguments:
- `tspan`: The timespan to be solved on.
- `alg`: The algorithm used to solve the ODE. Defaults to `nothing`, i.e. the
default algorithm from DifferentialEquations.jl.
- `sensealg`: The choice of differentiation algorthm used in the backpropogation.
- `sensealg`: The choice of differentiation algorithm used in the backpropogation.
Defaults to an adjoint method. See
the [Local Sensitivity Analysis](https://docs.sciml.ai/DiffEqDocs/stable/analysis/sensitivity/)
documentation for more details.
Expand Down Expand Up @@ -69,7 +69,7 @@ Arguments:
- `tspan`: The timespan to be solved on.
- `alg`: The algorithm used to solve the ODE. Defaults to `nothing`, i.e. the
default algorithm from DifferentialEquations.jl.
- `sensealg`: The choice of differentiation algorthm used in the backpropogation.
- `sensealg`: The choice of differentiation algorithm used in the backpropogation.
- `kwargs`: Additional arguments splatted to the ODE solver. See the
[Common Solver Arguments](https://docs.sciml.ai/DiffEqDocs/stable/basics/common_solver_opts/)
documentation for more details.
Expand Down Expand Up @@ -118,7 +118,7 @@ Arguments:
- `nbrown`: The number of Brownian processes
- `alg`: The algorithm used to solve the ODE. Defaults to `nothing`, i.e. the
default algorithm from DifferentialEquations.jl.
- `sensealg`: The choice of differentiation algorthm used in the backpropogation.
- `sensealg`: The choice of differentiation algorithm used in the backpropogation.
- `kwargs`: Additional arguments splatted to the ODE solver. See the
[Common Solver Arguments](https://docs.sciml.ai/DiffEqDocs/stable/basics/common_solver_opts/)
documentation for more details.
Expand Down Expand Up @@ -172,7 +172,7 @@ Arguments:
- `lags`: Defines the lagged values that should be utilized in the neural network.
- `alg`: The algorithm used to solve the ODE. Defaults to `nothing`, i.e. the
default algorithm from DifferentialEquations.jl.
- `sensealg`: The choice of differentiation algorthm used in the backpropogation.
- `sensealg`: The choice of differentiation algorithm used in the backpropogation.
Defaults to using reverse-mode automatic differentiation via Tracker.jl
- `kwargs`: Additional arguments splatted to the ODE solver. See the
[Common Solver Arguments](https://docs.sciml.ai/DiffEqDocs/stable/basics/common_solver_opts/)
Expand Down Expand Up @@ -219,11 +219,11 @@ Arguments:
derivative function. Should take an input of size `x` and produce the residual of
`f(dx,x,t)` for only the differential variables.
- `constraints_model`: A function `constraints_model(u,p,t)` for the fixed
constaints to impose on the algebraic equations.
constraints to impose on the algebraic equations.
- `tspan`: The timespan to be solved on.
- `alg`: The algorithm used to solve the ODE. Defaults to `nothing`, i.e. the
default algorithm from DifferentialEquations.jl.
- `sensealg`: The choice of differentiation algorthm used in the backpropogation.
- `sensealg`: The choice of differentiation algorithm used in the backpropogation.
Defaults to using reverse-mode automatic differentiation via Tracker.jl
- `kwargs`: Additional arguments splatted to the ODE solver. See the
[Common Solver Arguments](https://docs.sciml.ai/DiffEqDocs/stable/basics/common_solver_opts/)
Expand Down Expand Up @@ -286,7 +286,7 @@ constraint equations.
Arguments:
- `model`: A Flux.Chain or Lux.AbstractExplicitLayer neural network that defines the ̇`f(u,p,t)`
- `constraints_model`: A function `constraints_model(u,p,t)` for the fixed constaints to
- `constraints_model`: A function `constraints_model(u,p,t)` for the fixed constraints to
impose on the algebraic equations.
- `tspan`: The timespan to be solved on.
- `mass_matrix`: The mass matrix associated with the DAE
Expand All @@ -295,7 +295,7 @@ Arguments:
compatible with singular mass matrices. Consult the
[DAE solvers](https://docs.sciml.ai/DiffEqDocs/stable/solvers/dae_solve/) documentation
for more details.
- `sensealg`: The choice of differentiation algorthm used in the backpropogation.
- `sensealg`: The choice of differentiation algorithm used in the backpropogation.
Defaults to an adjoint method. See
the [Local Sensitivity Analysis](https://docs.sciml.ai/DiffEqDocs/stable/analysis/sensitivity/)
documentation for more details.
Expand Down
16 changes: 8 additions & 8 deletions test/runtests.jl
Original file line number Diff line number Diff line change
Expand Up @@ -72,16 +72,16 @@ const is_CI = haskey(ENV, "CI")
if GROUP == "All" || GROUP == "Aqua"
@safetestset "Aqua Q/A" begin
using Aqua, DiffEqFlux, LinearAlgebra

# TODO: Enable persistent tasks once the downstream PRs are merged
Aqua.test_all(DiffEqFlux; ambiguities = false, piracies = false,
persistent_tasks = false)

Aqua.test_ambiguities(DiffEqFlux; recursive = false)

Aqua.find_persistent_tasks_deps(DiffEqFlux)
Aqua.test_ambiguities(DiffEqFlux, recursive = false)
Aqua.test_deps_compat(DiffEqFlux)
Aqua.test_piracies(DiffEqFlux; treat_as_own = [LinearAlgebra.Tridiagonal])
Aqua.test_project_extras(DiffEqFlux)
Aqua.test_stale_deps(DiffEqFlux)
Aqua.test_unbound_args(DiffEqFlux)
Aqua.test_undefined_exports(DiffEqFlux)
# FIXME: Remove Tridiagonal piracy after
# https://github.com/JuliaDiff/ChainRules.jl/issues/713 is merged!
Aqua.test_piracies(DiffEqFlux; treat_as_own = [LinearAlgebra.Tridiagonal])
end
end
end

0 comments on commit b218ed6

Please sign in to comment.