Skip to content

Commit

Permalink
docs: update examples
Browse files Browse the repository at this point in the history
  • Loading branch information
sathvikbhagavan committed Mar 13, 2024
1 parent 66db48f commit b13f7c0
Show file tree
Hide file tree
Showing 2 changed files with 6 additions and 10 deletions.
8 changes: 4 additions & 4 deletions docs/src/examples/linear_parabolic.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,9 +24,9 @@ w(t, 1) = \frac{e^{\lambda_1} cos(\frac{x}{a})-e^{\lambda_2}cos(\frac{x}{a})}{\l
with a physics-informed neural network.

```@example
using NeuralPDE, Lux, ModelingToolkit, Optimization, OptimizationOptimJL, LineSearches
using NeuralPDE, Lux, ModelingToolkit, Optimization, OptimizationOptimisers, OptimizationOptimJL, LineSearches
using Plots
import ModelingToolkit: Interval, infimum, supremum
using ModelingToolkit: Interval, infimum, supremum
@parameters t, x
@variables u(..), w(..)
Expand Down Expand Up @@ -71,7 +71,7 @@ input_ = length(domains)
n = 15
chain = [Lux.Chain(Dense(input_, n, Lux.σ), Dense(n, n, Lux.σ), Dense(n, 1)) for _ in 1:2]
strategy = GridTraining(0.01)
strategy = StochasticTraining(500)
discretization = PhysicsInformedNN(chain, strategy)
@named pdesystem = PDESystem(eqs, bcs, domains, [t, x], [u(t, x), w(t, x)])
Expand All @@ -92,7 +92,7 @@ callback = function (p, l)
return false
end
res = Optimization.solve(prob, LBFGS(linesearch = BackTracking()); callback = callback, maxiters = 500)
res = Optimization.solve(prob, OptimizationOptimisers.Adam(1e-2); callback = callback, maxiters = 10000)
phi = discretization.phi
Expand Down
8 changes: 2 additions & 6 deletions docs/src/examples/nonlinear_hyperbolic.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ where k is a root of the algebraic (transcendental) equation f(k) = g(k), j0 and
We solve this with Neural:

```@example
using NeuralPDE, Lux, ModelingToolkit, Optimization, OptimizationOptimJL, Roots
using NeuralPDE, Lux, ModelingToolkit, Optimization, OptimizationOptimJL, Roots, LineSearches
using SpecialFunctions
using Plots
import ModelingToolkit: Interval, infimum, supremum
Expand Down Expand Up @@ -99,7 +99,7 @@ callback = function (p, l)
return false
end
res = Optimization.solve(prob, BFGS(); callback = callback, maxiters = 1000)
res = Optimization.solve(prob, BFGS(linesearch = BackTracking()); callback = callback, maxiters = 200)
phi = discretization.phi
Expand All @@ -117,9 +117,5 @@ for i in 1:2
p2 = plot(ts, xs, u_predict[i], linetype = :contourf, title = "predict")
p3 = plot(ts, xs, diff_u[i], linetype = :contourf, title = "error")
plot(p1, p2, p3)
savefig("nonlinear_hyperbolic_sol_u$i")
end
```

![nonlinear_hyperbolic_sol_u1](https://user-images.githubusercontent.com/26853713/126457614-d19e7a4d-f9e3-4e78-b8ae-1e58114a744e.png)
![nonlinear_hyperbolic_sol_u2](https://user-images.githubusercontent.com/26853713/126457617-ee26c587-a97f-4a2e-b6b7-b326b1f117af.png)

0 comments on commit b13f7c0

Please sign in to comment.