Skip to content

Commit

Permalink
docs: update broken links and fail on linkcheck
Browse files Browse the repository at this point in the history
  • Loading branch information
avik-pal committed Jul 28, 2024
1 parent 2c09692 commit 8eb392a
Show file tree
Hide file tree
Showing 9 changed files with 17 additions and 16 deletions.
1 change: 0 additions & 1 deletion docs/make.jl
Original file line number Diff line number Diff line change
Expand Up @@ -90,7 +90,6 @@ makedocs(; sitename="Lux.jl Docs",
repo="github.com/LuxDL/Lux.jl", devbranch="main", devurl="dev",
deploy_url="https://lux.csail.mit.edu", deploy_decision),
draft=false,
warnonly=:linkcheck, # Lately it has been failing quite a lot but those links are actually fine
pages)

deploydocs(; repo="github.com/LuxDL/Lux.jl.git",
Expand Down
2 changes: 1 addition & 1 deletion docs/src/manual/autodiff.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ Lux. Additionally, we provide some convenience functions for working with AD.
| [`ForwardDiff.jl`](https://github.com/JuliaDiff/ForwardDiff.jl) | Forward | ✔️ | ✔️ | ✔️ | Tier I |
| [`ReverseDiff.jl`](https://github.com/JuliaDiff/ReverseDiff.jl) | Reverse | ✔️ ||| Tier II |
| [`Tracker.jl`](https://github.com/FluxML/Tracker.jl) | Reverse | ✔️ | ✔️ || Tier II |
| [`Tapir.jl`](https://github.com/withbayes/Tapir.jl) | Reverse |[^q] ||| Tier III |
| [`Tapir.jl`](https://github.com/compintell/Tapir.jl) | Reverse |[^q] ||| Tier III |
| [`Diffractor.jl`](https://github.com/JuliaDiff/Diffractor.jl) | Forward |[^q] |[^q] |[^q] | Tier III |

[^e]: Currently Enzyme outperforms other AD packages in terms of CPU performance. However,
Expand Down
6 changes: 3 additions & 3 deletions docs/src/manual/nested_autodiff.md
Original file line number Diff line number Diff line change
Expand Up @@ -192,9 +192,9 @@ nothing; # hide

Hutchinson Trace Estimation often shows up in machine learning literature to provide a fast
estimate of the trace of a Jacobian Matrix. This is based off of
[Hutchinson 1990](https://www.researchgate.net/publication/243668757_A_Stochastic_Estimator_of_the_Trace_of_the_Influence_Matrix_for_Laplacian_Smoothing_Splines) which
computes the estimated trace of a matrix ``A \in \mathbb{R}^{D \times D}`` using random
vectors ``v \in \mathbb{R}^{D}`` s.t. ``\mathbb{E}\left[v v^T\right] = I``.
[Hutchinson 1990](https://www.nowozin.net/sebastian/blog/thoughts-on-trace-estimation-in-deep-learning.html)
which computes the estimated trace of a matrix ``A \in \mathbb{R}^{D \times D}`` using
random vectors ``v \in \mathbb{R}^{D}`` s.t. ``\mathbb{E}\left[v v^T\right] = I``.

```math
\text{Tr}(A) = \mathbb{E}\left[v^T A v\right] = \frac{1}{V} \sum_{i = 1}^V v_i^T A v_i
Expand Down
2 changes: 1 addition & 1 deletion docs/src/manual/performance_pitfalls.md
Original file line number Diff line number Diff line change
Expand Up @@ -67,4 +67,4 @@ GPUArraysCore.allowscalar(false)
`Lux.jl` is integrated with `DispatchDoctor.jl` to catch type instabilities. You can easily
enable it by setting the `instability_check` preference. This will help you catch type
instabilities in your code. For more information on how to set preferences, check out
[`set_dispatch_doctor_preferences`](@ref).
[`Lux.set_dispatch_doctor_preferences!`](@ref).
4 changes: 2 additions & 2 deletions docs/src/manual/preferences.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,8 +50,8 @@ By default, both of these preferences are set to `false`.
## [Dispatch Doctor](@id dispatch-doctor-preference)

1. `instability_check` - Preference controlling the dispatch doctor. See the documentation
on [`set_dispatch_doctor_preferences!`](@ref) for more details. The preferences need to
be set for `LuxCore` and `LuxLib` packages. Both of them default to `disable`.
on [`Lux.set_dispatch_doctor_preferences!`](@ref) for more details. The preferences need
to be set for `LuxCore` and `LuxLib` packages. Both of them default to `disable`.
- Setting the `LuxCore` preference sets the check at the level of `LuxCore.apply`. This
essentially activates the dispatch doctor for all Lux layers.
- Setting the `LuxLib` preference sets the check at the level of functional layer of
Expand Down
2 changes: 1 addition & 1 deletion examples/Basics/main.jl
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
# This is a quick intro to [Lux](https://github.com/LuxDL/Lux.jl) loosely based on:
#
# 1. [PyTorch's tutorial](https://pytorch.org/tutorials/beginner/deep_learning_60min_blitz.html).
# 2. [Flux's tutorial](https://fluxml.ai/Flux.jl/stable/tutorials/2020-09-15-deep-learning-flux/).
# 2. Flux's tutorial (the link for which has now been lost to abyss).
# 3. [Jax's tutorial](https://jax.readthedocs.io/en/latest/jax-101/index.html).
#
# It introduces basic Julia programming, as well `Zygote`, a source-to-source automatic
Expand Down
10 changes: 6 additions & 4 deletions examples/BayesianNN/main.jl
Original file line number Diff line number Diff line change
@@ -1,11 +1,13 @@
# # Bayesian Neural Network

# We borrow this tutorial from the
# [official Turing Docs](https://turinglang.org/stable/tutorials/03-bayesian-neural-network/). We
# will show how the explicit parameterization of Lux enables first-class composability with
# packages which expect flattened out parameter vectors.
# [official Turing Docs](https://turinglang.org/docs/tutorials/03-bayesian-neural-network/index.html).
# We will show how the explicit parameterization of Lux enables first-class composability
# with packages which expect flattened out parameter vectors.

# We will use [Turing.jl](https://turinglang.org/stable/) with [Lux.jl](https://lux.csail.mit.edu/)
# Note: The tutorial in the official Turing docs is now using Lux instead of Flux.

# We will use [Turing.jl](https://turinglang.org/) with [Lux.jl](https://lux.csail.mit.edu/)
# to implement implementing a classification algorithm. Lets start by importing the relevant
# libraries.

Expand Down
4 changes: 2 additions & 2 deletions examples/SymbolicOptimalControl/main.jl
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,8 @@

# This tutorial is based on [SciMLSensitivity.jl tutorial](https://docs.sciml.ai/SciMLSensitivity/stable/examples/optimal_control/optimal_control/).
# Instead of using a classical NN architecture, here we will combine the NN with a symbolic
# expression from [DynamicExpressions.jl](https://symbolicml.org/DynamicExpressions.jl) (the
# symbolic engine behind [SymbolicRegression.jl](https://astroautomata.com/SymbolicRegression.jl)
# expression from [DynamicExpressions.jl](https://symbolicml.org/DynamicExpressions.jl/) (the
# symbolic engine behind [SymbolicRegression.jl](https://astroautomata.com/SymbolicRegression.jl/)
# and [PySR](https://github.com/MilesCranmer/PySR/)).

# Here we will solve a classic optimal control problem with a universal differential
Expand Down
2 changes: 1 addition & 1 deletion src/helpers/losses.jl
Original file line number Diff line number Diff line change
Expand Up @@ -595,7 +595,7 @@ true
## Special Note
This function takes any of the
[`LossFunctions.jl`](https://juliaml.github.io/LossFunctions.jl/stable) public functions
[`LossFunctions.jl`](https://juliaml.github.io/LossFunctions.jl/stable/) public functions
into the Lux Losses API with efficient aggregation.
"""
@concrete struct GenericLossFunction <: AbstractLossFunction
Expand Down

1 comment on commit 8eb392a

@github-actions
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Benchmark Results

Benchmark suite Current: 8eb392a Previous: f3159bd Ratio
Dense(2 => 2)/cpu/reverse/ReverseDiff (compiled)/(2, 128) 3679.375 ns 4340.625 ns 0.85
Dense(2 => 2)/cpu/reverse/Zygote/(2, 128) 7338 ns 8991.833333333334 ns 0.82
Dense(2 => 2)/cpu/reverse/Tracker/(2, 128) 20558 ns 25788 ns 0.80
Dense(2 => 2)/cpu/reverse/ReverseDiff/(2, 128) 9722.2 ns 12129.333333333334 ns 0.80
Dense(2 => 2)/cpu/reverse/Flux/(2, 128) 9037 ns 10748.1 ns 0.84
Dense(2 => 2)/cpu/reverse/SimpleChains/(2, 128) 4463.9375 ns 4954.25 ns 0.90
Dense(2 => 2)/cpu/reverse/Enzyme/(2, 128) 4666.25 ns 5425.125 ns 0.86
Dense(2 => 2)/cpu/forward/NamedTuple/(2, 128) 1114.3974358974358 ns 1359.1832061068703 ns 0.82
Dense(2 => 2)/cpu/forward/ComponentArray/(2, 128) 1184.1407407407407 ns 1401.2664233576643 ns 0.85
Dense(2 => 2)/cpu/forward/Flux/(2, 128) 1810.445652173913 ns 2176.5833333333335 ns 0.83
Dense(2 => 2)/cpu/forward/SimpleChains/(2, 128) 180.16946778711485 ns 179.46778711484595 ns 1.00
Dense(20 => 20)/cpu/reverse/ReverseDiff (compiled)/(20, 128) 17292 ns 21761 ns 0.79
Dense(20 => 20)/cpu/reverse/Zygote/(20, 128) 16852 ns 20408 ns 0.83
Dense(20 => 20)/cpu/reverse/Tracker/(20, 128) 36358 ns 42780 ns 0.85
Dense(20 => 20)/cpu/reverse/ReverseDiff/(20, 128) 28133 ns 32301 ns 0.87
Dense(20 => 20)/cpu/reverse/Flux/(20, 128) 20148 ns 22652 ns 0.89
Dense(20 => 20)/cpu/reverse/SimpleChains/(20, 128) 17052 ns 18654 ns 0.91
Dense(20 => 20)/cpu/reverse/Enzyme/(20, 128) 25548 ns 28283 ns 0.90
Dense(20 => 20)/cpu/forward/NamedTuple/(20, 128) 3795.875 ns 4363.8125 ns 0.87
Dense(20 => 20)/cpu/forward/ComponentArray/(20, 128) 3908.625 ns 4459.5 ns 0.88
Dense(20 => 20)/cpu/forward/Flux/(20, 128) 4793.285714285715 ns 5581.928571428572 ns 0.86
Dense(20 => 20)/cpu/forward/SimpleChains/(20, 128) 1655.1 ns 1850.5 ns 0.89
Conv((3, 3), 3 => 3)/cpu/reverse/ReverseDiff (compiled)/(64, 64, 3, 128) 38607620.5 ns 45154820.5 ns 0.86
Conv((3, 3), 3 => 3)/cpu/reverse/Zygote/(64, 64, 3, 128) 58062464 ns 62378075.5 ns 0.93
Conv((3, 3), 3 => 3)/cpu/reverse/Tracker/(64, 64, 3, 128) 66821661 ns 82237576.5 ns 0.81
Conv((3, 3), 3 => 3)/cpu/reverse/ReverseDiff/(64, 64, 3, 128) 79893843 ns 92405677.5 ns 0.86
Conv((3, 3), 3 => 3)/cpu/reverse/Flux/(64, 64, 3, 128) 72220779 ns 78467695 ns 0.92
Conv((3, 3), 3 => 3)/cpu/reverse/SimpleChains/(64, 64, 3, 128) 11898610 ns 11963195.5 ns 0.99
Conv((3, 3), 3 => 3)/cpu/reverse/Enzyme/(64, 64, 3, 128) 79046441.5 ns 91650799 ns 0.86
Conv((3, 3), 3 => 3)/cpu/forward/NamedTuple/(64, 64, 3, 128) 7662156.5 ns 7710446.5 ns 0.99
Conv((3, 3), 3 => 3)/cpu/forward/ComponentArray/(64, 64, 3, 128) 7540729 ns 7584631 ns 0.99
Conv((3, 3), 3 => 3)/cpu/forward/Flux/(64, 64, 3, 128) 10363028 ns 12087804.5 ns 0.86
Conv((3, 3), 3 => 3)/cpu/forward/SimpleChains/(64, 64, 3, 128) 6374425 ns 6389570.5 ns 1.00
vgg16/cpu/reverse/Zygote/(32, 32, 3, 16) 688891366 ns 691470864.5 ns 1.00
vgg16/cpu/reverse/Zygote/(32, 32, 3, 64) 2580089696 ns 2612057812 ns 0.99
vgg16/cpu/reverse/Zygote/(32, 32, 3, 2) 140409324.5 ns 151710246 ns 0.93
vgg16/cpu/reverse/Tracker/(32, 32, 3, 16) 767316784 ns 847276341 ns 0.91
vgg16/cpu/reverse/Tracker/(32, 32, 3, 64) 2911634396 ns 2998224779 ns 0.97
vgg16/cpu/reverse/Tracker/(32, 32, 3, 2) 220564407 ns 231162547 ns 0.95
vgg16/cpu/reverse/Flux/(32, 32, 3, 16) 660516925.5 ns 712811787.5 ns 0.93
vgg16/cpu/reverse/Flux/(32, 32, 3, 64) 2628665253 ns 2949177680 ns 0.89
vgg16/cpu/reverse/Flux/(32, 32, 3, 2) 134431743 ns 139771741 ns 0.96
vgg16/cpu/forward/NamedTuple/(32, 32, 3, 16) 172684244 ns 175403229 ns 0.98
vgg16/cpu/forward/NamedTuple/(32, 32, 3, 64) 660049856.5 ns 643986392.5 ns 1.02
vgg16/cpu/forward/NamedTuple/(32, 32, 3, 2) 34254472 ns 45299778 ns 0.76
vgg16/cpu/forward/ComponentArray/(32, 32, 3, 16) 163069879 ns 165572824 ns 0.98
vgg16/cpu/forward/ComponentArray/(32, 32, 3, 64) 639471793 ns 648012066 ns 0.99
vgg16/cpu/forward/ComponentArray/(32, 32, 3, 2) 29383379 ns 35881786 ns 0.82
vgg16/cpu/forward/Flux/(32, 32, 3, 16) 201738097 ns 228284686 ns 0.88
vgg16/cpu/forward/Flux/(32, 32, 3, 64) 722144211.5 ns 824337722 ns 0.88
vgg16/cpu/forward/Flux/(32, 32, 3, 2) 35370961 ns 36049946 ns 0.98
Conv((3, 3), 64 => 64)/cpu/reverse/ReverseDiff (compiled)/(64, 64, 64, 128) 1247821765 ns 1305860557.5 ns 0.96
Conv((3, 3), 64 => 64)/cpu/reverse/Zygote/(64, 64, 64, 128) 1872950778 ns 1872497449 ns 1.00
Conv((3, 3), 64 => 64)/cpu/reverse/Tracker/(64, 64, 64, 128) 2218972759.5 ns 2342556041 ns 0.95
Conv((3, 3), 64 => 64)/cpu/reverse/ReverseDiff/(64, 64, 64, 128) 2061757737.5 ns 2488038610 ns 0.83
Conv((3, 3), 64 => 64)/cpu/reverse/Flux/(64, 64, 64, 128) 1829486370 ns 1951631856 ns 0.94
Conv((3, 3), 64 => 64)/cpu/reverse/Enzyme/(64, 64, 64, 128) 2001768856 ns 2087636119 ns 0.96
Conv((3, 3), 64 => 64)/cpu/forward/NamedTuple/(64, 64, 64, 128) 329179369 ns 338033674 ns 0.97
Conv((3, 3), 64 => 64)/cpu/forward/ComponentArray/(64, 64, 64, 128) 328423712 ns 332696625 ns 0.99
Conv((3, 3), 64 => 64)/cpu/forward/Flux/(64, 64, 64, 128) 345125201 ns 359960422 ns 0.96
Conv((3, 3), 1 => 1)/cpu/reverse/ReverseDiff (compiled)/(64, 64, 1, 128) 11790993.5 ns 11686611 ns 1.01
Conv((3, 3), 1 => 1)/cpu/reverse/Zygote/(64, 64, 1, 128) 18100299 ns 17986098.5 ns 1.01
Conv((3, 3), 1 => 1)/cpu/reverse/Tracker/(64, 64, 1, 128) 18996442 ns 19052140 ns 1.00
Conv((3, 3), 1 => 1)/cpu/reverse/ReverseDiff/(64, 64, 1, 128) 23733608 ns 23778464 ns 1.00
Conv((3, 3), 1 => 1)/cpu/reverse/Flux/(64, 64, 1, 128) 17885808.5 ns 17775034.5 ns 1.01
Conv((3, 3), 1 => 1)/cpu/reverse/SimpleChains/(64, 64, 1, 128) 1164034 ns 1157462 ns 1.01
Conv((3, 3), 1 => 1)/cpu/reverse/Enzyme/(64, 64, 1, 128) 22919399 ns 22924969.5 ns 1.00
Conv((3, 3), 1 => 1)/cpu/forward/NamedTuple/(64, 64, 1, 128) 2345181 ns 2406465 ns 0.97
Conv((3, 3), 1 => 1)/cpu/forward/ComponentArray/(64, 64, 1, 128) 2211761 ns 2242648.5 ns 0.99
Conv((3, 3), 1 => 1)/cpu/forward/Flux/(64, 64, 1, 128) 2058975 ns 2065648 ns 1.00
Conv((3, 3), 1 => 1)/cpu/forward/SimpleChains/(64, 64, 1, 128) 195266 ns 197910 ns 0.99
Dense(200 => 200)/cpu/reverse/ReverseDiff (compiled)/(200, 128) 291195 ns 290874 ns 1.00
Dense(200 => 200)/cpu/reverse/Zygote/(200, 128) 263513 ns 264424 ns 1.00
Dense(200 => 200)/cpu/reverse/Tracker/(200, 128) 358852 ns 364661 ns 0.98
Dense(200 => 200)/cpu/reverse/ReverseDiff/(200, 128) 404647 ns 405888 ns 1.00
Dense(200 => 200)/cpu/reverse/Flux/(200, 128) 271508 ns 272550 ns 1.00
Dense(200 => 200)/cpu/reverse/SimpleChains/(200, 128) 405448 ns 408042 ns 0.99
Dense(200 => 200)/cpu/reverse/Enzyme/(200, 128) 394609 ns 394657 ns 1.00
Dense(200 => 200)/cpu/forward/NamedTuple/(200, 128) 80831 ns 80941 ns 1.00
Dense(200 => 200)/cpu/forward/ComponentArray/(200, 128) 81954 ns 81403 ns 1.01
Dense(200 => 200)/cpu/forward/Flux/(200, 128) 86211 ns 86461 ns 1.00
Dense(200 => 200)/cpu/forward/SimpleChains/(200, 128) 104485 ns 104525 ns 1.00
Conv((3, 3), 16 => 16)/cpu/reverse/ReverseDiff (compiled)/(64, 64, 16, 128) 187479900 ns 206243721 ns 0.91
Conv((3, 3), 16 => 16)/cpu/reverse/Zygote/(64, 64, 16, 128) 328956491 ns 327524051.5 ns 1.00
Conv((3, 3), 16 => 16)/cpu/reverse/Tracker/(64, 64, 16, 128) 419428117 ns 410738774 ns 1.02
Conv((3, 3), 16 => 16)/cpu/reverse/ReverseDiff/(64, 64, 16, 128) 479472814.5 ns 439152524.5 ns 1.09
Conv((3, 3), 16 => 16)/cpu/reverse/Flux/(64, 64, 16, 128) 380882756 ns 382717413 ns 1.00
Conv((3, 3), 16 => 16)/cpu/reverse/SimpleChains/(64, 64, 16, 128) 306226546 ns 325870578.5 ns 0.94
Conv((3, 3), 16 => 16)/cpu/reverse/Enzyme/(64, 64, 16, 128) 468205037 ns 455578258 ns 1.03
Conv((3, 3), 16 => 16)/cpu/forward/NamedTuple/(64, 64, 16, 128) 47148555 ns 47303828.5 ns 1.00
Conv((3, 3), 16 => 16)/cpu/forward/ComponentArray/(64, 64, 16, 128) 46601077.5 ns 46793042.5 ns 1.00
Conv((3, 3), 16 => 16)/cpu/forward/Flux/(64, 64, 16, 128) 59226820 ns 57476949 ns 1.03
Conv((3, 3), 16 => 16)/cpu/forward/SimpleChains/(64, 64, 16, 128) 28155899 ns 27955898 ns 1.01
Dense(2000 => 2000)/cpu/reverse/ReverseDiff (compiled)/(2000, 128) 18845148.5 ns 18906265 ns 1.00
Dense(2000 => 2000)/cpu/reverse/Zygote/(2000, 128) 19457185 ns 19556819 ns 0.99
Dense(2000 => 2000)/cpu/reverse/Tracker/(2000, 128) 23071936.5 ns 23268022 ns 0.99
Dense(2000 => 2000)/cpu/reverse/ReverseDiff/(2000, 128) 23991051.5 ns 24118160 ns 0.99
Dense(2000 => 2000)/cpu/reverse/Flux/(2000, 128) 19527991.5 ns 19557901 ns 1.00
Dense(2000 => 2000)/cpu/reverse/Enzyme/(2000, 128) 20815286.5 ns 20840171 ns 1.00
Dense(2000 => 2000)/cpu/forward/NamedTuple/(2000, 128) 6510728 ns 6539910.5 ns 1.00
Dense(2000 => 2000)/cpu/forward/ComponentArray/(2000, 128) 6483627 ns 6488719 ns 1.00
Dense(2000 => 2000)/cpu/forward/Flux/(2000, 128) 6472777 ns 6511897 ns 0.99

This comment was automatically generated by workflow using github-action-benchmark.

Please sign in to comment.