diff --git a/.github/workflows/Documentation.yml b/.github/workflows/Documentation.yml index d2749c4..5ba9cc3 100644 --- a/.github/workflows/Documentation.yml +++ b/.github/workflows/Documentation.yml @@ -16,6 +16,7 @@ on: # Sets permissions of the GITHUB_TOKEN to allow deployment to GitHub Pages permissions: + actions: write contents: write pages: write id-token: write diff --git a/docs/Project.toml b/docs/Project.toml index 8567a5c..61d676d 100644 --- a/docs/Project.toml +++ b/docs/Project.toml @@ -3,3 +3,4 @@ Boltz = "4544d5e4-abc5-4dea-817f-29e4c205d9c8" Documenter = "e30172f5-a6a5-5a46-863b-614d45cd2de4" DocumenterVitepress = "4710194d-e776-4893-9690-8d956a29c365" Lux = "b2108857-7c20-44ae-9111-449ecde12c47" +Random = "9a3f8284-a2c9-5f02-9a11-845980a1fd5c" diff --git a/docs/src/index.md b/docs/src/index.md index 41ff71c..054d803 100644 --- a/docs/src/index.md +++ b/docs/src/index.md @@ -44,3 +44,51 @@ features: link: https://sciml.ai/ --- ``` + +## How to Install Boltz.jl? + +Its easy to install Boltz.jl. Since Boltz.jl is registered in the Julia General registry, +you can simply run the following command in the Julia REPL: + +```julia +julia> using Pkg +julia> Pkg.add("Boltz") +``` + +If you want to use the latest unreleased version of Boltz.jl, you can run the following +command: (in most cases the released version will be same as the version on github) + +```julia +julia> using Pkg +julia> Pkg.add(url="https://github.com/LuxDL/Boltz.jl") +``` + +## Want GPU Support? + +Install the following package(s): + +:::code-group + +```julia [NVIDIA GPUs] +using Pkg +Pkg.add("LuxCUDA") +# or +Pkg.add(["CUDA", "cuDNN"]) +``` + +```julia [AMD ROCm GPUs] +using Pkg +Pkg.add("AMDGPU") +``` + +```julia [Metal M-Series GPUs] +using Pkg +Pkg.add("Metal") +``` + +```julia [Intel GPUs] +using Pkg +Pkg.add("oneAPI") +``` + +::: diff --git a/docs/src/tutorials/getting_started.md b/docs/src/tutorials/getting_started.md index e69de29..8aa3b05 100644 --- a/docs/src/tutorials/getting_started.md +++ b/docs/src/tutorials/getting_started.md @@ -0,0 +1,56 @@ +# Getting Started + +!!! tip "Prerequisites" + + Here we assume that you are familiar with [`Lux.jl`](https://lux.csail.mit.edu/stable/). + If not please take a look at the + [Lux.jl tutoials](https://lux.csail.mit.edu/stable/tutorials/). + +`Boltz.jl` is just like `Lux.jl` but comes with more "batteries included". Let's start by +defining an MLP model. + +```@example getting_started +using Lux, Boltz, Random +``` + +## Multi-Layer Perceptron + +If we were to do this in `Lux.jl` we would write the following: + +```@example getting_started +model = Chain( + Dense(784, 256, relu), + Dense(256, 10) +) +``` + +But in `Boltz.jl` we can do this: + +```@example getting_started +model = Layers.MLP(784, (256, 10), relu) +``` + +The `MLP` function is just a convenience wrapper around `Lux.Chain` that constructs a +multi-layer perceptron with the given number of layers and activation function. + +## How about VGG? + +!!! warning "Returned Values" + + The returned value from `Vision` module functions are a 3 tuple of (model, ps, st). + The `ps` and `st` are the parameters and states of the model respectively. + +Let's take a look at the `Vision` module. We can construct a VGG model with the +following code: + +```@example getting_started +model, _, _ = Vision.VGG(13) +model +``` + +We can also load pretrained ImageNet weights using + +```@example getting_started +model, _, _ = Vision.VGG(13; pretrained=true) +model +```