Skip to content

Commit

Permalink
docs: add installation details
Browse files Browse the repository at this point in the history
  • Loading branch information
avik-pal committed Aug 24, 2024
1 parent 187debf commit c63942e
Show file tree
Hide file tree
Showing 4 changed files with 106 additions and 0 deletions.
1 change: 1 addition & 0 deletions .github/workflows/Documentation.yml
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@ on:

# Sets permissions of the GITHUB_TOKEN to allow deployment to GitHub Pages
permissions:
actions: write
contents: write
pages: write
id-token: write
Expand Down
1 change: 1 addition & 0 deletions docs/Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -3,3 +3,4 @@ Boltz = "4544d5e4-abc5-4dea-817f-29e4c205d9c8"
Documenter = "e30172f5-a6a5-5a46-863b-614d45cd2de4"
DocumenterVitepress = "4710194d-e776-4893-9690-8d956a29c365"
Lux = "b2108857-7c20-44ae-9111-449ecde12c47"
Random = "9a3f8284-a2c9-5f02-9a11-845980a1fd5c"
48 changes: 48 additions & 0 deletions docs/src/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,3 +44,51 @@ features:
link: https://sciml.ai/
---
```

## How to Install Boltz.jl?

Its easy to install Boltz.jl. Since Boltz.jl is registered in the Julia General registry,
you can simply run the following command in the Julia REPL:

```julia
julia> using Pkg
julia> Pkg.add("Boltz")
```

If you want to use the latest unreleased version of Boltz.jl, you can run the following
command: (in most cases the released version will be same as the version on github)

```julia
julia> using Pkg
julia> Pkg.add(url="https://github.com/LuxDL/Boltz.jl")
```

## Want GPU Support?

Install the following package(s):

:::code-group

```julia [NVIDIA GPUs]
using Pkg
Pkg.add("LuxCUDA")
# or
Pkg.add(["CUDA", "cuDNN"])
```

```julia [AMD ROCm GPUs]
using Pkg
Pkg.add("AMDGPU")
```

```julia [Metal M-Series GPUs]
using Pkg
Pkg.add("Metal")
```

```julia [Intel GPUs]
using Pkg
Pkg.add("oneAPI")
```

:::
56 changes: 56 additions & 0 deletions docs/src/tutorials/getting_started.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
# Getting Started

!!! tip "Prerequisites"

Here we assume that you are familiar with [`Lux.jl`](https://lux.csail.mit.edu/stable/).
If not please take a look at the
[Lux.jl tutoials](https://lux.csail.mit.edu/stable/tutorials/).

`Boltz.jl` is just like `Lux.jl` but comes with more "batteries included". Let's start by
defining an MLP model.

```@example getting_started
using Lux, Boltz, Random
```

## Multi-Layer Perceptron

If we were to do this in `Lux.jl` we would write the following:

```@example getting_started
model = Chain(
Dense(784, 256, relu),
Dense(256, 10)
)
```

But in `Boltz.jl` we can do this:

```@example getting_started
model = Layers.MLP(784, (256, 10), relu)
```

The `MLP` function is just a convenience wrapper around `Lux.Chain` that constructs a
multi-layer perceptron with the given number of layers and activation function.

## How about VGG?

!!! warning "Returned Values"

The returned value from `Vision` module functions are a 3 tuple of (model, ps, st).
The `ps` and `st` are the parameters and states of the model respectively.

Let's take a look at the `Vision` module. We can construct a VGG model with the
following code:

```@example getting_started
model, _, _ = Vision.VGG(13)
model
```

We can also load pretrained ImageNet weights using

```@example getting_started
model, _, _ = Vision.VGG(13; pretrained=true)
model
```

0 comments on commit c63942e

Please sign in to comment.