Skip to content

Commit

Permalink
Update README.md
Browse files Browse the repository at this point in the history
  • Loading branch information
xyproto authored Nov 14, 2024
1 parent 2794549 commit 5ac97ec
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -308,7 +308,7 @@ Using AI / LLMs / Ollama

* The `ollama` server must be running locally, or a `host:port` must be set in the `OLLAMA_HOST` environment variable.

Example use, using the default `tinyllama` model (will be downloaded at first use, the size is 637 MiB and it should run anywhere).
For example, using the default `tinyllama` model (will be downloaded at first use, the size is 637 MiB and it should run anywhere).

```
lua> ollama()
Expand Down Expand Up @@ -388,7 +388,7 @@ The experimental `prompt` format is very simple:
* The first line is the `content-type`.
* The second line is the Ollama model, such as `tinyllama:latest` or just `tinyllama`.
* The third line is blank.
* The rest of the lines is the prompt that will be passed to the large language model.
* The rest of the lines are the prompt that will be passed to the large language model.

Note that the Ollama server must be fast enough to reply within 10 seconds for this to work!
`tinyllama` or `gemma` should be more than fast enough with a good GPU or on an M1/M2/M3 processor.
Expand Down

0 comments on commit 5ac97ec

Please sign in to comment.