Please see https://github.com/simonw/llm-claude-3/ instead
Plugin for LLM adding support for Anthropic's Claude models.
Install this plugin in the same environment as LLM.
llm install llm-claude
You will need an API key from Anthropic. Request access at https://www.anthropic.com/earlyaccess then go to https://console.anthropic.com/account/keys
You can set that as an environment variable called ANTHROPIC_API_KEY
, or add it to the llm
set of saved keys using:
llm keys set claude
Enter key: <paste key here>
This plugin adds models called claude
and claude-instant
.
Anthropic describes them as:
two families of models, both of which support 100,000 token context windows:
- Claude Instant: low-latency, high throughput
- Claude: superior performance on tasks that require complex reasoning
You can query them like this:
llm -m claude-instant "Ten great names for a new space station"
llm -m claude "Compare and contrast the leadership styles of Abraham Lincoln and Boris Johnson."
max_tokens_to_sample
, default 10_000: The maximum number of tokens to generate before stopping
Use like this:
llm -m claude -o max_tokens_to_sample 20 "Sing me the alphabet"
Here is the alphabet song:
A B C D E F G
H I J
To set up this plugin locally, first checkout the code. Then create a new virtual environment:
cd llm-claude
python3 -m venv venv
source venv/bin/activate
Now install the dependencies and test dependencies:
pip install -e '.[test]'
To run the tests:
pytest