Skip to content

Commit

Permalink
update README.md with default parameter override section, formatting (#…
Browse files Browse the repository at this point in the history
  • Loading branch information
roryl23 authored Nov 18, 2023
1 parent cfee12d commit 49dd2ba
Show file tree
Hide file tree
Showing 2 changed files with 29 additions and 8 deletions.
27 changes: 24 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,12 +24,12 @@ using Pkg; Pkg.add("OpenAI")
__⚠️ We strongly suggest setting up your API key as an ENV variable__.

```julia
secret_key = "PAST_YOUR_SECRETE_KEY_HERE"
secret_key = ENV["OPENAI_API_KEY"]
model = "gpt-3.5-turbo"
prompt = "Say \"this is a test\""

r = create_chat(
secret_key,
secret_key,
model,
[Dict("role" => "user", "content"=> prompt)]
)
Expand All @@ -39,7 +39,28 @@ returns
```julia
"This is a test."
```
For more use cases [see tests](https://github.com/rory-linehan/OpenAI.jl/tree/main/test).

# Overriding default parameters

If you have a non-standard setup, such as a local LLM or third-party that
conforms to the OpenAI interface, you can override parameters using the `OpenAIProvider`
struct in your application like this:

```julia
using OpenAI
provider = OpenAI.OpenAIProvider(
api_key=ENV["OPENAI_API_KEY"],
base_url=ENV["OPENAI_BASE_URL_OVERRIDE"]
)
response = create_chat(
provider,
"gpt-3.5-turbo",
[Dict("role" => "user", "content" => "Write some ancient Greek poetry")]
)
```


For more use cases [see tests](https://github.com/JuliaML/OpenAI.jl/tree/main/test).

## Feature requests

Expand Down
10 changes: 5 additions & 5 deletions src/OpenAI.jl
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ end
Default provider for OpenAI API requests.
"""
const DEFAULT_PROVIDER = let
const DEFAULT_PROVIDER = let
api_key = get(ENV, "OPENAI_API_KEY", nothing)
if api_key === nothing
OpenAIProvider()
Expand Down Expand Up @@ -157,7 +157,7 @@ function _request(api::AbstractString, provider::AbstractOpenAIProvider, api_key
lines = split(body, "\n") # split body into lines

# throw out empty lines, skip "data: [DONE] bits
lines = filter(x -> !isempty(x) && !occursin("[DONE]", x), lines)
lines = filter(x -> !isempty(x) && !occursin("[DONE]", x), lines)

# read each line, which looks like "data: {<json elements>}"
parsed = map(line -> JSON3.read(line[6:end]), lines)
Expand Down Expand Up @@ -275,7 +275,7 @@ The response body will reflect the chunked nature of the response, so some reass
message returned by the API.
```julia
julia> CC = create_chat(key, "gpt-3.5-turbo",
julia> CC = create_chat(key, "gpt-3.5-turbo",
[Dict("role" => "user", "content"=> "What continent is New York in? Two word answer.")],
streamcallback = x->println(Dates.now()));
2023-03-27T12:34:50.428
Expand Down Expand Up @@ -434,7 +434,7 @@ function get_usage_status(provider::OpenAIProvider; numofdays::Int=99)

# Get total quota from subscription_url
subscription_url = "$base_url/dashboard/billing/subscription"
subscrip = HTTP.get(subscription_url, headers = auth_header(provider))
subscrip = HTTP.get(subscription_url, headers=auth_header(provider))
resp = OpenAIResponse(subscrip.status, JSON3.read(subscrip.body))
# TODO: catch error
quota = resp.response.hard_limit_usd
Expand All @@ -443,7 +443,7 @@ function get_usage_status(provider::OpenAIProvider; numofdays::Int=99)
start_date = today()
end_date = today() + Day(numofdays)
billing_url = "$base_url/dashboard/billing/usage?start_date=$(start_date)&end_date=$(end_date)"
billing = HTTP.get(billing_url, headers = auth_header(provider))
billing = HTTP.get(billing_url, headers=auth_header(provider))
resp = OpenAIResponse(billing.status, JSON3.read(billing.body))
usage = resp.response.total_usage / 100
daily_costs = resp.response.daily_costs
Expand Down

0 comments on commit 49dd2ba

Please sign in to comment.