Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Automated] Merge release into main #235

Merged
merged 39 commits into from
Oct 19, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
39 commits
Select commit Hold shift + click to select a range
c7603eb
bump the version, test release to PyPi
ProKil May 28, 2024
ea4c128
Update README.md
ProKil May 28, 2024
57beef5
Update README.md
ProKil May 28, 2024
a2265f6
Update README.md
ProKil May 28, 2024
13a7721
Merge remote-tracking branch 'origin/main' into release
ProKil Jun 2, 2024
e798e8c
Merge remote-tracking branch 'origin/main' into release
ProKil Jun 14, 2024
db9ace3
bumpy version to 0.0.9
ProKil Jun 14, 2024
4d4ed5c
Update Sotopia presentation information in README.md
ProKil Jun 17, 2024
859405a
Merge branch 'main' into release
ProKil Jun 18, 2024
fa1a410
bump version to 0.0.10
ProKil Jun 18, 2024
d34115b
Merge remote-tracking branch 'origin/main' into release
ProKil Jun 27, 2024
ff58645
bump version
ProKil Jun 27, 2024
37ccb7b
add merge release back to main action
ProKil Jul 20, 2024
1c05438
change checkout v4->v3
ProKil Jul 20, 2024
e4f0a26
fix merge-back-to-main and pin mypy to <1.11.0
ProKil Jul 20, 2024
865e8b6
Merge branch 'main' into release
XuhuiZhou Aug 26, 2024
7bb0bce
Merge remote-tracking branch 'origin/main' into release
ProKil Sep 2, 2024
88d043d
merge bug fix
ProKil Sep 2, 2024
c9c411c
upgrade default model to handle bad-foratted outputs to gpt-4o-mini a…
yangalan123 Sep 5, 2024
fbe8410
update pull request -> pull request target
ProKil Sep 5, 2024
e3e5737
Merge branch 'release' of github.com:sotopia-lab/sotopia into release
ProKil Sep 5, 2024
78e8eb8
bump version
ProKil Sep 5, 2024
b8a6dbc
Add `bad_output_process_model` option and `use_fixed_model_version` o…
yangalan123 Sep 24, 2024
feef903
fix gpt-3.5
XuhuiZhou Sep 27, 2024
dc6db94
Merge branch 'main' into release
XuhuiZhou Sep 27, 2024
a73ff48
replace gpt3.5 turbo for tests
XuhuiZhou Sep 27, 2024
1923c18
update gpt-3.5-turbo to gpt-4o-mini
ProKil Oct 1, 2024
db8839e
bug fix for return fixed model version function
ProKil Oct 1, 2024
0fcfcb0
fix sampling error
XuhuiZhou Oct 4, 2024
cd31c72
fix rc.4
XuhuiZhou Oct 4, 2024
12e74a6
new tag
XuhuiZhou Oct 4, 2024
8606b17
Merge remote-tracking branch 'origin/main' into release
ProKil Oct 12, 2024
63eab3e
bump version
ProKil Oct 12, 2024
12060df
update workflow permission
ProKil Oct 12, 2024
0be9ea9
add why sotopia
XuhuiZhou Oct 12, 2024
1836791
improve the why sotopia
XuhuiZhou Oct 14, 2024
6916077
Merge remote-tracking branch 'origin/main' into release
ProKil Oct 14, 2024
3580bcb
bump version
ProKil Oct 14, 2024
2b80515
further add clarification to the custom models
XuhuiZhou Oct 19, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 5 additions & 1 deletion docs/pages/concepts/generation.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -47,4 +47,8 @@ In this example, we generate a list of the first `n` prime numbers with the `gpt

Apart from using api endpoints from LLM providers like OpenAI, Together AI, Azure, etc., you can also use custom model with OpenAI compatible endpoints.
You will need to set the model name to `custom/<model_name>@url`, and CUSTOM_API_KEY to the API key of the custom model.
For an example, check out `examples/generation_api/custom_model.py`.

For example, if you want to use the `llama3.2` model for an agent from [Meta](https://www.meta.com/llama/), and you host the model on [LiteLLM](https://github.com/BerriAI/litellm) proxy server (e.g., Proxy running on `http://0.0.0.0:4000`). Then you can set the model name to `model_name="custom/llama3.2:1b@http:0.0.0.0:4000"`
to call the model in the [`LLMAgent`](/python_API/agents/llm_agent#llmagent).

For more information, check out `examples/generation_api/custom_model.py`.
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[project]
name = "sotopia"
version = "0.1.1"
version = "0.1.2"
description = "A platform for simulating and evaluating social interaction."
authors = [
{ name = "Hao Zhu", email = "prokilchu@gmail.com" },
Expand Down
Loading