Skip to content

Commit

Permalink
[Automated] Merge release into main (#235)
Browse files Browse the repository at this point in the history
* bump the version, test release to PyPi

* Update README.md

* Update README.md

* Update README.md

* bumpy version to 0.0.9

* Update Sotopia presentation information in README.md

* bump version to 0.0.10

* bump version

* add merge release back to main action

* change checkout v4->v3

* fix merge-back-to-main and pin mypy to <1.11.0

* merge bug fix

* upgrade default model to handle bad-foratted outputs to gpt-4o-mini as gpt-3.5-turbo is deprecated (#183)

* update pull request -> pull request target

* bump version

* Add `bad_output_process_model` option and `use_fixed_model_version` option for all generation methods, to avoid future OpenAI API changes break Sotopia running. (#196)

* Two major updates: 1) add "bad_output_process_model" option to all `agenerate_xxx()` methods so users can decide which model to use for handling bad outputs. By default, this is set to be `gpt-4o-mini`. 2) add `use_fixed_model_version` option for all generation methods, as some fixed model version may no longer available in the future. Users should have the right to bypass the fixed model version mapping instead of getting stuck in an error. Document (`generation.md`) has been updated for these two major changes correspondingly.

* [autofix.ci] apply automated fixes

---------

Co-authored-by: Chenghao Yang <yangalan1996@gmail.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>

* fix gpt-3.5

* replace gpt3.5 turbo for tests

* update gpt-3.5-turbo to gpt-4o-mini

* bug fix for return fixed model version function

* fix sampling error

* fix rc.4

* new tag

* bump version

* update workflow permission

* add why sotopia

* improve the why sotopia

* bump version

* further add clarification to the custom models

---------

Co-authored-by: XuhuiZhou <zhouxuhui2018@gmail.com>
Co-authored-by: Chenghao (Alan) Yang <chenghao@uchicago.edu>
Co-authored-by: Chenghao Yang <yangalan1996@gmail.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
  • Loading branch information
5 people authored Oct 19, 2024
1 parent 4077048 commit cb6b70d
Show file tree
Hide file tree
Showing 2 changed files with 6 additions and 2 deletions.
6 changes: 5 additions & 1 deletion docs/pages/concepts/generation.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -47,4 +47,8 @@ In this example, we generate a list of the first `n` prime numbers with the `gpt

Apart from using api endpoints from LLM providers like OpenAI, Together AI, Azure, etc., you can also use custom model with OpenAI compatible endpoints.
You will need to set the model name to `custom/<model_name>@url`, and CUSTOM_API_KEY to the API key of the custom model.
For an example, check out `examples/generation_api/custom_model.py`.

For example, if you want to use the `llama3.2` model for an agent from [Meta](https://www.meta.com/llama/), and you host the model on [LiteLLM](https://github.com/BerriAI/litellm) proxy server (e.g., Proxy running on `http://0.0.0.0:4000`). Then you can set the model name to `model_name="custom/llama3.2:1b@http:0.0.0.0:4000"`
to call the model in the [`LLMAgent`](/python_API/agents/llm_agent#llmagent).

For more information, check out `examples/generation_api/custom_model.py`.
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[project]
name = "sotopia"
version = "0.1.1"
version = "0.1.2"
description = "A platform for simulating and evaluating social interaction."
authors = [
{ name = "Hao Zhu", email = "prokilchu@gmail.com" },
Expand Down

0 comments on commit cb6b70d

Please sign in to comment.