Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add vllm deploy #45

Merged
merged 12 commits into from
Oct 11, 2023
Merged

Add vllm deploy #45

merged 12 commits into from
Oct 11, 2023

Conversation

lwaekfjlk
Copy link
Member

Closes #43

πŸ“‘ Description

βœ… Checks

  • My pull request adheres to the code style of this project
  • My code requires changes to the documentation
  • I have updated the documentation as required
  • All the tests have passed
  • Branch name follows type/descript (e.g. feature/add-llm-agents)
  • Ready for code review

β„Ή Additional Information

@lwaekfjlk
Copy link
Member Author

We can use vllm to support high-throughout inference and deployment.

@lwaekfjlk lwaekfjlk merged commit 5dcbdb8 into main Oct 11, 2023
1 of 3 checks passed
lwaekfjlk added a commit that referenced this pull request Nov 17, 2023
* support qlora

* upload dummy conversation data

* delete doc and docker

* update pyproject pip install package

* continue cleaning

* delete more files

* delete a format

* add llm_deploy

* add testing scripts

* update deployment readme

* update readme and fix some bug

* finalize the inference and deployment based on vllm
lwaekfjlk added a commit that referenced this pull request Mar 14, 2024
* support qlora

* upload dummy conversation data

* delete doc and docker

* update pyproject pip install package

* continue cleaning

* delete more files

* delete a format

* add llm_deploy

* add testing scripts

* update deployment readme

* update readme and fix some bug

* finalize the inference and deployment based on vllm
lwaekfjlk added a commit that referenced this pull request Mar 14, 2024
* support qlora

* upload dummy conversation data

* delete doc and docker

* update pyproject pip install package

* continue cleaning

* delete more files

* delete a format

* add llm_deploy

* add testing scripts

* update deployment readme

* update readme and fix some bug

* finalize the inference and deployment based on vllm
lwaekfjlk added a commit that referenced this pull request Mar 14, 2024
* support qlora

* upload dummy conversation data

* delete doc and docker

* update pyproject pip install package

* continue cleaning

* delete more files

* delete a format

* add llm_deploy

* add testing scripts

* update deployment readme

* update readme and fix some bug

* finalize the inference and deployment based on vllm

(cherry picked from commit 6fc08eb)
lwaekfjlk added a commit that referenced this pull request Mar 14, 2024
* support qlora

* upload dummy conversation data

* delete doc and docker

* update pyproject pip install package

* continue cleaning

* delete more files

* delete a format

* add llm_deploy

* add testing scripts

* update deployment readme

* update readme and fix some bug

* finalize the inference and deployment based on vllm

(cherry picked from commit 6fc08eb)
lwaekfjlk added a commit that referenced this pull request Mar 14, 2024
* support qlora

* upload dummy conversation data

* delete doc and docker

* update pyproject pip install package

* continue cleaning

* delete more files

* delete a format

* add llm_deploy

* add testing scripts

* update deployment readme

* update readme and fix some bug

* finalize the inference and deployment based on vllm

(cherry picked from commit 6fc08eb)
Signed-off-by: Haofei Yu <1125027232@qq.com>
lwaekfjlk added a commit that referenced this pull request Mar 14, 2024
* support qlora

* upload dummy conversation data

* delete doc and docker

* update pyproject pip install package

* continue cleaning

* delete more files

* delete a format

* add llm_deploy

* add testing scripts

* update deployment readme

* update readme and fix some bug

* finalize the inference and deployment based on vllm

(cherry picked from commit 6fc08eb)
lwaekfjlk added a commit that referenced this pull request Mar 14, 2024
* support qlora

* upload dummy conversation data

* delete doc and docker

* update pyproject pip install package

* continue cleaning

* delete more files

* delete a format

* add llm_deploy

* add testing scripts

* update deployment readme

* update readme and fix some bug

* finalize the inference and deployment based on vllm

(cherry picked from commit 6fc08eb)
lwaekfjlk added a commit that referenced this pull request Mar 14, 2024
* support qlora

* upload dummy conversation data

* delete doc and docker

* update pyproject pip install package

* continue cleaning

* delete more files

* delete a format

* add llm_deploy

* add testing scripts

* update deployment readme

* update readme and fix some bug

* finalize the inference and deployment based on vllm

(cherry picked from commit 6fc08eb)
lwaekfjlk added a commit that referenced this pull request Mar 14, 2024
* support qlora

* upload dummy conversation data

* delete doc and docker

* update pyproject pip install package

* continue cleaning

* delete more files

* delete a format

* add llm_deploy

* add testing scripts

* update deployment readme

* update readme and fix some bug

* finalize the inference and deployment based on vllm

(cherry picked from commit 6fc08eb)
lwaekfjlk added a commit that referenced this pull request Mar 14, 2024
* support qlora

* upload dummy conversation data

* delete doc and docker

* update pyproject pip install package

* continue cleaning

* delete more files

* delete a format

* add llm_deploy

* add testing scripts

* update deployment readme

* update readme and fix some bug

* finalize the inference and deployment based on vllm

(cherry picked from commit 6fc08eb)
lwaekfjlk added a commit that referenced this pull request Mar 14, 2024
* support qlora

* upload dummy conversation data

* delete doc and docker

* update pyproject pip install package

* continue cleaning

* delete more files

* delete a format

* add llm_deploy

* add testing scripts

* update deployment readme

* update readme and fix some bug

* finalize the inference and deployment based on vllm
lwaekfjlk added a commit that referenced this pull request Mar 14, 2024
* support qlora

* upload dummy conversation data

* delete doc and docker

* update pyproject pip install package

* continue cleaning

* delete more files

* delete a format

* add llm_deploy

* add testing scripts

* update deployment readme

* update readme and fix some bug

* finalize the inference and deployment based on vllm
lwaekfjlk added a commit that referenced this pull request Mar 14, 2024
* support qlora

* upload dummy conversation data

* delete doc and docker

* update pyproject pip install package

* continue cleaning

* delete more files

* delete a format

* add llm_deploy

* add testing scripts

* update deployment readme

* update readme and fix some bug

* finalize the inference and deployment based on vllm

(cherry picked from commit 6fc08eb)
Signed-off-by: Haofei Yu <1125027232@qq.com>
lwaekfjlk added a commit that referenced this pull request Mar 14, 2024
* support qlora

* upload dummy conversation data

* delete doc and docker

* update pyproject pip install package

* continue cleaning

* delete more files

* delete a format

* add llm_deploy

* add testing scripts

* update deployment readme

* update readme and fix some bug

* finalize the inference and deployment based on vllm

(cherry picked from commit 6fc08eb)
lwaekfjlk added a commit that referenced this pull request Mar 14, 2024
* support qlora

* upload dummy conversation data

* delete doc and docker

* update pyproject pip install package

* continue cleaning

* delete more files

* delete a format

* add llm_deploy

* add testing scripts

* update deployment readme

* update readme and fix some bug

* finalize the inference and deployment based on vllm
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[FEAT]: Add vLLM-based deployment and usage testing code
1 participant