Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Proposal: A way to skip LLMs #93

Open
jeremyfowers opened this issue Jan 25, 2024 · 0 comments
Open

Proposal: A way to skip LLMs #93

jeremyfowers opened this issue Jan 25, 2024 · 0 comments
Labels
bug Something isn't working model Pertaining to the models corpus //models/

Comments

@jeremyfowers
Copy link
Collaborator

Problem:

  • running turnkey transformers/*.py is a nice way to sweep over lots of transformers
  • that folder now contains a lot of LLMs
  • LLMs will melt your RAM and hard drive if they run on insufficient hardware

Why this is needed: if I run a job on a machine with limit RAM and HDD, I want to skip over LLMs so that my system doesn't crash.

Possible solutions:

  • Quick/sad: move the LLMs out of transformers into a different folder (eg, models/llm)
  • Moderate: add an task::llm or similar label to the LLMs, and a way to exclude models based on their labels (eg, turnkey transformers/*.py --labels ~task::llm
  • Advanced: filter models based on parameter count like turnkey transformers/*.py --max-params 1B

cc @danielholanda @ramkrishna2910

@jeremyfowers jeremyfowers added bug Something isn't working model Pertaining to the models corpus //models/ labels Jan 25, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working model Pertaining to the models corpus //models/
Projects
None yet
Development

No branches or pull requests

1 participant