Replies: 6 comments 2 replies
-
Bonjour @flefevre, Je suis là pour vous aider avec votre problème. N'hésitez pas à poser des questions ou à signaler tout autre problème. To pass the authentication Bearer token when creating a Langflow with Litellm as the LLM model provider, you should set the appropriate API key in the configuration. You need to specify the Here is how you can set it: from langflow.base.models.model import LCModelComponent
class ChatLiteLLMModelComponent(LCModelComponent):
# Other configurations...
def build(
self,
input_value: Text,
model: str,
provider: str,
api_key: Optional[str] = None,
stream: bool = False,
temperature: Optional[float] = 0.7,
model_kwargs: Optional[Dict[str, Any]] = {},
top_p: Optional[float] = None,
top_k: Optional[int] = None,
n: int = 1,
max_tokens: int = 256,
max_retries: int = 6,
verbose: bool = False,
system_message: Optional[str] = None,
) -> Text:
try:
import litellm # type: ignore
litellm.drop_params = True
litellm.set_verbose = verbose
except ImportError:
raise ChatLiteLLMException(
"Could not import litellm python package. " "Please install it with `pip install litellm`"
)
provider_map = {
"OpenAI": "openai_api_key",
"Azure": "azure_api_key",
"Anthropic": "anthropic_api_key",
"Replicate": "replicate_api_key",
"Cohere": "cohere_api_key",
"OpenRouter": "openrouter_api_key",
}
# Set the API key based on the provider
api_keys: dict[str, Optional[str]] = {v: None for v in provider_map.values()}
if variable_name := provider_map.get(provider):
api_keys[variable_name] = api_key
else:
raise ChatLiteLLMException(
f"Provider {provider} is not supported. Supported providers are: {', '.join(provider_map.keys())}"
)
output = ChatLiteLLM(
model=model,
client=None,
streaming=stream,
temperature=temperature,
model_kwargs=model_kwargs if model_kwargs is not None else {},
top_p=top_p,
top_k=top_k,
n=n,
max_tokens=max_tokens,
max_retries=max_retries,
openai_api_key=api_keys["openai_api_key"],
azure_api_key=api_keys["azure_api_key"],
anthropic_api_key=api_keys["anthropic_api_key"],
replicate_api_key=api_keys["replicate_api_key"],
cohere_api_key=api_keys["cohere_api_key"],
openrouter_api_key=api_keys["openrouter_api_key"],
)
return self.get_chat_result(output, stream, input_value, system_message) Replace |
Beta Was this translation helpful? Give feedback.
-
Dear all,
I have tried to create a workflow in order to connect to my Litellm but i
do not find any LiteLLM where i can provide the base url.
What should I use?
I did'nt find any example in the store.
Thanks
…On Tue, Jun 11, 2024 at 10:25 AM dosubot[bot] ***@***.***> wrote:
Bonjour @flefevre <https://github.com/flefevre>,
Je suis là pour vous aider avec votre problème. N'hésitez pas à poser des
questions ou à signaler tout autre problème.
Dosu <https://dosu.dev>
To pass the authentication Bearer token when creating a Langflow with
Litellm as the LLM model provider, you should set the appropriate API key
in the configuration. You need to specify the api_key field in the
configuration for the ChatLiteLLMModelComponent.
Here is how you can set it:
from langflow.base.models.model import LCModelComponent
class ChatLiteLLMModelComponent(LCModelComponent):
# Other configurations...
def build(
self,
input_value: Text,
model: str,
provider: str,
api_key: Optional[str] = None,
stream: bool = False,
temperature: Optional[float] = 0.7,
model_kwargs: Optional[Dict[str, Any]] = {},
top_p: Optional[float] = None,
top_k: Optional[int] = None,
n: int = 1,
max_tokens: int = 256,
max_retries: int = 6,
verbose: bool = False,
system_message: Optional[str] = None,
) -> Text:
try:
import litellm # type: ignore
litellm.drop_params = True
litellm.set_verbose = verbose
except ImportError:
raise ChatLiteLLMException(
"Could not import litellm python package. " "Please install it with `pip install litellm`"
)
provider_map = {
"OpenAI": "openai_api_key",
"Azure": "azure_api_key",
"Anthropic": "anthropic_api_key",
"Replicate": "replicate_api_key",
"Cohere": "cohere_api_key",
"OpenRouter": "openrouter_api_key",
}
# Set the API key based on the provider
api_keys: dict[str, Optional[str]] = {v: None for v in provider_map.values()}
if variable_name := provider_map.get(provider):
api_keys[variable_name] = api_key
else:
raise ChatLiteLLMException(
f"Provider {provider} is not supported. Supported providers are: {', '.join(provider_map.keys())}"
)
output = ChatLiteLLM(
model=model,
client=None,
streaming=stream,
temperature=temperature,
model_kwargs=model_kwargs if model_kwargs is not None else {},
top_p=top_p,
top_k=top_k,
n=n,
max_tokens=max_tokens,
max_retries=max_retries,
openai_api_key=api_keys["openai_api_key"],
azure_api_key=api_keys["azure_api_key"],
anthropic_api_key=api_keys["anthropic_api_key"],
replicate_api_key=api_keys["replicate_api_key"],
cohere_api_key=api_keys["cohere_api_key"],
openrouter_api_key=api_keys["openrouter_api_key"],
)
return self.get_chat_result(output, stream, input_value, system_message)
Replace "your_bearer_token_here" with your actual Bearer token and set
the api_key field accordingly. This configuration will be picked up by
Langflow when it initializes the LLM model provider [1]
<https://github.com/logspace-ai/langflow/blob/dev/src/backend/base/langflow/components/models/ChatLiteLLMModel.py>
[2]
<https://github.com/langflow-ai/langflow/blob/dev/src/backend/base/langflow/components/models/ChatLiteLLMModel.py>
.
*To continue talking to Dosu <https://dosu.dev>, mention @dosu
<https://github.com/dosu>.*
—
Reply to this email directly, view it on GitHub
<#2137 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABKZRFACS2DRA2SRXCOJYQDZG2YALAVCNFSM6AAAAABJDWXOZOVHI2DSMVQWIX3LMV43SRDJONRXK43TNFXW4Q3PNVWWK3TUHM4TOMZWGA3DK>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
--
-----------------------------------------
François Le Fèvre
36 rue Jean Poulmarch
91190 Gif-sur-Yvette
0665604928
|
Beta Was this translation helpful? Give feedback.
-
Thanks for the tips. Does anyone know how to use Litellm rather Ollama? Thanks |
Beta Was this translation helpful? Give feedback.
-
Any updates on this? The documentation on https://docs.langflow.org/components/models indicates that LiteLLM should be a standard model compoenent. But on Langchain version 1.0.5 the component is nowhere to be found. |
Beta Was this translation helpful? Give feedback.
-
Following this too. Looking for a way to interact with LiteLLM in Langflow. |
Beta Was this translation helpful? Give feedback.
-
hello, i have create a new ticket here: #4741 |
Beta Was this translation helpful? Give feedback.
-
Dear all
I would like to create a langflow ith Litellm as LLM model provider.
I do not know how to pass the authetification Bearer token?
Where should i do it?
Thanks
François
Beta Was this translation helpful? Give feedback.
All reactions