-
I am a new to elixir . I want to display the respone from LLM by stream way. I have already finished the base function with 'stream: false'. After changing it to 'true', it also produces a response, but it doesn't stream as expected." Here is my code: @spec openai_form_env_key(binary() | [LangChain.Message.ContentPart.t()]) :: any()
def openai_form_env_key (promot)do
api_keys = Application.get_env(:langchain, :fireworksAI_keys)
IO.inspect(api_keys)
random_api_key = Enum.random(api_keys)
{:ok, _updated_chain, response} =
%{llm: ChatOpenAI.new!(%{
endpoint: "https://api.fireworks.ai/inference/v1/chat/completions",
model: "accounts/fireworks/models/llama-v3-70b-instruct",
api_key: random_api_key,
temperature: 0.6,
stream: true
})}
|> LLMChain.new!()
|> LLMChain.add_message(Message.new_user!(promot))
|> LLMChain.run(callback_fn: &handle_response/1, while_needs_response: true)
response.content
end
defp handle_response(_response) do
end
@impl true
def handle_event("submit", params, socket) do
%{"info1" => info1, } = params
pormot =
~s"
#{info1}
"
IO.puts( pormot )
content = openai_form_env_key (pormot)
{:noreply, assign(socket, :content, content)}
end
def render(assigns) do
~H"""
<%= @content %>
"""
end |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments
-
@youfun check out this: https://github.com/brainlid/langchain_demo/blob/main/lib/langchain_demo_web/live/conversation_live/show.ex |
Beta Was this translation helpful? Give feedback.
-
Thanks for helping answer @cpursley! @youfun: Yes, specifically, check out line 310 where the callback is handling a MessageDelta. it sends it as a message to the LiveView which is handled on line 183, which applies the delta to the LLMChain instance owned by the LiveView, updates the LiveView state, which results in rendering the LLM response data as it's streamed in. |
Beta Was this translation helpful? Give feedback.
Thanks for helping answer @cpursley!
@youfun: Yes, specifically, check out line 310 where the callback is handling a MessageDelta. it sends it as a message to the LiveView which is handled on line 183, which applies the delta to the LLMChain instance owned by the LiveView, updates the LiveView state, which results in rendering the LLM response data as it's streamed in.