Replies: 13 comments 2 replies
-
Hey @rgbkrk @shouples @kafonek! 👋 First off, amazing work with Noteable. It was actually the first plugin I attempted to get working with this project, and I'm honored to have LibreChat looked at by you and your team!
My LangChainJS method of using the OpenAPI plugins gets over this issue as well as makes all the OpenAPI spec methods known in the system message, but I lack a way to authenticate. In short, I basically create an OpenAPI chain as a tool for a functions agent.
This would make the OpenAPI method possible, and would be really welcome! As for the second method, I agree it would be easier to work with the noteable SDK.
For both points, the functions agent sounds like it will work better for you as it can accept not just LangChain Tools but OpenAI functions as Tools. Since noteable SDK is in python, this project would still benefit from your development on the client, and while not as proficient as you in Python, I would be excited to help as well. For LibreChat, I can proxy from LangChainJS to a python micro-service. |
Beta Was this translation helpful? Give feedback.
-
This is my "second" iteration and I have a better way still in the works, as I learn more of how langchain handles this under the hood. |
Beta Was this translation helpful? Give feedback.
-
Hello @danny-avila! 👋 I'm definitely still learning more about how langchain handles this workflow as well, so you're not alone! As a quick update from the original post @rgbkrk made; we were able to use the OpenAPIToolkit and AIPluginTool, albeit with some hiccups hitting the token limit (likely from OpenAPI spec duplication happening between the from langchain.agents import AgentType, load_tools, initialize_agent
from langchain.agents.agent_toolkits import OpenAPIToolkit
# from langchain.agents.agent_toolkits.openapi import planner
# from langchain.agents.agent_toolkits.openapi.spec import reduce_openapi_spec
from langchain.chat_models import ChatOpenAI
from langchain.requests import RequestsWrapper
from langchain.tools import AIPluginTool
from langchain.tools.json.tool import JsonSpec
import httpx
client = httpx.Client()
openapi_spec_resp = client.get("https://chat.noteable.io/api/origami/openapi.json")
llm = ChatOpenAI(model_name="gpt-4", temperature=0.0)
# wrap requests using token from https://app.noteable.io/api/token
requests_wrapper = RequestsWrapper(headers={"Authorization": f"Bearer {app_token}"})
# old agent using just the request wrapper and the reduced OpenAPI spec
# origamist_agent = planner.create_openapi_agent(
# api_spec=reduce_openapi_spec(openapi_spec_resp.json()),
# requests_wrapper=requests_wrapper,
# llm=llm,
# )
# new agent using the OpenAPIToolkit with the Noteable ChatGPT plugin manifest
openai_toolkit = OpenAPIToolkit.from_llm(
llm=llm,
json_spec=JsonSpec(dict_=openapi_spec_resp.json()),
requests_wrapper=requests_wrapper,
verbose=True,
)
tools = [
AIPluginTool.from_plugin_url("https://chat.noteable.io/.well-known/ai-plugin.json")
] + openai_toolkit.get_tools()
origamist_agent = initialize_agent(
tools=tools,
llm=llm,
agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION,
verbose=True,
requests_wrapper=requests_wrapper,
)
result = origamist_agent.run( ... ) |
Beta Was this translation helpful? Give feedback.
-
Happy to learn from you! I'd love to make this much easier to work with. We have a lot of learning lessons from the plugin we'd love to bring into the LangChain universe. |
Beta Was this translation helpful? Give feedback.
-
@shouples the official ChatGPT plugins implementation seem to run openapi routes/methods as OpenAI functions, which get interpreted by some backend service to do the heavy lifting of the actual API call. I think going for this direct format could be more token efficient than the AIPlugin + OpenAPIToolkit approach. I'm not sure if langchain Python has the openApiChain that langchainJS has, but the latter does a lot of algorithmic work to the end I outlined, converting the spec endpoints to OpenAI functions. I benefit off this + using a dynamicTool to create an "AIPlugin" dynamically. I'm working on improving it by utilizing a lot of the algorithmic logic from createOpenAPIChain without actually using it, as it's double work for the LLM to decide on a method, and then rely on the chain to call it. |
Beta Was this translation helpful? Give feedback.
-
Is there anything I or anyone can do to help you with this? |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
Y'all are amazing. I'm terribly excited for this.
…On Thu, Oct 12, 2023 at 11:46 AM Danny Avila ***@***.***> wrote:
Amazing! Will try to set this up for LibreChat
—
Reply to this email directly, view it on GitHub
<#929 (reply in thread)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AFIPO5QNSCEEXSPJMBUBDFTX7A3G7ANCNFSM6AAAAAA4ZJ23GE>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
@danny-avila it took longer than I'd like to get back into this thread, but I'm ready to jump in and help get this hammered out! Looking at the langchain Python docs, there's a import requests
from langchain.chains.openai_functions.openapi import get_openapi_chain
from langchain.chat_models import ChatOpenAI
from langchain.prompts import ChatPromptTemplate
from langchain.utilities.openapi import OpenAPISpec
MANIFEST_URL = "https://chat.noteable.io/.well-known/ai-plugin.json"
OPENAPI_SPEC_URL = "https://chat.noteable.io/api/origami/openapi.json"
manifest = requests.get(MANIFEST_URL).json()
openapi_spec = requests.get(OPENAPI_SPEC_URL).json()
# should we try adding the `description_for_model` from https://chat.noteable.io/.well-known/ai-plugin.json in here?
prompt_template = ChatPromptTemplate.from_template(
"Use the provided APIs to respond to this user query:\n\n{query}"
)
llm = ChatOpenAI(
model_name="gpt-4",
temperature=0.0,
)
# load Noteable API key from env/file/etc
# noteable_token = os.environ['NOTEABLE_TOKEN']
custom_headers = {"Authorization": f"Bearer {noteable_token}"}
noteable_chain = get_openapi_chain(
spec=OpenAPISpec.parse_obj(openapi_spec),
llm=llm,
verbose=True,
prompt=prompt_template,
headers=custom_headers,
) noteable_chain.run("what's my default project in Noteable?")
This is where it gets a little fuzzy for me based on my limited exposure to langchain chains and tools -- what are your thoughts on how this could/should be implemented as a tool (or toolkit) for an agent? One thing I like about the
...but doesn't seem to take advantage of the |
Beta Was this translation helpful? Give feedback.
-
@rgbkrk @shouples for what it's worth, I came up with a much better way of handling OpenAPI specs for the upcoming Assistants update. What I implemented for Assistants will work with regular "plugins" just not at first. Wanted to try noteable next but noticed I couldn't access app.noteable.io |
Beta Was this translation helpful? Give feedback.
-
Sadly Noteable ceased operating. Disappointing; there's nothing quite as
good to replace it.
…--dan
On Mon, Jan 29, 2024 at 10:41 AM Danny Avila ***@***.***> wrote:
@rgbkrk <https://github.com/rgbkrk> @shouples
<https://github.com/shouples> for what it's worth, I came up with a much
better way of handling OpenAPI specs for the upcoming Assistants update.
What I implemented for Assistants will work with regular "plugins" just
not at first. Wanted to try noteable next but noticed I couldn't access
app.noteable.io
image.png (view on web)
<https://github.com/danny-avila/LibreChat/assets/110412045/ab136af5-b419-4824-bb51-d4d86a77dd1e>
—
Reply to this email directly, view it on GitHub
<#929 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AFIPO5UZI4X2VQBD47UZIR3YQ7UO5AVCNFSM6AAAAAA4ZJ23GGVHI2DSMVQWIX3LMV43SRDJONRXK43TNFXW4Q3PNVWWK3TUHM4DEOBYGY2TS>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
If memory serves, JupyterHub was built by one of the Noteable
cofounders before they founded the company, so it's probably the next-best
thing!
It would be really valuable to have something to replace Code Interpreter.
…On Mon, Jan 29, 2024 at 10:50 AM Danny Avila ***@***.***> wrote:
Ah I see, that is sad. One of these days I will get around to figuring out
how to setup JupyterHub for LibreChat. A friend has figured out an open
source, scalable "code interpreter" through this. It's easy enough to get a
local instance going but need to go into the weeds of documentation to set
this up right.
—
Reply to this email directly, view it on GitHub
<#929 (reply in thread)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AFIPO5Q7NHSSHTTZL77G4U3YQ7VOFAVCNFSM6AAAAAA4ZJ23GGVHI2DSMVQWIX3LMV43SRDJONRXK43TNFXW4Q3PNVWWK3TUHM4DEOBYG43TO>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
I'm happy to comment on this as a fully independent person that worked at Noteable. I didn't found Noteable. Just to clear things up, none of the founders created JupyterHub. Early days on with JupyterHub I created tmpnb which became the Dockerspawner that folded into JupyterHub. That work helped form much of the original JupyterHub and Binder setup. Many more maintainers took it much further after. I went on to Netflix where we had our own notebook orchestration system directly built on that original tmpnb code and plugging in with our container orchestration system. Coming up to the future here, I am currently evaluating how we can enable LLM-assisted compute on existing JupyterHub deployments. I've been talking to Jupyter folks about it recently while building prototypes. I'm currently seeking grant funding to make it happen. I'd love to talk to folks to make connections and help re-enable the amazing experience of notebooks + LLMs. |
Beta Was this translation helpful? Give feedback.
-
Hey! 👋🏻 I'm Kyle from Noteable, along with @shouples and @kafonek.
Recently @glowforgedan told me about LibreChat's ability to work with ChatGPT Plugins and the desire to support OAuth, particularly Noteable. I'm coming to you with a few options that I'll also post over to LangChain's repo.
Note: I realize that LibreChat is written in JavaScript so these are mostly here to get discussions going.
ChatGPT Plugin API Using RequestsWrapper and LangChain OpenAPI Schema
Since we're not seeing a way to inject headers with
AIPluginTool.from_plugin_url
@shouples went with the following approachchat.noteable.io
)RequestsWrapper
to incorporate the Noteable Tokenopenapi_agent
Pull the OpenAPI Spec
Set up a
RequestsWrapper
with a Noteable API TokenCaveat: our system currently only accepts JWTs. We're working on releasing a version that will accept your API Token as well as the OAuth token.
Create an agent using the spec and wrapper
Some major issues with this approach is that it doesn't include the
model_for_description
from Noteable's AI Plugin JSON. This does lead the model towards some eh paths instead of diving in like your personal data science buddy.Directly using
origami
, Noteable's APIAny approach using
origami
directly will be faster to work with (no round trip to the plugin, full notebook document synced in realtime).Example Client
I've created a simplified client for LLMs around
origami
's API and RTU Client (Real time updates) https://github.com/rgbkrk/chatlab/blob/main/chatlab/builtins/noteable.py#L24. Since it was built purposefully for use with OpenAI Function Calling, I'm using the related langchain modules.This works, though there are several ways I wish we could improve this:
create_openai_fn_chain
. Right now it doesn't like methods, which is odd to me because I setup something similar in chatlab to accept bound functions.In chatlab I would just run:
Beta Was this translation helpful? Give feedback.
All reactions