You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Been working on an extension that basically allows you to host a LangChain Agent as an API endpoint using a serving API endpoint similar to the OpenedAI API extension. It works so far as a standalone API endpoint, but am working on working it as a connectible extension. This is a bit different from the SuperBooga implementation as it's intended to support existing chains written using LCEL (LangChain Expression Language).
Basic Framework:
Within the extension folder, you will have an Agents folder. All you need to do in this agent folder is create a python file that returns a chain in LCEL format. It will be highly dev dependent, by that I mean you will need to be familiar with creating your own LangChain agent (this includes any databases you want to setup, which will by default be expected to work by itself). This is another differentiation from SuperBooga because its less a Agent as a Service type of extension but a API endpoint as a Service. Basically, the only requirement is that the chain uses the "stream" and "invoke" commands to generate output.
Once the agent is placed in the folder, the extension packages will basically used a reworked version of the OpenedAI extension to serve the LangChain agent as an endpoint.
Benefits:
Opens up possibility to host/serve LangChain agents to the Chat interface
Opens up possibility for text-generation-webui to serve LangChain agents to other existing chat interfaces that already work with the OpenAI API.
Not sure if this is a capability that is of interest or not. I have heard about LangServe, which is basically langchain's implementation of an API endpoint to host agents. However, having it match the OpenAI API hasn't been done just yet.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Been working on an extension that basically allows you to host a LangChain Agent as an API endpoint using a serving API endpoint similar to the OpenedAI API extension. It works so far as a standalone API endpoint, but am working on working it as a connectible extension. This is a bit different from the SuperBooga implementation as it's intended to support existing chains written using LCEL (LangChain Expression Language).
Basic Framework:
Within the extension folder, you will have an Agents folder. All you need to do in this agent folder is create a python file that returns a chain in LCEL format. It will be highly dev dependent, by that I mean you will need to be familiar with creating your own LangChain agent (this includes any databases you want to setup, which will by default be expected to work by itself). This is another differentiation from SuperBooga because its less a Agent as a Service type of extension but a API endpoint as a Service. Basically, the only requirement is that the chain uses the "stream" and "invoke" commands to generate output.
Once the agent is placed in the folder, the extension packages will basically used a reworked version of the OpenedAI extension to serve the LangChain agent as an endpoint.
Benefits:
Not sure if this is a capability that is of interest or not. I have heard about LangServe, which is basically langchain's implementation of an API endpoint to host agents. However, having it match the OpenAI API hasn't been done just yet.
Is it worth releasing an extension like this?
Beta Was this translation helpful? Give feedback.
All reactions