We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add LLM inference support for https://lmstudio.ai/
Implementation notes:
src/adapters/lmstudio
llm.ts
chat.ts
Implementation tips:
src/adapters/ollama
The text was updated successfully, but these errors were encountered:
Note that it's currently possible to use LM Studio with Bee through the OpenAI-compatible API (https://lmstudio.ai/docs/basics/server#openai-like-api-endpoints), in a similar way to Ollama. I haven't checked if it has limitations over the SDK, however.
Sorry, something went wrong.
Then I wonder why their SDK (https://www.npmjs.com/package/@lmstudio/sdk) differs from Ollama / OpenAI SDK.
No branches or pull requests
Add LLM inference support for https://lmstudio.ai/
Implementation notes:
src/adapters/lmstudio
with filesllm.ts
andchat.ts
.Implementation tips:
src/adapters/ollama
).The text was updated successfully, but these errors were encountered: