-
Notifications
You must be signed in to change notification settings - Fork 61
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Improved 'o1' model detection in OpenAI client and updated documentat…
…ion for error handling and client setup (#1290) Improved 'o1' model detection in OpenAI client and updated documentation for error handling and client setup. - **Behavior**: - Improved detection of 'o1' models in `is_o1_model()` in `openai.rs` to include exact 'o1' matches. - **Documentation**: - Added `BamlClientFinishReasonError` handling in `error-handling.mdx`. - Updated various provider documentation files (e.g., `anthropic.mdx`, `aws-bedrock.mdx`) to include `finish-reason.mdx` snippet. - Added `enablePlaygroundProxy.mdx` for VSCode extension settings. - Fixed `ClientRegistry` docs
- Loading branch information
Showing
16 changed files
with
178 additions
and
91 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,31 @@ | ||
--- | ||
title: litellm | ||
--- | ||
|
||
[LiteLLM](https://www.litellm.ai/) supports the OpenAI client, allowing you to use the | ||
[`openai-generic`](/ref/llm-client-providers/openai-generic) provider with an | ||
overridden `base_url`. | ||
|
||
|
||
See [OpenAI Generic](/ref/llm-client-providers/openai-generic) for more details about parameters. | ||
|
||
|
||
## Set up | ||
|
||
1. Set up [LiteLLM Proxy server](https://docs.litellm.ai/docs/proxy/docker_quick_start#21-start-proxy) | ||
|
||
2. Set up LiteLLM Client in BAML files | ||
|
||
3. Use it in a BAML function! | ||
|
||
|
||
```baml BAML | ||
client<llm> MyClient { | ||
provider "openai-generic" | ||
options { | ||
base_url "http://0.0.0.0:4000" | ||
api_key env.LITELLM_API_KEY | ||
model "gpt-4o" | ||
} | ||
} | ||
``` |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,19 @@ | ||
| Type | Value | | ||
| --- | --- | | ||
| `boolean \| null` | true | | ||
|
||
|
||
<Tip> | ||
When running VSCode from a remote machine, you likely need to set this to `false`. | ||
</Tip> | ||
|
||
Many LLM providers don't accept requests from the browser. This setting enables a proxy that runs in the background and forwards requests to the LLM provider. | ||
|
||
## Usage | ||
|
||
```json settings.json | ||
{ | ||
"baml.enablePlaygroundProxy": false | ||
} | ||
``` | ||
|
Oops, something went wrong.