You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I guess some users might have encountered issues with the CrewAI project and, as a result, indirectly found LiteLLM. They encountered some issues while using various models.
I've noticed that several China model providers do not appear in the Providers list, and there have been multiple issues raised regarding this point. Actually, if the model you are using supports OpenAI-Compatible Endpoints, the author has already paved the way for you. You can use your model with the following method: https://docs.litellm.ai/docs/providers/openai_compatible
Relevant log output
Qwen as an example:
As you can see,it works.
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.53.1
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered:
What happened?
I guess some users might have encountered issues with the CrewAI project and, as a result, indirectly found LiteLLM. They encountered some issues while using various models.
I've noticed that several China model providers do not appear in the Providers list, and there have been multiple issues raised regarding this point. Actually, if the model you are using supports OpenAI-Compatible Endpoints, the author has already paved the way for you. You can use your model with the following method:
https://docs.litellm.ai/docs/providers/openai_compatible
Relevant log output
Qwen as an example:
As you can see,it works.
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.53.1
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered: