You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently Cria isn't really meant for online models, so this is not supported.
If people are intrested though, it could be implemented.
The closest thing available now would be using an ollama instance on a server somewhere, and that is supported. To use a different host other than localhost, create an environment variable called OLLAMA_HOST and input your server's URL.
I should note that I haven't test this, but it should work. I will work on adding native support for this later, so that it can be possible to pass in the URL to a Cria instance instead of using environment variables.
Until then, I would need more information about your setup in order to help with your use case.
If my
llama3:8b
is hosted on remote server somewhere in AWS cloud, how do I pass it in to this?The text was updated successfully, but these errors were encountered: