Any plans to add OLLAMA to the Resources?
that seems a bit dangerous unless its client-side only or self-hosted only? I just don't see it going too well with cloud host, but I may be overlooking something it just seems like that'd open a vulnerability on the client side or on the server side or both, like what if client was hosting a service that made it look like a OLLAMA resource but actually sends malicious stuff... best case I can think of is not having a security problem but running into a ton of IT issues with clients using wrong settings in the setup process of OLLAMA (something you don't have control over).
We're tracking requests for this in our backlog, so I'll post here if our team decides to pick it up
@Markus_von_der_Heiden I wanted to circle back here to say that our team shipped an option for custom AI models: Connect a custom AI provider | Retool Docs Would that help your use case?
i am using self hosted retool. i also have ollama running locally. Has anyone used this combination? I see conflicting information as to whether or not ollama has full support for the OpenAI schema. I do see it has support for chat/completions... will this be enough?
Thanks for checking on this @jmjava. Apologies for the confusion!
We do allow connecting Retool AI with custom AI models now. However, we have seen users run into cert errors when trying to connect to Ollama locally, as we don't have a feature for uploading CA certs in the Retool AI resource config yet.
If you need to upload CA certs in order to connect to Ollama, technically you could do this in the REST api resource integration (separate from Retool AI), but it wouldn't be connected to the Retool AI features. If you go with the REST api approach and want to use it for chat, you wouldn't be able to use our AI chat component. You'd have to manually create your own chat functionality with our regular component library.