Thanks for all the work on RetoolAI. It would be great if you could add Ollama support to RetoolAI. It would allow testing of open source factorized LLMs.
Hey @nsteblay, thanks for the request. I've logged this internally, and can update this post as we hear any additional information!
As Ollama now supposedly supports an OpenAI-style API, could you not just expose the configuration for OpenAI so we can point to an Ollama instance instead?