Local LLM as resource

Any plans to add OLLAMA to the Resources?

that seems a bit dangerous unless its client-side only or self-hosted only? I just don't see it going too well with cloud host, but I may be overlooking something it just seems like that'd open a vulnerability on the client side or on the server side or both, like what if client was hosting a service that made it look like a OLLAMA resource but actually sends malicious stuff... best case I can think of is not having a security problem but running into a ton of IT issues with clients using wrong settings in the setup process of OLLAMA (something you don't have control over).

1 Like

We're tracking requests for this in our backlog, so I'll post here if our team decides to pick it up