Integrating Pre-built Open AI assistant in AI chat component

Hello. I have a chatbot built using Open AI assistannts. I want to integrate it into Retool Chat component. Any ideas how to go about this?

Hello @OlaOye1 ,

Thank you for your question.

I might suggest rebuilding your Chatbot within Retool using our Pre-Built components.

Here is a quick video: Loom | Free Screen & Video Recording Software | Loom

Additionally, you can leverage Workflows or other features to add your own data sources more specific to your use case than gpt-4.

Hope this helps.

-Brett

Just to give some added context to this question, the "term pre-built AI assistant" is vague and should first be clarified.

As there are two components to a "chatbot" the LLM, which receives inputs and responds to users, and the component/medium in which users can query the LLM.

As shown in the video, Retool offers a chat component that takes in text and outputs a response from an LLM. The LLM you specify in the Retool AI Query that accompanies the creation of a chat component.

If you are using your own model and not a big name LLM, check out our docs here on how to set that up.

This is how you would integrate an LLM into Retool that is not already a pre-selectable option from the AI Query shortlist of most common LLMs.

If you built a chatbot that is analogous to our chat component, you could use a Retool Custom Component to mirror over this intermediary between the user and the LLM. Check out our docs here for the options and setting up a custom component.