Thanks to everyone who joined us live for Day 1 of AI Build Week — our Prompt Arena Showdown!
We explored how to compare LLM outputs and write smarter prompts, and y’all asked some questions. Here’s a quick recap of the top Qs (and answers):
Prompting & LLM Access
Q: Do I need to bring my own API keys to access different LLMs in Retool?
A: You’ll have access to default models in Retool AI, but if you want to use your own OpenAI, Anthropic, or other API keys, you can configure them directly.
Q: Are there any extra costs when using AI in Retool self-hosted?
A: If you're self-hosting and using your own model keys, charges will come directly from your LLM provider — Retool doesn’t add extra fees for that.
Q: Will we be discussing specific prompts in the demo?
A: Yep — we showcased prompt construction live, and you can rewatch the session anytime above!
External APIs & Integration
Q: Can I integrate external APIs with these workflows?
A: Absolutely. Whether you’re calling third-party services, webhooks, or internal APIs, you can integrate them directly into your workflows or app logic.
Q: Can I make a chatbot built in Retool public?
A: Yes — Retool apps can be shared publicly, but make sure to configure proper permissions and security depending on what data and features you expose.
Learning Retool & Sharing Takeaways
Q: Is there a certification or course I can take to learn more?
A: Check out Retool University for guided tutorials and educational content to help you get hands-on.
Q: Will these sessions be recorded?
A: Yep — all session recordings are available in the AI Build Week Community Hub. You can also rewind and rewatch anytime.
Let us know your top takeaway from Day 1 — or share how you’re thinking about building with LLMs!
Up next: Day 2 – RAG + Vectors