Using OpenAI resource with Function calls

Looks like the latest version fo Retool offers support for the latest OpenAI models and with that, function calls. (Will Retool Update the Models Available from OpenAI)

However, with function calls, you pretty much have to be able to modify the response body (append a new message with function_response) and re run the query with the additional context.

Currently, I'm not sure this is possible. The OpenAI Query Builder in Retool exposes the ability add an arbitrary amount of specific messages (which can be referenced with variables), but when using the query in my retool code, there doesn't seem to be a way for me to arbitrary add message and re-trigger the request.

For reference, this is how you work with function calling in openai:

  1. Setup your openai request with a set of function references
  2. Make a prompt query request that implicitly requests one of your functions.
  3. OpenAI returns that it thinks you need to call a specific function.
  4. We make a call to that function in our code and append to the messages with a response type of "function_response".
  5. OpenAI uses that context to formulate another answer.
  6. We might need to make another request depending on the response.

The obvious workaround right now would be to now use the OpenAI resource and go with the REST API resource.

1 Like

Hey @Chris_Gat! Thanks for posting your thoughts here along with your potential workaround for anyone dealing with the same issue that happens across this thread. It looks like functions aren't really supported at the moment though the AI team does have them on their radar.

If they do end up being supported I can report back here!

1 Like