I am starting a new thread, since this one was closed, as I am having the same issue.
It is easily re-producible too. All I did was drop the default chat component and let it create the default query and wiring. Type hi
and get two replies
I am starting a new thread, since this one was closed, as I am having the same issue.
It is easily re-producible too. All I did was drop the default chat component and let it create the default query and wiring. Type hi
and get two replies
i have the same issue
I don't think this is something we can fix, it seems it's more like a backend problem, but to make a bandaid you can add a transformer to the query w return (chat1.data[chat1.data.length - 1].sender === chat1.data[chat1.data.length - 2].sender ? chat1.data.slice(0, -1) : data)
. the ternary check is incase this gets fixed without you knowing, so you don't have to remember to remove the transformer later on in the future.
Won't the message history start to inflate with double replies? I get that would fix the display of the chat, but like you said, still a band-aid on the backend.
ya, unfortunately that's not as easy of a fix. The AI Chat component gives you access to the message history, but in workflows it has to be supplied. It is still possible, here's how I solved it:
user_id int UNIQUE {{ startTrigger.data.user_id }}
or if you're using that component from the pic in your original post you set Message History to {{ chat1.messageHistory.length % 2 ? chat1.messageHistory.slice(0, -1) : chat1.messageHistory }}
it's not as versatile as the transformer suggestion I made, but it will remove the last message from the history if there are ever an odd number of messages in the message list. i can't remember if the newest message sent by the user is added to the .data before or after sending the message but you might need to change it chat1.messageHistory.length % 2 === 0
I think I am going to hold off for now on "chatting" and stick to just text generation, as I want to summarize something. Interactive would be better, but for now, a single shot summary will work for me.
Hi @khill-fbmc Thanks for resurfacing this! I still haven't been able to reproduce this issue, but we'll flag it to the team internally for a fix.
Curious to hear more about your summary project if you're able to participate in the AI Challenge
hi. i'm new here so sorry if i missed it.... any update on this issue?
in my case the issue exist when using chat response streaming (Advanced-> Stream response data and update and update app model incrementally) is checked
would be great to get this resolve given it is key for chat UX, especially when responses are longer.
thanks. cheers.
Any update on this issue Tess? It is happening for me as well now.
Hi,
I too am facing the same issue. Getting multiple replies to my one chat message.
Can anyone help resolve?
Thanks,
Hi everyone! Thanks for flagging Is anyone able to share an app export? or screenshots of their set up & an example chat?
Is this happening consistently for every chat message?
I am experiencing the double replies in User mode, but not in Preview or Edit mode (there I only get single replies). I made a new app and put in the standard chat component (no changes from defaults).
That's super helpful, @Aaron_Berdanier
I'm currently able to reproduce it & will flag to our team for a fix
FYI I was able to find a workaround by turning off streaming on the chat. In the Advanced tab uncheck "Stream response ..."