I'm testing out the new Kafka connections and have it attached to one of the sample/default examples from Confluent's hosted environment for streaming stock transactions. I'm setting up a consumer query to read from the topic.
I keep getting this error once messages arrive:
"Error deserializing key/value for partition demo-0 at offset 0. If needed, please seek past the record to continue consumption."
There're a few challenges here:
- It's difficult to see more logs to understand what else might be happening under the hood, what's arriving from the Confluent server, etc.
- It seems to choke before it gets to a transform option, I tried returning an empty array and just log data, so it's happening earlier in the de-serialization phase.
There's a fair amount of nuance here between two systems in motion, but maybe someone here has an idea what's happening, how to debug better, etc. The next option I would imagine would be to use a transform on Kafka's side which has a non-zero cost to stand up additional resources to transform the data before it's sent, and in a real-world scenario, I don't know if I'd be likely to get that permission from the client to set up a new transformer.