Yep, exactly - Gabe has also been thinking about providing per-user signed URLs to task executions so clients can subscribe more easily without a long-lived token. So basically, you would start the workflow from your API, and pass back the signed URL to the client, where we would then provide a React hook to get task updates automatically. We need this ourselves once we open our cloud instance up to self-serve, since we want to provision separate queues per user, with a Hatchet workflow of course.
It's dead simple: an existence of the URI means the topic/channel/whathaveu exists, to access it one needs to know the URI, data streamed but no access to old data, multiple consumers no problem.
I love the simplicity & approachability of Deno queues for example, but I’d need to roll my own way to subscribe to task status from the client.
Wondering if perhaps the Postgres underpinnings here would make that possible.
EDIT: seems so! https://docs.hatchet.run/home/features/streaming