Profile avatar
ephem.dev
Freelance developer - React Query maintainer - Occasional OSS contributor - Been putting React on servers for a decade - Father of two - Homebrewer - Stockholm ephem.dev
88 posts 351 followers 225 following
Getting Started
Active Commenter
comment in response to post
Oh wow, I never thought I’d hear you say that, I’m happy you’ve made friends with them! I’m also happy I haven’t had to wrestle (much) with them myself. 😀
comment in response to post
Oh yeah, for sure, and I agree, I just saw an opportunity to expand some of my thinking. 😀
comment in response to post
If I say "this library has streaming support", which one am I talking about? Just `await fetch()` and passing the data down technically streams in Next. I feel our language and terms hasn't quite caught up, I'd love better concise names/concepts to talk about these different kinds/things. 🤔
comment in response to post
So in everyday communication I still think it can be hard to know exactly which kind of streaming a person is talking about. All of these needs different support in libs. So I try to ask "What are you trying to achieve?" to get more context, so easy to be talking about slightly different things. 😀
comment in response to post
The model also differs between `await prefetchQuery()` and `prefetchQuery()` and passing the promise down. Both can be streamy with RSCs, but in different ways. Passing down an AsyncIterable from RSCs is different than passing promises, and also different from the streamedQuery RQ recently relesed.
comment in response to post
Totally agree on the holistical part. Streaming is also things like websockets or SSE though, where the markup/RSC part might not be relevant. While it has caveats, streaming SSR (just markup) without RSCs is another kind.
comment in response to post
Maybe this is a good topic for a talk? 🤔
comment in response to post
Hehe, yeah, it gets even trickier when you introduce the kind of streaming @tkdodo.eu just introduced with support for AsyncIterable, when it's just data that streams server->client over an open socket, which has nothing to do with markup streaming.
comment in response to post
A lot of this depends on the interplay with the framework, so it's hard to say "support streaming SSR outside of Next", because so much rests on the details. 😀
comment in response to post
You'd have to dehydrate and pass the fully fetched query data at the end of the SSR and hydrate that before the client render. I don't think any frameworks support this today, but might be possible in custom setups. A lot of the low level APIs are available in RQ to build on top off too.
comment in response to post
There is one version where you start the prefetch in the same process as the SSR pass, but before starting the render. So prefetch and render on the server would share the same queryClient. This would stream markup, but does not work with incremental hydration on the client, no way to pass promise.
comment in response to post
If it's the prefetch but don't await kind, it kind of does require Server Components as it relies on passing a promise down to both the SSR and Client pass.
comment in response to post
This is a large topic, could you describe a bit more what you are looking for Oliver? Is it the experimental streaming without prefetching, or is it the prefetch but don't await kind?
comment in response to post
I’ve promised myself to finish it during summer vacations, I can really only read that book when I have no distractions, it’s a tough one..
comment in response to post
Very strong Gödel, Escher, Bach vibes too, both the recursion themes and in style. In a good way, I loved the start of act 2. Unlike that book I actually got through this though, thank you for writing and sharing this!
comment in response to post
I enjoyed this as much as a philosophical treaty on time and space, and a reading journey, as I enjoyed the (always brilliant) technical explanations. 🫶
comment in response to post
That’s amazing, congratulations! 🫶
comment in response to post
For the uninitiated, AsyncLocalStorage is a Node API (nodejs.org/api/async_co...) that's been adopted by other server runtimes. But we really need it in the browser too, and that's where github.com/tc39/proposa... comes in. Really hoping it moves forward, it will be huge
comment in response to post
I agree, and there's also the practical side of it. Changing an undocumented behavior enough people rely on should likely be a major for practical reasons. Changing a documented behavior that you know no one or very few rely on might be fine to break in a minor in niche circumstances.
comment in response to post
Thanks, I will! 🫶
comment in response to post
As you say, definitely okay that it imposes constraints too, the ”how far” part is mostly a theoretical itch. 😀 Very cool that you are playing with this!
comment in response to post
For sure, hence the ”how far”. 😀 Unlike many distributed systems, this has very low latency though, and parallel actions are less likely, so other solutions than traditional might be possible. I could see how many of the async APIs could be made to wait and resolve in the shared cache instead e.g.
comment in response to post
Nice! That’s how I envisioned it too. I’m very curious how far it’s possible to get in userland towards just (mostly) using the normal APIs and the experience being that the shared cache ”just works” like the normal one. 🤔
comment in response to post
Very cool! Would love to see this. Have you given any thought to invalidations or updating the shared cache based on mutation responses etc?
comment in response to post
With all that context, it’s hard to misread the tweet, but without it people seem to.
comment in response to post
In the context of this discussion, let’s not forget the amount of time the team invested (years?) into making that upgrade easy for anyone not wanting to opt into the new features, without compromising on the final vision. Early concurrent Mode was so much harder to incrementally upgrade to.
comment in response to post
That is, the larger gains had not been available to Next users before, now they were. Custom frameworks might have been able to adopt at least some of the features even earlier, yay, now Next users could too.
comment in response to post
I remember it as there being a discussion/narrative at the time that the React 18 upgrade kind of had two steps. Just upgrading (easy, small gains) and starting to use the concurrent features (harder, large gains). I always read that tweet simply as excitement Next had gotten to the second part.
comment in response to post
Happy birthday! 🥳🎉
comment in response to post
So beautiful! 😍
comment in response to post
Currently reading it, nearing the end. I love to hate to love it, will take some time to sink in. The tree of dead, well, you know, is easily top 3 most disturbing things I’ve ever read. 😱 I could never have listened to this, need to look up every tenth word. 😅
comment in response to post
I'm also thinking about primary/secondary content, where both queries might live in the same component, but it might make sense just for the primary one to blow up to the boundary. Secondary could just hide that part of the content. Very possible to build well, very easy not to.
comment in response to post
In many cases having at least that would help and make sense and UX-wise it often makes sense to have a loading and an error state for some widget on the screen. In my experience, many have just a single Suspense boundary for a page too though which is kind of fine for loading, but not for errors.
comment in response to post
This very much related to React Query (and other data fetching libraries) too and is something I've been thinking about a lot recently.
comment in response to post
Infinite spinner when the data fetch errored was and still is super common and isn't great either, but at least you could use the rest of the app. A lot of data fetching is not critical for the thing you are trying to do and having to add super granular error boundaries to preserve that is a lot.
comment in response to post
I do think this is further exacerbated by Suspense, where data fetching blows to an error boundary, where the old model would often error locally. It is safest to throw by default, but for data fetching I'm wondering if maybe that has swung the default too far away from resilience?
comment in response to post
I'm not sure if the same has turned out as well for errors. Even with a top level ErrorBoundary, it is a bit cumbersome to add more granular ones and from my experience, most apps don't add granular ones enough. I'm not saying this is a Next issue, it's easy to add error.tsx. In app code though..
comment in response to post
I think this is a great idea, many people start by tweaking what's there rather than going to the full docs. I do think there's a deeper point to reflect on too though. For loading states, it's made total sense to hoist them to the top and later add more granular. Better default than spinner hell.
comment in response to post
SUCH a miss! 😭
comment in response to post
Not saying that it's feasible to implement in RQ, or the right solution. Just trying to think about what a promise means to us and what it means to React since what the promise represents seems to be at the heart of this?
comment in response to post
Or maybe that's not the tricky part? Because we could emit as usual if the promise was stable? Just thinking out loud, but right now the promise represents "waiting for a specific queryFn to resolve", but by wrapping it, it could mean "waiting for data, however it's resolved", that is, stable.
comment in response to post
Yeah, I guess the tricky part is we can’t know if that promise will be used, so cant skip the normal emit.
comment in response to post
I guess we could technically wrap the promise with one we control the resolve of, and if setQueryData happens we immediately resolve with that data? Cumbersome though..
comment in response to post
GRIS is a great chill platformer/puzzle game/piece of art/audiovisual experience. I guess it lacks the keeps surprising bit, not that kind of storytelling, but is really immersive and has it's own unique vibe I think you'll like. Quite short so not a huge commitment.
comment in response to post
Thank YOU for your hard work! ❤️