devagr.bsky.social
He/Him
Content Creator, Software Architect
Core team @solidjs.com
Organizer @momentumdevcon.com
youtube.com/@devagr
twitch.tv/devagrawal09
302 posts
4,454 followers
906 following
Regular Contributor
Active Commenter
comment in response to
post
congrats on finally shipping this!
comment in response to
post
if Meta's usage of React has led to certain heuristics being developed to prevent over-fetching, it makes sense that you'd want to offer those heuristics as a first class citizen within React, and you might also think that the broader ecosystem benefits from the same things.
comment in response to
post
i might be totally off-base here, but it's possible that the opinions around data fetching with React are a consequence of how React gets used within Meta (and possibly Vercel), rather than what the general ecosystem is doing.
comment in response to
post
it's not a question of capability. both solid and svelte can easily build a temporary cache of sorts in the derived primitives that hold the previous values for a short amount of time in case the dependencies revert back.
it just sounds like a terrible api to offer developers.
comment in response to
post
you can't decide what requests are unnecessary as someone who's responsibility is to show data on the UI that is consistent with its dependencies
you can offer heuristics, but they have to be wrapped in an opt-in api, not the default that has to be opted out of
comment in response to
post
out of the box caching solutions make it difficult for the ecosystem to build solutions that suit specific use cases. there's no cache strategy that suits everyone.
even something as simple as "keep the last piece of data around for 30 seconds" leads to super annoying experiences
comment in response to
post
i don't see how caching data that is no longer needed by the UI is a concern that rendering frameworks should concern themselves with
or have you forgotten the biggest complaints people had with nextjs app router for the longest time until dynamicIO
comment in response to
post
besides - we've already shown that you can achieve this without a compiler, purely at runtime
stackblitz.com/edit/github-...
comment in response to
post
i would like hoisting/prefetching data to be an optimization, not a requirement
it's much easier to simply prefetch the bundles so that data fetching can be kicked off for most of the page instantly
when combined with offscreen rendering, waterfalls become a much smaller problem
comment in response to
post
without any hoisting/prefetching this provides the best possible scenario
there's a decent number of apps that don't bother hoisting manually because it's too brittle, and RSC/GraphQL solutions don't work for their use case
if we can offer them the best case scenario i don't see why we shouldn't
comment in response to
post
nice! excited to see what this ends up looking like
comment in response to
post
the wisemonkeys are not nested components, they are sibling components
even react can render siblings in parallel
nested but parallel is the real deal
comment in response to
post
only if you put the await in the template
if you put it in the code then it blocks everything under it
comment in response to
post
(given that the fetches aren't directly dependent on each other for data and they can happen in parallel)
comment in response to
post
can the compiler ensure nested components also fetch in parallel to their parents?
comment in response to
post
for example - nested components fetching data in parallel without unnecessary waterfalls by default
stackblitz.com/edit/github-...
comment in response to
post
> so far from representative real-world use cases
Rendering an array into a list/table is far from real world use cases?
comment in response to
post
Funnily enough wasm frameworks generally score worse than js frameworks in the benchmarks
comment in response to
post
The best part about the benchmarks is finding frameworks that push you towards good patterns AND are also super performant so I can spend more time building features and less time looking for regressions
comment in response to
post
It’s literally testing the performance of rendering an array into a table/list
Do you not have that in your app? I have like 20 of them in a normal app
It’s a big and general enough task to not qualify as a microbenchmark
comment in response to
post
if you think this is a microbenchmark you're ngmi
comment in response to
post
i have to admit that i was skeptical about @devagr.bsky.social's talk about fine-grained rendering and sync engines, but he tied everything together really well. if you really care about the fastest possible performance, local-first data and fine-grained rendering are probably the way to go.
comment in response to
post
Yes you can check out seroval by @lxsmnsyc.bsky.social that does this and more, and is used within solidstart and soon tanstack start
github.com/lxsmnsyc/ser...
comment in response to
post
1. Effects run when atleast one of the dependencies change. Are you noticing unexpected behavior in any cases? Sounds like it could be a bug.
2. As in cases where dependencies are dynamic? And how does that affect the non-dynamic case exactly?
3. We have onCleanup, not sure what you mean exactly
comment in response to
post
What issues are you running into right now?
createEffect is supposed to be the best place for this stuff, would love to hear why that’s hasn’t been working for you
comment in response to
post
I appreciate the @
comment in response to
post
When you say “mutations”, do you mean mutations to signals in the local state? Or server mutations?
Are you writing to signals inside createEffect?
If the user interaction is not a DOM event, what is it? Got some examples?
comment in response to
post
Correct createEffect isn’t needed there, the props object doesn’t get reassigned.
The yellow squiggly is because you’re not reading this.props anywhere, only assigning
comment in response to
post
Madlad
comment in response to
post
so it becomes a matter of familiarity. are you more familiar with solving waterfalls or server caching? and which one is a bigger problem in your context? is it possible to provision server infrastructure so that we can continue writing servers that fetch everything all the time?
comment in response to
post
with server functions, you can reduce waterfalls
1. use client loaders to fetch in parallel with the code
2. find and combine dependent queries into a single server function
with server components, you can reduce refetching by caching on the server
comment in response to
post
the base case with server components is the opposite - no waterfalls, but everything is refetched.
neither of these are really attractive default options, so the question is
- what's a better default for a given situation
- how much work goes into addressing the weaknesses of the chosen model
comment in response to
post
this means the base case with server functions (without client loaders) is that while there might be some client-server waterfalls, revalidation is always granular by default, and there's potential of things like single flight mutations based on cache keys
comment in response to
post
empowering devs with the ability to colocate server logic with components removes the ergonomic barrier to writing efficient data queries by hand that return the exact data needed by the UI at any point in time
comment in response to
post
you're right, they don't influence the fundamental capabilities, but one of the problems that led to overfetching/underfetching is that when we are changing the UI, we don't have easy ways to also change the data fetching because it's locked behind 3 layers of abstractions
comment in response to
post
x.com/mjackson/sta...
comment in response to
post
What the microtruck
comment in response to
post
for solid the main reason was throwing async, we probably stick with the single function effect otherwise
that said i am slowly getting to like the split api, and in simple cases i actually like the ergonomics better
comment in response to
post
this seems simpler
comment in response to
post
you could still have conditional dependencies in the first function🙂