Profile avatar
kojordan.com
Web Development, Web Performance, Web Sustainability 🌐 kojordan.com
25 posts 49 followers 269 following
Getting Started
Active Commenter
comment in response to post
Auch auf deutsch:
comment in response to post
@guaca.bsky.social is co-organizing with me, and we'd love to hear your talk ideas! Speakers of all levels are welcome. We'd really like a diverse set of voices too. Info and form here --> performanceobserver.dev/call-for-pap...
comment in response to post
-I wasn't aware that for `Vary`ing requests for the same URL, Chrome doesn't seem to store and serve a separate cache entry for each varying value. -Firefox only sends an `Origin` header when the request is actually cross-origin? -Good to know that the issue for preloaded fonts is pretty rare.
comment in response to post
Same here :) What I figured out though: The `Vary: Origin` response header seems to prevent using the cache, when alternating cors/no-cors requests occur, since only the cors requests send the `Origin` header. Some additional thoughts:
comment in response to post
Thanks for the hint. 🙏 I updated the query now to use the crawl.pages and it's 5 times cheaper! Also, the query looks a bit nicer. 😍
comment in response to post
But running a query on a small sample of the all.pages table, indicates that less 2% of pages with link[rel=preload][as=font] tags have at least one of those tags with a missing crossorigin attribute. gist.github.com/ko-jordan/57...
comment in response to post
Thanks for taking the time to answer. Leftovers seem to amount to 200,000 pages as well, so let's meet in the middle? :) I originally thought that missing `crossorigin` on font preloads, could be a big contributing factor to unused preload hints. ...
comment in response to post
Great read, very insightful, thanks! Am I reading the numbers correctly, that on ~30% of pages that use preload hints, one of those hints goes unused? Anyway, I like that you gave it a positive spin. 😀 Great to see the less error-prone fetchPriority on the rise!
comment in response to post
Actual cache lifetime is also further reduced by the `vary` header. For static assets Accept and Accept-Encoding are frequently used. They are pretty stable, but every now and then these request headers change. Sometimes User-Agent and Cookie are (mis-)used as well which has an even stronger effect
comment in response to post
Awesome! General question: What do you think about Chrome's caching behaviour in such scenarios? Why is it not serving the files from cache on reloads? bsky.app/profile/kojo...
comment in response to post
Thanks for checking.
comment in response to post
I would have expected that each CO/non-CO request gets its own cache entry, but this doesn't seem to be the case. Can you confirm the behaviour?
comment in response to post
@screenspan.net What I just realised: On EVERY refresh the fonts are downloaded twice and not served from cache! The alternating CO/non-CO requests seem to be messing with the files' caches, at least in all Chromium browsers I tested. In fact, this behaviour is the only reason I HEARD it.
comment in response to post
Ah, just saw that you already mentioned the crossorigin issue in your skeet.
comment in response to post
Great! I noticed a few more issues with the fonts. 🙈 - there's no subsetting - they are preloaded without crossorigin attribute, which causes double-downloading (see github.com/w3c/preload/...) - the Italic font is probably not used often enough to justify a preload
comment in response to post
btw SUX is also used for Sustainable UX, see sustainableuxnetwork.com :)
comment in response to post
Great read, thanks! I fully approve of the idea of having an all-stakeholder-friendly metric. Here's an explicit suggestion for a "Vital Score": bsky.app/profile/alek...
comment in response to post
Great idea, I love it! I immediately heard that something was off with bsky.app when playing my latest reload's .har. The fonts files are massive. :-D
comment in response to post
Great read, thank you! Another example similar to Case 1 was/is delaying the download of JS files until user interaction in order to improve former Core Web Vitals metric FID and boost Lighthouse scores. While this technique can yield good UX in some cases, I've seen it misused intentionally.
comment in response to post
I like the idea, it's nice and simple. I use a calculation with the established 75th percentiles: For each metric - 100% if 75th is Good - 0% if Poor - linearly declining if in between and then (lcp+cls+inp)/3. Yours rewards pages where more than 75% experience good vitals, which is great.
comment in response to post
I opened an issue regarding a slight discrepancy in the calculation of page weight and carbon emissions in the Sustainability chapter: github.com/HTTPArchive/...
comment in response to post
Also available in German: www.kojordan.com/de/blog/erke...
comment in response to post
One more lane will fix it