Barry Pollard
@tunetheweb.com
2.6K followers 370 following 890 posts
Web Performance Developer Advocate at Google Chrome helping to make the web go faster! All opinions my own.
Posts Media Videos Starter Packs
I think that would be interesting for Google Maps since that’s clearly an outlier here. Beyond that it gets tricky. Define when which technologies to exclude? Every site uses multiple technologies we detect so I’m willing to be the number of sites that use JUST lit is zero.
How would you do that for a Lit site for example?
That’s not a very satisfying answer I know. But it does surface interesting data and discussions (like this!).

And also useful to show that Lit is used a whole lot more than we might have even considered before (for whatever reason). Which is also interesting.
Ultimately this is a “sites that use” not a “sites that are primarily built with” report. Because there’s no easy to differentiate the two for most techs (but if you have a way then I’m all ears!).
Agreed. But that also impacts WordPress (unfairly) too. And the same could be said of other technologies used to build third-party component like React. And at what point does usage of a tech (whether 1P or 3P) become enough to say the site uses that technology?
Yeah that’s where it gets tricky. Other tech is also the same - like if an embed is built in React.

How to detect if it’s just used in those places? And at what point does enough usage cross over to say that’s now a factor in how that site’s built?
Not sure what you mean?

The percentage is:

The number of sites using that technology* AND that pass CWV / the number of sites that use that technology*

* where we detect it in either of the 2-4 pages we scan for that site as we look at 2 pages per desktop and mobile
Not sure what you mean? This report is built by joining the CrUX BigQuery data (for Core Web Vitals stats) and HTTP Archive BigQuery data (for the technology detection that those sites use).
But in the meantime the data is the data and we don't "correct" the data just because it seems "unfair". By our methodology and definition (as flawed or limited as it is), these sites ARE using Lit.

But if we can improve detection of when Lit is actually being used then I'm all ears!
But that doesn't mean a site built primarily on Lit is poorly performing.

It would be interesting to look at the sites EXCLUDING those using Lit through Google Maps. Sadly we don't have that functionality in the Tech report (yet?), but can get it from BigQuery.
Cause I think Lit is being treated a little unfairly here. Most of this seems to be due to Google Maps, which is a heavy 3rd-party (and yes I know I work for Google).

So that does show the sites are willing to load that SDK (often needlessly as I say) which IS a perf issue and probably one of many
Always keen to understand if our data or methodology is flawed! Usually there's a reason for numbers that feel "off" — like in this case.

In this case it's "correct" even if it's not ideal.
Even when it is primarily built with a tech there can be other explanations (e.g. complex sites are more likely to use complex tools to build the site!).

Correlation is not causation. Which I admit many assume. But the data is provided as an interesting dataset to dig into to find things like this.
And of course, "uses a technology" doesn't mean "the site is primarily built with this technology" — especially for components like this which are frequently used by third-parties like in this case.
I'm willing to bet a lot of those don't actually show maps on most pages, but load the SDK on all.

So if you can think of a better way of detecting whether the site actually uses Lit (i.e. a map is displayed with Lit for this example) then I'd be happy to know to reduce this down?
Oh I may have found something!

I see Google Maps uses Lit, so sites that execute that JavaScript (even if they don't dislpay a map), will create the global object and so that will count as the site "using Lit".

And looking that up we have 910,868 of those sites:

httparchive.org/reports/tech...
Interestingly an alternative source (builtwith) has MUCH less usage:

trends.builtwith.com/javascript/Lit

But I can't explain that when all sites I spot check show it being used?
Lit Web Technology Usage Trends
Lit web technology usage trends.
trends.builtwith.com
I also extracted the version out of curiosity (in the same sheet)
3.2.1 is far and away the most popular with 786,512 origins, way about the next version (2.8.0 with 40,819)
We actually have over a million sites detected, but only 500,000 of them also have Core Web Vitals.

HEre's the top 10,000 sites:
docs.google.com/spreadsheets...
And spot checking a few I do see that global object for most (not all, but most)
lit-html pages
docs.google.com
Would love to dig into this more!

We use Wappalyzer (before they went closed source) for the technology detection and here's the definition:

github.com/HTTPArchive/...

Which looks like a simple check on the `litHtmlVersions` global JavaScript object so hard to see how that's could be misdetected?
Reposted by Barry Pollard
Bramus @bram.us · 13h
With View Transitions now being Baseline Newly available, and the View Transition API consisting of various features, it can be confusing to know what is supported in which browser versions.

To help with that, I created this VT Feature Explorer (powered by View Transitions)

web.dev/blog/same-do...
I get some pre-LLM issues from reading this article…
Sure that addresses one problem, but can still give the feeling it's slower.

Same with skeleton screens. If going to render the full thing shortly (for some definition of "shortly") then I'd rather not see that skeleton screen. Not what I wanted, and _feels_ like it's slowing things down.
When I went digging into this a few years back, the UX story was... mixed IIRC.

Some didn't like it (and even thought it seemed slower), and were not certain when it was loaded.

Some said they did like it, but proof of that seemed more difficult to come by.

Some did like it.

Overall: 🤷‍♂️