Google has said that Core Web Vitals (CWV) are going to be an SEO factor, and the date is nigh: May 2021. So, I’m seeing some scrambling to make sure those metrics are good. Ya know, the acronym soup: CLS, LCP, and FID. There is starting to be more and more tooling to measure and diagnose problems. Hopefully, once diagnosed, you have some idea how to fix them. Like if you have crappy CLS, it’s because you load in stuff (probably ads) that shifts layout, and you should either stop doing that or make space for them ahead of time so there is less shifting.
But what about LCP? What if you have this big hero image that is taking a while to paint and it’s giving you a crappy LCP number? Chris Castillo’s trick is to just not load the hero background image at all until a user interacts in some way. Strikes me as weird, but Chris did some light testing and found some users didn’t really notice:
Although this accomplishes the goal, it’s not without a cost. The background image will not load until the user interacts with the screen, so something needs to be used as a fallback until the image can be loaded. I asked a few friends to load the page on their phones and tell me if they found anything strange about the page, and none of them noticed anything “off”. What I observed is that the few friends I asked to test this all had their fingers on the screen or quickly touched the screen when the page was loading, so it happened so quickly they didn’t notice.
It’s a fine trick that Chris documents, but the point is fooling a machine into giving you better test scores. This feels like the start of a weird new era of web performance where the metrics of web performance have shifted to user-centric measurements, but people are implementing tricky strategies to game those numbers with methods that, if anything, slightly harm user experience.
Ugh… Feels like the old days, where you stuff invisible keywords in the footer of your page to try to game the search engines. I hate SEO so much. It more often than not hurts the user experience, and just feels dirty. I hate that it forces us to use hacks.
SEO doesn’t force anything. Search engines are trying to recommend sites and articles that their users will find value in. If you want the engine to rank you for usability and relevance build useable and relevant content. Anything that games the system will be found out pretty quickly because G will monitor the rank vs engagement metrics and adjust the algorithms according. There are plenty of things to criticize the search engines for like scaping your content straight into the search results, but better ranking algos isn’t one of them.
+1 to that.
s/WCV/CWV/
It’s Core Web Vitals. That’s the comment :-)
Thank you
There’s an article on the Gatsby blog about not loading analytics until the user interacts. You lose some data but you gain better measured and real performance.
I think we’ll start to see a lot of this.
It’s definitely a sort of “progressive enhancement”, and not something that’s necessarily ideal.
Interestingly enough, Gigo Varghese just released a Lazy Rendering for his FlyingPress plugin yesterday.
That Lazy Rendering does something similar, but instead utilizes the new content-visibility CSS property:
https://web.dev/content-visibility/
The only problem with this approach is that the browser support right now is pretty lacking… at only around 60% support across modern browsers.
The sort of strategy I outlined in my article, is almost like a polyfill for what I think should be a standard in terms of browser features that developers can utilize – like content-visibility.