Google collects performance information from millions of opted-in Chrome browsers around the world and uses this information as a performance ranking factor for its search engine. But it also makes this information freely available so that anyone can use it to check the real-world performance of individual websites. Even more significantly, it’s possible to segment this data according to the technologies used in the websites. In this article, Dan Shappir leverages this information to analyze and compare the performance of leading JavaScript frameworks. Along the way, he uncovers unexpected behaviors and solves a web performance mystery.
Read more…
At the end of 2021, the Chrome team shipped some functionality that has the ability to make or break sites meeting the Core Web Vitals. So, let’s learn a little bit more about the Back/Forward Cache (aka bfcache), and what you can do to test if your website is compatible with it. Barry Pollard strongly encourage sites to run the Back/Forward Cache test, understand any blockers leading to an unsuccessful test and seek to remove those blockers.
Read more…
Performance needs to be built in starting at the code level, and user-centric metrics like time to interactive (TTI), total blocking time (TBT), and first input delay (FID) help you gauge how fast a website is. But modern web pages are heavy and ever-growing in size Introducing Partytown, a lightweight open-source solution that reduces execution delays due to third-party JavaScript by offloading third-party scripts to web workers, which run in background threads.
Read more…
Loading experience is crucial to the user’s first impression and overall usability, so Google defined Largest Contentful Paint (LCP) metric to measure how quickly the main content loads and is displayed to the user. This new attribute will enable us to fine-tune relative resource priority, improve LCP performance, deprioritize JavaScript fetch calls, and much more. Let’s check out fetchpriority and explore some potential use cases.
Read more…
Using signals to deliver less, or different, content is a form of progressive enhancement (or graceful degradation depending on how you look at it), whereby extraneous content is only loaded when necessary, but the core functionality of the website still works. In this article, we’ll look at some of the signals that can be used for this.
Read more…
How would you measure performance? Unfortunately, there is no silver bullet for measuring performance. Different products will have different benchmarks and two apps may perform differently against the same metrics, but still rank quite similarly to our subjective “good” and “bad” verdicts. Web Vitals are the new gold standard in performance due to their direct correlation with the user’s experience. In this article, Atila Fassina will show you what monitoring can do and how RayGun can help you sustain performance maintenance while scaling your app.
Read more…
Statoscope is an instrument that analyses your webpack-bundles. Created by Sergey Melukov, it started out as an experimental version in late 2016, which has now become a full-fledged toolkit for viewing, analyzing, and validating webpack-bundles.
In this article, we’ll look specifically at what we can do to reduce the impact of social media embeds and social sharing widgets — or even some strategies to avoid them altogether. While the spotlight is on reducing the environmental impact, many of these tips will be great for performance too.
Read more…
In this article, Adrian Bece shares more about the benefits and caveats of code-splitting and how page performance and load times can be improved by dynamically loading expensive, non-critical JavaScript bundles.
Read more…
How to improve Core Web Vitals, a Smashing Magazine case study on how we detected and fixed the bottlenecks, and how we ended up with green scores, all the way.
Read more…