It is no longer an industry secret that Google uses various factors for ranking websites in its search, and one of them is the speed of the respective website. But what exactly does it mean when talking about the speed of a website, or its "PageSpeed"? In this field guide, we want to give you an insight into Google's Web Vitals, how they contribute to ranking in search and to what extent, and also provide initial tips for optimizing your website's Web Vitals.
To optimize the Core Web Vitals, you first need to understand what they are exactly about. First: "Web Vitals" and "Core Web Vitals" are often used synonymously, even in professional articles. However, this is not entirely correct – the Core Web Vitals are a subset of the Web Vitals. The Core Web Vitals consist of only three metrics of the actual Web Vitals, which according to Google should be measured and optimized for every website. In the further course of this article, we will deal with these metrics, but also with other data from the broader Web Vitals, which we consider to be at least equally important.
But back to the origin: For some time, Google had been suspected of using the loading speed of a website as a ranking factor. In 2021, Google released an update to their ranking algorithm, in which for the first time such a factor was officially mentioned. Google wanted to rank pages by "Web Vitals". A set of predefined metrics which measure, weigh, and evaluate the speed and in a narrower sense the usability of each website.
The principle is easy to understand: Google has a certain idea of how a website should be structured. However, they cannot influence the content and structure of the websites that appear in the search. But they certainly can influence the order in which they are displayed. So, if you plan to reach one of the top ranks in the Google search for a hot keyword, you will have to optimize your Web Vitals.
In this article, we will primarily deal with the following Web Vitals:
- Largest Contentful Paint (LCP). This core metric evaluates how long your website takes to display the largest element in its initial viewport. The "largest element" is determined by various factors. However, Google provides tools allowing you to see what Google considers the largest element and where there is corresponding need for optimization.
- Cumulative Layout Shift (CLS). This metric is also included in the Core Web Vitals and can be easily explained by an example you have probably encountered: You load a website you might have visited before. You know there is a button in the middle that you need to press. You try to press it during the loading process – but just can’t hit it because the website’s layout shifts during loading. Very annoying for the user, which is why Google includes it in the calculation of the performance score.
- First Contentful Paint (FCP). Although this metric is not included in the Core Web Vitals (only in the broader category Web Vitals), we consider it one of the most common breaking points in a website's performance, which is why we need to address it as well. Here, it is evaluated how long it takes for anything (meaning something useful for the user) to be displayed on the website. This metric naturally strongly depends on Time To First Byte (TTFB) (also not included in the Core Web Vitals), which we will address briefly as well.
- First Input Delay (FID). This metric is again included in the Core Web Vitals. It measures the time from the first interaction of the user with the website (e.g., a click on a link) until a response to this interaction occurs in the browser. FID is therefore the core metric that measures how responsive the website is to user interactions.
Within the broader category Web Vitals, there are several more metrics, but they have a smaller impact on the score or rarely lead to problems, which is why we will only focus on the top four (or five) metrics in this article.
The above metrics and their weight within the performance score calculation are not set in stone. Google regularly updates their algorithms, including (Core) Web Vitals. These updates can lead to new necessary optimizations. Therefore, unless you work in SEO as a profession, keeping up can be challenging. A digital agency well-versed in Core Web Vitals can sustainably optimize your website. At Kuatsu, we have already invested a lot of time in the optimization of customer projects as well as our own website (more on that in a moment!). So, if you prefer to invest your time in advancing your company and leave the optimization of Core Web Vitals to a professional provider, feel free to reach out to us.
Additionally, you should keep in mind that Google calculates the Web Vitals twice: once for mobile devices and once for desktops. Desktop scores are generally higher than mobile scores, as Google also performs artificial network speed throttling to slow 4G or even 3G levels while calculating the latter. However, mobile scores are actually more important than desktop scores: globally, according to a StatCounter survey,more than 57% of all page views come from mobile devices – and growing. Therefore, the optimization of your website should always primarily focus on mobile devices. You shouldn't necessarily rest on a good desktop score.
Before you start optimizing, you should of course first analyze the status quo to see where optimizations are most needed. There are several ways to do this.
The simplest way to get your website's performance score is through Google PageSpeed Insights. Here, you only need to enter the URL of the page to be tested, and Google's servers take over the measurement. Note, however, that when measuring via PageSpeed Insights, not only the current measurement in a controlled environment ("Lab Data") feeds into the score, but also "Field Data". Therefore, Google also incorporates the experiences of real users who have accessed your website (e.g., via Google Chrome). Therefore, the Web Vitals cannot be tricked by simply removing all images and scripts during a controlled measurement. Google also uses real session data for the scores.
You should also note that during a measurement via PageSpeed Insights, your server location may play a crucial role. Since the measurement is done on Google's servers, it may be that a very long connection to your server has to be established first, which can of course dramatically pull some scores down. PageSpeed Insights measures from one of four different locations based on your current location:
- Northwest USA (Oregon)
- Southwest USA (South Carolina)
- Europe (Netherlands)
- Asia (Taiwan)
The coverage is therefore not bad, but if your server is in Australia, for example, the connection may take considerable time. Even from the Netherlands to a data center in Frankfurt, some time elapses.
As you can see, you should not rely solely on PageSpeed Insights, as the measurement may not accurately reflect real user experiences.
Lighthouse is an open-source tool provided directly by Google, which captures numerous important metrics about your website in addition to Web Vitals, including accessibility or on-page SEO. Lighthouse runs locally on your computer, but simulates a controlled environment where the metrics can be accurately measured. The best part: You often don't even need to download additional software, especially if you use Google Chrome as your browser. Lighthouse is directly integrated into the "Chrome Dev Tools", which you can access via right-click, a click on "Inspect" in the context menu, and then selecting "Lighthouse" from the top toolbar.
As an example, here is a mobile performance measurement of our website using Lighthouse via the Dev Tools:
There are several other ways to measure the Web Vitals, including via Google Search Console. However, these are more aimed at experienced SEOs. For beginners who want to assess their website's performance, the above-mentioned PageSpeed Insights and Lighthouse are most suitable.
The Largest Contentful Paint (LCP) measures how long it takes for the largest content element within the initial viewport (i.e., the area the user sees upon page load) to be fully displayed. This could be large banner images, large text blocks, videos, buttons, but also more subtle elements. According to Google, for example, the largest content element on our new website is the logo in the navigation area. But don’t worry: No treasure hunt is required to find the respective element. Both PageSpeed Insights and Lighthouse show you the element directly via a click on "LCP".
As with every metric in the Web Vitals, Google has very specific expectations about how quickly this element should load. A good LCP is therefore a load time of under 2.5 seconds, while anything over 4 seconds is very bad and requires immediate action. Anything in between is acceptable but still in need of improvement. Considering the 25% weighting of this metric in the performance score, we personally see an urgent need for optimization for all values above 2.5 seconds.
Basic LCP optimization is fortunately relatively easy to carry out. Sometimes, however, more in-depth optimizations are necessary to achieve a really good LCP score. You can find the most commonly used optimization methods here in our checklist:
- Use a Content Delivery Network. If the element in question is an image, video, or similar embedded resource, the most obvious option is to reduce the loading time of that resource itself. A simple way to achieve this is to serve the resource via a Content Delivery Network (CDN). A CDN is specifically optimized to provide resources like images as quickly as possible for any user worldwide. Various load balancing techniques are used for this purpose. Combining all these techniques results in much faster load times than if the resource is served locally from your own server. It also takes the load off your own server, which is needed elsewhere. A popular CDN solution that we use on our website is Cloudinary.
- Compress resources and serve them in a modern format. You will often come across this recommendation in PageSpeed Insights or Lighthouse. In principle, resources, including the LCP element, should be compressed as much as possible to generate as little data traffic as possible. There are numerous tools for images to compress them losslessly. But compression should also be enabled on the web server itself, for example via Gzip. Images should also preferably be served in a modern format like WebP or JPEG2000, as these offer a much smaller file size. However, since not every browser supports these formats, you should always have the old but as compressed as possible formats as a fallback. Try also to properly dimension raster graphics and not send a large 1000x1000px JPEG for a small logo in the navigation area to the user.
- Minimize client-side JavaScript and CSS rendering. It makes little sense to load a large Google Maps library into the browser at page load when it's only needed at the end of the page. Try to minimize the JavaScript and CSS you use as much as possible and defer those needed in the loading cycle of the webpage using
async
anddefer
attributes. While JavaScript or CSS is being loaded into the browser, the Document Object Model (DOM), which loads the actual HTML structure of your website, and thus the LCP element, cannot be extended further.
The Cumulative Layout Shift (CLS) measures, simply put, the visual stability of your website during loading. To revisit the example from above: Imagine a button you want to press during the loading process, but it keeps changing position. Frustrating, right? This is exactly what an optimization of CLS aims to prevent. The CLS metric primarily reflects good usability of the website. No user wants to accidentally click on a button that leads to a subscription when they only wanted to purchase a single issue of a magazine.
But how can you package this layout shift into a comparable value? For Google, this value is the product of what's called the Impact Fraction and the Distance Fraction. The Impact Fraction expresses how much an unstable element of the layout affects the viewport between two frames. Pardon? Let's take an example: Imagine an element that occupies 50% of the viewport height upon page load. Due to an unstable layout, it suddenly shifts 25% downwards. Thus, the element now has an Impact Fraction of 75% (or 0.75). The Distance Fraction is essentially the movable portion of the Impact Fraction. In our example, the Distance Fraction would be 25%, as the element has shifted by this value. If the CLS consists only of this element, we would thus have a total Impact Fraction of 0.75 and a Distance Fraction of 0.25, which multiplies to give a CLS of 0.1875. This score would be considered improvable by Google. Only a score of up to a maximum of 0.1 is considered good, while anything above 0.25 is considered bad.
Now that we have clarified the technical details, the question remains: how can we best prevent these layout shifts?
- Use placeholders. If you load a button element during the loading process via JavaScript and then insert a button above a text block, the text block will be subject to a layout shift. You should therefore use a placeholder that is ideally the same size as the button to be inserted, and then replace this placeholder with the button. This way, the text block knows where it "belongs" from the start and is no longer shifted.
- Define widths and heights for images. The browser automatically creates placeholders for images and videos during loading when it knows how much space it needs to keep free. It can only do this if the respective elements are equipped with width and height specifications.
- Replace web fonts with system fonts during loading. When working with web fonts, always make sure a similar system font is specified as a fallback. Not only can you thereby support older browsers that may not display web fonts, but the text will also be displayed before the respective font is loaded, avoiding layout shifts.
- Avoid layout shifts in animations. When working with animations or CSS transitions, ensure that the entire layout is not shifted when the size of an element is animated. Again, you should create a wrapper element with the maximum size of the animated element so that external elements are not shifted.
The First Contentful Paint (FCP) is closely related to the LCP and measures when the first content element on the website is displayed. Until the FCP, the website is therefore blank and unusable for the user. The FCP should naturally be far earlier than the LCP. Google indicates a value of under 1.8 seconds as good, while anything above 3 seconds is bad. Optimizing the FCP involves many factors, some of which also have direct (positive) effects on the other metrics.
The FCP naturally depends heavily on how long the server takes to send the first byte to the user. This period is also called Time To First Byte, or TTFB. This includes things like the DNS request, which resolves the hostname (e.g., kuatsu.dev) to the server's IP address, as well as SSL handshakes. All these, however, have one thing in common: they are server-related and can only be optimized by optimizing the server. A large, cluttered database could be a reason for a long TTFB, or poor web server configurations. The best configuration is useless if the hosting provider is simply not good enough, or if the website is served from a single server on the other side of the world compared to the user. In the checklist for FCP optimization, you will also find some points that are directly related to the TTFB.
We have recognized that the FCP is one of the most important influences on user satisfaction. But how do we optimize this metric? There are countless ways to do this, of which Google presents many in their own blog entry . We will cover a few of these in our checklist.
- Minimize CSS and JavaScript. We have already learned earlier that the Document Object Model cannot be built up during the loading process of JavaScript or CSS. Therefore, it is self-evident that they must be minimized as much as possible. There are so-called "minifiers" or "uglifiers" that take your CSS and JavaScript and shrink them as small as possible using sophisticated methods. Use this option.
- Remove unused CSS and JavaScript. Closely related to the last optimization option: remove CSS and JavaScript that are not needed. Here, often the disadvantage of a large WordPress theme or similar comes to light, carrying several megabytes of CSS and JavaScript that are probably not needed for your personal website. For WordPress, there are some plugins like Asset CleanUp that allow you to remove unnecessary assets as far as possible. However, with many themes, this is not always perfectly possible, which is why the best solution is still to forgo pre-made themes and instead develop your own theme or use a performance-optimized page builder like Oxygen.
- Use caching. Most web servers offer many ways to enable caching. This ensures that certain resources are not regenerated with every page load, but cached for a time. Even for WordPress, there are several plugins that, in combination with the corresponding web server adjustments, can result in massive improvements in the FCP value. Every WordPress site we create includes a WP Rocket license and preconfiguration.
- Use Server-Side Generation (SSG). Admittedly: Now we are reaching a point where we are no longer talking about optimization but about a completely new website. A static site generator like Next.js creates static HTML pages and handles much of the JavaScript on the server. Therefore, no dozens of API requests needs to be made in the user’s browser for displaying a blog; instead, the server makes these requests before loading and serves a finished HTML page. By the way: We also use Next.js and Static Site Generation.
Please note that some of the previously mentioned optimization possibilities, especially in the area of Largest Contentful Paint (LCP), overlap with those of the FCP, and therefore are not listed again here.
The First Input Delay (FID) represents the time it takes for a user to perform an interactive action on your website. If a user clicks a button during the loading process, it takes this amount of time before a response occurs. Here, too, Google provides reference values: a time of less than 100ms is considered good, while anything over 300ms is considered bad.
The First Input Delay can prove particularly tricky or technically challenging to optimize. Often, a detailed assessment of the current website and in-depth, technical optimizations are needed to fix a poor FID. However, here are some measures that can be taken:
- Relieve the main thread. Rendering in JavaScript runs on the main thread. However, if API requests and complex calculations are performed simultaneously, rendering and thus the response to a user interaction has to wait. This can be remedied by using web workers or asynchronous JavaScript.
- Minimize the impact of third-party code. If you use many JavaScript libraries such as Google Analytics, Google Maps, Facebook Pixel, etc., this is reflected in your website's interactivity. Such libraries should be loaded only after the Document Object Model (DOM) is fully loaded. You can use the
async
ordefer
attributes in thescript
tag for this.
Since July 2021, Core Web Vitals have been an official part of Google's ranking and have become a homework task for every serious SEO. But you should not only pay attention to these metrics for a good position in Google search: ultimately, they are good metrics to measure and compare the real usability and user-friendliness of your website. A digital agency specializing in optimizing Web Vitals can therefore not only improve the Google ranking but also provide real added value for users.