What are your Core Web Vital results and why do they matter? If you want to attract clients or customers, it’s a very good idea to build a website. These days people regularly go to Google, or other search engines, when they want to learn anything. And having a website is an excellent way to help meet the needs of your audience. But how do you build a website that is going to be seen and provide an outstanding user experience? One answer is to make sure you’ve taken care of Google’s Core Web Vitals for your website.
To help websites make to most out of Core Web Vitals, Google suggests website owners use the “Measure” tool, which can be found at web.dev. This tool will make it extremely easy for you to run a test on your website, and see how it performs, and what areas of improvement may exist. But depending on your website setup, if you run the report ten times you could get ten different results. There are a number of factors which can impact the outcome of your test, and can result in websites scores being inconsistent. This blog post helps explain the most common reasons for the inconsistency.
What is the Measure Tool and What does it Measure?
Before we dive into the details, let’s get some basics out of the way.
What is the Measure tool and what are the parameters it uses to measure a website? The measure tool essentially takes a close look at some key features of your website in order to gauge its potential for success. There are four parameters it judges; performance, accessibility, best practices, and search engine optimization (SEO).
Parameters For Core Web Vitals Results
The first one, performance, tests how well your website functions and responds to a user’s visit. Examples of milestones that can be used to measure this include first paint, loading time, or lag encountered.
Accessibility is the second parameter and it gauges how easily users can access content on your website. This is usually measured by looking out for issues that might plague a website’s display. Examples could include too many advertisements, poor font usage and size, etc.
The third parameter, best practices, measures how well a website adheres to some dos and don’ts that are essential in today’s online landscape. Examples of this include the usage of HTTPS in hyperlinks and the presence of proper image ratios and quality.
Finally, search engine optimization is a metric that measures how well the website is configured, in accordance to best practices for ranking well on search engines. SEO involves a website’s display and content to match a particular set of requirements. Efficient and accessible layouts are especially important to a good SEO output. To learn more about SEO, check out our extensive blog post titled FAQs About Search Engine Optimization (SEO).
Let’s review some ways to ensure the web.dev website gives you the most accurate and consistent reports possible. How do we do that? We first need to identify reasons for an inconsistent report.Why Core Web Vitals Results Vary and How To Get Consistent Results? #CWV #JLM #SEO Click To Tweet
Where You Are Can Affect The Core Web Vitals Results
Where you run your Web.dev report can have an impact on the results. This is due to where you are in relation to the nearest connection point. With your normal day to day browsing, you may notice pages take a few extra seconds to load at home than they do at your office. This can be caused just because one of them is closer to a connection box. If we assume both are of running Fibre Optic but your office has only 100 meters of cable from your modem to the connection point. Where your house may have 300 meters from where you connect your modem to where it gets to the connection box that handles all the homes in your area.
This is only going to affect it by milliseconds, but that is milliseconds for each byte of information going back and forth. We can see a similar result when running Web.dev reports based on where the request is originating from to where it is going. Even a few extra kilometers can add a few seconds.
The Content Can Change Based on Where You Are
Now is the time to discuss a particular feature called geotargeting. It’s a practice that many web developers use, that essentially changes a website’s layout from place to place. Certain developers want to market heavily in, for instance, the US instead of Canada. Accordingly, the website may display a more fully stocked online market, better deals, and so forth. Even Netflix, as an example, displays different content depending on the area you’re streaming for. How does this affect the Measure tool? Well, metrics such as SEO and Best Practices heavily rely on a website’s content and display to determine score outcomes.
If in a certain area a website’s content is simply blacked out, or plastered with advertisements, scores can change. If text changes aren’t translated properly across both versions of the website, Accessibility is affected. Therefore, if you’re designing a site with differing web pages, make sure that all content is properly incorporated before going lie. If you’re unsure about the final product, testing the Measure tool with a VPN is a great way to test it out. Here are the results for Netflix taken from 3 different locations.
Those poor people living in the United States.
Server Response Times
No matter what optimizations you make to your website, or how well you compress your images, it may not matter if your server response is slow. There are countless hosting companies you could choose to use. And there are continuing improvements towards faster and better technology that hosting sites must keep up as well. Every few years, it can be worthwhile to look into changing or upgrading hosting to a new plan or new company.
Just as it comes to other products you own. For example, if you have a 5-year-old laptop, it wouldn’t perform at the same level as a brand new one. You don’t necessarily need to upgrade from a shared host to a dedicated host. Though this can help with some of the speed issues inherent in shared hosting. Even simply upgrading from a slower package to a faster one can be a good use of your budget. It can provide you with better and more consistent performance results.
This is an example of a site we worked on for a client who felt their site was taking too long to load. We started changed who their webpage was hosted with and saw immediate results.
Performance Can Change Depending On When It Is Measured
Timing is important as server loads rise and lag depending on the amount of traffic a website’s hosting. Accordingly, Core Web Vitals results and scores may change, depending on what time you check it. Your website may be completely functional and still receive lower scores in performance simply because when you checked in, there were too many people browsing it.
Higher traffic leads to a much slower response time and speed index. Again, through no fault of your own development, web.dev can show low scores that don’t accurately reflect your website’s actual performance (from a coding perspective). How do you, then, get better results? That entirely depends on you. Monitor your website’s traffic, note the average and at what time that average is met, and then check. Again, while checking at 4 in the morning will likely lead to a more positive result, it still won’t be a more accurate one.
You want the web.dev report to reflect your website’s overall performance, which requires at least some amount of traffic to judge.
To put it into even simple terms, let’s take the example of a new road. You decide that it needs to be tested. By running no cars across, you have a road that’s fully stable. But the results are skewed since roads obviously have traffic running across them. You then decide to run the heaviest machinery imaginable across the road. This ruins the road but is still an inaccurate test. The best result is achieved when you measure the average traffic flow of roads in your area, and then decide to run that number of vehicles across the road.
Provided are the web.dev results of the same webpage, with no changes made, less than 4 hours apart. Showing the difference a server response time can make
Lighthouse vs Web.dev
We have talked about Web.dev but we do need to talk about Lighthouse. It is the actual program that Web.dev uses to test your webpage. Google Lighthouse is a development tool that can run instant, live reports of your web page. These reports test your page for the Core Web Vitals can provide a more detailed report of each metric but is more reliant on your system and how it runs the webpage. Lighthouse is built into Chrome and is available to all users to check their websites.
The one thing that you do not need to worry about between running a report with web.dev is your own hardware. That is due to the fact that when a web.dev measures the Core Web Vitals for your site it is doing so on an emulated phone. All tests are run using a simulated mobile device, throttled to a fast 3G network & 4x CPU slowdown. By creating this simulated environment for all reports, it provides all users with a standard baseline. Regardless of your own network and computer specs. Web.dev uses the simulated device to run a Lighthouse report to provide the for your Core Web Vital results.
If you are going to be using Lighthouse to work on your website page by page there a few things that you should be aware of.
With Lighthouse Incognito Mode Isn’t Just For Private Browsing
Why are Performance ratings for the Lighthouse different when checked under Incognito Mode as opposed to normal browsing? Incognito Mode does provide users with better performance. Factors that run better are almost exclusively linked to assets such as speed index or interactive time. Incognito Mode has the advantage of pulling websites quicker due to a lack of storing caches with every visit.
Many users also speculate that the accumulation of browser history slows down regular browsing as well. Especially with all those cookies that are collected as you browse the internet. Browser history is the one thing above all else that Incognito Mode does not store. You can see the difference in the results of the same Lighthouse report run at the same time on the same website. The only difference being the Incognito mode.
Depending on what extensions your browser has downloaded, the addition of browsers can impact the performance scores of your website test. Extensions add an extensive load on a browser’s performance. Websites are therefore loaded slower and the performance ratings drop. Through no fault of the website in question. Most individuals have their extensions on for a reason. Therefore it is best to run web.dev on another device, a different browser, or with Incognito mode. Some extensions also affect the display of a website when encountered. The changing of factors such as image resolution and aspect ratio can affect the Best Practices score. Therefore, for an unbiased opinion on your website’s functionality, perhaps it’s better to forego any and all extensions.
Many of these external factors that can affect the Measure tool’s overall scores are linked to the Performance metric. This article isn’t covering what deficiencies your website itself possesses. Web.dev itself is a perfect site for doing that, you don’t need us. What this article’s here to do instead, is help identify and remove any interference with your results and provide a better understanding of why they differ. But do keep in mind, the point isn’t always shooting for the best score. The point is trying to best gauge how your website works and how to provide the best experience for your customers.