Google performance. Google PageSpeed ​​Insights has been radically updated, what will change? Page Speed ​​Insights and SEO Promotion

Page speed measures the total time it takes for a website to fully load. When analyzing a website, loading speed can be critical for SEO. Users quickly abandon a page when it takes too long to load, and search engines ignore pages with poor performance.

Besides SEO benefits, there are other reasons to improve page speed. Fast websites have higher conversion rates and are much more user-friendly. This can attract more people to your site and keep them coming back for more.

Google Research

Let's look at some recent research: The average time it takes to load a mobile landing page is 15 seconds. Considering that visitors have to decide within three seconds whether they want to stay on your site, this is a huge discrepancy.

Based on the data presented above, we see that the likelihood of leaving a page largely depends on its loading time:

  •   With boot times ranging from 1 to 3 seconds, the probability of failure increases to 32%
  •   1 to 5 seconds to 90%
  •   From 1 to 10 seconds the probability of failure is already 123%
  • It’s not hard to guess how many visitors we are losing.

Page speed will be an important ranking factor

Back in 2010, Google stated that page speed would be a ranking factor, but this was only aimed at desktop sites.

And not so long ago, a new ranking factor in mobile search called “Speed ​​Update” was announced. The update will go into effect in July 2018, giving site owners time to prepare.

According to the company, sanctions will only apply to pages that load slowly and provide a negative user experience.

Also, do not forget that site speed is one of many ranking factors. For example, if your page is well optimized according to Google PageSpeed, but has little useful non-unique content, then it is unlikely to be able to take decent positions in search engines.

  •   1. Use browser caching and gzip/deflate compression. If you have apache, this is configured in htaccess.
  •   2. Optimize your images. Removing unnecessary information and compressing images can significantly reduce their weight.
  •   3. Reduce resources javascript, css, html code. If the site is under development, then for the convenience of editing the code it is better to leave this item for later.
  •   4. Use the new version of the HTTP/2 protocol. It is more efficient, reliable and prone to fewer errors.
  •   5. Switch to PHP7. Runs more than twice as fast as its predecessor.

If you need the help of a qualified specialist, you can always get a free analysis and estimate of the cost of the work.

In one of the following articles we will talk about what blocking scripts are and why it is not always possible to get rid of them.

Page loading speed is now a very powerful signal for search engines. And for users this is a significant factor, which is difficult not to pay attention to if there are problems with it. By improving site speed, you can not only get ranking benefits, but also gain more trust and conversion rates. Below is a list of the most useful tools that will help you analyze and identify the weakest points of your site in terms of speed.

1. Google PageSpeed ​​Insights

Google's Page Loading Speed ​​Tool. Shows a value from 0 to 100 for both desktop and mobile devices. He immediately points out the weak points of the site and gives recommendations for optimizing speed.

2. Pingdom Tools

Gives an assessment of speed, shows the number of calls to the server and the average loading time. The summary table will display in detail the data for each request to the server (styles, scripts, images, etc.). It’s easy to assess what exactly on the site is slowing down loading.

3.WhichLoadFaster

Load two sites for comparison (yourself and a competitor), visually observe which loads faster (convenient to demonstrate to clients). At the end of the download, information is displayed which site won and how many times faster it loaded.

4. Web Page Performance Test

Loads the page twice, compares the number of hits - reveals how well caching is organized, shows detailed statistics for each test. Saves screenshots of how the site looks every second of loading. It also shows in a convenient form which group of requests took the most time. The server is located in Dallas (USA).

5. GTmetrix

Another useful tool for testing site speed. Displays a lot of summary information, also stores a history so you can compare how much your download speed has improved or worsened. Loads Yahoo and Google recommendations for speed optimization, sorting them by priority. The test server is located in Vancouver (Canada).

6. Load Impact

The service tests how much the site can withstand the load (light DDOS). Several dozen users and more than a hundred active connections are emulated. Since the test lasts several minutes, other tools can be used during this load time to evaluate page loading speed during rush hour. At the end of the test, you can see a graph of how the download speed changes depending on the number of active users.

7. Monitis Tools

Analyzes website loading from different parts of the Earth - servers in the USA, Europe and Asia. Displays summary statistics for each test.

8.SiteSpeed.me

Sends requests to the analyzed page from different data centers (about 30 servers) and determines the speed for each of them. Highlights the best, worst and average performance in terms of time and speed.

9. PR-CY

Mass website speed check. You can specify up to 10 addresses - thus comparing the loading time and document size for each resource.

10. WebPage Analyzer

Report on page loading and all additional scripts/styles/images. A simple and often necessary tool.

If you use any other free online tools to check website page loading speed, please share them in the comments.

On November 12, Google quietly updated PageSpeed ​​Insights, changing almost everything about it. This will be a big change for the entire website building industry. It looks like there will now be some wave of panic and hype around this event. The article contains an analysis of the changes and what they will bring to us.

What is PageSpeed ​​Insights

Just a few words for those who don’t know. For 8 years now, PageSpeed ​​Insights has been the main site speed meter. You can enter the page address into it and find out its score on a scale from 0 to 100, along with recommendations for improvement.

Of course, there are many other good speed testing tools. But since this one is from Google, and they stated that site speed affects rankings, for most this assessment seems to be the most important. Especially for customers and bosses, and as a result, almost everyone is trying to raise the PageSpeed ​​Score of their projects, and the metric has become almost the most important in the industry.

What changed?

In short - everything. The old PageSpeed ​​has been put aside, replaced with scores and analytics from Lighthouse, an open-source website auditing tool that is, among other things, built into Google Chrome.

The cardinal difference of the approach is Points are now awarded not for following the rules, but for speed. Page loading is assessed based on several time characteristics - how long after the start of loading something is already visible, when you can already click, how slow everything is while loading, and when everything is loaded. These characteristics are compared with those of the best sites and converted into points. Below we will analyze this in more detail, now the principle itself is important.

There are recommendations as before - but now they carry a completely different burden. Recommendations are not directly related to points, and it is absolutely not a fact that their implementation will improve the situation (but it can easily worsen if implemented thoughtlessly).

Panic is inevitable

It's the night of the 13th and everything is relatively quiet. Only a couple of specialized resources posted short notes about the update, only a couple of clients wrote worried letters about the strange behavior of PageSpeed ​​Insights. This seems to be the calm before the storm.

Right now it is clear that the tool behaves unstable - ratings for the same page fluctuate within 20 points, sometimes there are complaints about the inability to obtain the page of the site being evaluated. Some sites, in his opinion, are generally inaccessible - although they actually feel great.

It is obvious that soon a lot of people will rush to look at the evaluations of their projects, covering the service with an international habra effect. Everything will work through the cracks, glitch and frighten with jumping estimates.

It's not easy, but try to relax and stay calm. The first thing to remember is that updating PageSpeed ​​Insights does not in any way affect the principles of ranking in search results. Secondly, it will take at least two weeks for the update to be tested, corrected and start working stably. Don't make sudden movements, they may have to be rolled back later.

Reflections and forecasts

There are many positives in these changes. The dominance of the old PageSpeed ​​Insights with its forced recommendations caused a lot of trouble. First, any recommendation may be close to useless in your specific situation. Secondly, it can be implemented at the expense of more important things, such as page generation time. But there was no choice - I had to do all this to get a good grade.

For example, what is the recommendation about minifying the HTML code of the page? On average, this operation takes about 100 ms, and this delay is tens of times greater than any possible benefit from reducing the page size. The only time this can be beneficial is if you are serving ready-made pre-minified pages from the cache.

In any project in recent years, a lot of effort has gone into optimizing images, minifying and grouping resources, and deferring JavaScript to ensure that nothing breaks. Most often, this took the focus away from the essence - the speed of the site for visitors. The Internet was full of examples of both slow sites with excellent ratings and fast sites that were poorly rated.

Now this tinsel will gradually fall off. In the first tests, the scores with and without minification-grouping of resources are practically the same. The really important things become significant - how quickly the server responds and how much heavy material is on the page. All the bells and whistles - social network widgets, interactive maps, chats and luxurious pop-ups will inexorably hit the ratings, no matter how you wrap them up.

It is likely that this will all lead to really fast websites and an understanding of how to make them. At least, I really want to believe it.

New metrics

And for the most persistent - a detailed analysis of new metrics that affect the assessment. There are 6 of them in total, and they have different weights in determining the final grade. Let's go through them in order of decreasing importance.

1. Loading time for interaction

This is the most important characteristic - and the hardest. The timestamp when the page is fully ready for user interaction. This moment comes when:
  • page was displayed
  • event handlers have been registered for most visible elements
Essentially, the page should be drawn, not slow down and be ready to respond to actions.

2. Loading speed index

Shows how quickly page content becomes available for viewing. The Speedline module is used for evaluation.

This is the time when the page in the browser stops changing visually. To determine this, a frame-by-frame comparison of the page view is used.

3. First content loading time

A metric that defines the time interval between the page starting to load and the first image or block of text appearing.

4. CPU end time

This parameter indicates the time at which the page's main thread becomes free enough to process manual input. This moment comes when:
  • most elements on the page are already interactive
  • the page responds to user actions in a reasonable time
  • response to user actions is less than 50 ms
The Russian translation of this metric misses the point a little. In the original it sounds First CPU Idle- first processor downtime. But this is not entirely true. This refers to the moment in page loading when it can mostly respond to actions, although it continues to load.

5. Time to load enough content

This parameter shows the time after which the main content of the page becomes visible. This moment comes when:
  • the biggest change in page appearance has occurred
  • fonts loaded

6. Approximate input delay time

This is the least significant characteristic. Shows the time in milliseconds it takes the page to respond to user actions during the busiest 5 seconds of page load. If this time exceeds 50 ms, users may feel like your site is slowing down.

Each metric is compared to all evaluated sites. If you do it better than 98% of sites, you get 100 points. If it is better than 75% of sites, you get 50 points.

At first glance, these metrics are very vital and it will be almost impossible to deceive them with dirty manipulations that do not actually speed up the site.

For now, the principle of evaluating the mobile version of a site remains a mystery. Or rather, the principle is the same, but often the scores are noticeably lower. It is not clear what virtual configuration of the mobile device they are running on.

PageSpeed ​​Insights (PSI) reports on the performance of a page on both mobile and desktop devices, and provides suggestions on how that page may be improved.

PSI provides both lab and field data about a page. Lab data is useful for debugging performance issues, as it is collected in a controlled environment. However, it may not capture real-world bottlenecks. Field data is useful for capturing true, real-world user experience - but has a more limited set of metrics. See for more information on the 2 types of data.

Performance score

At the top of the report, PSI provides a score which summarizes the page’s performance.

This score is determined by running to collect and analyze about the page. A score of 90 or above is considered fast, and 50 to 90 is considered average. Below 50 is considered to be slow.

When PSI is given a URL, it will look it up in the (CrUX) dataset. If available, PSI reports the (FCP) and the (FID) metric data for the origin and potentially the specific page URL.

Classifying Fast, Average, Slow

PSI also classifies field data into 3 buckets, describing experiences considered fast, average, or slow. PSI sets the following thresholds for fast / average / slow, based on our analysis of the CrUX dataset:

Fast Average Slow
FCP (1000ms, 2500ms] over 2500ms
FID (50ms, 250ms] over 250ms

Generally speaking, fast pages are roughly in the top ~10%, average pages are in the next 40%, and slow pages are in the bottom 50%. The numbers have been rounded for readability.

These thresholds apply to both mobile and desktop and have been set based on human perceptual abilities.

Distribution and selected value of FCP and FID

PSI presents a distribution of these metrics so that developers can understand the range of FCP and FID values ​​for that page or origin. This distribution is also split into three categories: Fast, Average, and Slow, denoted with green, orange, and red bars.

For example, seeing 14% within FCP"s orange bar indicates that 14% of all observed FCP values ​​fall between 1,000ms and 2,500ms. This data an aggregate view of all page loads over the previous 30 days.

Above the distribution bars, PSI reports the 90th percentile First Contentful Paint and the 95th percentile First Input Delay, presented in seconds and milliseconds respectfully.

  • These percentiles are so that developers can understand the most frustrating user experiences on their site. These field metric values ​​are classified as fast/average/slow by applying the same thresholds shown above.
  • Field data summary label
  • An overall label is calculated from the field metric values:

Fast: If both FCP and FID are Fast.

Slow: If any either FCP or FID is Slow.

Average: All other cases.

Differences between Field Data in PSI and CrUX

The difference between the field data in PSI versus the Chrome User Experience Report on BigQuery, is that PSI’s data is updated daily for the trailing 30 day period. The data set on BigQuery is only updated monthly.

The field data is a historical report about how a particular URL has been performed, and represents anonymized performance data from users in the real-world on a variety of devices and network conditions. The lab data is based on a simulated load of a page on a single device and fixed set of network conditions. As a result, the values ​​may differ.

Why is the 90th percentile chosen for FCP and the 95th percentile for FID?

Our goal is to make sure that pages work well for the majority of users. By focusing on 90th and 95th percentile values ​​for our metrics, this ensures that pages meet a minimum standard of performance under the most difficult device and network conditions.

Why does the FCP in v4 and v5 have different values?

V5 FCP is looking at the 90th percentile while v4 FCP reports the median (50th percentile).

What is a good score for the lab data?

Any green score (90+) is considered good.

Why does the performance score change from run to run? I didn't change anything on my page!

Variability in performance measurement is introduced via a number of channels with different levels of impact. Several common sources of metric variability are local network availability, client hardware availability, and client resource contention.

More questions?

If you"ve got a question about using PageSpeed ​​Insights that is specific and answerable, ask your question in English on Stack Overflow.

If you have general feedback or questions about PageSpeed ​​Insights, or you want to start a general discussion, start a thread in the mailing list .

Feedback

Was this page helpful?

Yes Great! Thank you for the feedback.

If you have a specific, answerable question about using PageSpeed ​​Insights, ask the question in English on Stack Overflow mailing list.

More than 50% of Internet users on mobile devices expect almost instantaneous loading of the site. Based on these statistics, this article will discuss how to achieve 100/100 in Google PageSpeed ​​Insights for desktop and mobile devices using the Monitor Backlinks site as an example.

Motivation

The sample site already loads quite quickly, so in this case the results will be improved to the maximum possible.

One day, while working with the PageSpeed ​​Tool, it was noticed that Google's website had a surprisingly low score for mobile devices - 59/100. The situation with the version for desktop devices was better - 92/100.

It would seem that they should have used their own tool to optimize their website, right? So is the result 100/100 unattainable?

This is what motivated us to achieve the fastest possible loading time for the site, to prove that it is possible to achieve 100/100 results, and if you want, you can do it too. It's not an obsession, it's just a desire to achieve perfection.

The starting indicators of the experimental site are 87/100.

As a result, after applying certain manipulations, the following results were obtained:

Read on to learn how we managed to achieve these results.

How to speed up page loading?

Before proceeding with the step-by-step instructions, let me note that the PageSpeed ​​tool is just a guide for webmasters on the path to resource optimization. In addition, the tool contains recommendations for speeding up the loading of pages on your site, and achieving positive results largely depends on the server settings.

Please note that some of the steps in the instructions may require technical knowledge, regardless of the content management system (CMS) used.

So let's get started:

Step #1: Optimizing Images

To make images load faster, PageSpeed ​​Insights Tool suggested optimizing them by reducing file sizes. To solve this problem, you need to do two important things:

  • Compress all images using tools like Compressor.io and TinyPNG. These are free tools that can help you reduce the size of a graphic file by more than 80%, and in some cases, without compromising the quality of the image itself.
  • Reduce image sizes to a minimum without reducing their quality. For example, if on the site we need a 150x150px image, then on the server the image must be of the appropriate size. Image parameters should not be adjusted using CSS or HTML tags.

According to the above rules, each image on the site has been uploaded and manually compressed and sized. And in order not to bother with optimizing images after they are uploaded to the site, it is better to develop the habit of initially optimizing all new images uploaded to the server. Each new image must be compressed and adjusted to the required parameters.

PageSpeed ​​Insights offers the option to download already optimized images, so they can be uploaded to the server directly from the service. The same can be done with JavaScript and CSS.

Step #2: Minify JavaScript, CSS and HTML

In the example, Google suggests reducing the amount of JavaScript and CSS files.

The minification process allows you to reduce file sizes by eliminating unnecessary spaces, breaks, lines, characters and comments in JavaScript and CSS files. Programmers often leave a lot of space and comments while coding, which can double the size of JavaScript and CSS files.

To fix this problem, Gulpjs was installed on the server. This is a tool that automatically creates a new CSS file and removes all unnecessary whitespace. It also automatically creates a new CSS file every time you make new changes. In the example above, this helped reduce the size of the main CSS file from approximately 300Kb to 150Kb. This difference in size is due to unnecessary characters.

You can further reduce the size of JavaScript and CSS by renaming variables, provided that the selectors work correctly and the HTML is updated.

You can optimize JavaScript using Closure Compiler, JSMin or YUI Compressor. You can create a special program that, using the above tools, will rename files and save them to the working directory.

You can reduce CSS using YUI Compressor and cssmin.js tools.

You can minimize HTML code using PageSpeed ​​Insights. Parse the page and select "Shorten HTML." To optimize the code, click on "View optimized content".

You can learn more about optimizing JavaScript and CSS files at the following link:

You can also download optimized files directly from the PageSpeed ​​Tool.

Here are the results obtained after minifying JavaScript and CSS:

Step #3: Using Browser Cache

For many webmasters, the step of using browser caching is the most difficult.

To resolve this issue, we had to move all static files from the site to CDN (content delivery network).

A CDN is a network of servers located in various locations around the world. They cache static versions of websites, such as images, JavaScript and CSS files. CDN servers store copies of the site's content, and when you visit this site, static content is downloaded from the nearest server.

For example, if the site's main server is located in Texas, then without a CDN, a visitor from Amsterdam will have to wait until the site's content travels all the way from a server located in the USA. With a CDN, the site will load much faster from the server closest to the user, in this case from Amsterdam. Thus, the distance to access data is reduced and the site loads almost instantly.

Here's a visualization of how the CDN works:

On the test site, all images, JavaScript and CSS files were transferred to the CDN, and only HTML files remained stored on the main server. Hosting images on a CDN plays an important role in how quickly your site's pages load for visitors.

After the above manipulations, the PageSpeed ​​tool annoyingly continued to suggest using browser caching for individual third-party resources. Here's a screenshot:

To resolve this issue, we had to fix the social network scripts by replacing the counters with static images hosted on a CDN. Instead of third-party scripts that tried to access data from Twitter, Facebook or Google Plus to count subscribers, an autonomous counter was installed, which helped solve this issue.

But what was more frustrating was that, in addition to the problems with social media scripts, the website was being slowed down by the Google Analytics code.

Solving a problem with a Google Analytics script is quite a difficult task. Since Analytics was needed and could not be removed from the site, we had to look for other solutions.

Google changes the Analytics code quite rarely, once or twice a year. Therefore, Razvan created a special script that checks for the latest Analytics code updates every eight hours, and if updates are found, downloads them. This way, you can host the Analytics JavaScript code on the server, eliminating the need to download it from Google's servers every time you visit.

If there are no updates, the Analytics code will be loaded from the cached version on the CDN.

And when Google updates the JavaScript code, the server will automatically download the new version and update it on the CDN. This script was used for all external third party scripts.

Here is a screenshot from Pingdom Tools showing all downloads from the CDN, including the Google Analytics code. The only download file from the server is the home page file, which is only 15.5Kb. Eliminating all third party scripts greatly improved overall loading speed.

Step #4: Removing blocking codes

Removing blockers is also a rather difficult step in the process of improving page loading speed, requiring good technical knowledge. The main problem that had to be solved was the revision of all JavaScript code starting from the top from the “header” and “body” to the “footer” located at the bottom on all pages of the site.

If your site is on the Wordpress platform, the Autopmize plugin will most likely help you solve this problem. Check your settings and then Uncheck "Force JavaScript" and set it to "Inline all CSS".

Step #5: Enable Compression

Step #6: Mobile Format Optimization

Analysis of the mobile format shows the adaptability of the mobile version of the site for various types of resolution, the use of suitable fonts, and the presence of a good navigation system.

Using Google Chrome, you can see how your site looks in different mobile versions. To do this, click on the menu icon (hamburger) of settings and browser controls in the upper right corner, and then select "More tools → Developer tools." On the toolbar, select the icon with the image of mobile devices. That's all, look:

In the case of the example, no radical changes were required.

Conclusion

As a result, 6 of the most important steps were completed that helped achieve a perfect score of 100/100 in Google PageSpeed ​​Tools for the Monitor Backlinks site. As a result, not only the main page was optimized, but also all internal pages.



Among all the actions taken to optimize the site, the three most important can be identified:

  1. Using a CDN.
  2. Removing blocking codes. (Avoid JavaScript in the body of the coding; move it to the bottom of the files instead.)
  3. Image size optimization and compression.

I would like to remind you once again that Google PageSpeed ​​Tools is just an auxiliary tool for resource optimization. The tools are designed to reduce the time between the request (clicking on a link) and the response of the site page (displaying the first elements of the page), so that visitors do not leave the site without waiting for it to load. Also, the recommendations provided by the tool must be used with caution so that users are not shown a broken layout or some unstylized set of blocks.

Note. Fast loading of website pages indirectly affects the ranking of the resource in search engines, that is: higher loading speed → more and longer visits → higher ranking.

If you've used Google PageSpeed ​​Insights to optimize your site, please share your results in the comments.

Also, do not forget that UAWEB specialists are always ready to provide the necessary assistance in creating, optimizing and promoting your web resource, so that every second spent by users on your site brings you benefits!