How did I get out of Google Penguin's filter for unnatural links? but the panda is still with me! Myth: You won't recover from the effects of the Penguin algorithm.

A redirect is a method of redirecting users and search robots to a URL that differs from the one originally requested. There are several types of forwarding, some of which are listed below.

301 Moved Permanently

301 is a permanent redirect, which indicates that the requested page is located at a new address, and the old one should be considered outdated. This type of redirect transfers 90-99% of the link mass to the new URL.

Canonization or gluing of a domain

To merge a domain from www to without www:

RewriteCond %(HTTP_HOST) ^www.site\.com$ RewriteRule ^(.*)$ http://site.com/$1

To merge a domain from without www to with www:

RewriteCond %(HTTP_HOST) ^site\.com$ RewriteRule ^(.*)$ http://www.site.com/$1

For the right choice gluing method, you need to consider the following factors:

  • Which option has higher indexing;
  • Which option has a higher position in the search results;
  • Canonization of the slash at the end of the address.

When creating a website project, you need to decide whether to use a slash at the end of the address. For search engines, addresses like:

  • http://www.site.com/category1
  • http://www.site.com/category1/

Are different URLs. Therefore, when you decide what type of addresses you will have on your site, you need to write the following code in order to to remove the slash at the end:

RewriteCond %(HTTP_HOST) (.*) RewriteCond %(REQUEST_URI) /$ RewriteRule ^(.*)(/)$ $1

or this one to add it:

RewriteCond %(REQUEST_FILENAME) !-f RewriteCond %(REQUEST_URI) !(.*)/$ RewriteRule ^(.*[^/])$ $1/

For 301 redirect one page to another:

Redirect 301 /oldpage.html http://www.site.com/newpage.html

To make sure that when requesting any version of the main page, for example: default.htm or index.html, a redirect will be made to the canonical page http://www.site.com, you need to enter the following redirect code:

RewriteCond %(THE_REQUEST) ^(3,9)\ /([^/]+/)*(default|index|main)\.(html|php|htm)\ HTTP/ RewriteRule ^(([^/]+ /)*)(default|main|index)\.(html|php|htm)$ http://www.site.com/$1

Directory redirect

If your address displays the directory structure, then when changes occur, your address will also change. Therefore, you need to write the following redirect:

RewriteRule ^(.*)/old-catalog/(.*)$ $1/new-catalog/$2

But it happens that the address of the old catalog is displayed immediately after the domain name, for example www.site.com/old-catalog/. In this case, the following code is used:

RewriteRule old-catalog /(.*) / old-catalog /$1

Redirect when changing file extensions

When changing CMS, usually only the file extensions change. To canonicalize pages in this case, you need to use code like:

RedirectMatch 301 (.*)\.php$ http://www.site.com$1.html

Redirect when multiple slashes or dashes appear

By various reasons It happens that extra slashes or dashes appear in the address, for example www.site.com/catalog////page-1.html. Such pages need to be redirected to addresses with one slash.

RewriteCond %(REQUEST_URI) ^(.*)//(.*)$ RewriteRule . %1/%2

In the same way, extra dashes in the address are removed, for example, changing www.site.com/catalog/page-1.html on www.site.com/catalog/page-1.html.

RewriteCond %(REQUEST_URI) ^(.*)-(.*)$ RewriteRule . %1-%2

.htaccess - extra slashes after the domain name

  • http://site.com//////catalog

To remove all these slashes so that there is a redirect to a page without slashes, i.e.

  • http://site.com/catalog

You need to write:

RewriteCond %(REQUEST_URI) ^(.*)//(.*)$ RewriteRule . %1/%2

Generating 301 redirects

If the technical knowledge to write your own code is not enough, that is special services generating all main redirects:

Here you can insert your data and instantly receive required code. Redirects for domains are supported, url addresses, catalogues.

How to check 301 redirect?

After making any changes to the logical part of the redirect, you need to check its functionality. For manual testing you need:

  • To check whether the site is working at all - go to its main page;
  • Wander around the site, its sections and individual pages.

But there are also services for automatic check redirect:

  • http://bertal.ru – very detailed data about all server responses

Rules for using 301 redirects vs Canonical

The Google search engine sets clear rules, only if you follow them will it correctly interpret your actions. Here's how 301 and Canonical search engines literally understand it:

  • this page is outdated new page is located at such and such an address. I ask you to remove the old page from the index, and index the new one and completely transfer all the weight of the old one to it.
  • Canonical – in addition to this version of the page, I also have others. But please, index only the one on which Canonical stands. Other versions will be there for people to view, but you don't need to include them in the index. All the weight should be transferred to the page with Canonical.

Preferences for using a 301 redirect

Typically, this is the most preferred method:

  • For individual pages– if her address has changed forever;
  • For domains - if the site will be permanently located on a new domain;
  • For 404 pages and pages with content that is no longer relevant. For example, when deleting a product from the catalog, you can redirect to a product with similar functions or to a catalog page with this type of product.

When is it better not to use a 301 redirect?

  • If their implementation is impossible or it will take an unreasonably long time.
  • If the content is duplicated on two pages, but both of them must be accessible to the user due to some differences (for example, clothing size).
  • If one page has several URLs (sorting the catalog according to different criteria).
  • For cross-domains, when content on two addresses can be duplicated, but it must be on each of the domains.

Did you like the post? Click on the buttons →

Redirect 301 or 301 Permanent Redirect is a rule that automatically redirects the user to another page address. With its help, you can merge old pages with new ones or move the site to another domain, remove many duplicates from the site and much more. Extremely useful thing- so you need to learn how to do it correctly!

Also, with a 301 redirect, the TCI (read) and PR are merged - i.e. you can store the old value in the new address. You may not lose at all in the eyes of search engines. Let's move from theory to practice.

How to set up 301 redirect in htaccess

The file called .htaccess is located in main category your site. This is a service file in which we will specify the gluing rules. Open the file with notepad (I recommend notepad++, so that there are no problems with encoding). Don't forget that this is one of the important points in.

The general template for .htaccess that we will use:

Options +FollowSymLinks RewriteEngine On #Here we specify the rules

All rules are written in the form:

RewriteCond [Comparison] [Condition] [Flags] RewriteCond [Comparison] [Condition] [Flags] RewriteRule [Pattern] [Substitution] [Flags]

Now let’s directly deal with redirection and look at specific examples.

I have already written an article about, but just in case I will summarize here to make it more convenient for you to use the page.

Examples of using 301 redirects

Redirect from index.php to home page

To set it up, you need to write the following code in your file, which will redirect visitors from site.ru/index.php to site.ru:

RewriteCond %(THE_REQUEST) ^(3,9)\ /index\.php\ HTTP RewriteRule ^index\.php$ http://site.ru/

If you also need to redirect from index.html, then simply replace .php with .html in the code above

Gluing site aliases

If you have several domains and you want to redirect them all to the main site, then we use:

RewriteCond %(HTTP_HOST) ^vash-sait.com$ RewriteCond %(HTTP_HOST) ^www.vash-sait.com$ RewriteCond %(REQUEST_URI) !^/robots.* RewriteRule ^(.*)$ http://vash- sait.ru/$1

Pay attention to the first 2 lines, it indicates a mirror in zone.com, if you have another or several zones, then add rules.

Redirect from www to without www

I have already described this method earlier, but I will repeat it. In order to glue mirrors and select the main domain without www, we write:

RewriteCond %(HTTP_HOST) ^www.site\.ru$ RewriteRule ^(.*)$ http://site.ru/$1

Redirect from a domain without www to a domain with www

This action is the opposite of the previous one, only the main mirror is here www.site.ru:

RewriteCond %(HTTP_HOST) ^site\.ru$ RewriteRule ^(.*)$ http://www.site.ru/$1

301 redirect of pages with and without slash

This is another type of duplicate, here we will merge the pages site.ru/category/ and site.ru/category, as you can see, there is no slash at the end of the second URL:

RewriteCond %(REQUEST_FILENAME) !-d RewriteCond %(REQUEST_URI) ^(.+)/$ RewriteRule ^(.+)/$ /$1

If, on the contrary, you need to leave a slash at the end of the URL, then you need this option:

RewriteBase / RewriteCond %(REQUEST_FILENAME) !-f RewriteCond %(REQUEST_URI) !(.*)/$ RewriteRule ^(.*[^/])$ $1/

If for some reason you have URLs like site.ru/category//article.html, then use the code:

RewriteCond %(REQUEST_URI) ^(.*)//(.*)$ RewriteRule . %1/%2

where, “//” can be replaced with “—” or any double characters in the URL.

Mass category replacement

It happens that you renamed a category, but thousands of URLs are assigned to it. To avoid getting old while writing thousands of identical redirects, use:

RewriteRule ^(.*)/old-category/(.*)$ $1/new-category/$2

RewriteRule old-category /(.*) / old-category /$1

Redirect to new page

The simplest redirect from page to page, just write it like this:

Redirect 301 /old-post.html http://new-site.ru/new-post.html

Where old-post.html is yours old page, and new-site.ru/new-post.html is a new page and it can be on any domain (including your current domain).

Redirect for url with parameters

Pages with parameters are more difficult to redirect; take the example of http://site.ru/page.php?sort=articles. The parameter here is “sort=articles”. The code will be as follows:

RewriteCond %(QUERY_STRING) sort=articles RewriteRule .* http://site.ru/page.php?

Working with extensions

We remove .html from the url (to remove .php, do not forget to replace $1.html with $1.php):

RewriteRule ^(([^/]+/)*[^.]+)$ /$1.html [L]

We change .php to .html in the URLs and vice versa (don’t forget to swap places in the code):

RedirectMatch 301 (.*)\.php$ http://www.site.com$1.html

Correct 301 redirect to a new domain

I highlighted this point separately, because... it is more related to transfer to another domain. There is important point, this is not to miss the directive for editing robots - it is indicated on line 3 of the code.

RewriteCond %(REQUEST_FILENAME) robots.txt$ RewriteRule ^([^/]+) $1 [L] RewriteCond %(HTTP_HOST) ^site\.ru RewriteRule ^(.*)$ http://new-site.ru/$1 RewriteCond %(HTTP_HOST) ^www.site\.ru RewriteRule ^(.*)$ http://new-site.ru/$1

With the help of such simple machinations, we will get rid of duplicate pages, thereby improving our internal optimization and, accordingly, the quality of the site.

If you have any questions, write in the comments - we’ll sort it out :)

Despite the fact that changes in Google algorithms are one of the hottest topics in the field of SEO, many marketers cannot say with certainty how exactly the Panda, Penguin and Hummingbird algorithms affected the ranking of their sites.

Moz specialists have summarized the most significant changes to Google's algorithms and literally broken down the information about what each update is responsible for.

Google Panda – Quality Inspector

The Panda algorithm, whose main goal is improving the quality of search results, was launched on February 23, 2011. With its appearance, thousands of sites lost their positions, which excited the entire Internet. At first, SEOs thought that Panda was penalizing sites found to be participating in link schemes. But, as it later became known, the fight against unnatural links is not within the mandate of this algorithm. All he does is assess the quality of the site.

To find out if you are at risk of falling under Panda's filter, answer these questions:

  • Would you trust the information posted on your website?
  • Are there pages on the site with duplicate or very similar content?
  • Would you trust your credit card information to a site like this?
  • Do the articles contain spelling or stylistic errors or typos?
  • Are articles for the site written taking into account the interests of readers or only with the goal of getting into search results for certain queries?
  • Does the site have original content (research, reports, analytics)?
  • Does your site stand out from the competitors that appear alongside it on the search results page?
  • Is your site an authority in its field?
  • Do you pay due attention to editing articles?
  • Do the articles on the site provide complete and informative answers to users' questions?
  • Would you bookmark the site/page or recommend it to your friends?
  • Could you see an article like this in a printed magazine, book, or encyclopedia?
  • Does advertising on the site distract readers' attention?
  • Do you pay attention to detail when creating pages?

Nobody knows for sure what factors Panda takes into account when ranking sites. Therefore, it is best to focus on creating the most interesting and useful website. In particular, you need to pay attention to the following points:

  • "Insufficient" content. In this context, the term “weak” implies that the content on your site is not new or valuable to the reader because it does not adequately cover the topic. And the point is not at all in the number of characters, since sometimes even a couple of sentences can carry a considerable semantic load. Of course, if most of your site's pages contain only a few sentences of text, Google will consider it low quality.
  • Duplicate content. Panda will consider your site to be of low quality if most of its content is copied from other sources or if the site has pages with duplicate or similar content. This is a common problem with online stores that sell hundreds of products that differ in only one parameter (for example, color). To avoid this problem, use the canonical tag.
  • Low quality content. Google loves sites that are constantly updated, so many SEOs recommend publishing new content daily. However, if you publish low-quality content that does not provide value to users, then such tactics will cause more harm.

How to get out from under the Panda filter?

Google updates the Panda algorithm monthly. After each update, search robots review all sites and check them for compliance with established criteria. If you fell under the Panda filter and then made changes to the site (changed insufficient, low-quality and non-unique content), your site’s position will improve after the next update. Please note that you will most likely have to wait several months for your positions to be restored.

Google Penguin – Link Hunter

The Google Penguin algorithm was launched on April 24, 2012. Unlike Panda, this algorithm aims to combat unnatural backlinks.

The authority and significance of a site in the eyes of search engines largely depends on which sites link to it. Moreover, one link from an authoritative source can have the same weight as dozens of links from little-known sites. Therefore, in the past, optimizers tried to get the maximum number of external links in every possible way.

Google has learned to recognize various manipulations with links. How exactly Penguin works is known only to its developers. All SEOs know is that this algorithm hunts for low-quality links that are manually created by webmasters in order to influence a site's rankings. These include:

  • purchased links;
  • exchange of links;
  • links from irrelevant sites;
  • links from satellite sites;
  • participation in link schemes;
  • other manipulations.

How to get out from under the Penguin filter?

Penguin is the same filter as Panda. This means that it regularly updates and reviews sites. To get out of the Penguin filter, you need to get rid of all unnatural links and wait for an update.

If you conscientiously follow Google's guidelines and don't try to gain links through unfair means, you can regain favor with the search engines. However, to regain the top positions, it will not be enough for you to simply remove low-quality links. Instead, you need to earn natural editorial links from trusted sites.

Google Hummingbird is the most “understanding” algorithm

The Hummingbird algorithm is a completely different beast. Google announced the launch of this update on September 26, 2013, but it also mentioned that this algorithm has been in effect for a month. This is why website owners whose rankings fell in early October 2013 mistakenly believe that they fell under the Hummingbird filter. If this were really the case, they would have felt the effect of this algorithm a month earlier. You may ask, what in this case caused the decrease in traffic? Most likely, this was another Penguin update, which came into force on October 4 of the same year.

The Hummingbird algorithm was developed to better understand user requests. Now, when a user enters the query “What places can you eat deliciously in Yekaterinburg,” the search engine understands that by “places” the user means restaurants and cafes.

How to raise yourself in the eyes of Google Hummingbird?

Since Google strives to understand users as best as possible, you should do the same. Create content that will provide the most detailed and useful answers to user queries, instead of focusing on keyword promotion.

Finally

As you can see, all search algorithms have one goal - to force webmasters to create high-quality, interesting and useful content. Remember: Google is constantly working to improve the quality of search results. Create content that will help users find solutions to their problems, and you will be guaranteed to rank first in search results.


We released new book“Content Marketing on Social Media: How to Get into Your Subscribers’ Heads and Make Them Fall in Love with Your Brand.”

Subscribe

The Google Penguin filter is one of the latest algorithms, which the company uses when ranking sites in search results.

Today, Google takes into account more than two hundred factors when ranking sites. To take them all into account, one algorithm is not enough; several are needed, each of which will solve its own problems.

The main task of the Penguin filter is to identify and block sites that use dishonest promotion methods, the main one of which is the purchase of link mass. The algorithm is constantly being improved and the filter is currently being updated Google Penguin runs almost continuously.

History of the development of the Penguin algorithm

Google Penguin was released to the world in April 2012. Over the next two months, it was updated twice, the developers adjusted the first version of the filters. The second version of the algorithm appeared almost a year later; the updated Penguin acted more subtly and took into account not only the level of link spam, but also the overall level of the page.

In the fall of 2014, the algorithm was updated again. It must be said that at that time he acted in such a way that sites that fell under his filters had to wait a long time for the release after correction. next update to be tested again. The situation changed in 2016, after Google exit Penguin 4.0, which operated in real time and was updated continuously. Latest versions The algorithms act extremely gently - the level of the site, the quality of the pages are taken into account, and low-quality links are canceled without sending the entire site to a ban.

What does Google Penguin punish for?

Experts believe that the Penguin algorithm should complement the Google Panda algorithm, which is responsible for checking website content. To prevent your resource from falling under the Google Penguin filter, you need to carefully work with external links to the site and avoid what experts call link manipulation. The main methods of such manipulation are:

  • “Trading” links, when the site owner publishes links to other people’s sites on his resource for money or other payment.
  • Obviously artificial link exchange, when sites link to each other due to collusion of owners, and not because of the quality of the content.
  • Use on the site large quantity texts that contain a lot of “far-fetched” anchors and keywords.
  • Using services that automatically generate links to the site.
  • The presence on the site of links that have direct keywords in the anchor.
  • Using cross-links with anchor keywords in the sidebar and footer of the site.
  • Comments on site materials with links to spam resources.
  • Excessive quantity contextual advertising on home page site.

For using such dishonest link schemes, the Google Penguin filter will quickly and reliably “drop” your site by many pages in the search results. Moreover, it will be very difficult for you to regain your position, since Google Penguin checks the site only twice a year.

How to find out if Google Penguin has applied sanctions

Unlike Google's algorithm, which only works in automatic mode, Penguin is also used for manual moderation. If you notice a sharp drop in traffic, go to Google Webmaster Tools, in the “Manual measures taken” section and check if there are messages from moderators there.

If there is a letter, all you have to do is correct the shortcomings indicated in it and send a request for a new check.

However, most often the algorithm works automatically. In this case, it's worth going to Moz.com and checking if there have been any recent Penguin updates. If there have been updates, then the diagnosis has been established correctly, and it’s time to start “treating” the site. You can also identify this correspondence using the PenguinTool service from the Barracuda website. True, for this you will have to give the service access to your account in Google Analytics, so that it compares the period of drop in traffic and the time of release of the new update. The comparison result will help you understand whether you are caught by Penguin’s filters or not.

What to do? if Google Penguin caught you

If you fall under the filters of this algorithm, then the worst thing you can do is start to panic delete all links. This will completely destroy the resource.

A site that is deemed to be of poor quality by the search engine needs a calm and thoughtful reorganization. Herself Google company proposes to gain link mass again, slowly, naturally and mainly through the creation of unique content.

The first thing you need to do to get out from under the filter is to analyze the link profile of the resource. You will need to understand which links come from quality sites, that is, from useful, interesting and visited ones, and which from spam ones. You can do this using the Majestic SEO service. Links to spam sites (internal links) must be neutralized using noindex and nofollow bans, which will block “bad” links from indexing and block transitions to them. To remove external links, you will need to use Google service to disavow links. The service is called , Google Penguin simply does not take into account the links included in it.

The second step is changing link anchors. It is performed in two ways. The first is to change the link to a non-anchor link, but only an experienced webmaster can do this. The second method is to build up your link profile by creating new non-anchor links.

The third step is to expand your base of link donors, that is, make sure that the links come from different sources: from forums, from social networks, from catalogues, from blogs and media such as online magazines and news portals. Complete site sanitization and removal from filters usually takes 3-4 months.

To avoid falling under Google Penguin filters, you need to attract only high-quality links, maintain constant dynamics of link profile growth, and not use direct anchors in your links. Quality content and natural link building from different sources will protect you from sanctions search engine better than any specialist.