HTML programming lessons from scratch. Modern HTML lessons for beginners. What do you need for work?

Good afternoon friends! For a very long time now, blog readers have been asking me to make a post about how to get rid of replytocom. This topic is relevant, because this results in a huge number of similar pages with comment parameters in the index. The thing is that many blogs are created on the WordPress CMS, which suffers from duplication of comments. Therefore, I decided to give all the information not only about replytocom. In this article, I explain the concept of duplicate content, give a method for finding duplicate pages on a site, and explain why you need to get rid of them. And in the next post I will show how to remove them.

Duplicate content on the site

Takes can be different - quiet and contagious. 🙂 Seriously, a duplicate page is a site document that may have completely or partially the same content. Of course, each take has its own address (page url). A clear duplicate page may occur for the following reasons:

  • artificially created by the site owner for special needs. For example, printable pages that allow the user of a commercial site to copy information on a selected product/service.
  • are generated by the resource engine, since this is built into their body. Some modern CMSs display similar pages with other URLs, located in their own different directories.
  • appear due to errors of the webmaster who manages the site. For example, a resource may have two identical main pages with different addresses (for example, site.ru and site.ru/index.php).
  • arise due to changes in the structure of the site. For example, when creating a new template with a different URL system, new pages with old content receive completely different addresses.

Fuzzy duplicates are obtained from pages that partially duplicate content. Such pages have a common part of the site template, but differ in small content content. For example, such small duplicates could be resource pages that have search results or individual elements of an article (for example, pictures - this very often happens in many blog templates).

In the following figure, I have collected the most common variants of duplicate pages that are inherent in a blog with the WordPress engine (without taking into account various plugins and template features, which also often create duplicate documents):

The presence of duplicate pages on a site may indicate that either the webmaster is not aware of their presence, or he simply does not know how to deal with them. But you need to fight them, as they lead to various errors and problems in terms of search engine promotion. Let's talk about this now.

Why do you need to remove duplicate pages?

But no matter how passive the duplicate pages are, they can play a real role on the owner of any website or blog. So, why is duplicate content so dangerous? Here are the main problems that arise when there are duplicates:

Deterioration of site indexing. I call this problem #1. The fact is that, depending on the source and the webmaster’s errors, a site page can have from one to several or more duplicates. For example, the main one may have two or three duplicates, and blog posts, due to the ubiquity of replytocom, will be copied according to the number of comments. And if the blog is popular, then the number of “left” pages will be huge. And search engines (especially Google) do not like duplicates and therefore often lower the position of the site itself.

Replacement of the relevant page. The search engine algorithm may consider the duplicate more relevant than the original page being promoted (). Thus, the search results will not contain the target page, but its clone. But this clone has completely different insignificant parameters (link juice, ), which over time pessimize the site in the search engine crescent.

So, let's summarize the above. Duplicate pages interfere with the normal indexing of the site, become a source of incorrect choice by the search engine of a relevant page, and take away the influence of natural external links. Also, duplicates incorrectly distribute internal link juice, taking away power from promoted pages and changing behavioral indicators:

In general, duplicates are a big evil in search engine promotion and an optimizer’s nightmare. 🙂

How to identify and check duplicate pages

There are different ways to find and check duplicate pages. They require varying levels of CMS knowledge and understanding of how the search index works. I'll show you the easiest way. This does not mean that just because it is simple, it is not accurate. With its help, duplicate site documents can be easily found. And most importantly, this method does not require special knowledge and will not take much time.

In order to find and check your website for duplicate pages, you need to either use the advanced Yandex (Google) search or immediately enter a special query into the search bar. Let's look at the first option first.

Checking the site for duplicates using advanced search

Yandex advanced search allows you to get more accurate results thanks to clarifying query parameters. In our case, we need only two parameters - the site address and a piece of text of the page on which we are looking for duplicates. First, we select the text that is on the page of our website (for example, a completely foreign resource is taken) that we will check for duplicates. Then we go to the Yandex advanced search and enter a piece of content and the site address in the appropriate positions:

Next, we click the treasured word “Find” and the Russian search engine begins to generate results. It will not be the one we usually see - it will consist entirely of headlines and snippets from our resource only. Moreover, if there is only one search result in the search results, then everything is fine - there is no duplicate content on this page. If there are several results in the resulting sickle, then you will have to take out a shovel:

In my example, the search engine found several unclear duplicates - a number of pages from pagination for some categories. It is immediately obvious that on this site the page with highlighted text about the salad recipe was clearly placed in different sections at once. And since there was no ban on indexing pagination pages for this resource, all sorts of duplicates appeared in the index.

Now let's look at the same steps for a foreign search engine. We go to the Google advanced search page and perform the same actions:

Having filled in everything necessary in the advanced search, we get the indexing of the site under study according to the specified piece of text:

As you can see, Google also indexed inaccurate duplicates of the page under study - all the same category pages appear in the results.

In principle, you can get the same results without using an advanced search. To do this, you need to enter the following query in a simple version of Google or Yandex. And this is the second way to determine duplicates.

Finding duplicate pages using a special query

Using the advanced search, you can easily find all duplicates for a given text. Of course, using this method we will not get duplicate pages that do not contain text. For example, if a duplicate is created by a “crooked” template, which for some reason shows on another page a picture that is on the original page. Then such a duplicate cannot be found using the method described above. In this case, you will have to use another method.

Its essence is simple - with the help of a special operator, we request indexing of our entire site (or a separate page) and then manually look at the results in search of duplicates. Here are the syntax rules for this request:

When we specify the address of the main page in a request, we simply receive a list of pages indexed by a search robot. If we indicate the address of a specific page, we get a list of indexed duplicates of this page. In Yandex they are immediately visible. But in Google everything is a little more complicated - first they will show us those pages that are in the main search results:

As you can see in the picture, in the main search results we have one page of the site and it is also the original. But there are other pages in the index that are duplicates. To see them, you need to click on the “Show hidden results” link:

As a result, we are given a list of duplicates that Google has indexed and linked to the original page (number 1 in the picture). In my example, these duplicates were pages with positions 2 and 3.

In the second position there is a double, which is the trackback of this document ( automatic notification for other sites of this publication). The thing is certainly necessary, but its presence in the index is not desirable. The owner of this site understands this very well and therefore has prescribed a ban on indexing trackbacks from the site. This is indicated by the inscription “Web page description is not available due to restrictions in the robots.txt file.” If you look at the instructions for search engines (robots.txt), you will see the following picture:

Thanks to the latest directives, the author of the site indicated a ban on indexing trackbacks. But unfortunately, Google puts into its index everything that comes to hand. And here you just need to delete duplicates from the database. We’ll talk about this in the second article on duplicate content.

The third position shows the beloved replytocom by many bloggers. It is obtained by using commenting on blogs and websites. And thanks to him, there is simply a huge pile of duplicates - usually this number is approximately equal to the number of comments on the resource. In our example, this attribute, as well as the trackback, are closed for indexing. But Google put him in its index too. It also needs to be cleaned by hand.

By the way, if we slightly change our query, we can get the same results, which gives an advanced search for duplicates using a piece of text:

So, friends, in this article I lifted the curtain on the concept of duplicate pages and their successful search and verification. To consolidate the material covered, I suggest watching my video on this topic. In it, I clearly showed not only today’s material in two parts, but also added other examples of duplicate content:


In the next article you will find out. See you!

Sincerely, Your Maxim Dovzhenko

They almost always occur. This is due to the fact that the site developers did not take into account many nuances related to SEO. Therefore, usually issues with duplicates are resolved by the optimizer together with webmasters.

1. What are duplicate pages on a website?

Duplicate pages- these are pages with different URLs (address), but with the same content

For example, the same page is available at the following addresses (response code 200)

/category/razdel.html /category/razdel/

There are a lot of similar examples with duplicate pages. Moreover, even if the content of the page is slightly different, it is very important that the name of the page, which is written in And <h1>may match. This is already enough to create difficulties for yourself in search engines.</p> <p>Most often, problems with multiple duplicates occur in online stores. Their catalogs are usually displayed through pages with numbers:</p> <p>The addresses of these pages usually end with page=N, where N is the page number. Naturally, no one changes the titles and headings of each page. In total, the site may have dozens of pages with the same headings. For example:</p> <blockquote><span>/category/kosmetika?page=1 /category/kosmetika?page=2 /category/kosmetika?page=3</span> </blockquote> <p>They also often add the sorting parameter sort=alf , then the number of duplicates begins to grow even faster:</p> <blockquote><span>/category/kosmetika?page=1&sort=alf /category/kosmetika?sort=alf&page=1 /category/kosmetika?sort=alf /category/kosmetika?page=1</span> </blockquote> <p>Other sorting options are also possible. We get that one title is displayed on tens and even hundreds of pages with different URLs.</p> <h2>2. Why is it important to deal with duplicate pages?</h2> <p>The search engine ranks documents according to its algorithm. For example, a user has asked some kind of query in a search engine, and your site has many pages with the same title. Which of these pages should be returned to a search engine? Unclear. It also reduces the internal weight of other pages. Site trust is decreasing.</p> <p>Duplicate pages have a negative impact on the entire site as a whole. However, this problem can be solved in fairly simple ways. Let's first take a quick look at the options for finding duplicate pages.</p> <h2>3. How to find duplicate pages on a website</h2> <h3>3.1. Scanner programs</h3> <p>Typically, scanner programs are good at searching for duplicate pages within a site by following links. This has a big disadvantage, because... if there is no link to some accessible page, then the scanner simply will not be able to find it.</p> <p>Free site scanners are:</p> <ul><li>Netpeak spider</li> </ul><h3>3.2. Online services</h3> <p>There are several online services that scan the site. True, most likely, they will not be suitable for large sites, since they will have limitations (for example, free analysis of no more than 500 pages).</p> <p>Yandex Webmaster and Google Webmaster services have special sections in HTML optimization where you can find duplicate headings. This is probably one of the most accessible and easiest ways to find duplicates.</p> <h3>3.3. Through requests</h3> <p>It is also possible to try searching for duplicates using queries in Yandex and Google search for duplicate headings</p> <blockquote>For Yandex: site : urlsite.ru title :(query) For Google: site : urlsite.ru intitle :query</blockquote> <p>Where urlsite.ru is your website address. However, this method will help identify global engine problems, which is what we wanted.</p> <h3>3.4. Potential duplicates</h3> <p>Not all pages of the site that could be indexed in the index. But it is better to warn in advance about the possibility of indexing duplicate documents. For example, many engines like to send the correct server response to the following addresses:</p> <blockquote><span>/category/razdel /category/razdel/category/ /category/razdel/category/category/category/category/</span> </blockquote> <p>If you have a competitor in the search results (and there often is), then he can easily annoy you simply by adding a couple of links to such pages. At the same time, a catastrophic number of new site pages appear. Because, as a rule, all links on the site are relative.</p> <blockquote><span>/category/razdel/tovar1.html /category/razdel/category/tovar1.html /category/razdel/category/category/category/category/tovar1.html</span> </blockquote> <h2>4. How to remove duplicate sites from the index</h2> <p>There are different types of duplicate pages and you need to deal with them in different ways. Let's consider all possible cases.</p> <h3>4.1. Duplicates due to site accessibility via www and without www</h3> <p>Let's start with the most common situation, when the site is accessible via www and without www. For example</p> <blockquote>www.site.ru/cat/site.ru/cat/</blockquote> <p>This situation can be easily corrected by adding the appropriate directives to .htaccess (see 301 redirect from www to non-www).</p> <p>Redirect without www to a site page with www (site.ru -> www.site.ru)</p> <blockquote>RewriteEngine On RewriteCond %(HTTP_HOST) ^site.ru RewriteRule (.*) http://www.site.ru/$1</blockquote> <p>For a reverse redirect from www to without www (www.site.ru -> site.ru)</p> <blockquote>RewriteEngine On RewriteCond %(HTTP_HOST) ^www.site.ru RewriteRule (.*) http://site.ru/$1</blockquote> <p>Note</p><p>In addition to these duplicates, I advise you to register other possible duplicate options in .htaccess. I mean about adding .html to directories:</p> <blockquote>/category/index.html /category/</blockquote> <p>And also slashes</p> <p>Read how to deal with this in separate articles:</p> <h3>4.2. Duplicates due to sorting and extra catalog pages</h3> <p>Each such case needs to be considered more individually, but general recipes can be written. Let's consider two options.</p> <h4>4.2.1. Through the meta robots meta tag</h4> <p>If it is possible to write a meta directive on duplicate pages, then it is best to do this:</p> <blockquote><meta name ="robots " content ="noindex,nofollow "> </blockquote> <p>Those. This tag should only appear on pages:</p> <blockquote>/category/kosmetika?page=2 /category/kosmetika?page=3 /category/kosmetika?page=4 /category/kosmetika?page=4&sort=alf</blockquote> <p>But not on the entire site! This is not difficult to do. For example, in php you can write</p> <blockquote><span>if($_GET["page"] != "" || $_GET["sort"] != "")</span>( echo " <meta name ="robots " content ="noindex,nofollow "; } else { echo "<meta name ="robots " content ="all "; } </blockquote> <h4>4.2.2. Via robots.txt</h4> <p>There is a robots.txt file at the root of the site. In it you can specify the rules for indexing the site. Moreover, this is even easier to do than writing meta tags. But if the first method works 100%, then by setting a ban on indexing through robots.txt we only give a recommendation to search engines not to index unnecessary documents.</p> <p>One of the main reasons why a site can lose positions and traffic is the increasing number of duplicate pages on the site. They can arise as a result of the peculiarities of the CMS (engine), the desire to get maximum traffic from search due to a patterned increase in the number of pages on the site, as well as due to the conscious or unconscious placement of links by third parties to your duplicates from other resources.</p> <p>The problem of duplicates is very closely related to the problem of finding the canonical address of a page using a search analyzer. In some cases, the robot can determine the canonical address, for example, if the order of parameters in the dynamic URL has been changed:</p> <blockquote> <p>?&cat= <b>10 </b>&product= <b>25 </b></p> </blockquote> <p>This is essentially the same page as</p> <blockquote> <p>Product= <b>25 </b>&cat= <b>10 </b></p> </blockquote> <p>But in most cases, especially when using, it is difficult to determine the canonical page, therefore, full and partial duplicates end up in the index.</p> <p>What’s interesting is that for Yandex, duplicates are not so bad, and even on site search results pages (which are partial duplicates of each other) it can bring good traffic, but Google is more critical of duplicates (due to the fight against MFA and template sites ).</p> <h2>Basic methods for finding duplicates on a website</h2> <p>Below are the main methods by which you can quickly find duplicate pages on your website. Use them periodically.</p> <h3>1. Google webmaster</h3> <p>Go to the Google Webmaster Panel. Find the menu section “Optimization” – “HTML Optimization”. On this page you can see the number of duplicate meta descriptions and TITLE titles.</p> <p>In this way, you can find complete copies of pages, but unfortunately, you cannot identify partial duplicates, which have unique, however, template headings.</p> <h3>2. Xenu program</h3> <p>Xenu Link Sleuth is one of the popular optimizer programs that helps to conduct a technical audit of the site and, among other things, find duplicate headers (if, for example, you do not have access to Google Webmaster).</p> <p>More details about this program are written in the review article. Simply crawl the site, sort the results by title, and look for visual title matches. Despite all its convenience, this method has the same drawback - there is no way to find partial duplicate pages.</p> <h3>3. Search results</h3> <p>Search results can reflect not only the site itself, but also a certain attitude of the search engine towards it. To search for duplicates on Google, you can use a special query.</p> <blockquote> <p>site:mysite.ru -site:mysite.ru <b>/& </b></p> </blockquote> <p>Where the components are:</p> <p><b>site:mysite.ru</b>- shows the pages of the site mysite.ru that are in the Google index (general index).</p> <p><b>site:mysite.ru/&</b>- shows the pages of the site mysite.ru participating in the search (main index).</p> <p>This way, you can identify pages with little information and partial duplicates that are not included in the search and may prevent pages from the main index from ranking higher. When searching, be sure to click on the link “repeat the search, including missing results” if there were few results in order to see a more objective picture (see example <a target="_blank" href="http://www.google.com/search?q=site:drezex.com.ua+-site:drezex.com.ua/%26">site:</a> drezex.com .ua -site:drezex.com.ua/&).</p> <p><img src='https://i2.wp.com/devaka.ru/wp-content/uploads/2012/10/1163.png' width="100%" loading=lazy loading=lazy></p> <p>Now that you have found all the duplicate pages, you can safely delete them by adjusting the site engine or adding a tag to the page headers.</p> <p>A duplicate page is another copy of a website page, similar in content and content. There are two types of takes:</p> <ol><li><span>Complete duplicate page</span>- when the content is completely identical;</li> <li><span>Partial duplicate</span>- when the content of the page is largely the same, but there are separate different elements.</li> </ol><h2>Why do duplicate pages have a bad effect on website ranking?</h2> <p>Search engines perceive these pages as separate pages of the site, so their content ceases to be unique due to duplication of information. In addition, the link weight of a page is reduced if it has a duplicate. A small number of duplicate pages may not be a big problem, but if there are more than 50% of them, you urgently need to correct the situation.</p> <h2>Where do the duplicates come from?</h2> <p>The most common reason is the generation of duplicate pages by the control system due to incorrect settings. The most famous example is CMS Joomla; almost every site has to deal with the problem of duplicates.</p> <p>Partial duplicates are often found on online shopping sites:</p> <ul><li>They can appear on pagination pages if they contain the same text, changing only the products;</li> <li>Incorrect directory filter settings can generate partial and complete duplicates;</li> <li>Pages of product cards can become duplicates if the product, for example, differs only in color or size (for such products you need to make one card indicating all the characteristics).</li> </ul><h2>How to find duplicate pages?</h2> <p>There are several ways to find duplicate pages, each of which may produce different results.</p> <p>1. Some common variations of takes can be checked manually.</p> <ul><li>Is the main mirror of the site configured (is it accessible with www and without www);</li> <li>Are there any unclear duplicates with / and without / at the end of the url;</li> <li>Presence of duplicates with index.html, index.asp, index.php at the end of the url;</li> <li>Availability of a page with letters in both lower and upper case also creates duplicates.</li> </ul><p>2. Analyze pages indexed by search engines.</p> <p>To do this, just enter the query in Google <i>site:mysite.com</i>- it will show the pages of the general index, that is, everything that the search engine managed to index on the site.</p> <p><img src='https://i0.wp.com/seo.kasper.by/kscms/uploads/editor/files/screenshot-www.google.by_2014-12-05_16-35-07.png' width="100%" loading=lazy loading=lazy></p> <p>3. Search by text fragment</p> <p>By typing long fragments of text into the search, you can find places where it is repeated (and at the same time sites that copied your text). But there are two disadvantages here: the method is suitable if the site has few pages, and the fact that the search engine can analyze the request up to a certain length.</p> <p><img src='https://i2.wp.com/seo.kasper.by/kscms/uploads/editor/files/screenshot-www.google.by_2014-12-05_16-54-52.png' width="100%" loading=lazy loading=lazy></p> <p>4. Look into the Google webmaster panel</p> <p>In the “Search View” section, find the “HTML Optimization” tab and look for the value of the “Duplicate Meta Descriptions” and “Duplicate Headings” fields. By clicking on them, you can see a list of all pages with repeating title and description tags and the titles and descriptions themselves.</p> <p><img src='https://i0.wp.com/seo.kasper.by/kscms/uploads/editor/files/screenshot-www.google.com_2014-12-05_16-57-23.png' width="100%" loading=lazy loading=lazy></p> <p>5. Use the Xenu`s Link Sleuth program</p> <p>The program is distributed free of charge and is able to determine the url of all pages of the site, including scripts and images, as well as external links. In addition to duplicates, it is convenient to search for broken links - pages that return a 404 code.</p> <p><img src='https://i2.wp.com/seo.kasper.by/kscms/uploads/editor/files/page.png' width="100%" loading=lazy loading=lazy></p> <h2>How to eliminate duplicate pages?</h2> <p>There are 4 effective ways to do this, the toughest of which. In our opinion, the first two are.</p> <p>1. Manual removal</p> <p>This can be done on small sites by thoroughly understanding your management system and making the right settings to prevent duplicate pages from appearing in the future.</p> <p>2. Setting up a 301 redirect</p> <p>A 301 redirect is a permanent redirection of users from one page to another, which leads to them sticking together. It allows you to transfer up to 99% of the link juice to the page, both internal and external.</p> <p>Entire manuals have been written about using 301 redirects. Therefore, here we will briefly present the most necessary ones for eliminating duplicates. It is configured either through the .htaccess file in the root directory of the site, or through program code.</p> <p>To configure the main mirror, you need to write the following code:</p> <p>1 - for redirecting from www to without www</p> <p>To merge fuzzy takes with/without it, use the code:</p> <p>1 - remove the slash</p> <p>A page redirect looks like this:</p> <table><tbody><tr><td class="code" style="background-color: rgb(204, 204, 204);"> <span>Redirect 301 /oldpage.html http://www.site.com/newpage.html</span> </td> </tr></tbody></table><p>To generate more complex redirects, you will need to use rules. There are special services where you can generate code to set up a redirect according to a specific template:</p> <p>3. Use Rel=”Canonical”</p> <p>This option is best used in the case of partial duplicates, since the non-canonical page is not physically removed from the site and is available to users.</p> <p>In order to set up canonical urls, a link is written in the page code in the head block:</p> <p><i>“link rel="canonical" href="http://site.com/kopiya"/"</i> </p> <p>4. Setting up Robots.txt</p> <p>This is also an effective method, but it will be difficult to remove already indexed duplicates.</p> <p>Using the Disallow directive, all addresses and their types are indicated that search engine robots should not access for indexing. For example:</p> <p><i>User-agent: Yandex</i> </p> <p><i>Disallow: /index*</i> </p> <p>It says that the Yandex search bot should not go to pages whose url contains index.</p> <p>Finding and eliminating all duplicates is the main task in the first stages of website promotion, otherwise you can simply take on the wrong pages and spend a long time looking for the problem.</p> <script>document.write("<img style='display:none;' src='//counter.yadro.ru/hit;artfast_after?t44.1;r"+ escape(document.referrer)+((typeof(screen)=="undefined")?"": ";s"+screen.width+"*"+screen.height+"*"+(screen.colorDepth? screen.colorDepth:screen.pixelDepth))+";u"+escape(document.URL)+";h"+escape(document.title.substring(0,150))+ ";"+Math.random()+ "border='0' width='1' height='1' loading=lazy loading=lazy>");</script> </div> <div class="comment_box" id="comments"> </div> </div> <div id="sidebar"> <div class="widget widget_nav_menu" id="nav_menu-2"> <div class="menu-mainmenu-container"> <ul id="menu-mainmenu-2" class="menu"> <li class="submenu"><a href="https://viws.ru/en/category/internet/">Internet</a> </li> <li class="submenu"><a href="https://viws.ru/en/category/programs/">Programs</a> </li> <li class="submenu"><a href="https://viws.ru/en/category/instructions/">Instructions</a> </li> <li class="submenu"><a href="https://viws.ru/en/category/browsers/">Browsers</a> </li> <li class="submenu"><a href="https://viws.ru/en/category/windows-10/">Windows 10</a> </li> <li class="submenu"><a href="https://viws.ru/en/category/android/">Android</a> </li> <li class="submenu"><a href="https://viws.ru/en/category/ios/">iOS</a> </li> <li class="submenu"><a href="https://viws.ru/en/category/communication/">Connection</a> </li> </ul> </div> </div> <div class="widget"> <div class="heading star">The last notes</div> <div class="popular_posts"> <div class="news_box"> <a href="https://viws.ru/en/ne-pokazyvaet-demonstraciyu-ekrana-v-skaipe-kakie-vozmozhnosti-dostupny-v.html" class="thumb"><img width="95" height="95" src="/uploads/48da642f351fb209e4e3892a8a47479d.jpg" class="attachment-mini size-mini wp-post-image" alt="What features are available in Skype calls?" sizes="(max-width: 95px) 100vw, 95px" / loading=lazy loading=lazy></a> <div class="element"> <div class="title"> <a href="https://viws.ru/en/ne-pokazyvaet-demonstraciyu-ekrana-v-skaipe-kakie-vozmozhnosti-dostupny-v.html">What features are available in Skype calls?</a> </div> </div> </div> <div class="news_box"> <a href="https://viws.ru/en/ne-podklyuchaetsya-k-dota-2.html" class="thumb"><img width="95" height="95" src="/uploads/8f9ebdd4a2b1d0efeb6df2c2ec3e4cb4.jpg" class="attachment-mini size-mini wp-post-image" alt="Doesn't connect to Dota 2" sizes="(max-width: 95px) 100vw, 95px" / loading=lazy loading=lazy></a> <div class="element"> <div class="title"> <a href="https://viws.ru/en/ne-podklyuchaetsya-k-dota-2.html">Doesn't connect to Dota 2</a> </div> </div> </div> <div class="news_box"> <a href="https://viws.ru/en/lichnyi-kabinet-oao-tatenergosbyt-dobro-pozhalovat-v-lichnyi-kabinet-vhod-v.html" class="thumb"><img width="95" height="95" src="/uploads/0b3d80daaf734d8e094e6c7fb7b058a3.jpg" class="attachment-mini size-mini wp-post-image" alt="Welcome to "Personal Account"" sizes="(max-width: 95px) 100vw, 95px" / loading=lazy loading=lazy></a> <div class="element"> <div class="title"> <a href="https://viws.ru/en/lichnyi-kabinet-oao-tatenergosbyt-dobro-pozhalovat-v-lichnyi-kabinet-vhod-v.html">Welcome to "Personal Account"</a> </div> </div> </div> <div class="news_box"> <a href="https://viws.ru/en/ne-zapuskaetsya-konsol-servera-1s-8-3-administrirovanie-serverov1s.html" class="thumb"><img width="95" height="95" src="/uploads/d88eaf0a9090b9c3532508a1dcfd24bd.jpg" class="attachment-mini size-mini wp-post-image" alt="The 1s 8 server console does not start" sizes="(max-width: 95px) 100vw, 95px" / loading=lazy loading=lazy></a> <div class="element"> <div class="title"> <a href="https://viws.ru/en/ne-zapuskaetsya-konsol-servera-1s-8-3-administrirovanie-serverov1s.html">The 1s 8 server console does not start</a> </div> </div> </div> <div class="news_box"> <a href="https://viws.ru/en/vidy-oshibok-v-szv-m-kak-rasshifrovat-kody-i-ispravit-oshibki-chastye-oshibki-v.html" class="thumb"><img width="95" height="95" src="/uploads/013dd44c9713fafad38249958390334d.jpg" class="attachment-mini size-mini wp-post-image" alt="Common errors in sv-m and how to correct them Gross error 50 sv-m" sizes="(max-width: 95px) 100vw, 95px" / loading=lazy loading=lazy></a> <div class="element"> <div class="title"> <a href="https://viws.ru/en/vidy-oshibok-v-szv-m-kak-rasshifrovat-kody-i-ispravit-oshibki-chastye-oshibki-v.html">Common errors in sv-m and how to correct them Gross error 50 sv-m</a> </div> </div> </div> </div> </div> <div class="widget"> <div class="heading star">Popular</div> <div class="popular_posts"> <div class="news_box"> <a href="https://viws.ru/en/excel-snyat-zashchitu-ot-zapisi-kak-snyat-parol-v-excel-tri-rabochih-sposoba.html" class="thumb"><img width="95" height="95" src="/uploads/70d62fb686e51dc6790138ab9ef23652.jpg" class="attachment-mini size-mini wp-post-image" alt="How to remove password in Excel?" sizes="(max-width: 95px) 100vw, 95px" / loading=lazy loading=lazy></a> <div class="element"> <div class="title"> <a href="https://viws.ru/en/excel-snyat-zashchitu-ot-zapisi-kak-snyat-parol-v-excel-tri-rabochih-sposoba.html">How to remove password in Excel?</a> </div> </div> </div> <div class="news_box"> <a href="https://viws.ru/en/kak-udalit-programmu-v-windows-kak-udalit-programmu-v-windows-kak-udalit.html" class="thumb"><img width="95" height="95" src="/uploads/64641350d192cfd772c9583776d445ec.jpg" class="attachment-mini size-mini wp-post-image" alt="How to remove a program in Windows How to remove a program from a Windows 8 computer" sizes="(max-width: 95px) 100vw, 95px" / loading=lazy loading=lazy></a> <div class="element"> <div class="title"> <a href="https://viws.ru/en/kak-udalit-programmu-v-windows-kak-udalit-programmu-v-windows-kak-udalit.html">How to remove a program in Windows How to remove a program from a Windows 8 computer</a> </div> </div> </div> <div class="news_box"> <a href="https://viws.ru/en/adres-elektronnoi-pochty-icloud-chto-takoe-icloud-i-kak-im-polzovatsya-na-iphone-ipad-i-mac-sozdan.html" class="thumb"><img width="95" height="95" src="/uploads/3886c5c91b6199b7f8f6107199eaf627.jpg" class="attachment-mini size-mini wp-post-image" alt="What is iCloud and how to use it on iPhone, iPad and Mac" sizes="(max-width: 95px) 100vw, 95px" / loading=lazy loading=lazy></a> <div class="element"> <div class="title"> <a href="https://viws.ru/en/adres-elektronnoi-pochty-icloud-chto-takoe-icloud-i-kak-im-polzovatsya-na-iphone-ipad-i-mac-sozdan.html">What is iCloud and how to use it on iPhone, iPad and Mac</a> </div> </div> </div> </div> </div> <div class="widget"> <div class="heading">News</div> <div class="business_news"> <div class="news"> <div class="date">2024-04-17 01:43:25</div> <a href="https://viws.ru/en/kak-rasprostranit-informaciyu-metody-rasprostraneniya-informacii-v.html" class="title">Methods of disseminating information on the Internet Methods of disseminating information on the Internet</a> </div> <div class="news"> <div class="date">2024-04-17 01:43:25</div> <a href="https://viws.ru/en/ne-mogu-ustanovit-licenzionnyi-klyuch-na-noutbuk-kak-uznat.html" class="title">How to find out the Windows operating system activation key</a> </div> <div class="news"> <div class="date">2024-04-17 01:43:25</div> <a href="https://viws.ru/en/bezvozvratnyi-success-php-prostoi-primer-ispolzovaniya-php-i-ajax-chem-dlya-vas.html" class="title">A simple example of using PHP and AJAX</a> </div> <div class="news"> <div class="date">2024-04-16 01:40:17</div> <a href="https://viws.ru/en/kak-sozdat-wmz-koshelek-chto-takoe-wmr-i-kak-s-nim-obrashchatsya-wmr-koshelek.html" class="title">What is WMR and how to use it Wmr wallet example</a> </div> <div class="news"> <div class="date">2024-04-16 01:40:17</div> <a href="https://viws.ru/en/skachat-brauzer-dlya-uskoreniya-interneta-programma-dlya-uskoreniya.html" class="title">The best Internet speedup program</a> </div> </div> </div> <div class="widget ai_widget" id="ai_widget-5"> <div class='dynamic dynamic-13' style='margin: 8px 0; clear: both;'> </div> </div> </div> </div> </div> </div> <div id="footer"> <div class="fixed"> <div class="inner"> <div class="footer_l"> <a href="https://viws.ru/en/" class="logo" style="background:none;">viws.ru</a> <div class="copyright"> <p>viws.ru - All about modern technology. Breakdowns, social networks, internet, viruses</p> <p><span>2024 - All rights reserved</span></p> </div> </div> <div class="footer_c"> <ul id="menu-topmenu-1" class="nav"> <li><a href="https://viws.ru/en/feedback.html">Contacts</a></li> <li><a href="">About the site</a></li> <li><a href="">Advertising on the website</a></li> </ul> <div class="footer_menu"> <ul id="menu-nizhnee-1" class=""> <li id="menu-item-"><a href="https://viws.ru/en/category/internet/">Internet</a></li> <li id="menu-item-"><a href="https://viws.ru/en/category/programs/">Programs</a></li> <li id="menu-item-"><a href="https://viws.ru/en/category/instructions/">Instructions</a></li> <li id="menu-item-"><a href="https://viws.ru/en/category/browsers/">Browsers</a></li> </ul> <ul id="menu-nizhnee-2" class=""> <li id="menu-item-"><a href="https://viws.ru/en/category/internet/">Internet</a></li> <li id="menu-item-"><a href="https://viws.ru/en/category/programs/">Programs</a></li> <li id="menu-item-"><a href="https://viws.ru/en/category/instructions/">Instructions</a></li> <li id="menu-item-"><a href="https://viws.ru/en/category/browsers/">Browsers</a></li> </ul> </div> </div> </div> </div> </div> </div> <script type="text/javascript">jQuery(function($) { $(document).on("click", ".pseudo-link", function(){ window.open($(this).data("uri")); } );} );</script> <script type='text/javascript' src='https://viws.ru/wp-content/plugins/contact-form-7/includes/js/scripts.js?ver=4.9.2'></script> <script type='text/javascript' src='https://viws.ru/wp-content/plugins/table-of-contents-plus/front.min.js?ver=1509'></script> <script type='text/javascript' src='https://viws.ru/wp-content/themes/delo/assets/scripts/theme.js'></script> <script type='text/javascript'> var q2w3_sidebar_options = new Array(); q2w3_sidebar_options[0] = { "sidebar" : "sidebar", "margin_top" : 60, "margin_bottom" : 200, "stop_id" : "", "screen_max_width" : 0, "screen_max_height" : 0, "width_inherit" : false, "refresh_interval" : 1500, "window_load_hook" : false, "disable_mo_api" : false, "widgets" : ['text-8','ai_widget-5'] } ; </script> <script type='text/javascript' src='https://viws.ru/wp-content/plugins/q2w3-fixed-widget/js/q2w3-fixed-widget.min.js?ver=5.0.4'></script> <script async="async" type='text/javascript' src='https://viws.ru/wp-content/plugins/akismet/_inc/form.js?ver=4.0.1'></script> </body> </html>