Knowledge of the basics of SEO optimization, read what it is. What is SEO and how does it work? How to use the instructions

Send

Basics of internal SEO optimization Basics of internal optimization 4. Distribution of weight across website pages

A tool for distributing weight across website pages is linking.

Promoting just the main page of a website is fundamentally wrong. The fact is that the main page of the site often does not display those features that could be used in promotion. For example, subsection pages in their meaning may be better suited to a specific request and they are best used to promote this request. Likewise, a well-developed home page of the site will not change the situation if all other pages have low performance. Moreover, PageRank is calculated for each page separately, therefore, promotion must be comprehensive.

When linking occurs, links are exchanged between pages of the site. Thus, pages with higher scores give their weight to other pages on the site. Linking techniques can be different and unified system does not exist. Experts describe “star” and “ring” schemes, but I have not had to develop such schemes for specific sites and use them in practice.

5. Types of internal links

An internal link connects one page of a site to another, that is, an internal link is what is used to implement internal linking. Internal links are convenient to use when, when writing an article or any other content, it would be appropriate to link to another article that may act as a source or simply may be of interest to the reader.

  • assistance in navigating the site;
  • determining the architecture and hierarchy of the site;
  • ranking of pages on the site.
  • Regarding the type of internal links itself.

    Distinguish following types internal links:

  • main menu links (including submenu, drop-down tabs, etc.);
  • reference navigation bar (path-address on the site);
  • anchor links within the page (move to a specified area of ​​the page);
  • links from the right, left, top, bottom blocks of the site (advertising, logos, news, slideshows, etc.);
  • links within the main text of the page (article, material, description, etc.).
  • 6. CNC (human readable links)

    CNC - to man clear urls(page address), this is a transformation of the page address from one that is distributed dynamically to an address that is understandable to a person. Such addresses (NC) have their advantages. Firstly, they are easier to remember. If a visitor is interested in some specific content on the site, then he can easily associate the content of the page with its name, address, etc. Well, I think so.

    Also, by the way the addresses are composed, the user understands how navigation on the site works.

    The way the page address is filled in, the presence of keywords in it, all this also affects the site’s position in the search results. If the page address contains keywords, then the link is highlighted as relevant and the number of clicks to it increases.

    Breadcrumbs show position current page in the site hierarchy, that is, they help the user navigate the site. Not only can one chain of breadcrumbs lead to the same article or page; the same material can correspond on the topic to several sections of the site.

    Are being created bread crumbs in order to make it easier for the user to work with the site, give him the opportunity to navigate through articles and sections, and make moving around the site convenient and understandable. The final result, apparently, is to create such conditions for working with the resource that the visitor would like to work with this site in the future.

    7. Harmony of usability and behavioral factors

    Behavioral factors are metrics that characterize user behavior in search results and directly on the site.

    Behavioral factors are also ranking factors. They characterize a resource by how the user behaves on this site. For example, how often do people click on a link to this site and how long do they stay on it after clicking.

    The task of behavioral factors is to improve the quality of search results. To achieve this goal, search engines analyze user behavior on sites, their queries, etc.

    Usability is the ease of using a website. It depends on many factors - on the quality of the content on the site (uniqueness of articles, quality of graphics and videos), as well as on the organization of navigation on the site.

    In general, there is a 3-click rule - if a visitor, getting to an unfamiliar site, does not find the information you are looking for within three transitions, he leaves the site. Accordingly, the site structure should be designed in such a way that new visitor found what I wanted without any problems extra costs time.

    As I understand it, the harmony of usability and behavioral factors is to create a site with high levels of convenience so that all the conditions described above are met. Thus, a well-designed website with high-quality content will attract users and this will be reflected in their behavior - the time spent on the site will increase, the number of visits will also increase, and therefore the site’s position in the search results will also increase. Did I manage to answer this question? :)

    8. Maintain clean code and CSS styles

    Good clean code- this is primarily hand-written code, not generated visual editor. When writing html markup It is worth following some rules - the content must be neatly formatted, must contain comments and be arranged in logical blocks in accordance with its functions.

    Competent applying CSS is also important. First, the implementation of the design with using CSS it comes out much more compact than the html that the server loads. Secondly, the very structure of creation graphic design on CSS was originally designed for this task and therefore making changes and adjustments to CSS file it happens much easier and more comfortable and less labor-intensive. If you follow all these nuances, the process of indexing the site by search robots will be simplified.

    There are ways to check html and CSS code for validity. The main standard against which the audit is carried out is the recommendations of the W3 consortium. There are a number of the most common errors and shortcomings due to which the code is considered invalid. But the main problem, which, however, is now being solved, is cross-browser compatibility. Due to the fact that in different browsers different elements are displayed differently, and in some they are not displayed at all, developers have to create deliberately incorrect code that does not will pass inspections for validation, but all elements will be displayed in all browsers.

    9. 301st redirect, 404th page and server responses

    The 301st redirect redirects to new address using the server.

    Such redirection is used in cases where the address of the page, its url, has been changed, and it is necessary to redirect the flow of users from the old domain to the new one; to transfer TIC and PageRank indicators to the new domain; to organize access to the same site through several addresses; when moving a website page to another location; to eliminate a mirror domain - combining a site address with www and an address without it. The 301st redirect redirects about 90.99% of traffic from the old resource to the new one.

    301 is a way to keep a site's ranking in search engines when transferring it to new domain or changing the content management system. Redirection can be done in several ways, depending on the installed software.

    Error 404 occurs when there is no content at the requested address. When the user needs to go to specific page site he uses her address. The browser accesses the server at this address and downloads the contents of this page. If the site is working, but there is no information at the given address, the server responds with a 404 error - page not found.

    An error like this can occur for several reasons. Reasons from the client side - poor connection, incorrect address entered.

    The reasons why a page may not be on the site are that the page has been deleted, or the link to this page is outdated, or its address has been changed, but redirection using the same 301 redirect that I wrote about above was not organized.

    How they teach smart people on habrahabr, the 404 error page can also be used for your own purposes. Firstly, this page must be - it is necessary to notify users about the problems they encounter and, if possible, offer them alternative options. In addition to the link to home page, you can add an explanation why exactly it is missing. It is also recommended to create appearance The 404th page is the same as the rest of the site. In fact, there are a lot of options for creating 404 pages and all of them are aimed at not only not losing the user of the site, but, on the contrary, creating such an atmosphere that he would like to continue using it.

    10. Robots.txt and sitemap.xml files

    The robots.txt file is a service file that allows the site developer to restrict access to the content of web documents for search engines. This file is needed so that search robots index those pages that are desirable for indexing, and pages that are not unique content, duplicates, service folders and documents that are of no use to the visitor were not indexed by the search engine. The robots.txt file is a regular one text file. In addition to this file, there is also Sitemaps, which performs the opposite function - it makes it easier for robots to access the site’s content.

    A sitemap is an XML file with information for search engines (such as Google, Yahoo, Ask.com, MSN, Yandex) about the pages of a website that are subject to indexing. Sitemaps can help search engines determine the location of site pages, the time of their last update, update frequency and importance relative to other pages on the site in order to search engine was able to index the site more intelligently.

    This article is a translation of one wonderful article - SEO: A Comprehensive Guide for Beginners, which I found on the blog of one of the most famous developers SEO tools kissmetrics.com. I think (no, I'm sure!) that this article is just a godsend for beginners.

    The article is intended primarily for people who decide to independently promote it in search engines. It doesn’t matter whether you did it yourself or turned to professionals, the promotion technology is practically the same. By the way, recently one girl asked me: how much does it cost to create a website in a web studio? I answer everyone honestly: I don’t know. If you are interested in this, write a request to Google or Yandex - the cost of creating a website - and you will find out everything yourself from the original source.

    Let's return to the article. The article is long, but the topic discussed in the article is one of the most important, so be patient. You will be rewarded for your efforts...

    So let's get started.

    You have heard about SEO many times, but what are your real successes in the field? search engine promotion? When I first heard about it, it sounded like Voodoo to me, like magic, and few people then understood how to use it.

    The reality is this: SEO is not rocket science; anyone can master it. Some "gurus" claim that it will take years to learn SEO, but I don't think that's true. Of course, it will take a lot of time to learn all the nuances, but the truth is that you can learn the basics of SEO in just a few minutes.

    So, I thought, “Shouldn’t I put it all in one article?”

    It turned out to be very long in order to describe everything necessary. However, after several years of my studies in the field of SEO and working for various companies to help them break into the TOP, I'm sure this article has everything you need to know. If you're looking for ways to increase traffic to your site (which in turn will lead to more sales), then simply follow these principles.

    Traffic Trap (or How SEO Really Works)

    Many webmasters make the mistake of considering SEO only as a source of free traffic. Certainly, free traffic is a result of SEO, but it does not provide insight into how SEO actually works.

    The real goal of SEO is to help people find your site, or more precisely, the information that is on it. To achieve this goal, you must bring your content in line with the requirements of search engines, only then people will be able to find your site.

    Let me give you an example.

    Marie sells exclusive sweaters from her website. On her blog, she shows the process of knitting a sweater, demonstrates how her hands work, and talks about the different types of yarn that she uses. Since the competition for keywords related to yarn is low, soon Marie’s publications - and they often talked about yarn - began to occupy places on the first page of search results for various types yarn.

    You see here potential problem for Marie?

    People are interested in yarn so they can knit something themselves, and they are not at all interested in purchasing Marie's sweaters. Sure, Marie gets a lot of traffic, but those visitors won't pay attention to her products because they have completely different goals and aspirations.

    Conclusion: if you want SEO to truly work for you, you must first make sure that your goals and the goals of your visitors match. Traffic itself is not important here, it is important to determine, at least approximately, what your visitors want and then optimize your site for those keywords that will bring you the necessary visitors.

    How do you know which keywords will bring you a lot of the visitors you need?

    Just do your research.

    How to do research (search) for the right keywords?

    Of course, such research is a somewhat tedious process, but it is absolutely necessary. You need to find keywords that meet the following requirements:

    • were requested quite often on the Internet,
    • possessed low level competition, i.e. for these queries there should be a minimum number of results in the output,
    • corresponded to the content of your site and were relevant to it.

    There are many keyword research tools available, the most popular being the Keyword Tool.

    It shows real search queries and their number, and if you register with AdWords, it will also give you a whole list of keywords suitable for your site.

    However, before you dive into choosing keywords in AdWords, you need to clarify how broad or narrow your keywords should be. This is where the concept of the “long tail” comes into play.

    A long tail

    Formulated by Chris Anderson, the concept of the “long tail” describes the phenomenon where, across multiple low frequency queries(i.e. queries that are used quite rarely), in their totality, more visitors are sent to your site than when using a small number of high-frequency queries (keywords).

    For example, Amazon.com might get thousands of visitors for a high-volume search term for DVD, but instead they get millions of visitors for specific queries. DVD discs. It is clear that individually, none of the DVD titles can bring nearly as much traffic as high frequency request DVD.

    What does the long tail have to do with your website?

    If you combine the traffic for each low frequency request, and this is the long tail, you will find that such traffic (long tail) makes up 80% of the total traffic.

    So, when you select keywords for your site, you should not focus your attention only on popular queries, which carry large volumes of traffic. The general list of keywords should include both mid-frequency and low-frequency keywords.

    Creating content on your website

    Once you have compiled a list of keywords, you can begin creating content that matches those keywords.

    Search engine robots examine each page of your site, determine its main content, and then decide which keywords describe the content of the page and how to rank them. You can influence their “decision” by optimizing your content in a certain way and highlighting the keywords you need.

    This is especially important for those pages that contain content that bots have not yet learned to analyze. Search bots can easily analyze text, but they are not yet developed enough to evaluate videos, pictures or audio files. Therefore, you should describe these pages using keywords so that search robot was able to understand the content of the page and evaluate it.

    However, I want to warn you right away.

    If you write exclusively for search engines, this will lead to the content of the articles being very boring, and this in no way contributes to turning your visitors into customers who make purchases. It will be much better if you focus your attention on people, create content specifically for them, and only then optimize this content for the requirements of search engines. Then you will retain the persuasiveness of your articles.

    You should pay attention to:

    • Headlines – Create catchy headlines that will pique the reader's interest. Remember that you only have one chance to make a good first impression on a visitor.
    • Keywords – You need to choose keywords that will attract visitors to your site.
    • Links – Make links to quality sites, as if by doing this you indicate that your site is also good quality. This will contribute to the recognition of your site in your chosen niche, which, in turn, will lead to an increase in the number of visitors.
    • Quality – Publish only unique and quality content. This will encourage users to return to your site as it will be difficult for them to find necessary information in the other place.
    • Newness – If you publish content that does not become outdated over time or is regularly updated, this is good, but you should also publish fresh content regularly. If you don’t have time at all to publish new content, then you can add a question and answer section to your site or create a blog on the site (meaning a blog as aid for the main site - S.V.)

    And most importantly: do not publish other people's content on the pages of your site (meaning - stolen content– S.V.), because search engines may penalize you for this.

    Optimizing your website code

    Search bots don’t just read texts on your website. They also study your site's code.

    Therefore, you should optimize the code in 8 different places on the site. To illustrate this, I'll take the example sites zeldman.com and stuffandnonsense.co.uk, both owned by two popular web designers. Each of them uses their own techniques in website markup.

    Title tags

    The title tag contains the name of the site. Here's how it's done on zeldman.com:

    Jeffrey Zeldman Presents the Daily Report

    (in original: Jeffrey Zeldman Presents The Daily Report, - S.V.)

    As you can see, Zeldman focuses on his name and the name of the site. In search engines, his site is likely to be found by searching for “Jeffrey Zeldman” or “Daily Report.”

    Let's see how this is done on another site:

    Unique website design in Flintshire, North Wales from Stuff and Nonsense

    (in original: Fantastic web site design in Flintshire, North Wales from Stuff and Nonsense, - S.V. . )

    We see that Stuffannonsense.co.uk takes a slightly different approach: by putting the site name at the end, they emphasize the focus of the site. You'll likely find this site by searching for "web design in Flintshire, North Wales" or something similar.

    In conclusion: in Title tag You should use keywords that will be searched for your site. In the future for maximum optimization site, you must assign a unique title to each page (i.e., a unique title tag).

    Meta tags

    The main meta tag that you should pay attention to is the meta description tag. It doesn't have a big impact on how a site ranks in search results, but it does tell users what your site is about. And this may have big influence for them to decide whether to go to your site or not.

    Let's take a look at our sites: