Semantic core of the page. Drawing up a semantic core. An automatic way to analyze request competition

2014 in the field of search engine optimization revealed a new trend - thinking through and assembling a semantic core. The CN is an extensive list of key queries for the site, and the following goals are pursued when assembling it:

    Detect a significant number of NP/MNP occurrences

    Detect minimally competitive queries

    Reach your target audience as much as possible

    Develop a website structure or adjust an existing one

    Increase website hosting metrics

Compiling a semantic core is a complex process, which is divided into the following stages:

    Preparatory stage

    Defining basic website segments

    Increasing the list

    Optimization of communication systems

    Dividing web servers into clusters

    Compiling a list of landing pages

Based on the results, queries are introduced into the resource structure. A well-chosen website is a well-developed website framework with a large number of landing pages.

Key queries

When selecting “keywords” for a specific language, you need to prepare an initial list of site segments according to the created structure, for which starting entries are formed that characterize the site. Then detailed information about the product or service is added to the primary list according to its characteristics.

The list of requests increases significantly, while it contains only relevant data from the website. Combinations of words from the list form phrases from which an extended list of primary queries is compiled.

Expanding the list of requests

There are not enough keywords generated by this method, because user phrases from search engines are not taken into account. You can expand the list of queries using services like wordstat.yandex.ru and search engine suggestions.

Programs such as KeyCollector also help to increase the list.

When selecting data from lists of occurrences, it is necessary to designate a search area - a region. In addition, you can find out and analyze the keywords promoted by your competitors. To do this, you need to follow the algorithm:

    Using an expanded list of primary queries, identify competing sites. Special services will help with this, including www.engine.seointellect.ru/requests_analyses

    Check the results for their visibility by search engines and take the data into account in the future. Examples of such services: www.megaindex.ru/?tab=siteAnalyze, www.spywords.ru.

    Consolidate results.

Optimization of communication systems

The summary file requires cleaning of entries that are unnecessary for the site: mentions of goods and services that are not on the website, queries with the names of competitors and regions where the product is not distributed. All inappropriate requests must be deleted.

You can clear the semantic core using two methods.

"Group Analysis" in Key Collector

When you launch Key Collector, a list of words and the frequency of their mentions in the selected queries will be displayed. Mark all inappropriate words, phrases mentioning the marked words will be highlighted in the list, delete all unnecessary occurrences.

This way, it will be possible to clean up the SY from most of the inappropriate occurrences, but the preparation does not end there. It often becomes necessary to remove some queries based on a selected criterion: mention of a city, region, defined by occurrences. Therefore, you need to create a list of stop words.

List of stop words

Stop word lists are divided into general and special. General queries include geographic queries: city, district, region. That is, when distributing a product or service in a region, you must exclude all queries mentioning regions in your NL. If your site sells goods from 10 companies, then requests mentioning the rest must be deleted. These stop words are determined by viewing the list of queries.

Creating a list of stop words is a painstaking task. There are two ways to make your life easier:

    Filter tool in MS Excel

    Stopwords button in Key Collector

After kernel optimization, the list is supplemented with the necessary entries. It is necessary to establish the effectiveness of specific requests for attracting user traffic and their prospects.

Preparation of SYA

For further work on the semantic core, it is necessary to select additional information on queries.

You will need to set the normal and exact frequencies when entering your query. First let's define the terms. Normal frequency is a numerical indicator of the number of times queries including the required one are entered into a search engine in any morphological form.

The exact frequency, on the contrary, is the number of requests in the exact word form. The query “semantic core” has a usual frequency of 8381, and an exact frequency of 1202. That is, a verbatim query without other words or changes was entered by users 1202 times per month, and with additions - 8381 times, from which we conclude: effective promotion is only It is impossible to make an exact request.

Geodependency

What is geo-dependency of a request? Geo-dependent queries are those whose search results vary depending on the region. SEO specialists often turn to geo-dependent queries, assigning the necessary regions to a specific site.

Query efficiency factor

All necessary figures for calculations can be obtained using the Key Collector program. The next step is to calculate the entry efficiency coefficient (EC) when attracting users to the site. It can be calculated using the following formula:

Efficiency factor (%) = WS"!" *100/WS

In this case, WS is the usual frequency according to wordstat.yandex.ru, and WS "!" – exact frequency.

Let's consider a specific calculation of FE for the occurrences of “semantic core” and “create semantic core”: ·

    Query “semantic core”: WS – 8381, WS“!” – 1202, CE = 14.34%

    Request “create a semantic core”: WS – 242 WS“!” – 14, CE = 5.78%

A low CI does not mean that the request does not need promotion, but first of all it is worth concentrating on the most effective requests. Basically, low-frequency queries have a high FE, that is, the conversion of unpopular queries is higher.

When creating a NL, you should first concentrate on geo-dependent issues, and then move on to the rest.

Conclusion

The processed data forms a semantic core in the form of a list with query characteristics. FL is distributed across site segments, as well as query parameters. A well-developed strategic language is the foundation for website promotion on the Internet.

Often, novice webmasters, faced with the need to create a semantic core, do not know where to start. Although there is nothing complicated in this process. Simply put, you need to collect a list of key phrases that Internet users use to search for information on your website.

The more complete and accurate it is, the easier it is for a copywriter to write a good text, and for you to get high positions in searches for the right queries. How to correctly compose large and high-quality semantic cores and what to do with them next so that the site reaches the top and collects a lot of traffic will be discussed in this material.

The semantic core is a set of key phrases, ungrouped by meaning, where each group reflects one need or desire of the user (intent). That is, what a person thinks about when typing his query into the search bar.

The entire process of creating a kernel can be represented in 4 steps:

  1. We are faced with a task or problem;
  2. We formulate in our heads how we can find its solution through a search;
  3. We enter a request into Yandex or Google. Besides us, other people do the same;
  4. The most frequent variants of requests end up in analytics services and become key phrases that we collect and group according to needs. As a result of all these manipulations, a semantic core is obtained.

Is it necessary to select key phrases or can you do without it?

Previously, semantics was compiled in order to find the most frequent keywords on a topic, fit them into the text and get good visibility for them in the search. Over the past 5 years, search engines have been striving to move to a model where the relevance of a document to a query will be assessed not by the number of words and the variety of their variations in the text, but by assessing the disclosure of intent.

For Google, this began in 2013 with the Kolibri algorithm, for Yandex in 2016 and 2017 with Palekh and Korolev technologies, respectively.

Texts written without syntax will not be able to fully cover the topic, which means it will not be possible to compete with the TOP for high-frequency and mid-frequency queries. It makes no sense to rely on low-frequency queries - there is too little traffic for them.

If you want to successfully promote yourself or your product on the Internet in the future, you need to learn how to create the right semantics that fully reveal the needs of users.

Classification of search queries

Let's look at 3 types of parameters by which keywords are evaluated.

By frequency:

  • High Frequency (HF) - phrases that define a topic. Consist of 1-2 words. On average, the number of search queries starts from 1000-3000 per month and can reach hundreds of thousands of impressions, depending on the topic. Most often, the main pages of websites are designed for them.
  • Mid-frequency (MF) – separate directions in the topic. Mostly contain 2-3 words. With an exact frequency of 500 to 1000. Usually categories for a commercial site or topics for large information articles.
  • Low frequency (LF) – queries related to the search for a specific answer to a question. As a rule, from 3-4 words. This could be a product card or the topic of an article. On average, searches range from 50 to 500 people per month.
  • When analyzing metrics or statistics counter data, you can come across another type - micro low-frequency keys. These are phrases that are often asked once during a search. There is no point in sharpening the page for them. It is enough to be in the top for low frequencies, which includes them.


By competitiveness:

  • Highly competitive (HC);
  • Medium-concrete (SC);
  • Low competitive (NC);

According to need:

  • Navigational. Express the user’s desire to find a specific Internet resource or information on it;
  • Informational. Characterized by the need to obtain information as a response to a request;
  • Transactional. Directly related to the desire to make a purchase;
  • Vague or general. Those for which it is difficult to accurately determine the intent.
  • Geo-dependent and geo-independent. Reflect the need to search for information or complete a transaction in your city or without regional reference.


Depending on the type of site, you can give the following recommendations when selecting key phrases for the semantic core.

  1. Information resource. The main emphasis should be on finding topics for articles in the form of mid-range and low-frequency queries with low competition. It is recommended to cover the topic broadly and deeply, sharpening the page for a large number of low-frequency keys.
  2. Online store or commercial site. We collect HF, MF and LF, segmenting as clearly as possible so that all phrases are transactional and belong to the same cluster. We focus on finding well-converting low frequency NC keywords.

How to correctly compose a large semantic core - step-by-step instructions

We moved on to the main part of the article, where I will sequentially analyze the main stages that need to be completed to build the core of the future site.
To make the process clearer, all steps are given with examples.

Search for basic phrases

Working with the SEO core begins with selecting a primary list of basic words and phrases (VPs) that best characterize the topic and are used in a broad sense. They are also called markers.

These can be names of directions, types of products, popular queries from the topic. As a rule, they consist of 1-2 words and have tens and sometimes hundreds of thousands of impressions per month. It’s better not to use very wide keywords, so as not to drown in negative keywords at the expansion stage.

The most convenient way to select marker phrases is using . By entering a query into it, in the left column we see the phrases that it contains, in the right – similar queries from which you can often find suitable ones for expanding the topic. The service also shows the basic frequency of the phrase, that is, how many times it was asked per month in all word forms and with the addition of any words to it.

In itself, this frequency is of little interest, so to get more accurate values ​​you need to use operators. Let's figure out what it is and what it is needed for.

Yandex Wordstat operators:

1) “…” – quotation marks. A query in quotation marks allows you to track how many times a phrase was searched in Yandex with all its word forms, but without adding other words (tails).

2) ! - Exclamation point. Using it before each word in the query, we record its form and get the number of impressions in the search for a key phrase only in the specified word form, but with a tail.

3) “!... !... !...” - quotation marks and an exclamation mark before each word. The most important operator for the optimizer. It allows you to understand how many times a keyword is requested per month strictly for a given phrase, as it is written, without adding any words.

4) +. Yandex Wordstat does not take into account prepositions and pronouns when making a request. If you need him to show them, put a plus sign in front of them.

5) -. The second most important operator. With its help, words that do not fit are quickly eliminated. To use it, after the analyzed phrase we put a minus sign and a stop word. If there are several of them, repeat the procedure.

6) (…|…). If you need to get data from Yandex Wordstat for several phrases at the same time, enclose them in brackets and separate them with a forward slash. In practice, the method is rarely used.

For the convenience of working with the service, I recommend installing a special browser extension “Wordstat Assistant”. Installed on Mozilla, Google Chrome, Ya.Browser and allows you to copy phrases and their frequencies with one click of the “+” or “Add all” icon.


Let's say we decide to make our blog using SEO. Let’s choose 7 basic phrases for it:

  • semantic core;
  • optimization;
  • copywriting;
  • promotion;
  • monetization;
  • Direct

Search for synonyms

When formulating a query to search engines, users can use words that are close in meaning, but different in spelling.

For example, "car" and "machine".

It is important to find as many synonyms for the main words as possible in order to increase the coverage of the future semantic core. If this is not done, then during parsing we will miss a whole layer of key phrases that reveal the needs of users.

What we use:

  • Brainstorm;
  • Right column of Yandex Wordstat;
  • Queries typed in Cyrillic;
  • Special terms, abbreviations, slang expressions from the topic;
  • Yandex and Google blocks - search together with the “query name”;
  • Snippets of competitors.

As a result of all the actions for the selected topic, we get the following list of phrases:


Basic Query Expansion

Let's parse these keywords to identify the basic needs of people in this area.
The most convenient way to do this is in the Key Collector program, but if you don’t mind paying 1,800 rubles for a license, use its free analogue - Slovoeb.

In terms of functionality, it is of course weaker, but it is suitable for small projects.
If you don’t want to delve into the operation of programs, you can use the Just-Magic and Rush Analytics service. But it’s still better to spend a little time and understand the software.

I will show the principle of operation in Key Collector, but if you work with Slovoeb, then everything will also be clear. The program interface is similar.

Procedure:

1) Add a list of basic phrases to the program and measure the basic and exact frequencies based on them. If we are planning promotion in a specific region, we indicate the regionality. For informational sites, this is most often not necessary.


2) Let's parse the left column of Yandex Wordstat using the added words to get all the queries from our topic.


3) At the end we got 3374 phrases. Let's take the exact frequency from them, as in point 1.


4) Let’s check if there are any keys with zero base frequency in the list.


If there is, delete it and move on to the next step.

Negative words

Many people neglect the procedure for collecting negative keywords, replacing it with deleting phrases that are not suitable. But later you will realize that it is convenient and really saves time.

Open the Data -> Analysis tab in Key Collector. Select the type of grouping by individual words and scroll through the list of keys. If we see a phrase that does not fit, click the blue icon and add the word instead with all its word forms to the stop words.


In Slovoeb, working with stop words is implemented in a more simplified version, but you can also create your own list of phrases that are not suitable and apply them to the list.

Don’t forget to use sorting by Base Frequency and number of phrases. This option helps you quickly reduce the list of initial phrases or weed out rarely occurring ones.


After we have compiled a list of stop words, we apply them to our project and move on to collecting search tips.

Parsing hints

When you enter a query into Yandex or Google, search engines offer their own options for continuing it from the most popular phrases that Internet users type in. These keywords are called search suggestions.

Many of them do not fall into Wordstat, so when building a semantic one, it is necessary to collect such queries.

Kay Collector, by default parses them with a search of endings, Cyrillic and Latin alphabet and with a space after each phrase. If you are ready to sacrifice quantity in order to significantly speed up the process, check the box “Collect only the TOP hints without brute force and a space after the phrase.”


Often among search suggestions you can find phrases with good frequency and competition tens of times lower than in Wordstat, so in narrow niches I recommend collecting as many words as possible.

The time for parsing hints directly depends on the number of simultaneous calls to search engine servers. Maximum Kay Collector supports 50-thread operation.
But in order to parse requests in this mode, you will need the same number of proxies and Yandex accounts.

For our project, after collecting tips, we got 29,595 unique phrases. In terms of time, the entire process took a little more than 2 hours on 10 threads. That is, if there are 50 of them, we’ll do it in 25 minutes.


Determination of base and exact frequencies for all phrases

For further work, it is important to determine the basic and exact frequency and eliminate all zeros. We leave requests with a small number of impressions if they are targeted.
This will help you better understand the intent and create a more complete article structure than is in the top.

In order to remove the frequency, we first filter out all unnecessary things:

  • repetitions of words
  • keys with other symbols;
  • duplicate phrases (via the “Implicit Duplicates Analysis” tool)


For the remaining phrases, we will determine the exact and base frequency.

A) for phrases up to 7 words:

  • Select through the filter “Phrase consists of no more than 7 words”
  • Open the “Collect from Yandex.Direct” window by clicking on the “D” icon;
  • If necessary, indicate the region;
  • Select the guaranteed impressions mode;
  • Set the collection period to 1 month and check the boxes for the required frequency types;
  • Click “Get data”.


b) for phrases of 8 words or more:

  • Set the filter for the “Phrase” column – “consists of at least 8 words”;
  • If you need to promote in a specific city, indicate the region below;
  • Click on the magnifying glass and select “Collect all types of frequencies.”


Cleaning keywords from garbage

After we have received information about the number of impressions for our keys, we can begin to filter out those that are not suitable.

Let's look at the procedure step by step:

1. Go to “Group Analysis” of Key Collector and sort the keys by the number of words used. The task is to find non-target and frequent ones and add them to the list of stop words.
We do everything the same as in the “Minus words” paragraph.


2. We apply all the found stop words to the list of our phrases and go through it so as not to lose target queries. After checking, click “Delete Marked Phrases”.


3. We filter out dummy phrases that are rarely used in exact occurrences, but have a high base frequency. To do this, in the Key Collector program settings, in the “KEY&SERP” item, insert the calculation formula: KEY 1 = (YandexWordstatBaseFreq) / (YandexWordstatQuotePointFreq) and save the changes.


4. We calculate KEY 1 and delete those phrases for which this parameter is 100 or more.


The remaining keys need to be grouped by landing pages.

Clustering

The distribution of queries into groups begins with clustering phrases by top using the free program “Majento Clusterer”. I recommend a paid analogue with wider functionality and faster operating speed - KeyAssort, but the free one is quite enough for a small kernel. The only caveat is that to work in any of them you will need to buy XML limits. Average price - 5 rubles. for 1000 requests. That is, processing an average core for 20-30 thousand keys will cost 100-150 rubles. See the screenshot below for the address of the service you use.


The essence of clustering keys using this method is to combine into groups those phrases that have the Yandex Top 10:

  • shared URLs with each other (Hard)
  • with the most frequent request in the group (Soft).

Depending on the number of such matches for different sites, clustering thresholds are distinguished: 2, 3, 4 ... 10.

The advantage of this method is the grouping of phrases according to people’s needs, and not just by synonymous connections. This allows you to immediately understand which keywords can be used on one landing page.

Suitable for information specialists:

  • Soft with a threshold of 3-4 and then cleaning by hand;
  • Hard on 3, and then combining clusters according to the meaning.

Online stores and commercial sites, as a rule, are promoted according to Hard with a clustering threshold of 3. The topic is voluminous, so I will discuss it later in a separate article.

For our project, after grouping using the Hard method on 3, we got 317 groups.


Competition Check

There is no point in promoting for highly competitive queries. It’s difficult to get to the top, and without it there will be no traffic to the article. To understand which topics are profitable to write about, we use the following method:

We focus on the exact frequency of the group of phrases under which the article is written and the competition for Mutagen. For informational sites, I recommend taking on topics that have a total exact frequency of 300 or more, and a competitiveness coefficient of 1 to 12 inclusive.

In commercial topics, focus on the marginality of a product or service and how competitors in the top 10 are doing. Even 5-10 targeted requests per month may be a reason to make a separate page for it.

How to check competition on a request:

a) manually, by entering the appropriate phrase in the service itself or through mass tasks;


b) in batch mode through the Key Collector program.


Topic selection and grouping

Let's consider each of the resulting groups for our project after clustering and select topics for the site.
Majento, unlike Key Assort, does not allow you to download data on the number of impressions for each phrase, so you will have to additionally obtain them through Key Collector.

Instructions:

1) Upload all groups from Majento in CSV format;
2) Concatenate phrases in Excel using the “group:key” mask;
3) Load the resulting list into the Key Collector. In the settings, be sure to check the “Group:Key” import mode and not monitor the presence of phrases in other groups;


4) We remove the basic and exact frequency for keywords from the newly created groups. (If you use Key Assort, you don't need to do this. The program allows you to work with additional columns)
5) We are looking for clusters with unique intent, containing at least 3 phrases and the number of impressions for all queries totaling more than 300. Next, we check the 3-4 most frequent of them for competitiveness according to Mutagen. If among these phrases there are keys with competition less than 12, we take it to work;

6) We look through the remaining groups. If there are phrases that are close in meaning and worth considering on one page, we combine them. For groups containing new meanings, we look at the prospects for the total frequency of phrases; if it is less than 150 per month, then we postpone it until we go through the entire core. It may be possible to combine them with another cluster and get 300 exact impressions - this is the minimum from which it is worth taking the article into work. To speed up manual grouping, use auxiliary tools: quick filter and frequency dictionary. They will help you quickly find suitable phrases from other clusters;


Attention!!! How do you know that clusters can be merged? We take 2 frequency keys from those that we selected in step 5 for the landing page and 1 request from the new group.
We add them to Arsenkin’s “Upload Top 10” tool, specify the desired region if necessary. Next, we look at the number of intersections by color for the 3rd phrase with the rest. We combine groups if there are 3 or more of them. If there are no matches or one, you cannot combine - different intents, in the case of 2 intersections, look at the output by hand and use logic.

7) After grouping the keys, we get a list of promising topics for articles and semantics for them.


Removing requests for another content type

When compiling a semantic core, it is important to understand that commercial queries are not needed for blogs and information sites. Just like online stores do not need information.

We go through each group and clean out everything unnecessary; if we cannot accurately determine the intent of the request, we compare the results or use the following tools:

  • Commercialization check from Pixel Tools (free, but with a daily check limit);
  • Just-Magic service, clustering with a checkmark to check the commerciality of the request (paid, cost depends on the tariff)

After this we move on to the last stage.

Phrases optimization

We optimize the semantic core so that it is convenient for SEO specialists and copywriters to work with it in the future. To do this, we will leave in each group key phrases that reflect the needs of people as fully as possible and contain as many synonyms for the main phrases as possible.

Algorithm of actions:

  • Let's sort the keywords in Excel or Key Collector alphabetically from A to Z;
  • Let's choose those that reveal the topic from different angles and in different words. All other things being equal, we leave phrases with a higher exact frequency or which have a lower key 1 indicator (the ratio of the base frequency to the exact frequency);
  • We delete keywords with less than 7 impressions per month, which do not carry new meanings and do not contain unique synonyms.

An example of what a well-composed semantic core looks like:

I marked in red phrases that do not match the intent. If you neglect my recommendations for manual grouping and do not check compatibility, it will turn out that the page will be optimized for incompatible key phrases and you will no longer see high positions for promoted queries.

Final checklist

  1. We select the main high-frequency queries that set the topic;
  2. We look for synonyms for them using the left and right columns of Wordstat, competitor sites and their snippets;
  3. We expand the received queries by parsing the left column of Wordstat;
  4. We prepare a list of stop words and apply them to the resulting phrases;
  5. Parsing Yandex and Google tips;
  6. We remove the base and precise frequencies;
  7. Expanding the list of negative keywords. We clean from garbage and requests for pacifiers
  8. We do clustering using Majento or KeyAssort. For informational sites in Soft mode, the threshold is 3-4. For commercial Internet resources using the Hard method with a threshold of 3.
  9. We import the data into Key Collector and determine the competition of 3-4 phrases for each cluster with a unique intent;
  10. We select topics and decide on landing pages for queries based on an assessment of the total number of accurate impressions for all phrases from one cluster (from 300 for information specialists) and competition for the most frequent of them according to Mutagen (up to 12).
  11. For each suitable page, we look for other clusters with similar user needs. If we can consider them on one page, we combine them. When the need is not clear or there is a suspicion that there should be a different type of content or page as an answer to it, we check the search results or through the Pixel Tools or Just-Magic tools. For content sites, the core should consist of information requests; for commercial sites, transactional ones. We remove the excess.
  12. We sort the keys in each group alphabetically and leave those that describe the topic from different angles and in different words. All other things being equal, priority is given to those queries that have a lower ratio of base frequency to exact frequency and a higher number of precise impressions per month.

What to do with the SEO core after its creation

We compiled a list of keys, gave them to the author, and he wrote an excellent article in full, revealing all the meanings. Eh, I’m daydreaming... A sensible text will only work if the copywriter clearly understands what you want from him and how to test himself.

Let’s look at 4 components, having worked them out well, you are guaranteed to receive a lot of targeted traffic to the article:

Good structure. We analyze the queries selected for the landing page and identify what needs people have in this topic. Next, we write an outline for the article that fully answers them. The task is to make sure that when people visit the site, they receive a voluminous and comprehensive answer regarding the semantics that you have compiled. This will give good behavioral and high relevance to the intent. After you have made a plan, look at your competitors' websites by typing the main promoted query into the search. You need to do it exactly in this order. That is, first we do it ourselves, then we look at what others have and, if necessary, we modify it.

Optimization for keys. We sharpen the article itself for 1-2 of the most frequent keys with competition for Mutagen up to 12. Another 2-3 mid-frequency phrases can be used as headings, but in a diluted form, that is, inserting into them additional words not related to the topic, using synonyms and word forms . We focus on low-frequency phrases from which a unique part is pulled out - the tail - and evenly introduced into the text. The search engines themselves will find and glue everything together.

Synonyms for basic queries. We write them out separately from our semantic core and set the task for the copywriter to use them evenly throughout the text. This will help reduce the density of our main words and at the same time the text will be optimized enough to get to the top.

Thematic-setting phrases. LSIs themselves do not promote the page, but their presence indicates that the written text most likely belongs to the “pen” of an expert, and this is already a plus for the quality of the content. To search for thematic phrases, we use the “Technical Specifications for a Copywriter” tool from Pixel Tools.


An alternative method for selecting key phrases using competitor analysis services

There is a quick approach to creating a semantic core that is suitable for both beginners and experienced users. The essence of the method is that we initially select keywords not for the entire site or category, but specifically for an article or landing page.

It can be implemented in 2 ways, which differ in how we choose topics for the page and how deeply we expand the key phrases:

  • by parsing the main keys;
  • based on competitor analysis.

Each of them can be implemented at a simple or more complex level. Let's look at all the options.

Without using programs

A copywriter or webmaster often doesn’t want to deal with the interface of a large number of programs, but he needs good themes and key phrases for them.
This method is just for beginners and those who don’t want to bother. All actions are performed without the use of additional software, using simple and understandable services.

What you will need:

  • Keys.so service for competitor analysis – 1500 rub. Using promo code “altblog” - 15% discount;
  • Mutagen. Checking the competitiveness of requests - 30 kopecks, collecting basic and exact frequencies - 2 kopecks per check;
  • Bookvarix - free version or business account - 995 rub. (now with a discount of 695 RUR)

Option 1. Selecting a topic by parsing basic phrases:

  1. We select the main keys from the topic in a broad sense, using brainstorming and the left and right columns of Yandex Wordstat;
  2. Next, we look for synonyms for them, using the methods discussed earlier;
  3. We enter all received marker requests into Bukvariks (you will need to pay a paid tariff) in the advanced mode “Search using a list of keywords”;
  4. We indicate in the filter: “!Exact!frequency” from 50, Number of words from 3;
  5. We upload the entire list to Excel;
  6. We select all the keywords and send them for grouping to the Kulakov Clusterer service. If the site is regional, select the desired city. We leave the clustering threshold for informational sites at 2, for commercial sites we set it to 3;
  7. After grouping, we select topics for articles by looking through the resulting clusters. We take those where the number of phrases is from 3 and with a unique intent. An analysis of the URLs of sites from the top in the “Competitors” column (on the right in the sign of Kulakov’s service) helps to better understand people’s needs. Also, don’t forget to check the competitiveness of Mutagen. We run 2-3 requests from the cluster. If everything is more than 12, then the topic is not worth taking;
  8. The name of the future landing page has been decided, all that remains is to select key phrases for it;
  9. From the “Competitors” field, copy 3 URLs with the appropriate type of pages (if the site is informational, we take links to articles; if it is a commercial site, then to stores);
  10. We insert them sequentially into keys.so and upload all the key phrases for them;
  11. We combine them in Excel and remove duplicates;
  12. The service data alone is not enough, so we need to expand it. Let's use Bukvarix again;
  13. The resulting list is sent for clustering to the “Kulakov Clusterer”;
  14. We select groups of requests that are suitable for the landing page, focusing on intent;
  15. We remove the base and exact frequency through Mutagen in the “Mass Tasks” mode;
  16. We upload a list with updated data on the number of impressions in Excel. We remove zeros for both types of frequencies;
  17. Also in Excel, we add a formula for the ratio of the base frequency to the exact one and leave only those keys for which this ratio is less than 100;
  18. We delete requests for other types of content;
  19. We leave phrases that reveal the main intention as fully as possible and in different words;
  20. We repeat all the same steps in steps 8-19 for the remaining topics.

Option 2. Select a topic through competitor analysis:

1. We are looking for top sites in our field, driving in high-frequency queries and viewing the results through Arsenkin’s “Top 10 Analysis” tool. It is enough to find 1-2 suitable resources.
If we are promoting a site in a specific city, we indicate the region;
2. Go to the keys.so service and enter the urls of the sites that we found into it and see which competitor pages bring the most traffic.
3. We check 3-5 of the most accurate frequency queries for competitiveness. If for all phrases it is above 12, then it is better to look for another topic that is less competitive.
4. If you need to find more sites for analysis, open the “Competitors” tab and set the parameters: similarity - 3, thematic - 10. Sort the data in descending order of traffic.
5. After we have chosen a topic, enter its name into the search results and copy 3 URLs from the top.
6. Next we repeat points 10-19 from the 1st option.

Using Key Collector or Sloboeb

This method will differ from the previous one only in the use of the Key Collector program for some operations and in a deeper expansion of the keys.

What you will need:

  • Kay Collector program – 1800 rubles;
  • all the same services as in the previous method.

"Advanced - 1"

  1. We parse the left and right columns of Yandex for the entire list of phrases;
  2. We remove the exact and basic frequency through Key Collector;
  3. We calculate the indicator key 1;
  4. We delete queries from zero and with key 1 > 100;
  5. Next, we do everything the same as in paragraphs 18-19 of option 1.

"Advanced - 2"

  1. We do steps 1-5, as in option 2;
  2. We collect keys for each URL in keys.so;
  3. Removing duplicates in Key Collector;
  4. We repeat Points 1-4, as in the “Advanced -1” method.

Now let’s compare the number of keys received and their exact total frequency when collecting CN using different methods:

As we can see from the table, the best result was shown by the alternative method of creating a core for the page - “Advanced 1.2”. It was possible to obtain 34% more target keys, and at the same time, the total traffic across the cluster was 51% more than in the case of the classic method.

Below in the screenshots you can see what the finished kernel looks like in each case. I took phrases with an exact number of impressions from 7 per month so that I could evaluate the quality of the keywords. For full semantics, see the table at the “View” link.

A)


B)


IN)

Now you know that the most common method, as everyone does, is not always the most faithful and correct, but you shouldn’t give up other methods either. Much depends on the topic itself. For commercial sites where there are not many keys, the classic option is quite sufficient. You can also get excellent results on informational sites if you correctly draw up the copywriter’s specifications, do a good structure and SEO optimization. We will talk about all this in detail in the following articles.

3 common mistakes when creating a semantic core

1. Collecting phrases from top to bottom. It is not enough to parse Wordstat to get a good result!
More than 70% of queries that people enter rarely or periodically do not get there at all. But among them there are often key phrases with good conversion and really low competition. How not to miss them? Be sure to collect search tips and combine them with data from different sources (counters on websites, statistics services and databases).

2. Mixing information and commercial requests on one page. We have already discussed that key phrases differ according to the type of needs. If a visitor comes to your site who wants to make a purchase, and sees a page with an article as an answer to his request, do you think he will be satisfied? No! Search engines also think the same way when they rank a page, which means you can immediately forget about the top for mid-range and high-frequency phrases. Therefore, if you are in doubt about determining the type of request, look at the search results or use the Pixel Tools and Just-Magic tools to determine commerciality.

3. Choosing to promote very competitive queries. Positions for HF VC phrases depend 60-70% on behavioral factors, and to get them you need to get to the top. The more applicants, the longer the queue of applicants and the higher the requirements for sites. Everything is the same as in life or sports. Becoming a world champion is much more difficult than getting the same title in your city.
Therefore, it is better to enter a quiet niche rather than an overheated one.

Previously, getting to the top was even more difficult. In the top they stood according to the principle that whoever had time, ate it. Leaders got into first place, and they could only be displaced by accumulating behavioral factors. How can you get them if you are on the second or third page... Yandex broke this vicious circle in the summer of 2015 by introducing the “multi-armed bandit” algorithm. Its essence is precisely to randomly increase and decrease the positions of sites in order to understand whether more worthy candidates have appeared to be in the top.

How much money do you need to start?

To answer this question, let’s calculate the costs of the necessary arsenal of programs and services to prepare and ungroup key phrases into 100 articles.

The bare minimum (suitable for the classic version):

1. Word fucker - free
2. Majento clusterer - free
3. For captcha recognition - 30 rubles.
4. Xml limits - 70 rub.
5. Checking the competition of a request for Mutagen - 10 checks per day for free
6. If you are not in a hurry and are willing to spend 20-30 hours on parsing, you can do without a proxy.
—————————
The result is 100 rubles. If you enter captchas yourself and receive xml limits in exchange for those transferred from your website, then you can actually prepare the kernel for free. You just need to spend another day setting up and mastering the programs and another 3-4 days waiting for the parsing results.

Standard set of semanticist (for advanced and classical methods):

1. Kay Collector - 1900 rubles
2. Kay Assort - 1700 rubles
3. Bukvariks (business account) - 650 rubles.
4. Competitor analysis service keys.so - 1,500 rubles.
5. 5 proxies - 350 rubles per month
6. Anti-captcha - approximately 30 rubles.
7. Xml limits - about 80 rubles.
8. Checking competition with Mutagen (1 check = 30 kopecks) - we’ll keep it to 200 rubles.
———————-
The result is 6410 rubles. You can, of course, do without KeyAssort, replacing it with a Majento clusterer and using Sloboeb instead of Key Collector. Then 2810 rubles will be enough.

Should you trust the development of the kernel to a “pro” or is it better to figure it out and do it yourself?

If a person regularly does what he loves and gets better at it, then following the logic, his results should definitely be better than those of a beginner in this field. But with the selection of keywords, everything turns out exactly the opposite.

Why does a beginner do better than a professional in 90% of cases?

It's all about the approach. The task of a semanticist is not to assemble the best kernel for you, but to complete his work in the shortest possible time and so that its quality suits you.

If you do everything yourself using the algorithms discussed earlier, the result will be an order of magnitude higher for two reasons:

  • You understand the topic. This means that you know the needs of your clients or site users and will be able to maximally expand marker queries for parsing at the initial stage, using a large number of synonyms and specific words.
  • Interested in doing everything well. The owner of a business or an employee of the company in which he works will, of course, approach the issue more responsibly and try to do everything to the maximum. The more complete the core and the more low-competitive queries it contains, the more targeted traffic it will be possible to collect, which means the profit will be higher for the same investments in content.

How to find the remaining 10% that will make up the core better than you?

Look for companies where the selection of key phrases is a core competency. And you immediately discuss what result you want, like everyone else or the maximum. In the second case, it will be 2-3 times more expensive, but in the long run it will pay off many times over. For those who want to order a service from me, all the necessary information and conditions. I guarantee quality!

Why is it so important to fully develop semantics?

Here, as in any area, the principle of “good and bad choices” works. What is its essence?
Every day we are faced with what we choose:

  • meet a person who seems to be okay, but doesn’t attract attention, or, having understood yourself, build a harmonious relationship with the one you need;
  • do a job you don’t like or find something you love and make it your profession;
  • renting space for a store in a non-traffic area or waiting until it becomes available is a suitable option;
  • take on the team not the best sales manager, but the one who showed himself best at today’s interview.

Everything seems to be clear. But if you look at it from the other side, imagining each choice as an investment in the future. This is where the fun begins!

Saved on this. core, 3-5 thousand. Happy as elephants! But what does this lead to next:

a) for information sites:

  • Traffic losses are at least 1.5 times with the same investments in content. Comparing different methods for obtaining key phrases, we have already found out experimentally that the alternative method allows you to collect 51% more;
  • The project drops faster in search results. It’s easy for competitors to get ahead of us by giving a more complete answer in terms of intent.

b) for commercial projects:

  • Fewer leads or higher value. If we have semantics like everyone else, then we are promoting according to the same queries as our competitors. A large number of offers with constant demand reduces the share of each of them in the market;
  • Low conversion. Specific requests are better converted into sales. Saving on family kernel, we lose the most conversion keys;
  • It's harder to advance. There are many people who want to be at the top - the requirements for each of the candidates are higher.

I wish you to always make a good choice and invest only in the positive!

P.S. Bonus “How to write a good article with bad semantics”, as well as other life hacks for promoting and making money on the Internet, read in my group

The semantic core is a rather hackneyed topic, isn’t it? Today we will fix this together by collecting semantics in this lesson!

Don't believe me? - see for yourself - just enter the phrase semantic core of the site into Yandex or Google. I think that today I will correct this annoying mistake.

But really, what is it like for you - ideal semantics? You might think that this is a stupid question, but in fact it is not at all stupid, it’s just that most webmasters and site owners firmly believe that they know how to compose semantic cores and that any schoolchild can cope with all this, and they themselves try to teach others... But in reality everything is much more complicated. Once they asked me - what should you do first? — the site itself and content or seven core, and asked by a person who does not consider himself a novice in SEO. This question made me understand the complexity and ambiguity of this problem.

The semantic core is the basis of the foundations - the very first step that stands before the launch of any advertising campaign on the Internet. Along with this, site semantics is the most tedious process that will require a lot of time, but will more than pay off in any case.

Well... Let's create his together!

A short preface

To create a semantic field for a website, we need one and only program - Key Collector. Using the example of the Collector, I will analyze the example of collecting a small family group. In addition to the paid program, there are also free analogues like SlovoEb and others.

Semantics is assembled in several basic stages, among which the following should be highlighted:

  • brainstorming - analysis of basic phrases and preparation of parsing
  • parsing - expansion of basic semantics based on Wordstat and other sources
  • screening - screening after parsing
  • analysis - analysis of frequency, seasonality, competition and other important indicators
  • refinement - grouping, separation of commercial and informational phrases of the core

The most important stages of collection will be discussed below!

VIDEO - compiling a semantic core for competitors

Brainstorming when creating a semantic core - flexing our brains

At this stage it is necessary make a mental selection the semantic core of the site and come up with as many phrases as possible to suit our topic. So, launch the key collector and select Wordstat parsing, as shown in the screenshot:

A small window opens in front of us, where we need to enter as many phrases as possible on our topic. As I already said, in this article we will create an example set of phrases for this blog, so the phrases could be as follows:

  • seo blog
  • seo blog
  • blog about SEO
  • blog about SEO
  • promotion
  • promotion project
  • promotion
  • promotion
  • blog promotion
  • blog promotion
  • blog promotion
  • blog promotion
  • promotion with articles
  • article promotion
  • miralinks
  • work at sape
  • buying links
  • purchasing links
  • optimization
  • page optimization
  • internal optimization
  • self-promotion
  • how to promote a resource
  • how to promote your website
  • how to promote a website yourself
  • how to promote a website yourself
  • self-promotion
  • free promotion
  • free promotion
  • search engine optimization
  • how to promote a website in Yandex
  • how to promote a website in Yandex
  • promotion under Yandex
  • Google promotion
  • promotion on Google
  • indexing
  • indexing acceleration
  • donor selection site
  • donor screening
  • promotion by guards
  • use of guards
  • blog promotion
  • Yandex algorithm
  • Tits update
  • search database update
  • Yandex update
  • links forever
  • eternal links
  • rent links
  • rented link
  • links with monthly payment
  • compiling a semantic core
  • promotion secrets
  • promotion secrets
  • SEO secrets
  • optimization secrets

I think that’s enough, and so the list is half a page long;) In general, the idea is that at the first stage you need to analyze your industry to the maximum and select as many phrases as possible that reflect the theme of the site. Although, if you missed anything at this stage, don’t despair - missed phrases will definitely come up in the next stages, you'll just have to do a lot of extra work, but that's okay. We take our list and copy it to the key collector. Next, click on the button - Parse from Yandex.Wordstat:

Parsing can take quite a long time, so you should be patient. The semantic core usually takes 3-5 days to assemble, and the first day will be spent preparing the basic semantic core and parsing.

I wrote detailed instructions on how to work with the resource and how to select keywords. You can also find out about website promotion based on low-frequency queries.

Additionally, I will say that instead of brainstorming, we can use ready-made semantics of competitors using one of the specialized services, for example, SpyWords. In the interface of this service, we simply enter the keyword we need and see the main competitors who are present in the TOP for this phrase. Moreover, the semantics of any competitor’s website can be completely downloaded using this service.

Next, we can select any of them and pull out his queries, which will remain to be sifted out from the garbage and used as basic semantics for further parsing. Or we can do it even simpler and use .

Cleaning up semantics

As soon as Wordstat parsing stops completely - it's time to weed out the semantic core. This stage is very important, so treat it with due attention.

So, my parsing is over, but I got the phrases So many, and therefore, sifting out words can take up extra time from us. Therefore, before moving on to determining frequency, you should perform an initial cleaning of words. We will do this in several stages:

1. Let's filter out queries with very low frequencies

To do this, click on the symbol for sorting by frequency, and begin clearing out all queries with frequencies below 30:

I think that you can easily cope with this point.

2. We will remove queries that do not make sense

There are queries that have sufficient frequency and low competition, but they Doesn't fit our theme at all. Such keys must be removed before checking exact occurrences of the key, because verification can be very time consuming. We will delete such keys manually. So, for my blog the following turned out to be superfluous:

  • search engine optimization courses
  • selling a promoted website

Semantic core analysis

At this stage, we need to determine the exact frequencies of our keys, for which you need to click on the magnifying glass symbol, as shown in the image:

The process is quite long, so you can go and make yourself some tea)

When the check was successful, we need to continue cleaning our kernel.

I suggest you delete all keys with a frequency of less than 10 requests. Also, for my blog, I will delete all queries with values ​​above 1,000, since I do not plan to move forward with such queries.

Export and grouping of the semantic core

Do not think that this stage will be the last. Not at all! Now we need to transfer the resulting group to Excel for maximum clarity. Next, we will sort by page and then we will see many shortcomings, which we will correct.

Exporting site semantics to Excel is not at all difficult. To do this, you just need to click on the corresponding symbol, as shown in the image:

After inserting into Excel, we will see the following picture:

Columns marked in red must be deleted. Then we create another table in Excel, which will contain the final semantic core.

The new table will have 3 columns: URLpages, key phrase and him frequency. For the URL, select either an existing page or a page that will be created in the future. First, let's select the keys for the main page of my blog:

After all the manipulations, we see the following picture. And several conclusions immediately arise:

  1. such high-frequency queries should have a much larger tail of less frequent phrases than we see
  2. seo news
  3. a new key has surfaced that we did not take into account earlier - SEO articles. This key needs to be parsed

As I already said, not a single key can be hidden from us. The next step for us is to brainstorm these three phrases. After brainstorming, we repeat all the steps starting from the very first point for these keys. All this may seem too long and tedious to you, but that’s how it is - compiling a semantic core is a very responsible and painstaking job. But a well-designed field will greatly help in website promotion and can greatly save your budget.

After all the operations done, we were able to get new keys for the main page of this blog:

  • best seo blog
  • seo news
  • SEO articles

And some others. I think that the technique is clear to you.

After all these manipulations, we will see which pages of our project need to be changed (), and which new pages need to be added. Most of the keys we found (with a frequency of up to 100, and sometimes much higher) can be easily promoted alone.

Final elimination

In principle, the semantic core is almost ready, but there is one more rather important point that will help us significantly improve our semantic group. For this we need Seopult.

*In fact, here you can use any of the similar services that allow you to find out the competition by keywords, for example, Mutagen!

So, we create another table in Excel and copy only the names of the keys there (middle column). In order not to waste a lot of time, I will copy only the keys for the main page of my blog:

Then we check the cost of receiving one click using our keywords:

The cost of transition for some phrases exceeded 5 rubles. Such phrases need to be eliminated from our core.

Perhaps your preferences will be slightly different, then you can exclude less expensive phrases or vice versa. In my case, I deleted 7 phrases.

Helpful information!

on compiling a semantic core, with an emphasis on eliminating the most low-competitive keywords.

If you have your own online store - read, which describes how the semantic core can be used.

Clustering of the semantic core

I’m sure you’ve heard this word before in relation to search engine optimization. Let's figure out what kind of animal this is and why it is needed when promoting a website.
The classic search engine promotion model looks like this:

  • Selection and analysis of search queries
  • Grouping requests by site pages (creating landing pages)
  • Preparation of SEO texts for landing pages based on a group of queries for these pages

Clustering is used to facilitate and improve the second stage in the list above. At its core, clustering is a software method that serves to simplify this stage when working with large semantics, but not everything is as simple as it might seem at first glance.

To better understand the theory of clustering, you should take a short excursion into the history of SEO:

Just a few years ago, when the term clustering was not peeking around every corner, SEO specialists, in the vast majority of cases, grouped semantics by hand. But when grouping huge semantics into 1000, 10,000 and even 100,000 queries, this procedure turned into real hard labor for an ordinary person. And then the method of grouping by semantics began to be used everywhere (and today many people use this approach). The method of grouping by semantics involves combining queries that have semantic relatedness into one group. As an example, the requests “buy a washing machine” and “buy a washing machine up to 10,000” were combined into one group. And everything would be fine, but this method contains a number of critical problems and to understand them it is necessary to introduce a new term into our narrative, namely “ request intent”.

The easiest way to describe this term is as a user need, his desire. Intent is nothing more than the desire of the user entering a search query.
The basis of grouping semantics is to collect into one group queries that have the same intent, or the closest possible intents, and here two interesting features emerge at once, namely:

  • The same intent can have several queries that do not have any semantic similarity, for example, “car maintenance” and “sign up for maintenance”
  • Queries that have absolute semantic similarity can contain radically different intentions, for example, the textbook situation - “mobile phone” and “mobile phones”. In one case, the user wants to buy a phone, and in the other, watch a movie

So, grouping semantics by semantic correspondence does not take into account the intent of requests. And groups compiled in this way will not allow you to write a text that will make it to the TOP. During manual grouping, to eliminate this misunderstanding, guys with the profession of “helpful SEO specialist” analyzed the search results manually.

The essence of clustering is to compare the generated search engine results in search of patterns. From this definition, you should immediately make a note that clustering itself is not the ultimate truth, because the generated output may not fully reveal the intent (there may simply not be a site in the Yandex database that correctly combined requests into a group).

The mechanics of clustering are simple and look like this:

  • The system one by one enters all queries submitted to it into the search results and remembers the results from the TOP
  • After entering queries one by one and saving the results, the system looks for intersections in the results. If the same site with the same document (site page) is in the TOP for several requests at once, then these requests can theoretically be combined into one group
  • A parameter such as grouping strength becomes relevant, which tells the system exactly how many intersections there should be so that requests can be added to one group. For example, a grouping strength of 2 means that the results for 2 different queries must contain at least two intersections. To put it even more simply, at least two pages of two different sites must be simultaneously in the TOP for one and another request. Example below.
  • When grouping large semantics, the logic of connections between queries becomes relevant, on the basis of which 3 basic types of clustering are distinguished: soft, middle and hard. We will talk more about types of clustering in the next entries in this diary.

The semantic core of a site consists of keywords (queries) that users use on the Internet to search for services, products and any other information that this site offers. For webmasters, this is an action plan for promoting the resource. Ideally The semantic core of the site is created once, before optimization and promotion begin.


The semantic core of a website is usually compiled in several stages:

  1. All sorts of words (phrases) that are appropriate to the topic of the site are selected. At first, you can limit yourself to 100–200 search queries. In order to know which queries are suitable for you, answer the question “What do I want to dedicate my site to?”
  2. Expansion of the semantic core through associative queries
  3. Inappropriate words should be eliminated. Here you filter out those phrases that you will not use to promote your site. There are usually more than half of such words.
  4. Highly competitive queries for which there is no point in promoting the site are eliminated. Typically, three words out of five or more are removed.
  5. And lastly, this is the correct distribution of the list of search queries on the resource pages. It is recommended to leave highly competitive queries on the main page of the resource; less competitive ones should be grouped according to their meaning and placed on other pages. To do this, you need to create a document in Excel and break down the keywords into pages.

Selection of search queries and checking frequency

The first thing you need to do is collect as many different queries as possible on your topic that interest users on the Internet. There are two methods for this:

  • Free ones, which include: Wordstat Yandex, Slovoeb, the old-fashioned way, hints from Google (External Keyword Tool), analysis of the semantics of competitors and search tips.
  • Paid ones that include Key Collector, Semrush, Pastukhov databases and some other services.

These tools are suitable for various purposes (for example, Semrush is best used for burzhunet). Of course, all this can be entrusted to optimizers, but there is a possibility that you will receive an incomplete semantic core.

Many people use Pastukhov’s database to collect key phrases, but with Key Collector it is much more convenient to collect queries from Yandex and Google statistics services.

At the initial stage, it is better to collect queries in Excel; it looks like this:


If Google is more important for your resource, then focus on it, but also take into account and analyze keywords from Yandex. It is also very important to collect a long tail of low-frequency queries; they will get you traffic much faster.

Another option you can use is to find out key phrases (words) from your competitors and use them. At this stage, you simply collect as many key phrases (words) that are relevant to the topic of your resource as possible, and then move on to the next stage - filtering.

Analysis of requests, removal of dummies

This stage is already simpler; here you need to filter out dummy words and those that are not related to the theme of the site. For example, you have lunch delivery in Kyiv, but there are other cities on the list.

How to identify empty requests? Go to Yandex Wordstat and enter the keyword:


You see 881 impressions per month, but to be more precise:


Now a completely different picture emerges. This may not be the best example, but the main thing is that you get the idea. There are a large number of key phrases for which sufficient traffic is visible, although in reality there is nothing there. That's why you need to weed out such phrases.

For example, if a person before (or after) typing the request “lunch delivery” entered another phrase in the search bar (called one search session), then Yandex makes the assumption that these search phrases are somehow interconnected. If such a relationship is observed among several people, then such associative queries are shown in the right column of wordstat.


Such search queries are sorted in the wordstat window in descending order of frequency of their entry in conjunction with the main query this month (the frequency of their use in the Yandex search engine is shown). You need to use this information to expand the semantic core of your resource.

Distribution of requests across pages

After this, you need to distribute the keywords (phrases) you collected on the pages of your site. Distribution is much easier when you don’t yet have the pages themselves.

Focus primarily on keywords in search queries and their frequency. To deal with competition, you should do this: dedicate the main page to one or two highly competitive queries.

For moderately competitive or low competitive queries, optimize section and article pages accordingly.

If there is semantic similarity in the search queries, simply collect the same phrases and define them in one group. When creating keywords to promote a resource, always use not only standard tools, but also a creative approach.

By combining non-standard and classical methods, you can simply and quickly create the semantic core of a site, choose the most optimal promotion strategy and achieve success much faster!

03.08.2018 Reading time: 5 minutes

What is the semantic core?

The semantic core is a set of search phrases and words used to promote the site. These search words and phrases help robots determine the topic of a page or an entire service, that is, find out what the company does.

In the Russian language, semantics is a branch of the science of language that studies the semantic content of lexical units of a language. In relation to search engine optimization, this means that the semantic core is the semantic content of the resource. It helps to decide what information to convey to users and in what manner. Therefore, semantics is the foundation, the basis of all SEO.

Why do you need a semantic core of a website and how to use it?

  • The correct semantic core is necessary to accurately calculate the cost of promotion.
  • Semantics is a vector for building internal SEO optimization: the most relevant queries are selected for each service or product so that users and search robots can find them better.
  • Based on it, the site structure and texts for thematic pages are created.
  • Keys from semantics are used to write snippets (short descriptions of the page).

Here is the semantic core - an example of how it was compiled in a company website for a construction company website:

The optimizer collects semantics, parses it into logical blocks, finds out the number of impressions and, based on the cost of queries in the top Yandex and Google, calculates the total cost of promotion.

Of course, when selecting a semantic core, the specifics of the company’s work are taken into account: for example, if the company did not design and build houses from laminated veneer lumber, we would delete the corresponding queries and not use them in the future. Therefore, an obligatory stage of working with semantics is its coordination with the customer: no one knows the specifics of the company’s work better than him.

Types of Keywords

There are several parameters by which key queries are classified.

  1. By frequency
    • high-frequency – words and phrases with a frequency of 1000 impressions per month;
    • mid-frequency – up to 1000 impressions per month;
    • low-frequency – up to 100 impressions.
  2. Collecting keyword frequency helps you find out what users are asking for most often. But a high-frequency query is not necessarily a highly competitive query, and composing semantics with high frequency and low competitiveness is one of the main aspects in working with the semantic core.

  3. Type:
    • geo-dependent and non-geo-dependent – ​​promotions tied to the region and not tied;
    • informational – from them the user receives some information. Keys of this type are usually used in articles - for example, reviews or useful tips;
    • branded – contain the name of the promoted brand;
    • transactional – implying an action from the user (buy, download, order) and so on.
  4. Other types are those that are difficult to classify as any type: for example, a “profiled beam” key. By typing such a query into a search engine, the user can mean anything: purchasing timber, properties, comparisons with other materials, etc.

    From the experience of our company, we can say that it is very difficult to promote any website based on such requests - as a rule, these are high-frequency and highly competitive, and this is not only difficult to optimize, but also expensive for the client.

How to collect a semantic core for a website?

  • By analyzing competitor sites (in SEMrush, SerpStat you can see the semantic core of competitors):

The process of compiling a semantic core

The collected queries are not yet a semantic core; here we still need to separate the wheat from the chaff so that all queries are relevant to the client’s services.

To create a semantic core, queries need to be clustered (divided into blocks according to the logic of service provision). This can be done using programs (for example, KeyAssort or TopSite) - especially if the semantics are voluminous. Or manually evaluate and iterate through the entire list, removing inappropriate queries.

Then send it to the client and check if there are any errors.

A ready-made semantic core is a yellow brick path to the content plan, blog articles, texts for product cards, company news, and so on. This is a table of audience needs that you can satisfy using your website.

  • Distribute the keys across pages.
  • Use keywords in meta tags , <description>, <h>(especially in the first level H1 heading).</li> <li>Insert keys into texts for pages. This is one of the white hat optimization methods, but it is important not to overdo it: overspam can result in search engine filters.</li> <li>Save the remaining search queries and those that do not fit into any section under the title “What else to write about.” You can use them for informational articles in the future.</li> <li>And remember: you need to focus on the requests and interests of users, so trying to cram all the keys into one text is pointless</li> </ul><h2>Collecting a semantic core for a website: main mistakes</h2> <ul><li>Refusal of highly competitive keys. Yes, perhaps, to the top upon request <i>“buy profiled timber”</i> you won’t get it (and it won’t stop you from successfully selling your services), but you still need to include it in your texts.</li> <li>Refusal of low frequencies. This is wrong for the same reason as rejecting highly competitive requests.</li> <li>Creating pages for requests and for the sake of requests. <i>"Buy profiled timber"</i> And <i>“order profiled timber”</i>- essentially the same thing, there is no point in breaking them up into separate pages.</li> <li>Absolute and unconditional trust in the software. You can’t do without SEO programs, but manual analysis and data verification are necessary. And no program can yet assess the industry, the level of competition and distribute keys without errors.</li> <li>Keys are our everything. No, our everything is a convenient, understandable website and useful content. Any text needs keys, but if the text is bad, then the keys will not save you.</li> </ul> <script>document.write("<img style='display:none;' src='//counter.yadro.ru/hit;artfast_after?t44.1;r"+ escape(document.referrer)+((typeof(screen)=="undefined")?"": ";s"+screen.width+"*"+screen.height+"*"+(screen.colorDepth? screen.colorDepth:screen.pixelDepth))+";u"+escape(document.URL)+";h"+escape(document.title.substring(0,150))+ ";"+Math.random()+ "border='0' width='1' height='1' loading=lazy loading=lazy>");</script> </div> <div class="comment_box" id="comments"> </div> </div> <div id="sidebar"> <div class="widget widget_nav_menu" id="nav_menu-2"> <div class="menu-mainmenu-container"> <ul id="menu-mainmenu-2" class="menu"> <li class="submenu"><a href="https://viws.ru/en/category/internet/">Internet</a> </li> <li class="submenu"><a href="https://viws.ru/en/category/programs/">Programs</a> </li> <li class="submenu"><a href="https://viws.ru/en/category/instructions/">Instructions</a> </li> <li class="submenu"><a href="https://viws.ru/en/category/browsers/">Browsers</a> </li> <li class="submenu"><a href="https://viws.ru/en/category/windows-10/">Windows 10</a> </li> <li class="submenu"><a href="https://viws.ru/en/category/android/">Android</a> </li> <li class="submenu"><a href="https://viws.ru/en/category/ios/">iOS</a> </li> <li class="submenu"><a href="https://viws.ru/en/category/communication/">Connection</a> </li> </ul> </div> </div> <div class="widget"> <div class="heading star">The last notes</div> <div class="popular_posts"> <div class="news_box"> <a href="https://viws.ru/en/obnovit-po-soni-iksperiya-obnovlenie-i-vosstanovlenie-sony-xperia.html" class="thumb"><img width="95" height="95" src="/uploads/401949e631612fa1b849303aa87f4b52.jpg" class="attachment-mini size-mini wp-post-image" alt="Updating and restoring Sony Xperia - instructions" sizes="(max-width: 95px) 100vw, 95px" / loading=lazy loading=lazy></a> <div class="element"> <div class="title"> <a href="https://viws.ru/en/obnovit-po-soni-iksperiya-obnovlenie-i-vosstanovlenie-sony-xperia.html">Updating and restoring Sony Xperia - instructions</a> </div> </div> </div> <div class="news_box"> <a href="https://viws.ru/en/4-dyuimovyi-ekran-luchshie-kompaktnye-smartfony-po-otzyvam-pokupatelei.html" class="thumb"><img width="95" height="95" src="/uploads/8ae52fb505598a8df03b3fed89c37f5a.jpg" class="attachment-mini size-mini wp-post-image" alt="The best compact smartphones according to customer reviews" sizes="(max-width: 95px) 100vw, 95px" / loading=lazy loading=lazy></a> <div class="element"> <div class="title"> <a href="https://viws.ru/en/4-dyuimovyi-ekran-luchshie-kompaktnye-smartfony-po-otzyvam-pokupatelei.html">The best compact smartphones according to customer reviews</a> </div> </div> </div> <div class="news_box"> <a href="https://viws.ru/en/knopka-vklyucheniya-iphone-5-ceny-na-nekotorye-nashi-uslugi.html" class="thumb"><img width="95" height="95" src="/uploads/b2895b4c11113c57191652d658a97c44.jpg" class="attachment-mini size-mini wp-post-image" alt="Prices for some of our services" sizes="(max-width: 95px) 100vw, 95px" / loading=lazy loading=lazy></a> <div class="element"> <div class="title"> <a href="https://viws.ru/en/knopka-vklyucheniya-iphone-5-ceny-na-nekotorye-nashi-uslugi.html">Prices for some of our services</a> </div> </div> </div> <div class="news_box"> <a href="https://viws.ru/en/mozhno-ne-chitat-soobshcheniya-vk-kak-nezametno-prochitat.html" class="thumb"><img width="95" height="95" src="/uploads/5c2abd2b5bfc6f9966ea8cea38fdf8c1.jpg" class="attachment-mini size-mini wp-post-image" alt="How to quietly read VKontakte messages" sizes="(max-width: 95px) 100vw, 95px" / loading=lazy loading=lazy></a> <div class="element"> <div class="title"> <a href="https://viws.ru/en/mozhno-ne-chitat-soobshcheniya-vk-kak-nezametno-prochitat.html">How to quietly read VKontakte messages</a> </div> </div> </div> <div class="news_box"> <a href="https://viws.ru/en/programmy-dlya-arhivacii-i-vosstanovleniya-sistemy-kakie-ispolzovat.html" class="thumb"><img width="95" height="95" src="/uploads/74d88ac620230a8cbf654717104fd3c9.jpg" class="attachment-mini size-mini wp-post-image" alt="What programs should I use to backup data on my computer?" sizes="(max-width: 95px) 100vw, 95px" / loading=lazy loading=lazy></a> <div class="element"> <div class="title"> <a href="https://viws.ru/en/programmy-dlya-arhivacii-i-vosstanovleniya-sistemy-kakie-ispolzovat.html">What programs should I use to backup data on my computer?</a> </div> </div> </div> </div> </div> <div class="widget"> <div class="heading star">Popular</div> <div class="popular_posts"> <div class="news_box"> <a href="https://viws.ru/en/reshaem-problemy-s-zapuskom-prilozhenii-posle-obnovleniya-os-x-reshaem-problemy-s.html" class="thumb"><img width="95" height="95" src="/uploads/41f63e9355f218fc666746b89ceec487.jpg" class="attachment-mini size-mini wp-post-image" alt="Solving problems with launching applications after updating OS X Mac has become slower" sizes="(max-width: 95px) 100vw, 95px" / loading=lazy loading=lazy></a> <div class="element"> <div class="title"> <a href="https://viws.ru/en/reshaem-problemy-s-zapuskom-prilozhenii-posle-obnovleniya-os-x-reshaem-problemy-s.html">Solving problems with launching applications after updating OS X Mac has become slower</a> </div> </div> </div> <div class="news_box"> <a href="https://viws.ru/en/poryadok-programmirovaniya-mikrokontrollerov-avr-sovety.html" class="thumb"><img width="95" height="95" src="/uploads/f66516ab885654ce47577b693b7b9d51.jpg" class="attachment-mini size-mini wp-post-image" alt="Tips for beginning microcontroller programmers" sizes="(max-width: 95px) 100vw, 95px" / loading=lazy loading=lazy></a> <div class="element"> <div class="title"> <a href="https://viws.ru/en/poryadok-programmirovaniya-mikrokontrollerov-avr-sovety.html">Tips for beginning microcontroller programmers</a> </div> </div> </div> <div class="news_box"> <a href="https://viws.ru/en/shemy-lampovyh-usilitelei-dlya-elektrogitary-obzor-gitarnyh-usilitelei.html" class="thumb"><img width="95" height="95" src="/uploads/1e2c7e97054ef25522080f4fbc300f86.jpg" class="attachment-mini size-mini wp-post-image" alt="Review of Hi-End Guitar Amplifiers" sizes="(max-width: 95px) 100vw, 95px" / loading=lazy loading=lazy></a> <div class="element"> <div class="title"> <a href="https://viws.ru/en/shemy-lampovyh-usilitelei-dlya-elektrogitary-obzor-gitarnyh-usilitelei.html">Review of Hi-End Guitar Amplifiers</a> </div> </div> </div> </div> </div> <div class="widget"> <div class="heading">News</div> <div class="business_news"> <div class="news"> <div class="date">2024-05-02 01:37:52</div> <a href="https://viws.ru/en/izmenenie-i-nastroika-temy-wordpress-luchshie-minimalistskie-temy.html" class="title">Best Minimalist WordPress Themes for Business and Blogging Avada – Best Selling Business WordPress Theme</a> </div> <div class="news"> <div class="date">2024-05-01 01:40:44</div> <a href="https://viws.ru/en/televizor-supra-obnovlenie-po-usb-instrukciya-po-obnovleniyu-programmnogo.html" class="title">Instructions for updating software on Supra Smart TVs</a> </div> <div class="news"> <div class="date">2024-05-01 01:40:44</div> <a href="https://viws.ru/en/inno-setup-ne-vyvodit-privetstvie-sozdanie-distributiva-windows-prilozheniya-v-inno-setup.html" class="title">Creating a Windows application distribution in Inno Setup</a> </div> <div class="news"> <div class="date">2024-05-01 01:40:44</div> <a href="https://viws.ru/en/luchshie-graficheskie-programmy-dlya-risovaniya-na-kompyutere.html" class="title">Free programs for drawing on a computer and tablet Program for drawing on a Russian PC</a> </div> <div class="news"> <div class="date">2024-05-01 01:40:44</div> <a href="https://viws.ru/en/kak-risovat-na-kompe-myshkoi-risovanie-myshkoi-na-kompyutere-osnovnye.html" class="title">Drawing with a mouse on a computer</a> </div> </div> </div> <div class="widget ai_widget" id="ai_widget-5"> <div class='dynamic dynamic-13' style='margin: 8px 0; clear: both;'> </div> </div> </div> </div> </div> </div> <div id="footer"> <div class="fixed"> <div class="inner"> <div class="footer_l"> <a href="https://viws.ru/en/" class="logo" style="background:none;">viws.ru</a> <div class="copyright"> <p>viws.ru - All about modern technology. Breakdowns, social networks, internet, viruses</p> <p><span>2024 - All rights reserved</span></p> </div> </div> <div class="footer_c"> <ul id="menu-topmenu-1" class="nav"> <li><a href="https://viws.ru/en/feedback.html">Contacts</a></li> <li><a href="">About the site</a></li> <li><a href="">Advertising on the website</a></li> </ul> <div class="footer_menu"> <ul id="menu-nizhnee-1" class=""> <li id="menu-item-"><a href="https://viws.ru/en/category/internet/">Internet</a></li> <li id="menu-item-"><a href="https://viws.ru/en/category/programs/">Programs</a></li> <li id="menu-item-"><a href="https://viws.ru/en/category/instructions/">Instructions</a></li> <li id="menu-item-"><a href="https://viws.ru/en/category/browsers/">Browsers</a></li> </ul> <ul id="menu-nizhnee-2" class=""> <li id="menu-item-"><a href="https://viws.ru/en/category/internet/">Internet</a></li> <li id="menu-item-"><a href="https://viws.ru/en/category/programs/">Programs</a></li> <li id="menu-item-"><a href="https://viws.ru/en/category/instructions/">Instructions</a></li> <li id="menu-item-"><a href="https://viws.ru/en/category/browsers/">Browsers</a></li> </ul> </div> </div> </div> </div> </div> </div> <script type="text/javascript">jQuery(function($) { $(document).on("click", ".pseudo-link", function(){ window.open($(this).data("uri")); } );} );</script> <script type='text/javascript' src='https://viws.ru/wp-content/plugins/contact-form-7/includes/js/scripts.js?ver=4.9.2'></script> <script type='text/javascript' src='https://viws.ru/wp-content/plugins/table-of-contents-plus/front.min.js?ver=1509'></script> <script type='text/javascript' src='https://viws.ru/wp-content/themes/delo/assets/scripts/theme.js'></script> <script type='text/javascript'> var q2w3_sidebar_options = new Array(); q2w3_sidebar_options[0] = { "sidebar" : "sidebar", "margin_top" : 60, "margin_bottom" : 200, "stop_id" : "", "screen_max_width" : 0, "screen_max_height" : 0, "width_inherit" : false, "refresh_interval" : 1500, "window_load_hook" : false, "disable_mo_api" : false, "widgets" : ['text-8','ai_widget-5'] } ; </script> <script type='text/javascript' src='https://viws.ru/wp-content/plugins/q2w3-fixed-widget/js/q2w3-fixed-widget.min.js?ver=5.0.4'></script> <script async="async" type='text/javascript' src='https://viws.ru/wp-content/plugins/akismet/_inc/form.js?ver=4.0.1'></script> </body> </html>