• Creation of the semantic core of a web resource. What do you need to know at the very beginning? Brainstorming when creating a semantic core - flexing our brains

    Semantic core is a scary name that SEO people came up with to mean a rather simple thing. We just need to pick key queries, on which we will promote our site.

    And in this article I will show you how to correctly compose a semantic core so that your site quickly reaches the TOP, and does not stagnate for months. There are also “secrets” here.

    And before we move on to compiling the SY, let's figure out what it is and what we should ultimately come to.

    What is the semantic core in simple words

    Oddly enough, but the semantic core is the usual excel file, which lists the key queries for which you (or your copywriter) will write articles for the site.

    For example, this is what my semantic core looks like:

    I have marked in green those key queries for which I have already written articles. Yellow - those for which I plan to write articles in the near future. And colorless cells mean that these requests will come a little later.

    For each key query, I have determined the frequency, competitiveness, and come up with a “catchy” title. You should get approximately the same file. Now my CN consists of 150 keywords. This means that I am provided with “material” for at least 5 months in advance (even if I write one article a day).

    Below we will talk about what you should prepare for if you suddenly decide to order the collection of the semantic core from specialists. Here I will say briefly - they will give you the same list, but only for thousands of “keys”. However, in SY it is not quantity that is important, but quality. And we will focus on this.

    Why do we need a semantic core at all?

    But really, why do we need this torment? You can, after all, just write quality articles and attract an audience, right? Yes, you can write, but you won’t be able to attract people.

    The main mistake of 90% of bloggers is simply writing high-quality articles. I'm not kidding, they have really interesting and useful materials. But search engines don’t know about it. They are not psychics, but just robots. Accordingly, they do not rank your article in the TOP.

    There is another subtle point with the title. For example, you have a very high-quality article on the topic “How to properly conduct business in a face book.” There you describe everything about Facebook in great detail and professionally. Including how to promote communities there. Your article is the highest quality, useful and interesting on the Internet on this topic. No one was lying next to you. But it still won't help you.

    Why high-quality articles fall out of the TOP

    Imagine that your site was visited not by a robot, but by a live inspector (assessor) from Yandex. He realized that you have the coolest article. And hands put you in first place in the search results for the request “Promoting a community on Facebook.”

    Do you know what will happen next? You will fly out of there very soon anyway. Because no one will click on your article, even in first place. People enter the query “Promoting a community on Facebook,” and your headline is “How to properly run a business in a face book.” Original, fresh, funny, but... not on request. People want to see exactly what they were looking for, not your creativity.

    Accordingly, your article will empty its place in the TOP search results. And a living assessor, an ardent admirer of your work, can beg the authorities as much as he wants to leave you at least in the TOP 10. But it won't help. All the first places will be taken by empty articles, like the husks of seeds, that yesterday’s schoolchildren copied from each other.

    But these articles will have the correct “relevant” title - “Promoting a community on Facebook from scratch” ( step by step, in 5 steps, from A to Z, free etc.) Is it offensive? Of course. Well, fight against injustice. Let's create a competent semantic core so that your articles take the well-deserved first places.

    Another reason to start writing SYNOPSIS right now

    There is one more thing that for some reason people don’t think much about. You need to write articles often - at least every week, and preferably 2-3 times a week - to gain more traffic and faster.

    Everyone knows this, but almost no one does it. And all because they have “creative stagnation”, “they just can’t force themselves”, “they’re just lazy”. But in fact, the whole problem lies in the absence of a specific semantic core.

    I entered one of my basic keys into the search field - “smm”, and Yandex immediately gave me a dozen hints about what else might be interesting to people who are interested in “smm”. All I have to do is copy these keys into a notebook. Then I will check each of them in the same way, and collect hints on them as well.

    After the first stage of collecting SY, you should be able to text document, which will contain 10-30 broad basic keys, with which we will work further.

    Step #2 — Parsing basic keys in SlovoEB

    Of course, if you write an article for the request “webinar” or “smm”, then a miracle will not happen. You will never be able to reach the TOP for such a broad request. We need to break the basic key into many small queries on this topic. And we will do this using a special program.

    I use KeyCollector, but it's paid. You can use free analogue- SlovoEB program. You can download it from the official website.

    The most difficult thing about working with this program is setting it up correctly. I show you how to properly set up and use Sloboeb. But in that article I focus on selecting keys for Yandex Direct.

    And here let’s look step by step at the features of using this program for creating a semantic core for SEO.

    First, we create a new project and name it by the broad key that you want to parse.

    I usually give the project the same name as my base key to avoid confusion later. And yes, I will warn you against one more mistake. Don't try to parse all base keys at once. Then it will be very difficult for you to filter out “empty” key queries from golden grains. Let's parse one key at a time.

    After creating the project, we carry out the basic operation. That is, we actually parse the key through Yandex Wordstat. To do this, click on the “Worstat” button in the program interface, enter your base key, and click “Start collection”.

    For example, let's parse the base key for my blog “contextual advertising”.

    After this, the process will start, and after some time the program will give us the result - up to 2000 key queries that contain “contextual advertising”.

    Also, next to each request there will be a “dirty” frequency - how many times this key (+ its word forms and tails) was searched per month through Yandex. But I do not advise drawing any conclusions from these figures.

    Step #3 - Collecting the exact frequency for the keys

    Dirty frequency will not show us anything. If you focus on it, then don’t be surprised when your key for 1000 requests does not bring a single click per month.

    We need to identify pure frequency. And to do this, we first select all the found keys with checkmarks, and then click on the “Yandex Direct” button and start the process again. Now Slovoeb will look for the exact request frequency per month for each key.

    Now we have an objective picture - how many times what query was entered by Internet users over the past month. I now propose to group all key queries by frequency to make it easier to work with them.

    To do this, click on the “filter” icon in the “Frequency” column. ", and specify - filter out keys with the value "less than or equal to 10".

    Now the program will show you only those requests whose frequency is less than or equal to the value “10”. You can delete these queries or copy them to another group of key queries for future use. Less than 10 is very little. Writing articles for these requests is a waste of time.

    Now we need to select those key queries that will bring us more or less good traffic. And to do this, we need to find out one more parameter - the level of competitiveness of the request.

    Step #4 — Checking the competitiveness of requests

    All “keys” in this world are divided into 3 types: high frequency (HF), mid frequency (MF), low frequency (LF). They can also be highly competitive (HC), moderately competitive (SC) and low competitive (LC).

    As a rule, HF requests are also VC. That is, if a query is often searched on the Internet, then there are a lot of sites that want to promote it. But this is not always the case; there are happy exceptions.

    The art of compiling a semantic core lies precisely in finding queries that have a high frequency and a low level of competition. It is very difficult to manually determine the level of competition.

    You can focus on indicators such as the number of main pages in the TOP 10, length and quality of texts. level of trust and tits of sites in the TOP search results upon request. All of this will give you some idea of ​​how tough the competition is for rankings for this particular query.

    But I recommend you use Mutagen service. It takes into account all the parameters that I mentioned above, plus a dozen more that neither you nor I have probably even heard of. After analysis, the service gives an exact value - what level of competition this request has.

    Here I checked the query "setting contextual advertising in google adwords." Mutagen showed us that this key has a competitiveness of “more than 25” - this is maximum value which he shows. And this query has only 11 views per month. So it definitely doesn’t suit us.

    We can copy all the keys that we found in Slovoeb and do a mass check in Mutagen. After this, all we have to do is look through the list and take those requests that have many requests and low level competition.

    Mutagen is a paid service. But you can do 10 checks per day for free. In addition, the cost of testing is very low. In all the time I have been working with him, I have not yet spent even 300 rubles.

    By the way, about the level of competition. If you have a young site, then it is better to choose queries with a competition level of 3-5. And if you have been promoting for more than a year, then you can take 10-15.

    By the way, regarding the frequency of requests. We now need to take the final step, which will allow you to attract a lot of traffic even low frequency queries.

    Step #5 — Collecting “tails” for the selected keys

    As has been proven and tested many times, your site will receive the bulk of traffic not from the main keywords, but from the so-called “tails”. This is when a person enters search bar strange key queries, with a frequency of 1-2 per month, but there are a lot of such queries.

    To see the “tail”, simply go to Yandex and enter the key query of your choice into the search bar. Here's roughly what you'll see.

    Now you just need to write down these additional words in a separate document and use them in your article. Moreover, there is no need to always place them next to the main key. Otherwise, search engines will see “over-optimization” and your articles will fall in the rankings.

    Just use them in different places your article, and then you will receive additional traffic also on them. I would also recommend that you try to use as many word forms and synonyms as possible for your main key query.

    For example, we have a request - “Setting up contextual advertising”. Here's how to reformulate it:

    • Setup = set up, make, create, run, run, enable, place...
    • Contextual advertising = context, direct, teaser, YAN, adwords, kms. direct, adwords...

    You never know exactly how people will search for information. Add all these additional words to your semantic core and use them when writing texts.

    So, we collect a list of 100 - 150 key queries. If you are creating a semantic core for the first time, it may take you several weeks.

    Or maybe break his eyes? Maybe there is an opportunity to delegate the compilation of FL to specialists who will do it better and faster? Yes, there are such specialists, but you don’t always need to use their services.

    Is it worth ordering SY from specialists?

    By and large, semantic core compilers will only give you steps 1 - 3 from our diagram. Sometimes, for a lot additional fee, they will also do steps 4-5 - (collecting tails and checking the competitiveness of requests).

    After that, they will give you several thousand key queries that you will need to work with further.

    And the question here is whether you are going to write the articles yourself, or hire copywriters for this. If you want to focus on quality rather than quantity, then you need to write it yourself. But then it won't be enough for you to just get a list of keys. You will need to choose topics that you understand well enough to write a quality article.

    And here the question arises - why then do we actually need specialists in FL? Agree, parsing the base key and collecting exact frequencies (steps #1-3) is not at all difficult. This will literally take you half an hour.

    The most difficult thing is to choose HF requests that have low competition. And now, as it turns out, you need HF-NK, which you can write to good article. This is exactly what will take you 99% of your time working on the semantic core. And no specialist will do this for you. Well, is it worth spending money on ordering such services?

    When are the services of FL specialists useful?

    It’s another matter if you initially plan to attract copywriters. Then you don't have to understand the subject of the request. Your copywriters won’t understand it either. They will simply take several articles on this topic and compile “their” text from them.

    Such articles will be empty, miserable, almost useless. But there will be many of them. On your own, you can write a maximum of 2-3 quality articles per week. And an army of copywriters will provide you with 2-3 shitty texts a day. At the same time, they will be optimized for requests, which means they will attract some traffic.

    In this case, yes, calmly hire FL specialists. Let them also draw up a technical specification for copywriters at the same time. But you understand, this will also cost some money.

    Resume

    Let's go over the main ideas in the article again to reinforce the information.

    • The semantic core is simply a list of key queries for which you will write articles on the site for promotion.
    • It is necessary to optimize texts for precise key queries, otherwise even your highest-quality articles will never reach the TOP.
    • SY is like a content plan for social networks. It helps you avoid falling into a “creative crisis” and always know exactly what you will write about tomorrow, the day after tomorrow and in a month.
    • To compile a semantic core it is convenient to use free program Word fucker, you just need her.
    • Here are the five steps of compiling the NL: 1 - Selection of basic keys; 2 - Parsing basic keys; 3 - Collection of exact frequency for queries; 4 — Checking the competitiveness of keys; 5 – Collection of “tails”.
    • If you want to write articles yourself, then it is better to create a semantic core yourself, for yourself. Specialists in the preparation of synonyms will not be able to help you here.
    • If you want to work on quantity and use copywriters to write articles, then it is quite possible to delegate and compile the semantic core. If only there was enough money for everything.

    I hope this instruction was useful to you. Save it to your favorites so as not to lose it, and share it with your friends. Don't forget to download my book. There I show you the most fast way from zero to the first million on the Internet (extract from personal experience in 10 years =)

    See you soon!

    Yours Dmitry Novoselov

    What is the semantic core of a site? The semantic core of the site (hereinafter referred to as SY) is a collection keywords and phrases for which the resource progressing V search engines and which indicate that the site belongs to a certain topics.

    For successful promotion in search engines, keywords must be correctly grouped and distributed across the pages of the site and in a certain form contained in meta-descriptions (, , keywords), as well as in H1-H6 headings. At the same time, overspam should not be allowed, so as not to “fly away” into .

    In this article we will try to look at the issue not only from a technical point of view, but also to look at the problem through the eyes of business owners and marketers.

    How to create a semantic core of a website

    So, let's look at each point in more detail with various examples.

    At the first step, it is important to determine which products and services present on the site will be promoted in the search results of Yandex and Google.

    Example No. 1. Let’s say the site has two areas of services: computer repair at home and training to work with Word/Exel at home. IN in this case it was decided that training was no longer in demand, so there was no point in promoting it, and therefore collecting semantics on it. Another important point, you need to collect not only queries containing "computer repair at home", but also "laptop repair, PC repair" and others.

    Example No. 2. The company is engaged in low-rise construction. But at the same time he builds only wooden houses. Accordingly, queries and semantics by directions "construction of houses from aerated concrete" or "construction of brick houses" may not be collected.

    Collection of semantics

    We will look at two main sources of keywords: Yandex and Google. We’ll tell you how to collect semantics for free and briefly review paid services, allowing you to speed up and automate this process.

    In Yandex, key phrases are collected from the Yandex.Wordstat service and in Google through query statistics in Google AdWords. If available, you can use data from Yandex Webmaster and Yandex Metrics, Google Webmaster and Google Analytics as additional sources of semantics.

    Collecting keywords from Yandex.Wordstat

    Collecting queries from Wordstat can be considered free. To view the data of this service, you only need a Yandex account. So let's go to wordstat.yandex.ru and enter the keyword. Let's consider an example of collecting semantics for a car rental company website.

    What do we see in this screenshot?

    1. Left column. Here is the basic query and its various variations with "tail". Opposite each request is a number showing how much this request is in in general has been used by various users.
    2. Right column. Requests similar to the main one and indicators of their overall frequency. Here we see that a person who wants to rent a car, in addition to the request "car rental" can use "car rental", "car rental", "car rental" and others. This is very important data that you need to pay attention to so as not to miss a single request.
    3. Regionality and history. By choosing one of possible options, you can check the distribution of requests by region, the number of requests in a particular region or city, as well as the trend of changes over time or with the change of season.
    4. Devices, from which the request was made. By switching tabs, you can find out which devices are most often searched from.

    Check different options Key phrases and the received data are recorded in Excel tables or Google spreadsheets. For convenience, install the plugin Yandex Wordstat Helper. After installing it next to search phrases plus signs will appear; when you click on them, the words will be copied; you will not need to select and paste the frequency indicator manually.

    Collecting keywords from Google AdWords

    Unfortunately, Google does not have an open source of search queries with their frequency indicators, so here you need to work around it. And for this we need a working account in Google AdWords.

    We register an account in Google AdWords and top up the balance with the minimum possible amount - 300 rubles (on an account that is inactive in terms of budget, approximate data is displayed). After that, go to “Tools” - “Keyword Planner”.

    Will open new page, where in the “Search for new keywords by phrase, site or category” tab, enter the keyword.

    Scroll down, click “Get options” and see something like this.

    1. Top request and average number of requests per month. If the account is not paid, then you will see approximate data, that is, the average number of requests. When there are funds on the account, exact data will be shown, as well as the dynamics of changes in the frequency of the entered keyword.
    2. Keywords by relevance. This is the same as similar queries in Yandex Wordstat.
    3. Downloading data. This tool is convenient because the data obtained in it can be downloaded.

    We looked at working with two main sources of statistics on search queries. Now let's move on to automating this process, because collecting semantics manually takes too much time.

    Programs and services for collecting keywords

    Key Collector

    The program is installed on the computer. The program connects work accounts from which statistics will be collected. Next, a new project and a folder for keywords are created.

    Select “Batch collection of words from the left column of Yandex.Wordstat”, enter the queries for which we collect data.

    An example is included in the screenshot, in fact, for a more complete syntax, here you additionally need to collect all query options with car brands and classes. For example, “bmw for rent”, “buy a toyota with option to buy”, “rent an SUV” and so on.

    WordEb

    Free analogue previous program. This can be considered both a plus - you don’t need to pay, and a minus - the program’s functionality is significantly reduced.

    To collect keywords, the steps are the same.

    Rush-analytics.ru

    Online service. Its main advantage is that you don’t need to download or install anything. Register and use it. The service is paid, but when you register, you have 200 coins in your account, which is enough to collect small semantics (up to 5000 requests) and parse frequency.

    The downside is that semantics are collected only from Wordstat.

    Checking the frequency of keywords and queries

    And again we notice a decrease in the number of requests. Let's go ahead and try another word form of the same query.

    We note that in the singular, this request is searched by a much smaller number of users, which means the initial request is a higher priority for us.

    Such manipulations must be carried out with every word and phrase. Those requests for which the final frequency is equal to zero (using quotes and exclamation mark), are eliminated, because “0” means that no one enters such queries and these queries are only part of others. The point of compiling a semantic core is to select the queries that people use to search. All queries are then placed in an Excel table, grouped by meaning and distributed across the pages of the site.

    It’s simply not possible to do this manually, so there are many services on the Internet, paid and free, that allow you to do this automatically. Let's give a few:

    • megaindex.com;
    • rush-analytics.ru;
    • tools.pixelplus.ru;
    • key-collector.ru.

    Removing non-target requests

    After sifting through the keywords, you should remove unnecessary ones. Which search queries can I remove it from the list?

    • requests with the names of competitors' companies (can be left in);
    • requests for goods or services that you do not sell;
    • requests that indicate a district or region in which you do not work.

    Clustering (grouping) of requests for site pages

    The essence this stage— combine queries of similar meaning into clusters, and then determine which pages they will be promoted to. How can you understand which requests to promote to one page and which to another?

    1. By request type.

    We already know that everything is divided into several types, depending on the purpose of the search:

    • commercial (buy, sell, order) - promoted on landing pages, pages of product categories, product cards, pages with services, price lists;
    • informational (where, how, why, why) - articles, forum topics, answer to question section;
    • navigation (telephone, address, brand name) - page with contacts.

    If you are in doubt what type of request it is, enter its search string and analyze the results. Upon commercial request there will be more pages with the offer of services, for information - articles.

    There is also . Most commercial requests are geo-dependent, as people in to a greater extent trust companies located in their city.

    2. Request logic.

    • “buy iphone x” and “iphone x price” - need to be promoted to one page, since in both the first and second cases, the same product is searched, and more detailed information about him;
    • “buy iphone” and “buy iphone x” - need to be promoted on different pages, since in the first request we are dealing with a general request (suitable for the product category where iPhones are located), and in the second the user is looking for a specific product and this request should be promoted to the product card;
    • "how to choose good smartphone“—it is more logical to promote this request to a blog article with the appropriate title.

    Look search results on them. If you check which pages on different sites lead to the queries “construction of houses made of timber” and “construction of houses made of bricks”, then in 99% of cases these are different pages.

    4. Automatic grouping using software and manual refinement.

    The 1st and 2nd methods are excellent for compiling the semantic core of small sites where a maximum of 2-3 thousand keywords are collected. For a large system (from 10,000 to infinity of requests), the help of machines is needed. Here are several programs and services that allow you to perform clustering:

    • KeyAssistant - assistant.contentmonster.ru;
    • semparser.ru;
    • just-magic.org;
    • rush-analytics.ru;
    • tools.pixelplus.ru;
    • key-collector.ru.

    After automatic clustering is completed, it is necessary to check the results of the program manually and, if errors are made, correct them.

    Example: the program can send the following requests to one cluster: “vacation in Sochi 2018 hotel” and “vacation in Sochi 2018 hotel breeze” - in the first case, the user is looking for various hotel options for accommodation, and in the second, a specific hotel.

    To eliminate the occurrence of such inaccuracies, you need to manually check everything and, if errors are found, edit.

    What to do next after compiling the semantic core?

    Based on the collected semantic core, we then:

    1. we create the ideal structure (hierarchy) of the site from the point of view of search engines;
      or in agreement with the customer, we change the structure of the old website;
    2. we write technical specifications for copywriters to write text taking into account the cluster of requests that will be promoted to this page;
      or We are updating old articles and texts on the site.

    It looks something like this.

    For each generated request cluster, we create a page on the site and determine its place in the site structure. Most popular queries, are promoted to the most top pages in the resource hierarchy, less popular ones are located below them.

    And for each of these pages, we have already collected requests that we will promote on them. Next, we write technical specifications to copywriters to create text for these pages.

    Technical specifications for a copywriter

    As with the site structure, we will describe this stage in general terms. So, technical specifications for the text:

    • number of characters without spaces;
    • page title;
    • subheadings (if any);
    • a list of words (based on our core) that should be in the text;
    • uniqueness requirement (always require 100% uniqueness);
    • desired text style;
    • other requirements and wishes in the text.

    Remember, don’t try to promote +100500 requests on one page, limit yourself to 5-10 + tail, otherwise you will get banned for over-optimization and will be out of the game for a long time for places in the TOP.

    Conclusion

    Compiling the semantic core of a site is painstaking and hard work, which needs to be given especially close attention, because it is on this that the further promotion of the site is based. Follow the simple instructions given in this article and take action.

    1. Choose the direction of promotion.
    2. Collect all possible queries from Yandex and Google (use special programs and services).
    3. Check the frequency of queries and get rid of dummies (those with a frequency of 0).
    4. Remove non-target requests - services and goods that you do not sell, requests mentioning competitors.
    5. Form query clusters and distribute them across pages.
    6. Create an ideal site structure and draw up technical specifications for the content of the site.

    ". Today we’ll talk about compiling a semantic core, as well as the selection and analysis of keywords. As always, we will be glad to have any questions and will be happy to answer them.

    Website promotion should begin with compiling a semantic core, that is, selecting key phrases for which you need to get to the top. After selecting the keys, a decision is made on which additional sections and pages need to be added to the site. That is why the selection of words is carried out first, and only then work on the internal optimization of the site.

    Promotion tactics different keys may vary. The most competitive phrases are promoted on home page, and sometimes on several internal ones. At the same time, special efforts are made to ensure that the pages are user-friendly and provide high conversion. And to collect traffic for queries with low competition, you need a lot of simple information pages. Promoting such pages is much easier.

    It is recommended to select a maximum of key phrases - the more keys, the higher the traffic. By the way, SEO companies begin their work on client sites by significantly expanding the semantic core (and rightly so). It is important for them to achieve quick and tangible results in the form of an increase in traffic. The simplest and the right way, which is often ignored by webmasters, is precisely an expansion of the list of promoted key phrases.

    Why is it advisable to allocate tens and hundreds of keys, and not stop at promoting 5-10 most important queries? Firstly, promotion for low-frequency queries is much easier and cheaper, and in total they usually generate much more traffic than the main commercial queries. Moreover, it is not a fact that according to powerful commercial inquiries You will immediately be able to squeeze out your competitors. Most likely, there is already a fierce struggle between optimizers for these queries. Therefore, in order not to waste money and effort, you need to promote a large number of different keys at once.

    And the most popular general queries, such as “furniture”, “lawns”, “stationery” and others, are of little use for commercial purposes. The cost of promoting such requests is high, and the commercial effect is negligible. Therefore, it is much better and cheaper to promote, for example, individual TV models than the query “TVs.”

    Advice: if you decide to use the services of an SEO company, ask the manager how many keywords they use usually promote clients. A narrow semantic core (up to a couple of dozen keywords) should alert you. This means that the promotion potential is not used to its full potential.

    Well, now let’s move on to selecting keywords using different tools.

    Brainstorm

    Coincidentally, at the time of writing this website promotion guide, we decided to promote one of our projects, a text exchange Copilancer. And therefore it is best to consider it as a living example. So, let's start brainstorming and determine which keywords of the semantic core can be useful when promoting this site, what exactly our target audience in Yandex and Google. We managed to come up with quite a few keys right away, namely:

    • Text exchange
    • Copywriting Exchange
    • Copyright Exchange
    • Order text
    • Content Exchange
    • Exchange of articles

    This list would be much longer if we were promoting an online store, since each product item in it is a separate key. After the imagination at the brainstorming stage is exhausted, we turn to the help of Yandex and Google. First, we will look at how the semantic core is developed manually, and then how this work can be automated using paid programs.

    Yandex.Direct and Google Adwords

    3.When selecting keywords in Yandex and Google, use quotation marks to search by exact match rather than broad match.

    4.Try to collect as many keywords as possible for the semantic core, excluding keys with the lowest traffic (30 according to Yandex or 10 according to Google).

    5.Exclude from the semantic core keys that are in no way related to your site, for which you are not ready to create separate information pages. http://seo-case.com/.
    We provide not just groups of phrases in Excel with collected frequencies, but a whole range of information for all cases of website development and promotion.

    Banal truths…….

    Quick navigation on this page:

    Like almost all other webmasters, I create a semantic core using the KeyCollector program - this is certainly best program to compile a semantic core. How to use it is a topic for a separate article, although the Internet is full of information on this subject - I recommend, for example, the manual from Dmitry Sidash (sidash.ru).

    Since the question was asked about an example of compiling a core, I will give an example.

    List of keys

    Let's say our site is dedicated to British cats. I enter the phrase “British cat” into the “List of phrases” and click on the “Parse” button.

    I get a long list of phrases that will begin with the following phrases (the phrase and particularity are given):

    British cats 75553 British cats photo 12421 British fold cat 7273 British cat nursery 5545 British breed cats 4763 British shorthair cat 3571 colors of British cats 3474 British cats price 2461 blue British cat 2302 British fold cat photo 2224 mating of British cats 1888 British cats character 1394 I will buy a British cat cat 1179 British cats buy 1179 long-haired British cat 1083 pregnancy of a British cat 974 British chinchilla cat 969 cats of the British breed photo 953 nursery of British cats Moscow 886 color of British cats photo 882 British cats care 855 British shorthair cat photo 840 Scottish and British cats 763 names of British cats 762 British blue cat photo 723 British blue cat photo 723 British black cat 699 what to feed British cats 678

    The list itself is much longer; I have only given the beginning.

    Key grouping

    Based on this list, on my website there will be articles about types of cats (loose-eared, blue, short-haired, long-haired), there will be an article about the pregnancy of these animals, about what to feed them, about names, and so on on the list.

    For each article, one main such request is taken (= topic of the article). However, the article is not limited to just one query - it also adds other relevant queries, as well as different variations and word forms of the main query, which can be found in Key Collector below the list.

    For example, with the word “fold-eared” there are the following keys:

    British fold cat 7273 British fold cat photo 2224 British fold cat price 513 cat breed British fold 418 British blue fold cat 224 Scottish fold and British cats 190 British fold cats photo 169 British fold cat photo price 160 british fold cat buy 156 british fold blue cat photo 129 British Fold cats character 112 British Fold cat care 112 mating of British Fold cats 98 British shorthair Fold cat 83 color of British Fold cats 79

    To avoid overspam (and overspam can also occur due to the combination of using too many keys in the text, in the title, in, etc.), I would not take them all with the inclusion of the main query, but individual words of which it makes sense to use in the article (photo, buy, character, care, etc.) so that the article is better ranked by a large number low-frequency queries.

    Thus, under the article about fold-eared cats, we will form a group of keywords that we will use in the article. Groups of keywords for other articles will be formed in the same way - this is the answer to the question of how to create the semantic core of the site.

    Frequency and competition

    There is also an important point related to the exact frequency and competition - they must be collected in Key Collector. To do this, you need to tick all requests and on the “Yandex.Wordstat Frequencies” tab click the “Collect frequencies “!” — the exact frequency of each phrase will be shown (i.e. with exactly this word order and in this case), this is a much more accurate indicator than the overall frequency.

    To check the competition in the same Key Collector, you need to click the “Get data for Yandex” (or for Google), then click “Calculate KEI using available data.” As a result, the program will collect how many main pages this request is in the TOP 10 (the more, the more difficult it is to get there) and how many pages in the TOP 10 contain such a title (similarly, the more, the more difficult it is to get to the top).

    Next we need to act based on what our strategy is. If we want to create a comprehensive site about cats, then the exact frequency and competition are not so important to us. If we only need to publish a few articles, then we take requests that have the highest frequency and at the same time the lowest competition, and write articles based on them.