Semantic core! Blog Engine! Formation of the semantic core.

The semantic core is a scary name that SEOs have come up with to refer to a fairly simple thing. We just need to select the key queries for which we will promote our site.

And in this article, I will show you how to properly compose a semantic core so that your site quickly reaches the TOP, and does not stagnate for months. Here, too, there are "secrets".

And before we move on to compiling the SA, let's look at what it is, and what we should eventually come to.

What is the semantic core in simple words

Oddly enough, but the semantic core is the usual excel file, which lists the key queries for which you (or your copywriter) will write articles for the site.

For example, here is how my semantic core looks like:

I have marked in green those key queries for which I have already written articles. Yellow - those for whom I am going to write articles in the near future. And colorless cells mean that these requests will come a little later.

For each key request, I have determined the frequency, competitiveness, and invented a "catchy" title. Here is approximately the same file you should get. Now my SL consists of 150 keywords. This means that I am provided with “material” for at least 5 months in advance (even if I write one article a day).

A little lower we will talk about what you should prepare for if you suddenly decide to order the collection of a semantic core from specialists. Here I will say briefly - they will give you the same list, but only for thousands of "keys". However, in SA it is not the quantity that matters, but the quality. And we will focus on this.

Why do we need a semantic core at all?

But really, why do we need this torment? You can, in the end, just write high-quality articles just like that, and attract an audience with this, right? Yes, you can write, but you can’t attract.

The main mistake of 90% of bloggers is just writing high-quality articles. I'm not kidding, they have really interesting and useful materials. But search engines don't know about it. They are not psychics, but just robots. Accordingly, they do not put your article in the TOP.

There is another subtle point here with the title. For example, you have a very high-quality article on the topic "How to do business in the" muzzle book ". There you describe everything about Facebook in great detail and professionally. Including how to promote communities there. Your article is the most high-quality, useful and interesting on the Internet on this topic. No one was lying next to you. But it still won't help you.

Why quality articles fly out of the TOP

Imagine that your site was visited not by a robot, but by a live checker (assessor) from Yandex. He understood that you have the coolest article. And the hands put you in first place in the search results for the query "Community promotion on Facebook."

Do you know what will happen next? You will be out of there very soon. Because no one will click on your article, even in the first place. People enter the query "Community promotion on Facebook", and your headline is "How to do business in the" muzzle book ". Original, fresh, funny, but ... not on demand. People want to see exactly what they were looking for, not your creative.

Accordingly, your article will empty take a place in the TOP of the issue. And a living assessor, an ardent admirer of your work, can beg the authorities for as long as he likes to leave you at least in the TOP-10. But it won't help. All the first places will be occupied by empty, like husks from seeds, articles that were copied from each other by yesterday's schoolchildren.

But these articles will have the correct “relevant” title - “Community promotion on Facebook from scratch” ( step by step, 5 steps, from A to Z, free etc.) It's a shame? Still would. Well, fight against injustice. Let's make a competent semantic core so that your articles take the well-deserved first places.

Another reason to start compiling SA right now

There is one more thing that for some reason people don't think much about. You need to write articles often - at least every week, and preferably 2-3 times a week to get more traffic and faster.

Everyone knows this, but almost no one does it. And all because they have “creative stagnation”, “they can’t force themselves”, “just laziness”. But in fact, the whole problem is precisely in the absence of a specific semantic core.

I entered one of my basic keys — “smm” into the search field, and Yandex immediately gave me a dozen hints about what else might be of interest to people who are interested in “smm”. I just have to copy these keys into a notebook. Then I will check each of them in the same way, and collect clues on them as well.

After the first stage of collecting SA, you should be able to Text Document, which will contain 10-30 wide base keys, with which we will work further.

Step #2 - Parsing Basic Keys in SlovoEB

Of course, if you write an article for the query "webinar" or "smm", then a miracle will not happen. You will never be able to reach the TOP for such a broad query. We need to break the base key into many small queries on this topic. And we will do this with the help of a special program.

I use KeyCollector but it's paid. You can use a free analogue - the SlovoEB program. You can download it from the official site.

The most difficult thing in working with this program is to set it up correctly. How to properly set up and use Slovoeb I show. But in that article, I focus on the selection of keys for Yandex-Direct.

And here let's take a look at the features of using this program for compiling a semantic core for SEO step by step.

First we create a new project and name it according to the broad key you want to parse.

I usually give the project the same name as my base key so I don't get confused later. And yes, I will warn you against another mistake. Don't try to parse all base keys at the same time. Then it will be very difficult for you to filter out “empty” key queries from golden grains. Let's parse one key at a time.

After creating the project, we carry out the basic operation. That is, we actually parse the key through Yandex Wordstat. To do this, click on the "Worstat" button in the program interface, enter your base key, and click "Start collecting".

For example, let's parse the base key for my blog "contextual advertising".

After that, the process will start, and after a while the program will give us the result - up to 2000 key queries that contain "contextual advertising".

Also, next to each request there will be a “dirty” frequency - how many times this key (+ its word forms and tails) was searched per month through Yandex. But I do not advise you to draw any conclusions from these figures.

Step #3 - Gathering the exact frequency for the keys

Dirty frequency will not show us anything. If you focus on it, then do not be surprised later when your key for 1000 requests does not bring a single click per month.

We need to find the net frequency. And for this, we first select all the found keys with checkmarks, and then click on the Yandex Direct button and start the process again. Now Slovoeb will look for the exact request frequency per month for each key.

Now we have an objective picture - how many times what request was entered by Internet users over the past month. Now I propose to group all key queries by frequency, so that it would be more convenient to work with them.

To do this, click on the "filter" icon in the "Frequency "!" ”, and specify to filter out keys with the value “less than or equal to 10”.

Now the program will show you only those requests, the frequency of which is less than or equal to the value "10". You can delete these queries or copy them for the future to another group of keywords. Less than 10 is very low. Writing articles for these requests is a waste of time.

Now we need to choose those keywords that will bring us more or less good traffic. And for this we need to find out one more parameter - the level of competition of the request.

Step #4 - Checking for Query Concurrency

All "keys" in this world are divided into 3 types: high-frequency (HF), mid-frequency (MF), low-frequency (LF). And they can also be highly competitive (VC), medium competitive (SC) and low competitive (NC).

As a rule, HF requests are simultaneously VC. That is, if a query is often searched on the Internet, then there are a lot of sites that want to advance on it. But this is not always the case, there are happy exceptions.

The art of compiling a semantic core lies precisely in finding such queries that have a high frequency, and their level of competition is low. Manually determining the level of competition is very difficult.

You can focus on indicators such as the number of main pages in the TOP-10, the length and quality of texts. the level of trust and sites in the TOP of the issue on request. All of this will give you some idea of ​​how tough the competition for positions is for this particular request.

But I recommend that you use service Mutagen. It takes into account all the parameters that I mentioned above, plus a dozen more, which neither you nor I probably even heard of. After analysis, the service gives the exact value - what is the level of competition for this request.

Here I checked the request "configuration contextual advertising in google adwords. The mutagen showed us that this key has a competitiveness of "more than 25" - this is maximum value which it shows. And this request has only 11 views per month. So it doesn't suit us.

We can copy all the keys that we picked up in Slovoeb and do a mass check in Mutagen. After that, we will only have to look through the list and take those requests that have a lot of requests and a low level of competition.

Mutagen is a paid service. But you can do 10 checks per day for free. In addition, the cost of verification is very low. For all the time I have worked with him, I have not yet spent even 300 rubles.

By the way, at the expense of the level of competition. If you have a young site, then it is better to choose queries with a competition level of 3-5. And if you have been promoting for more than a year, then you can take 10-15.

By the way, at the expense of the frequency of requests. We now need to take the final step, which will allow you to attract a lot of traffic even for low-frequency queries.

Step #5 - Collecting "tails" for the selected keys

As has been proven and verified many times, your site will receive the bulk of traffic not from the main keys, but from the so-called “tails”. This is when a person enters strange key queries into the search box, with a frequency of 1-2 per month, but there are a lot of such queries.

To see the "tail" - just go to Yandex and enter your chosen key query in the search bar. Here's what you'll see.

Now you just need to write out these additional words in a separate document, and use them in your article. At what it is not necessary to put them always next to the main key. Otherwise, search engines will see "re-optimization" and your articles will fall in the search results.

Just use them in different places in your article, and then you will receive additional traffic from them as well. I would also recommend that you try to use as many word forms and synonyms as possible for your main key query.

For example, we have a request - "Setting up contextual advertising". Here's how you can reformulate it:

  • Setup = set up, make, create, run, launch, enable, host…
  • Contextual advertising = context, direct, teaser, YAN, adwords, kms. direct, adwords…

You never know exactly how people will look for information. Add all these additional words to your semantic core and use when writing texts.

So, we collect a list of 100 - 150 keywords. If you are compiling a semantic core for the first time, then it may take you several weeks to complete it.

Or maybe break his eyes? Maybe there is an opportunity to delegate the compilation of CL to specialists who will do it better and faster? Yes, there are such specialists, but it is not always necessary to use their services.

Is it worth ordering SA from specialists?

By and large, specialists in compiling a semantic core will only take you steps 1 - 3 of our scheme. Sometimes, for a large additional fee, they will also take steps 4-5 - (collecting tails and checking the competition of requests).

After that, they will give you several thousand key queries with which you will need to work further.

And the question here is whether you are going to write articles yourself, or hire copywriters for this. If you want to focus on quality, not quantity, then you need to write it yourself. But then it won't be enough for you to just get a list of keys. You will need to choose those topics that you understand well enough to write a quality article.

And here the question arises - why then do we actually need specialists in SA? Agree, parsing the base key and collecting the exact frequencies (steps #1-3) is not at all difficult. It will take you literally half an hour.

The most difficult thing is to choose high-frequency requests that have low competition. And now, as it turns out, you need HF-NC, on which you can write a good article. This is exactly what will take you 99% of the time working on the semantic core. And no specialist will do this for you. Well, is it worth spending money on ordering such services?

When the services of SA specialists are useful

Another thing is if you initially plan to attract copywriters. Then you do not need to understand the subject of the request. Your copywriters will also not understand it. They will simply take a few articles on this topic and compile “their” text from them.

Such articles will be empty, miserable, almost useless. But there will be many. On your own, you can write a maximum of 2-3 quality articles per week. And the army of copywriters will provide you with 2-3 shitty texts a day. At the same time, they will be optimized for requests, which means they will attract some kind of traffic.

In this case, yes, calmly hire SA specialists. Let them also draw up TK for copywriters at the same time. But you understand, it will also cost some money.

Summary

Let's go over the main ideas in the article again to consolidate the information.

  • The semantic core is just a list of keywords for which you will write articles on the site for promotion.
  • It is necessary to optimize the texts for the exact key queries, otherwise your even the highest quality articles will never reach the TOP.
  • SL is like a content plan for social networks. It helps you not to fall into a “creative block”, and always know exactly what you will write about tomorrow, the day after tomorrow and in a month.
  • To compose a semantic core, it is convenient to use free program Slovoeb, you only need it.
  • Here are five steps in compiling CL: 1 - Selection of basic keys; 2 - Parsing basic keys; 3 - Collection of exact frequency for requests; 4 - Checking the competitiveness of keys; 5 - Collection of "tails".
  • If you want to write articles yourself, then it is better to make the semantic core yourself, for yourself. Specialists in the compilation of CL will not be able to help you here.
  • If you want to work on quantity and use copywriters to write articles, then it is entirely possible to involve delegating and compiling the semantic core. If only there was enough money for everything.

I hope this guide was helpful to you. Save it to your favorites so as not to lose it, and share it with your friends. Don't forget to download my book. There I show you the fastest way from zero to the first million on the Internet (squeezed from personal experience for 10 years =)

See you later!

Your Dmitry Novoselov

In addition to the site, it is important to know how to apply it correctly with maximum benefit for internal and external site optimization.

A lot of articles have already been written on the topic of how to make a semantic core, therefore, within the framework of this article, I want to draw your attention to some features and details that will help you use the semantic core correctly, and thereby help promote an online store or website. . But first, I will briefly give my definition of the semantic core of the site.

What is the semantic core of the site?

The semantic core of the site is a list, set, array, set of keywords and phrases that are requested by users (your potential visitors) in search engine browsers in order to find the information of interest.

Why does a webmaster need to create a semantic core?

Based on the definition of the semantic core, a lot of obvious answers to this question arise.

It is important for online store owners to know how potential buyers try to find the product or service that the online store owner wants to sell or provide. The position of the online store in the search results directly depends on this understanding. The more the content of the online store corresponds to consumer search queries, the closer the online store is to the TOP of search results. This means that the conversion of visitors into buyers will be higher and better.

For bloggers who are actively engaged in the monetization of their blogs (moneymaking), it is also important to be in the TOP of search results for topics relevant to the content of the blog. Increase search traffic brings more profit from impressions and clicks from contextual advertising on the site, impressions and clicks on ad units of affiliate programs and an increase in profit from other types of earnings.

The more original and useful content on the site, the closer the site is to the TOP. Life exists predominantly on the first page of search engine results. Therefore, knowing how to make a semantic core is necessary for the SEO of any site, online store.

Sometimes webmasters and online store owners wonder where to get high-quality and relevant content? The answer comes from the question - you need to create content in accordance with key queries users. The more search engines consider your site content to be relevant (suitable) for user keywords, the better for you. By the way, this raises the answer to the question - where to get a variety of content topics? It's simple - analyzing search terms users, you can find out what they are interested in and in what form. Thus, having made the semantic core of the site, you can write a series of articles and / or descriptions for the goods of the online store, optimizing each page for a specific keyword (search query).

For example, I decided to optimize this article for the key query "how to make a semantic core", because the competition for this request lower than the query "how to create a semantic core" or "how to compose a semantic core". Thus, it is much easier for me to get to the TOP of search engine results for this query using absolutely free promotion methods.

How to make a semantic core, where to start?

To compile the semantic core, there are a number of popular online services.

The most popular service, in my opinion, is Yandex keyword statistics - http://wordstat.yandex.ru/

With help this service you can collect the vast majority of search queries in various word forms and combinations for any subject. For example, in the left column we see statistics on the number of requests not only for the keyword "semantic core", but also statistics on various combinations of this keyword in different conjugations and with dilutive and additional words. In the left column, we see the statistics of the search phrases that were searched along with the key phrase "semantic core". This information can be valuable, if only as a source of topics for creating new content relevant to your site. I also want to mention one feature of this service - you can specify the region. Thanks to this option, you can more accurately find out the number and nature of the search queries you need for the desired region.


Another service for compiling the semantic core is the statistics of rambler search queries - http://adstat.rambler.ru/


In my subjective opinion, this service can be used when there is a battle to attract every single user to your site. Here you can specify some low-frequency and long tail queries, the number of users using them is approximately from 1 to 5-10 per month, i.e. very little. I’ll make a reservation right away that in the future we will consider the topic of keyword classification and the features of each group in terms of their application. Therefore, I personally rarely use these statistics, as a rule, in cases where I am engaged in or a highly specialized site.

To form the semantic core of the site, you can also use the tips that appear when you enter a search query in the search engine browser.



And another option for residents of Ukraine to replenish the list of keywords for the semantic core of the site is to view site statistics on - http://top.bigmir.net/


Having chosen the desired topic of the section, we are looking for open statistics of the most visited and relevant site for the topic


As you can see, the statistics of interest may not always be open, as a rule, webmasters hide them. However, as an additional source of keywords, it can also come in handy.

By the way, a wonderful article by Globator (Mikhail Shakin) - http://shakin.ru/seo/keyword-suggestion.html You can also read about how to make a semantic core for English-language projects.

What to do next with the list of keywords?

First of all, to make a semantic core, I recommend structuring the list of keywords - dividing it into conditional groups: into high-frequency (HF), mid-frequency (MF) and low-frequency (LF) keywords. It is important that these groups include keywords that are very similar in morphology and subject matter. It is most convenient to do this in the form of a table. I do it like this:


The top row of the table is high-frequency (HF) search queries (written in red). I put them at the head of the thematic columns, in each cell of which I sorted the mid-frequency (MF) and low-frequency (LF) search queries as uniformly as possible by topic. Those. for each high-frequency request, I attached the most suitable groups of mid-range and low-frequency requests. Each cell of MF and LF queries is a future article that I write and optimize strictly for the set of keywords in the cell. Ideally, one article should match one keyword (search query), but this is a very routine and time-consuming work, because there can be thousands of such keywords! Therefore, if there are a lot of them, you need to select the most significant ones for yourself, and weed out the rest. Also, you can optimize the future article for 2 - 4 mid and low keywords.

For online stores, mid and low search queries are usually product names. Therefore, there is no particular difficulty here. internal optimization each page of the online store, it's just a long process. It is the longer, the more goods in the online store.

I highlighted in green those cells for which I already have articles ready, so. I will not get confused with the list of finished articles in the future.

I will talk about how to optimize and write articles for the site in one of the future articles.

So, having made such a table, you can have a very clear idea of ​​​​how you can make the semantic core of the site.

As a result of this article, I want to say that here we to some extent touched on the details of the issue of promoting an online store. Surely, after reading some of my articles about the promotion of an online store, you got the idea that compiling the semantic core of the site and internal optimization are interrelated and interdependent activities. I hope I was able to argue the importance and primacy of the issue - how to make a semantic core site.

Hello everyone!

What to do with the semantic core? This question is asked, probably, by all beginners in SEO promotion (judging by myself) and for good reason. After all, in reality, at the initial stages, it is not clear to a person why he sat for so long and collected keywords for the site, or using other tools. If I also suffered with this question for a long time, then I will release, perhaps, a lesson on this topic.

What is the purpose of the semantic core?

First, let's figure out why we collected the semantic core in general. So, all SEO promotion is based on the use of keywords that users enter into the search lines. Thanks to them, things like the structure of the site and its content are created, which in fact are the main factors in .

Also, do not forget about external optimization, in which the semantic core plays an important role. But more on that in the next lesson.

I summarize: SA is necessary for:

  • Creating a site structure that will be understandable to both search engines and ordinary users;
  • Content creation. Content in our time is the main way to promote the site in the issue. The better the content, the higher the site is; the more quality content, the higher the site is. More on creating quality content, more;

What to do with the semantic core after compilation?

So, after you have compiled a semantic core, that is: you have collected keywords, cleaned and grouped them, you can begin to form the structure of the site. In fact, when you grouped requests as we did in lesson #145, you have already created the structure of your web resource:

You just need to implement it on the site and that's it. Thus, you will form a structure not on the basis of what you have in stock, but on the basis of consumer demand. By doing so, you will not only benefit the web resource in terms of SEO, but also do the right thing from the point of view of the business as a whole. No wonder they say: if there is a demand, then there must be a supply.

We seem to have figured out the structure, now let's move on to the content. Once again, by grouping requests in the Key Collector, you have found topics for your future content that you will fill the pages with. For example, let's take the group "Mountain bikes" and break it into small subgroups:


Thus, we have created two subgroups with keywords under individual pages. Your task at this stage is to form groups (clusters) so that each cluster contains semantically the same keywords, that is, the same in meaning.

Remember one rule: each cluster has a separate page.

So, of course, it’s not very convenient for beginners to group, since you need to have a certain skill, so I’ll show you another way to form topics for articles. This time we'll use Excel:


Already on the basis of the resulting data, you can create separate pages.

This is how I carry out clustering (grouping) and I am quite satisfied with everything. I think that now you understand what to do with the semantic core after compilation.

Perhaps the example given in this lesson is too general, as it does not give a specific picture. I just want to convey to you the very essence of the action, and then you yourself will work with your head. So I apologize in advance.

If this lesson has become useful for you and helped in solving the problem, then please share the link in in social networks. And, of course, subscribe to blog updates if you haven't already.

Good luck to you, friends!

See you soon!

In the last article I told you, but today it's time to talk about how to compose the semantic core of the site using special programs and servers.

Summer is coming and it’s getting harder and harder to blog, and the result of the article is rarely published, which you probably won’t be happy about. Is not it?

To be honest, it’s not at all in the summer time, but in my “stupidity” of creating a blog at the initial stage. In general, when creating a blog, I started from my thoughts, and not from user requests. In general, it turned out that I answered my own questions, it's funny.

In one of my previous posts, I already wrote something for what you need With regard to the site and what it is, where he revealed the basis of the foundations of its creation, it would probably be time to move on to the practice of creating a semantic core.

Summary:

how to create a semantic core for a website

In the last article, I mainly told the theory, but as practice shows, not all users always understand such material, so I present a step-by-step instruction that you can take as a basis and make your own semantic core without much effort. Yes, it is for the basis, since progress does not stand still and, perhaps, there are already more advanced ways to create a core for the site.

We are looking for competitors to compose the semantic core of the site

So, first of all, we must decide what and what our future site will be about. I will take as a basis the topic "ecology of the environment » and choose keywords for it. To determine the main keys, I will take the site of a competitor located in the first places and Google.

As you understand, I will take as a basis the first three sites in the Yandex.

seasons-years.rf
zeleneet.com
biofile.ru

We select competitors' requests - automatically

At this step, any site that can automatically determine the site's queries that are suitable for its promotion to the top is good for us. I prefer to use which server you prefer - this is purely your opinion, which you can write to me in the comments and discuss.

We add our donor, not our site!

As you may have guessed, I took one of the three sites - you should take several. We press further ...

As you can see, I selected only the first 10 requests for a visual example. You should collect as many keywords as possible and don't be too lazy to think like you asked Google to find the answer to your question, which you are going to post on the Internet on your blog or website. We proceed to the next stage of the formation of the semantic core with our own hands.

As you already understood, all you have to do is export the database of keywords and move on to the next step towards creating the semantic core of the site, which you will do yourself. Don't forget to tick the keywords you are trying to save in the exel file.

We select low-frequency and high-frequency requests

At this stage, the server will help us, carefully affiliate link, register at rookee.ru. We go to it and create a new advertising campaign using the site of our competitor.

Add a site and follow step by step instructions Internet resource.

Now you need to wait a bit for the robot to crawl the site and give you the most interesting queries, which you can also apply later. Here is my result.

As you can see, the system found 299 requests, but they don’t interest me and I’ll just take them and delete them and add the 10 that seopult gave me at the first stage. By the way, one request must remain, otherwise you will not be able to delete everything.
This is what happened to me after removal. When I add the main query base, I will delete this key so that it does not interfere with me, since it is not relevant for me. Now we add the requests that we received earlier.

Now we need to go to the “requests” folder and wait a bit for the search robot to scan the requests. In the end, it turned out that all the words turned out to be NC (low competitive), which are quite suitable for me to create a website and attract the first readers to the blog.

If you have words with the meaning of HC (high competition) or SC (medium competition), then at the first stage of the formation of the core you should discard them and choose only low competition.

Export the words and move on to the next final step. Since I have all the words for low competition, I did not delete anything, you may have a different situation. But I urge you to delete medium and highly competitive requests, as these are your extra costs.

We compose the semantic core for the site

For the final step, we need the Key Collector program, I will use its free prototype, which you can easily find on the Internet. I'll leave the name of this program to you as homework. So, we open the program and enter our found keys, I have 9 of them so far.

As we can see, out of 9 requests, it is better for us to write about a device for measuring air humidity. This is an example with one keyword, which is relevant under the low-frequency request on the specified topic.

Do not forget that the program is also able to pick up 1 thousand words for the main query with its word forms, to do this, simply enter the words in the red tab "batch collection of words from Yandex worstat".

After you take maximum amount of relevant words, do not forget to sort them by importance, starting with the least. Try also to make a good linking to your article - this will improve the behavioral facts on your resource.

Well, that's all, I delayed something with work. And how do you do SA for your online store? write me about it in the comments.

P.S. You can also receive my new articles by email subscribing to rss feed and be the first to know about my secrets and methods. And how do you make up the semantic core of the site? Write to me about your method in the comments, we will discuss.