How quickly Yandex indexes new pages. What does "indexing" mean. Yandex Webmaster pre-configuration

Have you created a website but can't find it in search engines? No problem! In this material you will learn how to index a site in Yandex and Google in the shortest possible time. It is probably unnecessary to talk about the advantages of getting into the index of search engines quickly. After all, anyone understands that the earlier his site is shown in the search results, the sooner new customers will appear. And for this to work, you need to get into the search engine database.

By the way, thanks to the right approach, new materials on our site are good enough, and most importantly, they are always quickly indexed by search engines. Perhaps you also got to this page after contacting the corresponding request in the search bar. Let's move on from lyrics to practice.

How to find out if a site is indexed?

The first thing to do is to find out if the site is indexed by search engines. It may be that the site is simply not on the first page of the search for the query that you entered. These can be high-frequency queries, in order to be shown for which you need to work on, and not just make and launch the site.

So, to check, we go to all search engines that it makes sense to enter ( Yandex, Google, Mail, Rambler) and enter the site address in the search query string.

If your resource has not yet been indexed, nothing will be shown in the search results, or other sites will appear.

How to index a website in Yandex?

First, we'll tell you how to index a website in Yandex. But before adding your resource, check that it works correctly, opens correctly on all devices and contains only unique content... For example, if you add a site at the stage of development, you can simply fall under the filter - this happened to us once and had to wait a whole month for Yandex to understand that we have a high-quality site and removed the sanctions.

To inform Yandex about a new site, you need and, the first tool is responsible for additional information about resource ( region, structure, quick links) and the way the site looks in the organic search results, the second for collecting site data ( attendance, behavior, etc.), which, according to our experience, also affects the indexing of the site and its position in the search. Also, be sure to create a sitemap and include it in the webmasters panel.

How to get your site indexed on Google?

Most often, Google itself quickly finds new sites and drives them into search, but waiting for Google to come and do all the work for us is too presumptuous, so let's figure out how to index a site on Google.

First is the Google Webmasters Panel, and. We adhere to the same goals - to give maximum information about the site to search engines.

After the sites are added, it must pass from 3 to 7 daysbefore the search engines update the data and index the site.

You always want new pages on your site to appear in search results as quickly as possible and for this there are several secret (and very simple) ways to speed up the indexing of site pages in search engines.

3. To perform the manipulation described in the 2nd paragraph, only for search engine Google, go to Search Console. Select "Scan" and "View Like Googlebot" -adding the address new page and click "Scan", after that we request indexing.

Site indexing analysis

In conclusion, it should be noted that even after successful indexing of the site in search engines, the work does not end there. It is necessary to periodically analyze the indexing of the site, as well as remove positions for popular queries. This will allow you to keep your finger on the pulse and not be in a situation where a significant part of traffic from organic search results has simply disappeared.

This happened with many old sites that used the old promotion methods after the release. At the same time, Yandex announced in advance that it was launching this algorithm and the re-optimized pages would be excluded from the search, and the same Google never informs about the release of new algorithms. Therefore, only relentless control will allow you to remain the leader of the topic or become one!

By and large, if your resource is good, well made, then there should be no problems with indexing it. If a site, albeit not 100%, meets the requirements of search engines - "for people", then they will be happy to look at your light and index everything new that will be added.

But be that as it may, the first step in promoting a site is to add it to the PS index. Until the resource is indexed, by and large, there is nothing to promote, because search engines will not know about it at all. Therefore, in this article I will consider what is site indexing in Yandex and how to send a resource for indexing. I will also tell you about how to check whether a site or an individual page is included in the Yandex index and what to do to speed up Yandex indexing.

How Yandex indexes a website

Indexing a site in Yandex is a crawling of your site by the robots of the yandex search engine, and entering all open pages to the base. The spider of the Russian search engine adds to the database about the site: its pages, pictures, videos, documents that are available for search. Also, the search bot is engaged in indexing links and other elements that are not covered by special tags and files.

The main methods of indexing a resource:

    Forced - you should send the site for indexing to Yandex through a special form.

    Natural - the search spider manages to find your site on its own, going from external resources that link to the website.

The time for indexing a site in Yandex is different for everyone and can range from a couple of hours to several weeks.

It depends on many factors: what values \u200b\u200bare in the Sitemap.xml, how often the resource is filled, how often the site is mentioned on other resources. The indexing process is cyclical, so the robot will come to you at (almost) equal intervals of time. But how often depends on the factors mentioned above and the specific robot.

The spider can index the entire website (if it is small) or a separate section (this applies to online stores or media). On frequently updated resources such as mass media and information portals live, the so-called fast-robots for quick site indexing in Yandex.

Sometimes a project may have technical problems (or problems with the server), in which case yandex indexing of the site will not take place, which is why the search engine may resort to the following scenario:

  • immediately throw out non-indexed pages from the database;
  • re-index the resource after a certain time;
  • put pages that have not been indexed for exclusion from the database, and if it does not find them during re-indexing, then throw them out of the index.

How to speed up website indexing in Yandex

How to speed up indexing in Yandex - frequent question on various webmaster forums. In fact, the life of the entire site depends on indexing: the position of the resource in the PS, the number of clients from them, the popularity of the project, profit, in the end.

I have prepared 10 methods that I hope will be useful to you. The first five are standard for constant indexing of a resource, and the next five will help you speed up the indexing of a site in Yandex:

    bookmarking services;

    RSS-feed - will provide broadcasting of new materials from your resource to mail to subscribers and to RSS-catalogs;

    link exchanges - will provide a stable increase in dofollow links from high-quality donors, if they are selected correctly (how to select correctly);

    - if you have not registered your site in directories yet, then I advise you to do so. Many people say that directories are long dead or that registration in them will kill the site - this is not true. More precisely, it is not a complete truth, if you register in all directories in a row, then indeed your resource will only suffer from this. But with the right selection of trust and good directories, the effect will undoubtedly be.

Checking site indexing in Yandex

Transitions of users to sites from search engines are one of the primary sources of obtaining visitors, otherwise - potential users of the product / service presented on the resource. Timely site information by search engines allows you not to lose your customers. Therefore, the actions related to the representation in the search should be mandatory paramount, especially for newbie sites.

Why, when indexing, it is worth, first of all, focus on Google and Yandex

This is due to the fact that the level of development of the main characteristics of "search engines" surpasses all other systems presented to date:

  • Accuracy - how much the documents found by the system match the request. For example, when the user enters "buy a fur coat" into the search bar, the "search engine" displays 90-100% percent with the given unchanged combination of these words. The higher the percentage of similarity, the better.
  • Completeness - the number of documents, relative to all available on the network on this topic, which the system issues to the user. If a total of 100 documents on the question "Food for a 1 year old child" are conditionally on the network, and the "search engine" provided only 70 for consideration, the completeness will be 0.7. The search engine with the highest value "wins".
  • Search speed is related to technical characteristics and the capabilities of each "search engine". The higher it is, the more users will be satisfied with the work of the system.
  • The clarity of the search is the quality of the presentation of information upon request, the system's hints regarding those documents that were found upon request. This is the presence of simplifying elements on the results page.
  • - a characteristic that indicates the time interval between receiving information and entering the index into the database. Large search engines have a so-called "quick base" that allows them to index new information in a short time.

Step-by-step instructions for setting up indexing

Before you send a site for indexing by search engines, you need to make preliminary preparations. This is due to several points:

  • Competent preliminary work will exclude the indexing by the search engine robot of unnecessary or incompletely executed and registered information.
  • If the robot detects flaws - unwritten metadata, grammatical, unclosed uninformative links - the search engine will respond to the site owner with a low rating, incorrect submission of material in the search results, etc.
  • While the preparatory work for the demonstration to "search engines" is being carried out, it is necessary to hide the information from robots and index it by a corresponding entry in the robots.txt file.

Proper preparation for indexing will include:

1. Development of meta tags, description and title of pages:

  • Title must be no more than 60 characters. This is the main title of the page and the most important of the tags.
  • Description consists of readable phrases positioning this page, that is, it is necessary to write the main theses, what exactly will be discussed in this material.
  • The keywords tag assumes the prescription of all possible words on a given question. Recently, the value of this tag has diminished in the eyes of search engines,.
  • The meta tag revisit (or revisit-after) will talk about the period when site updates are planned, this is a kind of request-recommendation of the optimizer for the robot, indicating the optimal period of time before next check resource.
This tag should only be used with maximum confidence in the result. Otherwise, this action can only have the opposite effect.

2. Concealment of internal and uninformative sections of the site. This robot is also produced in the robots.txt file. "Search engine" considers this kind of information "weedy", and therefore it will be a minus in the process of checking the resource.

5. To selection keywords and the main points in bold must be treated carefully as the search engine regards these words as the most important, which is not always the case in fact.

6. All existing images must be signed with the alt tag.

7. It is necessary to check the texts for the number and revolutions in the text so that the robot does not ignore the information due to the high rate of text nausea.

8. An obligatory item before submitting an application to search engines for indexing a resource is to check spelling, grammatical and stylistic errors. If there are any in the description, the system will display information in such a form that it can filter out a large percentage of those wishing to visit the site even at the stage of issuing upon request.

In order for the resource to come out among others in the search results search query user, you need to set up indexing in the main search engines used:

1. Google search Console:

  • The form for adding a resource is available here https://www.google.com/webmasters/tools/submit-url.
  • To use the service, you need to log in from your google account.
  • A window will appear in which you must enter the address of the resource that requires indexing.
  • Website ownership will be verified by downloading hTML file to the root of the resource.
  • The Google system will display a message confirming the ownership of the site, which will indicate the inclusion of the resource in the index by this "search engine".
  • Here the form for adding is located at: http://webmaster.yandex.ru/addurl.xml.
  • A form will open where you need to register the address home page promoted resource. The system usually requires you to enter a captcha, after which you need to click the "Add" button.
  • The Yandex search engine checks the resource, after which it gives an answer, a decision on the issue of indexing. If Yandex writes that the site has been added, then the resource has been queued for indexing. Problems with the server will cause the system to respond: "Your hosting is not responding."
If the "search engine" displays the message "The specified URL is prohibited from indexing", this indicates the sanctions imposed on the site. In this case, you will need to urgently contact Yandex technical support specialists.

In addition to indexing in the main systems, do not forget about the slightly less well-known "search engines":

  • Rambler focuses on the indexing of a resource in Yandex, therefore, to add an index to its base, it is enough to go through indexing in the main search engine.
  • Indexing in Mail.ru is performed here: http://go.mail.ru/addurl.
  • The traffic of the Russian search engine Nigma.ru is about 3,000,000 per day. You can apply for indexing in this system here: http://www.nigma.ru/index_menu.php?menu_element\u003dadd_site.

Other ways to customize indexing

Search engines make the decision to index a site, regardless of the owner's desire for indexing.

Therefore, the term "customization" in relation to the indexing process does not sound entirely correct.

It would be more correct to say about the formation of conditions for a search engine to make a positive decision about indexing a resource.

These conditions include:

  • Creation in social networks, telling about the resource. The direction of the flow of visitors by posting records with interesting information with a proposal to follow the link to clarify points of interest, order, obtain more information on the indicated issue.
  • To increase the likelihood of site approval and indexing in Google, it is useful to register an account in this system and start being active.
It is necessary to understand that without indexing the site by search engines, all subsequent promotion actions will be useless.

Therefore, this action must be performed first of all (for new sites) and periodically check this moment when you include fresh information and add whole pages (for existing resources).

Site indexing in search engines - how it happens and how to speed it up - 5.0 out of 5 based on 1 vote

After creating their own website, many webmasters relax and think that the hardest part is over. In fact, this is not the case. First of all, the site is created for visitors.

After all, it is visitors who will read the pages with articles, buy goods and services posted on the site. The more visitors, the more profit. And traffic from search engines is the basis of everything, so it is so important that the indexing of the site is fast and the pages are kept stable in the index.

If there is no traffic, then few people will even know about the site, especially this provision is relevant for young Internet resources. Good indexing helps the page get to the top of search engines as soon as possible and, as a result, attracts a large number of targeted visitors.

What is indexing and how it works

First you need to understand what it is. Site indexing is the process of collecting information from site pages and then entering it into the search engine database. After that, the received data is processed. Then, after a while, the page will appear in the search engine results and people will be able to find it using this search engine.

Programs that collect and analyze information are called search robots or bots. Each search engine has its own robots. Each of them has its own name and purpose.

As an example, there are 4 main types of Yandex search robots:

1. A robot that indexes site pages. Its task is to detect and enter the found content pages into the database.

2. A robot that indexes pictures. Its task is to detect and enter into the search engine database all graphic files from site pages. Then these pictures can be found by users in a Google image search or in the Yandex.Pictures service.

3. A robot that indexes site mirrors. Sometimes sites have multiple mirrors. The task of this robot is to identify these mirrors using information from robots.txt, and then give users only the main mirror in the search.

4. A robot that checks the availability of the site. Its task is to periodically check the site added via Yandex.Webmaster for its availability.

In addition to the above, there are other types of robots. For example, robots that index video files and favicons on site pages, robots that index "fast" content, as well as robots that check the performance of an Internet resource hosted in Yandex.Catalog.

Indexing of site pages by search engines has its own characteristics. If the robot discovers a new page on the site, then it is entered into its database. If the robot commits changes to old pages, then their versions previously entered into the database are deleted and replaced with new ones. And all this happens over a period of time, usually 1-2 weeks. Such long terms are explained by the fact that search robots have to work with a large amount of information (a large number of new sites appear every day, and old ones are also updated).

Now about the files that search engine bots can index.

In addition to web pages, search engines also index some closed-format files, but with certain restrictions. So in PDF robots only read text content. Flash files quite often are not indexed at all (or only text placed in special blocks is indexed there). Also, robots do not index files larger than 10 megabytes. Search engines are best at indexing text. When indexing it, the minimum number of errors is allowed, the content is entered into the database in full.

To summarize, many search engines at the moment can index formats such as TXT, PDF, DOC and DOCX, Flash, XLS and XLSX, PPT and PPTX, ODP, ODT, RTF.

How to speed up the process of site indexing in search engines

Many webmasters are thinking about how to speed up indexing. First, you need to understand what the indexing time frame is. This is the time between visits to the site by the search robot. And this time can vary from a few minutes (on large information portals) to several weeks or even months (on forgotten and abandoned small or new sites).

There are frequent cases of content theft. Someone can simply copy your article and post it on their website. If a search engine indexes this article before it happens on your site, then the search engines will consider this site as the author, not yours. And although today some tools have appeared that allow you to indicate the authorship of content, the speed of indexing of site pages does not lose its relevance.

Therefore, below we will give tips on how all this can be avoided and speed up the indexing of your resource.

1. Use the "Add URL" function - these are the so-called addurilki, which are forms in which you can enter and add the address of any page on the site. In this case, the page will be added to the indexing queue.

It is available in many major search engines. So that you do not have to search for all the addresses of the forms for adding site pages, we have collected them in a separate article: "". This method cannot be called 100% protection against plagiarism, but it good way inform the search engine about new pages.

2. Register the site in Google Webmaster Tools and Yandex.Webmaster service. There you can see how many pages of the site have already been indexed, and how many have not been indexed. You can add pages to the indexing queue and do much more with the tools available there.

3. Make a sitemap in two formats - HTML and XML. The first is needed for placement on the site and for ease of navigation. The second map is needed for search engines. It contains text links to all pages on your site. Therefore, when indexing, the robot will not miss any of them. Sitemap can be done using plugins for CMS or using numerous online services.

The following are excellent solutions for creating it:

4. Announcement of articles on social networks - Google +1, Twitter, Facebook, Vkontakte. Immediately after adding a new article to the site, announce it on your google page +, Twitter feed and pages on Facebook and Vkontake. It is best to put buttons on the site social networks and add announcements there simply by clicking on the buttons. You can set up automatic announcements on Twitter and Facebook.

5. Cross-post to various blog platforms. You can create blogs for yourself on services such as: Li.ru, Livejournal.com, wordpress.ru, blogspot.com and publish there short announcements of your articles with links to their full versions on your website.

6. Make an RSS feed of the site and register it in various RSS directories. You can find their addresses in the article: "".

7. Frequency of site updates. The more often new materials appear on your site, the more often search robots will visit it. For a new site, this is best done every day, well, at least every other day.

9. Only post unique content on your site. This is a universal rule of thumb to improve more than just the indexing process. The more unique the material, the better search engines will apply to your site. The more often search robots will visit you.

These methods for speeding up indexing will be quite enough for a young or middle-aged site. They won't take much of your time and have a good effect.

Prevent indexing pages

In some cases, the webmaster needs to close the site from indexing or close its individual pages and sections. What is it for? For example, some of the pages on your site do not contain useful information; these can be all sorts of technical pages. Or you need to close unnecessary external links, banners, and so on from indexing.

1. Robots.txt.

You can close individual pages and sections of the resource from indexing using the robots.txt file. It is placed in the root directory. There are written the rules for search robots in terms of indexing individual pages, sections, and even for individual search engines.

With the help of special directives of this file, you can very flexibly control the indexing.

Here are some examples:

You can prohibit indexing of the entire site by all search engines using the following directive:

User-agent: * Disallow: /

Disable indexing of a specific directory:

User-Agent: * Disallow: / files /

Disable indexing url pages which contains "?":

User-agent: * Disallow: / *?

And so on. The robots.txt file has many directives and capabilities, and this is a topic for another article.

2. There is also noindex and nofollow tag and meta tag.

To prohibit indexing of certain content on the page, just place it between the tags , but these tags only work for the Yandex search engine.

If you need to close from indexing separate page or site pages can use meta tags. To do this, on the page of your site between the tags you need to add the following:

If you add:

then the document will not be indexed either.

If you add:

then the search engine robot will not follow the links on this page, but will index the page itself.

In this case, what will be indicated in the meta tags will take precedence over the directives of the robots.txt file. Therefore, if you prohibit the indexing of a certain directory of your site in the robots.txt file, and the following meta tag will be indicated on the pages of the site that refer to this directory:

That page data will still be indexed.

If the site is built on some CMS, then some of them have the ability to close the page for indexing using special options. In other cases, these meta tags will have to be inserted manually into the site pages.

In the next articles, we will take a closer look at the procedure for prohibiting indexing and everything connected with it (using the robots.txt file, as well as the noindex and nofollow tags).

Indexing and page dropping issues

There are many reasons why an Internet resource may not be indexed. Below we list the most common ones.

1. The Robots.txt file is incorrectly configured or specified incorrectly.

2. The domain of your site has already been used for a specific site and has a bad history, most likely some kind of filter was previously applied to it. Most often, problems of this kind relate to indexation by Yandex. The pages of the site can get into the index during the first indexing, then they completely crash and are no longer indexed. When you contact Yandex support, you will most likely be told to develop the site and everything will be fine.

But as practice shows, even after 6 months of publication of high-quality unique content on the site, there may not be any positive movements. If you have a similar situation and the site has not been indexed for 1 - 2 months, then it is better. As a rule, after that everything falls into place and the pages of the site begin to be indexed.

3. Non-unique content. Add only unique material to the site. If the pages of your site contain a large amount of copy-paste, then do not be surprised that over time these pages may drop out of the index.

4. The presence of spam in the form of links. On some sites, pages are literally overwhelmed external links... The webmaster usually hosts all of this in order to make more money. However, the end result can be very sad - certain pages site and the entire site may be excluded from the index, or some other sanctions may be imposed.

5. The size of the article. If you review source any page of your site, you will see that the text of the article itself does not take up much space compared to the code of other elements (header, footer, sidebar, menu, etc.). If the article is too small, then it can even get lost in the code. Therefore, there may also be problems with the uniqueness of such a page. Therefore, try to publish notes with a text volume of at least 2,000 characters; such content is unlikely to cause problems.

How to check site indexing

Now let's talk about how to check the indexing of your Internet resource and find out exactly how many pages are indexed.

1. First of all, try to drive into a simple search of the same Google or Yandex. The results should include this page. If there is no page, then it is not indexed.

2. To check the indexing of all pages of a site in Yandex, it is enough to insert host: your-site.ru | host: www.your-site.ru and search. For Google, it is enough to insert into the search form site: your-site.ru

3. You can also check your site using a service such as pr-cy.ru. Everything is simple and understandable here. You just need to drive the address of your resource into the field located in the center, and then click the "Analyze" button. After the analysis, you will receive the results of the check and find out how many pages are indexed in a particular search engine (you can do this in the appropriate section called "Key site indicators").

4. If your site has been added to the Yandex Webmaster service, then there you can also track the indexing of the website pages by this search engine.

Greetings friends. Today is an important article: I will share information on how to speed up the indexing of new pages on the site and show what tools I use. Particular attention should be paid to new bloggers.

Those who have just started blogging have already faced this problem. You write new articles, publish, but there is still no traffic. Check if Yandex search engines have indexed and Google new posts, and find out - no. It seems like a week, 2 weeks, or even a month has passed, but indexation has not happened.

What problems it carries? The most dangerous - your content can be copied by pumped up blogs and the authorship of your excellent article, on which you worked, will go to them. The search engine can still mistake your site for a plagiarist.

The second problem is not so critical, but, too, unpleasant - a long wait for traffic. It seems like there are already 50 articles, but no visitors.

But okay, if you have a young blog (2-4 months), but what about those who have been blogging for a year or more, and the difficulties with fast indexing remain? First of all, do not despair. And, secondly, read my indexing acceleration algorithm below.

Original texts Yandex

Point number 0. There are different rumors among bloggers and SEOs: someone says that there is no use for this tool, someone proves the opposite. So far, no one has given an exact answer, I advise you not to neglect this trick and add the text of a new article here.

Attention! You need to add text before publication on your blog. If the text has already been indexed by Yandex, you should not add it, otherwise it will take it for plagiarism.

After that you can publish your material on the blog.

Addurilki Yandex and Google

This point has the greatest effect. IN Google indexing new page happens instantly!

Social media set networks depends on the content you share. If you have good quality photos, you can add Instagram, Pinterest and Tumblr. For videos - Youtube and Vimeo. For portfolio - Behance and Coroflot.

It's even better if you have your blog group. We make an announcement in the group and repost it on our wall. If the group consists not of bots, but of living people, you will receive referrals, which will only increase the speed of indexing a new page.

How to further enhance influence of social... networks for indexing? You can buy social. signals:

  • Reposts, retweets, likes of your post;
  • Separate posts from other people.

For these tasks I use Forumok and Webartex. More often I order tweets (7-9 items) and VK likes (5-10). The price on the forum is from 2.5 rubles, in WebArtext it is more expensive. IN Google plus I get likes by exchange: I give other bloggers - they give me back. I have already formed a list of those who reciprocate. Add me to circles, colleagues: my social. there are nets in the sidebar on the right.

Actions that affect the speed of indexing

In addition to the active funds described above, which we use immediately and once, there are passive ones. They depend on long term exposure.

  1. Post new content more often;
  2. Observe regularity - search robot likes it when new articles appear at regular intervals and visits such blogs more actively;
  3. Create a sitemap for robots;
  4. Re-linking - put internal links to other articles;

You've probably noticed that since the new year I regularly publish articles (at least 1 per week, sometimes more often). As a result, Google indexes new material in 2 minutes.

Algorithm to speed up indexing

I will summarize the final sequence of actions:

  1. Before publishing, add the article to " Original texts Yandex ";
  2. Search engine add-ons;
  3. Reposts to your social profiles networks;
  4. Strengthening social signals.

I am sure this information will help you speed up the indexing of new pages. I would be glad to repost this article, because I did my best, guys) Good luck!