Accelerated site indexing. Mission is possible

Any site owner wants to see their content at the top of the search results. In order to easily find a page with published material in the search, it must be registered in the index – a special database of search engines. Because those pages that search engines do not know about and do not participate in the search results. They need to be “introduced”, in other words, indexed.

1. What indexing affects

This task is performed by special robots that crawl the entire resource and collect information from its pages, and also send requests to the server. If the resource was verified successfully, then its data is entered into the index.

In the process of indexing, robots evaluate a set of factors:

  • quality and frequency of content updating – texts, graphic elements;
  • hitting the topic;
  • engagement and attendance;
  • availability from various devices;
  • internal and external links;
  • nesting of pages;
  • download speed, etc.

The higher the usefulness of the site, the higher its pages are ranked, and the growth in SERPs leads to an influx of traffic. Thus, indexing contributes to the rapid promotion of a web resource and, due to an increase in traffic, an increase in conversions. It is definitely worth knowing how to speed up indexing.

2. Indexing algorithms

When the site is updated only from time to time, search robots enter the resource no more than once a week, with daily – indexing is carried out several times a day. The search engine indexes no more than 30 pages per visit.

There is also such an indicator as the indexing depth: this is the number of levels that the robot sequentially passes through the links of the site, starting from its main page. But often he studies only the upper levels, respectively, the lower ones are temporarily left without indexing, until the next time. Therefore, I want to speed up the indexing of links so that the target audience comes to the site faster and takes the necessary actions.

After analyzing the page, search robots rank the site. The frequency of site indexing is mainly influenced by two factors: site traffic and how often the resource itself is updated with new content. Robots track any flaws and reduce the credibility of sites that contain inaccurate and/or poorly submitted information. The content of the pages must be relevant to user requests.

In addition, small auxiliary factors also affect: for example, a clear and convenient structure of the site, the presence of working links – all this has a positive effect on indexing. Search engine robots independently find and check the created site or new pages of an already viewed web resource.

If the necessary conditions are met on it, then the indexing of the site is faster and takes from 24 hours to a week. If not everything is in order, then the search engines “forget” about the site and go to research others. So, for example, Google will not index a web resource if it comes under sanctions – a notification will be displayed that the site address is prohibited from being added for indexing, or if there are problems with the server, a message will appear stating that hosting does not respond to system requests.

3. How to check that a site is indexed

There are three main ways to check a web resource:

  1. Manually, using the site operator.

You need to enter the command in the search box: site: [site_ url] to get data about all indexed pages. If the values ​​in Google and Yandex differ noticeably, then there is a chance that the site is under the filter.

Fig. 1 – Checking indexing manually

Fig. 1 – Checking indexing manually

2. Use free webmaster tools.

For example, see the data on the Google Search Console page. You will need to open the “Google Index” section and go to the “Indexing Status” block. Indexing and Sitemap reports are also available there. Google checks pages faster by first crawling the entire site and then skipping only the best pages.

This can also be done in Yandex: you need to enter the site’s URL in the form, confirm your owner status, go to “Site Indexing” and then to “Pages in Search”.

Apply browser plugins, they are supported by Chrome, Opera, and Mozilla, for example, RDS Bar. Or special tools for these purposes, among them, SEOGadget are popular.

Fig. 2 – RDS Bar Plugin for Chrome

Fig. 2 – RDS Bar Plugin for Chrome

Fig. 3 – interface

Fig. 3 – interface

Fig. 4 – SEOGadget interface

Fig. 4 – SEOGadget interface

4. Ways to speed up indexing

The wait for the robot to crawl new site pages is often delayed. The exit is accelerated indexing. There are several ways to grow up in the search results, which have shown their effectiveness:

  1. Use a Sitemap.xml file that determines the rate at which new pages are indexed. If you add links to them right away, the robots that follow the content update will find them faster. The file must be saved at the root of the site.
    You can update the file manually after adjusting the content of the web resource and send it using a special report,

but it is much more convenient to use a dynamic Sitemap.xml: as new pages are created, the file will update itself.
For example, for sites based on Tilda, this map is dynamic by default, and for other CMS you can install additional plugins: for example, All in One SEO Pack may be suitable for WordPress.

2. View like Googlebot using the Google Webmaster Panel. This is the name of the option, it is located in the “Scan” section. Manual browsing in progress. To do this, you need to insert the address of the desired page, removing the name of the site itself from the url. Then we press the button “Scan” and then – “Add to index”. Within half an hour, the scanned page will appear in the index.

3. Optimize by setting your robots.txt file. It is necessary to prohibit visiting technical, service pages and viewing other information that does not carry useful information.

4. Create an RSS feed and use social media. The first method is no longer relevant for most users, but search engines still see links to new pages added. As for social networks, today it is already clear to everyone that building a link mass with the help of links to standing content affects indexing. In general, having links on sites that generate targeted traffic is a great idea.

5. Configure DFI – Click Distance from Index (the number of clicks from the main page of the site to the current one, which is important to index). The fewer clicks between them, the higher the priority of the page from the point of view of the search engine’s algorithm. Usability of sites today is difficult to underestimate: the optimal path to any indexed page should be no more than three clicks from the main menu.

  1. For indexing new pages in 2022 the most relevant method is Google Indexing API. It allows any site owner to directly notify Google about the addition or deletion of pages, and to send batch indexing requests. In turn, the search engine can schedule their processing. This helps to improve the quality of traffic.

It’s worth noting that the Indexing API can crawl pages with structured JobPosting or BroadcastEvent data, which are built into VideoObject.

To use Indexing API, a user will need to have a service account and confirm ownership of the web resource via Search Console. You will need an access token to authenticate the API call.

Quota for indexing Google API is 200 queries per day. Running the program to send different URLs with different domains will require activation of PRO mode.

In addition, to speed up indexing, it does not hurt to check:

  • the structure of the sections and their linking – the search robot has nowhere to go to it, respectively, there are problems with relevance;
  • placement of content on the site: how useful it is, how often it is updated, what is the ratio in each text of the number of keywords and its total volume;
  • navigation and codes of the site pages – are there any duplicates and errors;
  • weight of images, as it can slow down page loading and reflect badly on indexing in SERP.

5. What problems can you face

  1. If the pages are not indexed at all or only by one of the search engines, then you need to first check the hosting settings and the robots.txt file. It is important that there is no prohibition on indexing. If only one search engine sees it, then the problem may lie precisely in the presence of filters and sanctions. It is worth determining which of them are on the site and making edits.

2. If indexing occurs, but for a long time, then there are many more options for reasons and you need to check them in turn. Problems may be lack of optimization, lack of linking, rare content updates, etc.

3. If you have problems with specific URLs, you can send these pages for crawling in Google Search Console (the URL Checker Tool is useful) and Yandex.Webmaster (looking for Indexing – Page crawling).

Site indexing is a complex job. This is a rather unpredictable process, but since its influence on the ranking of a site in the SERP is very important, it is worth simplifying the task for search robots as much as possible.

If you quickly apply most of the general recommendations described in this article, then most likely they will have a positive effect on indexing speed. And of course, after that, it is worthwhile to start promoting the web resource.

If you have found a spelling error, please, notify us by selecting that text and pressing Ctrl+Enter.


Spelling error report

The following text will be sent to our editors: