WHAT IS GOOGLEBOT?

What is Googlebot?
apa itu Googlebot Web Crawler?
Googlebot is a web crawler (network crawler) used by Google. Googlebot is used to search and collect pages on the internet. Information collected by Googlebot is used to update the Google index.

Googlebot visits trillions of web pages and continuously visits these pages.

What is WebCrawler?

Actually Webcrawlers, also referred to as bots, robots, or spiders, are just a type of software designed to follow links / links and collect data from each link.
Googlebot retrieves content from web pages, such as the words, code, and resources that make up the web page.

If in the content that is taken there is a link to another page, then the links will be recorded.

Googlebot and Your Website

The information sent by Googlebot will update the Google index. Index is the place where all web pages will be compared and ranked according to keywords.

So the first step for your website page to be in this index is to make your website visible and accessible to Googlebot.

Can Googlebot See My Web Pages?

To find out what Googlebot can see from your site, do this on Google search. Enter the following search:
site: namadomain.com
This is a command to display all the pages that Googlebot collects into the Google index.

From here you can see how many of your pages are in the Google index. Is the amount reasonable, lacking, or even too excessive?

If it's lacking, then there might be pages that you think should be included in the Google index instead you accidentally blocked it with robots.txt.

If it's excessive, there may be pages that you don't want to enter instead of being indexed; or even your website accidentally creates a lot of duplicate content because the system of the theme is used.

Whichever happens between the two will damage the SEO of your website ... If the indexed page is less, then the points you should get are also less; and vice versa if the page is over-indexed, then the points of each page also become smaller because the divider becomes more ...

So if you are learning SEO for the first time, then observing the Google index is one of the things that you must understand.

Can Google access all content and links?

Not everything on our page can be accessed by Google, so you must know whether the important elements of the page that you hope can help SEO can be seen by Google.

There are many types of problems that result in Google not being able to see all the content on our website. Some of them are:
  1. Robots.txt is blocked
  2. Broken link
  3. Using flash technology that Googlebot can't see
  4. Error code
  5. Dynamic links that are too complicated


If you want to know whether all the components of your page can be seen by Google, then try doing FETCH AND RENDER in the Search console; see if Google can crawl everything or only partially.

Or you can also search your page on Google with the operator site: earlier, and after the search results come out, look at the Google CACHE from the small arrow in each search result (see image below)
cek cache pada pencarian Google

Can Googlebot be controlled?

Yes, Google is a company that respects the rules in robots.txt. So the robot will not collect data from banned pages.

Overall there are a number of ways you can manage Google robots:
  1. Using robots.txt
  2. Enter meta robots on the page
  3. Using robot instructions in the header
  4. Use site map
  5. Using Google Search Console


But the most commonly used is robots.txt

What is Robots.txt?

This is a file that contains rules about how crawling robots should interact on your web page. All you need to know is; ... just because your website uses robots.txt does not mean that all robots will follow the rules in it.

Robots from parties that are not good will certainly not care about the rules in this robots.txt.

Googlebot only; You can set which pages may and may not be accessed by Googlebot when it comes to your website. It's just that the discussion of robots.txt is quite complex so it will be discussed later in a separate article.

Googlebot and Sitemap

Sitemaps or commonly called sitemaps are ways to help Google understand the structure of your website.

Google says there are a number of conditions where you really need to use a site map ...
  1. You create a very large website
  2. Your website has a lot of content but is mutually isolated and does not have a good link structure
  3. Your site is still new and only a few backlinks come in; so Google relies more on site maps to crawl all new pages.


Googlebot Types

Googlebot has 9 different types with different functions. The nine Googlebot are:
  1. Googlebot (Google Web search)
  2. Google Smartphone
  3. Google Mobile (Feature phone)
  4. Googlebot Images
  5. Googlebot Video
  6. Googlebot News
  7. Google Adsense
  8. Google Mobile Adsense
  9. Google Adsbot (landing page quality check)

If anyone wants more detailed information, please visit the Google Crawlers explanation page.

0 Response to "WHAT IS GOOGLEBOT?"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel