Contact Support : +64 220 256 070

Aimvm
Blog

How to allow google bots to crawl your website?

Posted on: October 31, 2014

This lesson will teach you on how to enable Google bots to crawl your website.

What is robots.txt?

The robots.txt is a text file that tells web crawling software what pages on your site you want to be  indexed. It contains a list of “Allow” and “Deny” commands along with the urls that you want to be found and those that you want private.

Steps:

1) Login to your cPanel.

2) Access ‘File manager’ option available in cPanel.

3) Go to the document root of your website(/home/accountname/pubilc_html)

4) Create a new file called ‘robots.txt’ and add the following codes to it.

User-agent: googlebot
Allow: /

The above will allow Google bots to crawl your website.

If you want to disable all types of bots from crawling your website then add the following codes:

User-agent: *
Disallow: /

Now all the bots will be blocked from your site.

If you want to block a particular search engine from your site, say ‘Googlebot-image’ then add it as.

User-Agent: Googlebot-Image
Disallow: /

 

If you experience any problems or require any assistance with this matter, please contact the support team who will be happy to advise you further.

 

Search Blog