So you've built your website and you maybe heard about robots.txt file and what this file is for. In this tutorial, we’ll explain how to create a robots.txt file
What is a robots.txt file?
A robots.txt file tells search engines what your website’s rules of engagement are. A big part of doing SEO is about sending the right signals to search engines, and the robots.txt is one of the ways to communicate your crawling preferences to search engines.
Why should you care about robots.txt?
The robots.txt plays an essential role from a SEO point of view. It tells search engines how they can best crawl your website.
Using the robots.txt file you can prevent search engines from accessing certain parts of your website, prevent duplicate content and give search engines helpful tips on how they can crawl your website more efficiently.
How to create the robots.txt file?
2) Go to the Files section and click on File manager
3) When you are in File Manager to the website directory public_html then Click on New File, type robots.txt and Click on Create New File.
Note: You can create only one robots.txt per domain name.
Examples of usage
When you install WordPress on your domain name, robots.txt file will be generated automatically. But if you need to add your own, we provided few examples below:
Block one file (in other words, one particular webpage)
Disallow: /store/products/what-is-a-bot/
Block one directory
Disallow: /images/
Allow full access
Disallow:
Hide the entire website from bots
Disallow: /