how to control robots.txt in addon domains?

  • Answered
In my public_html folder I have a robots.txt for my main hosted domain. I also have two addon domains, which show up in the file structure as directories under public_html (e.g. public_html/ ). In each of those addon domains, I have separate robots.txt files specific to those domains.

My question is whether it is sufficient to just place robots.txt files in the addon domain directories to control things under that domain, or whether I *also* must specify any restrictions in the main domain robots.txt as well?

e.g. if I want to disallow crawling of the cgi-bin directory in my main domain and an addon domain, do I need to specify something like this in public_html/robots.txt :
Disallow: /cgi-bin/
Disallow: /

Or will I be covered for anything for "" by placing whatever restrictions in public_html/ ?

Hello antoinmo,

Thank you for your question on robots.txt and addon domains. You will want to have a separate robots.txt file for each addon domain. This way you can control them specifically.

As for the cgi-bin, anything visiting the addon cannot see above that folder to the main domain's cgi-bin, so unless you have a cgi-bin in that addon folder, you will not need to add that to the addon domain's robots.txt file.

Treat each domain as if it was the only one and create each robots.txt file as if it was the only one. That way you should be set.

Kindest Regards,
Scott M