Google search on subdomain

Avatar
  • Answered
I have a website, sohs.org, in Drupal. It is searched by Google, but that isn't a priority so I haven't tried to optimize it. Now I have created a subdomain that only includes static html files and photos, transferred from a previous site. It is important to have Google searching on the subdomain (truwe.sohs.org). When I tried to set this up I get this error message from Google:
"Network unreachable: robots.txt unreachable. We were unable to crawl your Sitemap because we found a robots.txt file at the root of your site but were unable to download it. Please ensure that it is accessible or remove it completely."
I tried to move the robots.txt file, use the robots module in Drupal, copy the robot.txt file and edit it in the subdomain root directory. I even tried deleting the robots.txt file completely.
I don't currently have a sitemap on the main Drupal site. I do have one in the root directory of the subdomain.
My actual domain name is sohistory.org, which is redirected to sohs.org.
Any help would be appreciated!
Avatar
Tim S.
Good morning, Thanks for your question about Google crawling your website. I found a robots.txt file here: http://sohistory.org/robots.txt Which is likely your problem. I'd suggest editing that and see if it helps. If you're still having problems, I'd suggest contacting Google for assistance. Make sure you are using Google Search Console as well. This will allow you to see how Google is crawling and indexing your website. It will also list the robots.txt location. I hope this helps! Thanks! Tim S