bots getting a 403 error when they try to crawl my site
I have installed a drupal 7 theme and am using webceo.com for CEO etc. Webceo is not able to gain access to crawl my site. They are getting a 403 error. the host company suggests it may be something in the .htaccess file. There are several: the domain root[publichtml/mysite/] , /sites/default/file, /site/default/files/tmp
Sorry to hear about your site giving 403 when bots crawl it. We took a look at the site and had a test bot hit the site with both robots and htaccess disabled. It still gives the 403. When testing other specific pages on the site, both Drupal and non-Drupal, it does not 403.
There is something in the Drupal code that activates in the index page that is doing a GET request somewhere that is causing the 403. Did this work prior to the new theme?