InMotion Hosting Support Center


In this article we'll discuss how you can block unwanted users or bots from accessing your website via .htaccess rules. The .htaccess file is a hidden file on the server that can be used to control access to your website among other features.

Following the steps below we'll walk through several different ways in which you can block unwanted users from being able to access your website.

Edit your .htaccess file

To use any of the forms of blocking an unwanted user from your website, you'll need to edit your .htaccess file.

  1. Login to your cPanel.
  2. Under Files, click on File Manager.
  3. Select the Document Root for: option, and choose your domain from the drop-down.
  4. Ensure that Show Hidden Files is selected.
  5. Then click Go.
  6. file-manager-hidden-files
  7. Right-click on the .htaccess file and select Edit.
  8. file-manager-htaccess-edit
  9. If your .htaccess file didn't exist already during the previous step, click on New File at the top-left, name the file .htaccess, and finally set the directory for the file to be created to /public_html/ or the document root of your site.
  10. file-manager-htaccess-create
  11. You might have a text editor encoding dialog box pop-up, you can simply click on Edit.

Block by IP address

You might have one particular IP address, or multiple IP addresses that are causing a problem on your website. In this event, you can simply outright block these problematic IP addresses from accessing your site.

Block a single IP address

If you just need to block a single IP address, or multiple IPs not in the same range, you can do so with this rule:

deny from

Block a range of IP addresses

To block an IP range, such as -, you can leave off the last octet:

deny from 123.123.123

You can also use CIDR (Classless Inter-Domain Routing) notation for blocking IPs:

To block the range -, use

To block the range -, use

deny from

Block bad users based on their User-Agent string

Some malicious users will send requests from different IP addresses, but still using the same User-Agent for sending all of the requests. In these events you can also block users by their User-Agent strings.

Block a single bad User-Agent

If you just wanted to block one particular User-Agent string, you could use this RewriteRule:

RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} Baiduspider [NC]
RewriteRule .* - [F,L]

Alternatively, you can also use the BrowserMatchNoCase Apache directive like this:

BrowserMatchNoCase "Baiduspider" bots

Order Allow,Deny
Allow from ALL
Deny from env=bots

Block multiple bad User-Agents

If you wanted to block multiple User-Agent strings at once, you could do it like this:

RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} ^.*(Baiduspider|HTTrack|Yandex).*$ [NC]
RewriteRule .* - [F,L]

Or you can also use the BrowserMatchNoCase directive like this:

BrowserMatchNoCase "Baiduspider" bots
BrowserMatchNoCase "HTTrack" bots
BrowserMatchNoCase "Yandex" bots

Order Allow,Deny
Allow from ALL
Deny from env=bots

Block by referer

Block a single bad referer

If you just wanted to block a single bad referer like you could use this RewriteRule:

RewriteEngine On
RewriteCond %{HTTP_REFERER} example\.com [NC]
RewriteRule .* - [F]

Alternatively, you could also use the SetEnvIfNoCase Apache directive like this:

SetEnvIfNoCase Referer "example\.com" bad_referer

Order Allow,Deny
Allow from ALL
Deny from env=bad_referer

Block multiple bad referers

If you just wanted to block multiple referers like and you could use:

RewriteEngine On
RewriteCond %{HTTP_REFERER} example\.com [NC,OR]
RewriteCond %{HTTP_REFERER} example\.net
RewriteRule .* - [F]

Or you can also use the SetEnvIfNoCase Apache directive like this:

SetEnvIfNoCase Referer "example\.com" bad_referer
SetEnvIfNoCase Referer "example\.net" bad_referer  

Order Allow,Deny
Allow from ALL
Deny from env=bad_referer

Temporarily block bad bots

In some cases you might not want to send a 403 response to a visitor which is just a access denied message. A good example of this is lets say your site is getting a large spike in traffic for the day from a promotion you're running, and you don't want some good search engine bots like Google or Yahoo to come along and start to index your site during that same time that you might already be stressing the server with your extra traffic.

The following code will setup a basic error document page for a 503 response, this is the default way to tell a search engine that their request is temporarily blocked and they should try back at a later time. This is different then denying them access temporarily via a 403 response, as with a 503 response Google has confirmed they will come back and try to index the page again instead of dropping it from their index.

The following code will grab any requests from user-agents that have the words bot, crawl, or spider in them which most of the major search engines will match for. The 2nd RewriteCond line allows these bots to still request a robots.txt file to check for new rules, but any other requests will simply get a 503 response with the message "Site temporarily disabled for crawling".

Typically you don't want to leave a 503 block in place for longer than 2 days. Otherwise Google might start to interpret this as an extended server outage and could begin to remove your URLs from their index.

ErrorDocument 503 "Site temporarily disabled for crawling"
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} ^.*(bot|crawl|spider).*$ [NC]
RewriteCond %{REQUEST_URI} !^/robots\.txt$
RewriteRule .* - [R=503,L]

This method is good to use if you notice some new bots crawling your site causing excessive requests and you want to block them or slow them down via your robots.txt file. As it will let you 503 their requests until they read your new robots.txt rules and start obeying them. You can read about how to stop search engines from crawling your website for more information regarding this.

You should now understand how to use a .htaccess file to help block access to your website in multiple ways.

Support Center Login

Social Media Login

Social Login Joomla

Related Questions

Here are a few questions related to this article that our customers have asked:
spam bot repeatedly contacts my website
Would you like to ask a question about this page? If so, click the button below!
Ask a Question
n/a Points
2014-05-06 12:34 pm

How can I identify bad user IP addresses?

9,968 Points
2014-05-06 2:25 pm
Hello Jeremy, and thank you for your comment.

I see that your account has been having some higher than normal CPU usage which you can see by looking at your CPU graphs in cPanel.

One of the best ways to get a good idea of bad IP addresses or other malicious users to block is to parse archived raw access logs. These archived raw access logs in cPanel allow you to see requests coming from the same malicious users over an extended period of time.

By default your raw access logs are processed into stat reports, and then rotated, so the raw information is no longer available. I went ahead and enabled the raw access log archiving for you, so that going forward you'll have a better understand of what is happening on your site.

Currently your raw access logs go back to 05/May/2014:09:02:05 and it looks like since then, these are the issues causing your higher CPU usage.

You should really optimize WordPress for your sites. Making sure to use a WordPress cache plugin to speed up your site. You should also disable the wp-cron.php default behavior in WordPress to cut down on unessary requests.

You can also review WordPress login attempts as it looks like there have been 120 different IP addresses hitting your wp-login.php script over 218 times in the available logs. This is common due to WordPress brute force attacks being a very big nuisance this year. In these cases it's generally recommended to setup a secondary WordPress password instead of keep going back and blocking IPs that have already attacked you.

I'd recommend checking back in a few days after implementing some of these WordPress solutions to see if your CPU usage has dropped back down to normal usage levels.

- Jacob
n/a Points
2014-11-09 4:19 pm

pls i am having issues with resources overage and i have done all i was asked to do , starting from putting in place a good captcha , deleting messages in my commentmeta which are spamms , i have also set my web master tool to a 30 secs delay and still it giving me the same issue , im loosing clients due to this coz they are scared of my site been suspended . i was told if i block some certain bots i wont be listed on google / bings site . how can you guys help resolve this without going for another plan (vps) . and please im based in nigeria , i really need ur help ..

16,896 Points
2014-11-10 1:47 pm
Hello albert,

Thank you for contacting us. We are happy to help, but it is difficult since we will need to review the specific nature of the traffic. If you received an email notification or suspension from our System Admin team, just reply to the email and request more information, or a review. This will reopen the ticket with our System Admin team.

The above guide guide explains how to block bots, and IP's but I do not recommend blocking anything that you need. For example, if your users are in America, I would not block Google.

If you do not have any visitors from China, then I would block the Baidu bot from crawling your website, since it is for a chinese based search engine.

If you have any further questions, feel free to post them below.

Thank you,
n/a Points
2014-06-05 9:57 am

I have the same question, but I can't parse my raw access logs per your instructions because I'm on a shared server.  How can I do this on a shared server.


BTW, I did follow your instructions for adding a second password and optimizing my wordpress.  Thanks.

11,186 Points
2014-06-05 10:13 am
On shared hosting, the access logs are still able to be obtained via cPanel under the Logs section.
n/a Points
2014-06-19 4:27 pm

when I try to use .htaccess it keeps turning on my hot-linking protection.

with the result that when I click on google images etc and click on an image to visit the site I get the page with no graphics on it.

You have told me this cannot happen - BUT IT IS HAPPENING

If I go to C panel and remove the htaccess file and then go back to the page and refresh it the normal page is then shown.

I have tried adding the Ph whatsit fix that you have up but that makes absolutely no difference.

My domain is

25,274 Points
2014-06-19 6:26 pm
Hello Vicky,

I'm sorry that you appear to be having problems with hotlinking. If you had previously activated hotlinking and Google indexed your site, then it's possible that that missing images are are due to the previous images being blocked. Google would need to re-index your page without the hotlinking protection. By default, the .htaccess file is always there and it will only have the hotlinking settings if the option is set. When I access your page with hotlinking off, I don't see any problems - and there is an .htaccess file active.

I also looked up your site in Google Images, and it looks normal to me there too. Make sure that you're clearing your browser cache. If you had accessed Google images earlier and it got cached in your browser without the images, then you may be seeing it that way.

If you continue to have problems after clearing the browser cache, then please explain the URL that you're viewing and exactly what you're seeing and what you expect to see. We can then investigate the issue further for you.


Arnel C.
n/a Points
2014-12-22 6:17 am
thanks working for my site
n/a Points
2015-03-10 3:23 pm

Hi, I notice that using the file manager in my cpanel, I can see 2 .htaccess files. One in the public_html folder and one in the level above it. I see that the IP's i have blocked are added into the public_html but what is the other file in the level above it used for?

3,669 Points
2015-03-10 11:41 pm
Hello John,

That .htaccess is generally more for setting PHP versions that you would like your account to use. You can also block IP addresses at that level as well as it would server the same purpose.

Best Regards,
TJ Edens
n/a Points
2015-04-09 5:47 pm

Thanks for this, Htacess is all new to me and this code thing, I was still unclear what I need to do code wise as to which one to Best choose for  me and my scenario and likely I need to past a few:

please kindly give me the exact code to paste in to stop this daily ocuurance of page views for my entire site driving me mad false impressions of so called real person  visitor: the  biggest offender is:

Current Ip being used is


(though likley to change as well know!) 

****and also prolific abuser is:

Provider: Hetzner Online AG


can I permenantly block these two companies above?And for future problems sometimes  these spam bots have an actual url other times their so called  user name/provider and IP only  can be seeen (not url) hence not sure of exact code to include multiple spam bots to cover this?

My concern is they will simpy change their ip so please gives me a code to maybe add further ip's for them or under their user name and perhaps code for multiple  spam bots if maybe you know their URL and name

In short:

I would like to include the best typical code scenarios to incorporate so as to   likley cover  other multiple bots  names,  or URL or IP's  I can simply  add new offenders to the appropriate  code ?as this another one amongst about five to seven regulars or so  I get in any given day or week:


Provider: Petersburg Internet Network ltd.

Organisation: Petersburg Internet Network LLC many thanks indeed!Jenny 


n/a Points
2015-06-03 4:55 am

prestashop has a big problem with their robot.txt file they have functionality through backoffice to generate robot.txt file however it would be better if the generated file would exclude products added to cart and site visitors. currently i have 6-10 carts being generated weekly which is annoying as i have to delete them manually.

wit visitor stats its totally incorrect as my site does not have actual visitors at 2am in the morning. hope this can be resolved without me physically amending the table.



25,274 Points
2015-06-03 3:24 pm
Hello Jay,

Robots.txt file is primarily used to stop search engine robots from hitting sites or certain portions of sites. It's not made to stop someone spamming your site. You may want to check their plugins for other ways to manage your carts (there are several that make managing carts much easier), or you may want to use the article above to stop certain IP addresses from accessing your site if you can isolate the culprits.

I hope that helps to answer your question! If you require further assistance, please let us know!

Arnel C.
n/a Points
2015-07-24 12:45 pm

Your suggestion regarding how to block a Baiduspider doesnt work. They still hit daily:

RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} Baiduspider [NC]
RewriteRule .* - [F,L]
n/a Points
2015-07-24 12:48 pm

Disregard that last post - it DOES work. I initially only checked the IP addresses that were accessing my size through AWSTATS alone. After checking the ERROR LOGS, I see that even though the Baidu spider was accessing my size, it received an error message "client denied by server configuration". This is indeed what I wanted to see.

Post a Comment

Email Address:
Phone Number:

Please note: Your name and comment will be displayed, but we will not show your email address.

21 Questions & Comments

Post a comment

Back to first comment | top

Need more Help?


Ask the Community!

Get help with your questions from our community of like-minded hosting users and InMotion Hosting Staff.

Current Customers

Chat: Click to Chat Now E-mail:
Call: 888-321-HOST (4678) Ticket: Submit a Support Ticket

Not a Customer?

Get web hosting from a company that is here to help. Sign up today!