View level of traffic with Apache access log

In this article I’ll be teaching you how to use the Apache access logs on your VPS (Virtual Private Server) or dedicated server to inspect the levels of traffic your websites are receiving.

If you’d like to know what’s been going on with your server and the visitors that are coming to your websites the Apache access logs provide a wealth of information for this. Apache is the open source web server software that runs on most Linux servers, and every request that goes to one of your websites is logged into an access log to detail vistor activity.

If you’ve been having problems with the load average on your server spiking by either following the steps in my guide on creating a server load monitoring script, or by using some advanced server load monitoring tactics, then reviewing your Apache access logs is a good thing to do.

I already covered in my article on how to determine cause of a server usage spike how you can track down a specific time range in your Apache access log to correlate it with a recent server load spike. In this particular article I’m going to focus more broadly on reviewing the entire Apache access log for your website rather than just a small portion of it.

In order to follow along with the steps in this article, you’d need either a VPS or dedicated server so that you have SSH access to the server to run the commands we’ll be covering.

View number of requests by time from Apache access log

One of the best ways to get a quick overview of your traffic is by splitting up the requests by how frequently they are happening. Using the steps below I’ll show you how you can view the number of requests per day, per hour, and per minute from your access logs.

  1. Login to your server via SSH.
  2. Navigate to your /access-logs directory with the following command, in this example our username is userna5:
    cd ~userna5/access-logs
  3. Run the following command to view what Apache access logs are currently in the directory:
    ls -lahtr
    You should get back a listing similar to this:
    drwxr-xr-x 3 root at0m 4.0K Dec 31 16:47 .
    drwx--x--x 9 root wheel 4.0K Jan 4 06:01 ..
    -rw-r----- 2 root at0m 15K Jan 9 05:09 ftp.example.com-ftp_log
    -rw-r----- 2 root at0m 3M Jan 23 13:10 example.com
  4. View Apache requests per day

  5. Run the following command to see requests per day:
    awk '{print $4}' example.com | cut -d: -f1 | uniq -c
    Code breakdown:

    awk ‘{print $4}’ example.comUse the awk command to print out the $4th column of data from the Apache access log which is the time stamp.
    cut -d: -f1 | uniq -cUse the cut command with the -delimter set to a colon : and grab the -field of data that shows up 1st before the delimiter. Then use the uniq -c command to uniquely count up the hits.

    You should get back something like this:
    6095 [20/Jan/2013
    7281 [21/Jan/2013
    6517 [22/Jan/2013
    5278 [23/Jan/2013

  6. View Apache requests per hour

  7. Run the following command to see requests per hour:
    grep "23/Jan" progolfdeal.com | cut -d[ -f2 | cut -d] -f1 | awk -F: '{print $2":00"}' | sort -n | uniq -c
    Code breakdown:

    grep “23/Jan” progolfdeal.comUse the grep command to only show hits from today from the Apache access log.
    cut -d[ -f2 | cut -d] -f1Use the cut command with the -delimter set to an opening bracket [ and print out the -field of data that shows up 2nd, then use the cut command again with the -delimter set to a closing bracket ] and print out the -field of data that shows up 1st which gives us just the time stamp.
    awk -F: ‘{print $2″:00″}’Use the awk command with the -Field delimiter set to a colon :, then print out the $2nd column of data which is the hour, and append “:00” to the end of it.
    sort -n | uniq -cFinally sort the hours numerically, and then uniquely count them up.

    You should get back something like this:
    200 00:00
    417 01:00
    244 02:00
    242 03:00
    344 04:00
    402 05:00
    522 06:00
    456 07:00
    490 08:00
    438 09:00
    430 10:00
    357 11:00
    284 12:00
    391 13:00
    163 14:00

  8. View Apache requests per minute

  9. Run the following command to see requests per minute:
    grep "23/Jan/2013:06" example.com | cut -d[ -f2 | cut -d] -f1 | awk -F: '{print $2":"$3}' |
    sort -nk1 -nk2 | uniq -c | awk '{ if ($1 > 10) print $0}'

    Code breakdown:

    grep “23/Jan/2013:06” example.comUse the grep command to only show hits from today during the 06th hour from our Apache access log.
    cut -d[ -f2 | cut -d] -f1Use the cut command with the -delimter set to an opening bracket [ and print out the -field of data that shows up 2nd, then use the cut command again with the -delimter set to a closing bracket ] and print out the -field of data that shows up 1st which gives us just the time stamp.
    awk -F: ‘{print $2″:”$3}’Use the awk command with the -Field delimiter set to a colon :, then print out the $2nd column which is the hour, follwed by the $3th colum which is the minute.
    sort -nk1 -nk2 | uniq -cSort the hits numerically by the 1st column which is the hour, then by the 2nd column which is the minute.
    awk ‘{ if ($1 > 10) print $0}’Finally use the awk command with an if statment to only print out data when the $1st colum which is the number of hits in a minute is greater than 10.

    You should get back something similar to this:

    12 06:10
    11 06:11
    16 06:12
    13 06:20
    11 06:21
    12 06:28
    12 06:30
    16 06:31
    14 06:39
    11 06:40
    15 06:52
    32 06:53
    43 06:54
    14 06:55

You should now know how to get the level of traffic on your website, either daily, hourly, or by the minute using the Apache access log.

InMotion Hosting Contributor
InMotion Hosting Contributor Content Writer

InMotion Hosting contributors are highly knowledgeable individuals who create relevant content on new trends and troubleshooting techniques to help you achieve your online goals!

More Articles by InMotion Hosting

2 thoughts on “View level of traffic with Apache access log

    1. You could possibly expand on the awk command and pipe that to a file, for example: apachelogs.csv
      Then, you can import the .csv file into a spreadsheet. Once there, you can break down and manipulate the data to make pivot tables, summaries, counting,etc.

Was this article helpful? Join the conversation!