Too Many Connections Error on Quiet Site
A user today reported receiving error - PDOException: SQLSTATE  Too many connections in lock_may_be_available().
According to google analytics our site is fairly quiet. How do i fix this or is a shared hosting issue?
I’m sorry but I still do not understand the idea that reducing the number of connections will result in not receiving a max connections exceeded error.
The current max_connections value of 750 is a default value applied by the setup of drupal either natively or by the inmotion techs (I have not touched it.)
You say that reducing this will fix the max connections error problem but that logic indicates that it would not solve my problem, but it would solve the problem of other users on my system getting the same error (in the event where my high setting might have hogged resources and caused problems for *them*.)
You imply that my issue is potentially caused not by my system, but by the resource usage of other sites that are sharing my hosted resource.
That is actually what I am trying to determine. Is this error a one-off caused by someone else’s unthrottled usage affecting me or is it an error with my own site config. I don’t see how throttling my own usage below 750 helps me, though I can see how it could though help others.
I am beginning to suspect that the relatively high value of 750 has been set by inmotion techs for some reason, and if it is present across all sites sharing the resource, then this could be what is causing the issue. Either that or someone else has an unlimited setting and that is causing them to hog resources.
That would indicate that the fix is for **everyone** to reduce their setting. Me alone reducing my setting isn’t going to help my situation (though it might help others sharing my resources.)
From what I have googled by the way the usual response is to increase (not decrease ) max_connections and the usual place to do it is /etc/my.cnf using this syntax:
That said 750 is already a high setting (but I reiterate – I don’t see how reducing it can help me, though it may help others.)
Another smoking gun is apparently when search engines visit sites, and the fix to that is to add the 'Crawl-delay' parameter in the robots.txt or to set it to a higher number of seconds.
Mine was 10 seconds and I have moved it to 15.
With all of the above in mind I am beginning to suspect the issue is not with my site, but with other sites using the shared resource one or more of which chewed it up.
Thanks again for your response