
A couple of weeks ago SiteGround notified me that I had reached my allowed daily executions. Since I have almost two dozen sites on my account, half of which are dev sites locked behind coming soon pages, I asked for some help in pinpointing the problem.
George M. was extremely helpful in showing me what was going on and suggesting what to do about it. Overall, I was suffering a bot infestation (more than six million visits from unknown bots), and my login page for the Author-izer site was particularly hard hit.
He also provided me with a handy trick for deterring bots. At first, when he said the code would “make bots crawl themselves,” I thought he was joking, but in fact that’s exactly what the provided code does. It directs all bots except those specifically identified to 127.0.0.1, which is localhost. This is wonderfully sneaky and elegant, and I have added it to the .htaccess files for all of my sites.
# BEGIN Tell bots to go crawl themselves <IfModule mod_rewrite.c> RewriteEngine On RewriteCond %{HTTP_USER_AGENT} ^$ [OR] RewriteCond %{HTTP_USER_AGENT} (bot|crawl|robot) RewriteCond %{HTTP_USER_AGENT} !(bing|Google|msn|MSR|Twitter) [NC] RewriteRule ^/?.*$ "http\:\/\/127\.0\.0\.1" [R,L] </IfModule> # END Tell bots to go crawl themselves
I also took a closer look at all of my sites and discovered that I had deactivated iThemes Security Pro on the Author-izer site and not actually installed it on some of the other sites, so I spent some time fixing that problem and then added Let’s Encrypt SSL certificates to all the sites for good measure.
I’m looking forward to inspecting my statistics again in a few weeks to see how it’s working.
And political spenders, who often outbid brands on targeted inventory, are uniquely vulnerable to digital fraud and bots.