This plugin hasn’t been tested with the latest 3 major releases of WordPress. It may no longer be maintained or supported and may have compatibility issues when used with more recent versions of WordPress.

Booter – Bots & Crawlers Manager


Booter – Bots & Crawlers Manager is a preventative measure (treatment in advance) and treatment of damages caused by crawlers and bots.
The plugin uses a number of existing technologies which are known by crawlers and bots and takes them one step forward – smartly and almost completely automatically.
To allow the plugin to function correctly, you must follow the instructions and manually enter some data (which must be done by a human being to avoid errors).

At the prevention level

  • Booter allows you to manage and create an advanced dynamic robots.txt file.
  • View a 404 error log to see the most common bad links.
  • Blocking bad bots that cause high server loads due to very frequent page crawls, or are used to search for security vulnerabilities.

At the treatment level

  • Booter allows you to limit the amount of requests from crawlers and bots, if or when they exceed the specified amount of requests per minute, it will be rejected for a specified period of time.
  • Rejecting links that we do not want in the fastest way, not by just blocking but by sending the appropriate HTTP status code to make search engines forget them.

Instructions for use in case of damage treatment

  1. Activate the plugin.
  2. Enable the 404 error log option.
  3. Set the access rate limit.
  4. Watch the 404 log, try to find common parts in the URLs that repeats most often.
  5. Enter the common parts to the “reject links” page, and ensure the rejection code is 410.
  6. Clear the 404 error log.
  7. Repeat the process once every few hours until the 404 error log remains blank.
  8. Check the status of your website’s index coverage every few days.


  • Plugin General Settings
  • Robots.txt Management
  • Reject Links Settings


  1. Upload booter-crawlers-manager folder to the /wp-content/plugins/ directory
  2. Activate the plugin through the ‘Plugins’ menu in WordPress
  3. The plugin will start rate limiting as soon as it is activated, however it is recommended to update the settings to suit your needs, under ‘Settings’ -> ‘Booter – Crawlers Manager’ menu


سبتمبر 12, 2023
I got this for Rate Limiting and I must say it works. I had so many bots scraping my site it crashed every day. However, when a legit user accidently got booted, there was no way to clear them. Also, in the forum there are some great questions (like “how to whitelist IP”) but they are left unanswered so I have to give it 4 stars.
ديسومبر 7, 2022
Kudos to this development team for creating a very SIMPLE, highly COMPREHENSIVE, and phenominally effective plugin. We tested a dozen and this is the best.
October 30, 2021
For who just hate bad bots or some bots in particulars, this plugin is a must have
ماي 1, 2020
It finally has stopped some bots that made a lot extra load on my web-site. Visitors stayed the same, number of visits haven’t changed. Load became two times less. Thank you. I use only two options in your plugin: Block Bad Robots Rate Limiting Other options are disabled. But i still get a message: “Booter – Bots & Crawlers Manager has found the following 404 redirect plugin active: bla bla bla. Such redirects prevents Booter from detecting 404 errors as well as being the wrong way to handle broken links. A 404 error is the correct response to invalid URLs while a redirect (30X) tells search engines that the URLs exists (in another location). We therefore recommend that you disable these plugins.” Is it possible to disable this message if i don’t use 404 Logging function? Please add such possibility. This message is too annoying and i need my instaled 404 redirect plugin.
Read all 11 reviews

Contributors & Developers

“Booter – Bots & Crawlers Manager” is open source software. The following people have contributed to this plugin.




  • Move additiona bots list to a remote list


  • Fix rare crash of the UI


  • Fix rate limited not properly detecting excluded useragents


  • Fix scheduled task not setting properly


  • Fix bots list not updating


  • Fix regression introduced in version 1.5


  • Added options for weekly and monthly 404 log report
  • Added option to exclude user agents from rate limiting
  • Updated UI components
  • Updated bad bots list
  • Server IP will be excluded from rate limiting by default