Thank you, that was helpful.
However, using .htaccess means I would have to configure it for each website individually.
I'm looking for a way to block bad bots at the server level, which is why I'm trying ModSecurity.
At the website level, I can already block them using Cloudflare.
It seems like if we can't limit the bots, the only choice is to block them.
I'm currently using this SecRule to block bots based on the user agent, but it only blocks static files. Can you point out where to adjust it to block the entire request from that bot on the very first request?
SecRule...
Thank you for your reply.
You can refer to these samples, where many people have experienced similar issues with DDoS:
https://developers.facebook.com/community/threads/992798532416685/
https://developers.facebook.com/community/threads/473239115254449/...
Sorry I don't understand your answer. I need a SecRule to limit user agents containing "facebook" to 3 requests per 10 seconds. How can your answer help with that?
I found in /usr/local/lsws/logs/error.log
2024-06-10 12:33:01.730873 [NOTICE] Loading LiteSpeed/1.7.19 Open (lsquic 3.3.2, modgzip 1.1, cache 1.66, mod_security 1.4 (with libmodsecurity v3.0.12)) BUILD (built: Tue Apr 16 15:14:26 UTC 2024) ...
Do you think im using mod_security ver 1.4 or...
Thank you for your guidance. I tried modifying the security rule to return a 403 status instead of 429. Here is the updated rule:
# Increment the global rate limit variable for any user agent containing 'facebook'
SecRule REQUEST_HEADERS:User-Agent "@contains facebook" \...
Thank you for your response. I tested the requests from Facebook, and they are all returning status 200. There doesn't seem to be any limit in place, and there is nothing in the log.
You can test some requests by entering your URL here: Facebook Debugger. https://developers.facebook.com/tools/debug
I recently came across the following ModSecurity rule intended to limit client hits by user agent:
# Limit client hits by user agent
SecRule REQUEST_HEADERS:User-Agent "@pm facebookexternalhit" \...
Thank you for your reply, but unfortunately it does not fully address my concern.
Since bots can easily spoof responses pretending to be from Google, the best solution would be to allow only requests from Google's verified IP addresses listed in their JSON metadata.
This would ensure responses...
I'm inquiring about the possibility of utilizing CSF to block all bots while allowing an exception for Google bots. Google has kindly provided us with a list of whitelisted IPs at the following links:
Googlebot IPs:
https://developers.google.com/static/search/apis/ipranges/googlebot.json...
Hi there,
I'm hoping to get some advice on setting up a subdomain on my WordPress site.
I have created a subdomain in CyberPanel called mysub.mydomain.com. However, I would prefer if people could access my main site at...