-
-
Notifications
You must be signed in to change notification settings - Fork 197
Open
Description
Description
I'm not sure if this is a bug, but I recently discovered that a whitelisted domain in custom.d/whitelist-domains.conf is bypassing a blacklisted user agent in custom.d/blacklist-user-agents.conf.
Steps to reproduce
- Add a domain in
custom.d/whitelist-domains.conf:
SetEnvIfNoCase Referer ~*yourdomain\.com good_ref
- Add a bad user agent in
custom.d/blacklist-user-agents.conf:
BrowserMatchNoCase "^(.*?)(\bMyVeryBadUserAgentName\b)(.*)$" bad_bot
-
Restart apache
-
Test with curl
curl -I -A "MyVeryBadUserAgentName" http://yourdomain.com -e "http://yourdomain.com"
HTTP/1.1 200
Question
I would expect a 403 response because I do not want any request from the bad bots on my server.
Am I right or I am missing something here?
Workaround
I do not know if there is a workaround this issue without modify the globalblacklist.conf, but if we replace:
8441 <RequireAny>
8442
8443 Require env good_ref
8444 Require env good_bot
with:
8441 <RequireAny>
8442 <RequireAll>
8443 Require env good_ref
8444 Require env !bad_bot
8445 </RequireAll>
8446
8447 Require env good_bot
it seems to do the work:
curl -I -A "MyVeryBadUserAgentName" http://yourdomain.com -e "http://yourdomain.com"
HTTP/1.1 403
Metadata
Metadata
Assignees
Labels
No labels