I run a website where visitors can submit information about themselves. Recently, a lot of URLs are being submitted that are found in SpamHaus (spam and phishing) and Google Safe Browing list (fantastic for malware, many URIs are never in SpamHaus but are in Google).
I am on CentOS with Cpanel/WHM and mod_sec 2.5.13 (latest stable as of now).
I want to create a rule that does this:
- Check all GET and POST requests to *specific files* on my server, not overall general stuff
- If a URI is found in the GET or POST (through "ARGS"?) then check it against
(a) FIRST, a local list if possible -- blacklist will block it, whitelist will skip all rules
(b) IF not found in either list, then go for RBL (SpamHaus etc)
(c) If not found there, then also check Google Safe Browsing list
Thanks!