So, ReverseDOS 3.1 is almost done. I'm starting some regression testing as a finish up a few more pieces of functionality.
Big changes:
- Again, thanks to Phil Haack for the great idea, I'll be adding:
if (Request.IsAuthenticated()) return;
to the filtering logic. If you're an authenticated user, it's safe to assume that you're not a spammer and therefore ReverseDOS will skip the process of scanning filters. - Along those same lines, if the request is just an HTTP get with no referrer and no querystring, then there's nothing to scan for either.
- <trustedAddresses> will now only track <address> entries (i.e., ipAddresses) and not <directory> entries. My hunch is that most people won't need this element much anymore in their config - but I'm leaving it just in case people need/want to trust specific addresses.
Medium Changes:
- I'm providing the ability to build filters that JUST scan specific 'attributes' of an incoming request. The default will be to check all aspects of an incoming request (as it is now), but on the off chance people want to be a bit more selective they'll now have the ability to specify an attributes attribute for each filter. Available attributes include: IpAddress, Referrer, QueryString, UserAgent, FormData, or All. So, if you've got a spammer coming at you from bwdow.com but they're only referring traffic you could create:
<filter attributes="referrer">bwdow.com</filter>
and the rule will only work against referring traffic. (Such that I could say, in my blog: bwdow.com is teh sux0r - they kept linking to my site as being one of the best designed sites on the web, but it was all lies. And then somebody could reply: "yeah, bwdow.com sux!!" and not get ReverseDOS-ed. -- Obviously not the best example... but you get the idea). - I'm also noticing that spammers are LAZY, and that they like to find one or two locations on a site, and beat the HELL out of it. So I'm adding some additional new fire-power that will let ReverseDOS users 'lock' down entire parts of their site against certain types of traffic. For example, my old blog (in .Text) is still up on my site 'cuz I haven't had time to do all of my redirection properly - but I've disabled comments, and there's REALLY no need for anyone to be posting to that directory. So:
<rules> <deny verbs="post">/blog/</deny> </rules>
will stop all post traffic to that directory. As with filters, rules can be Regex or simple text to make it easy to search for specific directories/paths or for patterns as needed.
You might also have noticed that the entry for that rule was <deny>. Yup, you guessed it, with rules you've also got the ability to specify <allow> rules that trump any of your filters. So, if you want to ALWAYS allow any posts to your /stuffCanNotBeLostHere/ directory, you could create a rule like the following:<allow verbs="post">/stuffCanNotBeLostHere/<allow>
and you wouldn't have to worry about ReverseDOS killiing any posted data to that directory.
Of course, while the <allow> rules will be beneficial in a handful of instances, the <deny> rules should make overall traffic control quite a bit easier. At present, the verbs I'm working with are: Head (as in HTTP HEAD), Get, Post, Query (i.e., is there a querystring present?), Refer (which doesn't count if the referrer is the local/current site), and Proxy. So, if you find that spammers keep proxying referral spam to a specific directory:<deny verbs="refer,proxy">/favSpammerDirectory/</deny>
My hunch is that people won't use rules all that often - but in some of the REALLY nasty spam attacks I've seen on my sites and from a few people who've had problems, rules will provide some MAJOR relief in some specialized scenarios. - I'll also be adding (shortly - the only remaining functionality to be added (then it's allll testing from there on out)) some goodness to allow for loading config files using the System.Web.Hosting VirtualPath goodness - to make ReverseDOS a bit easier to use with SubText.
Small changes:
- Each filter now has an additional attribute (for which I'm still working on a good attribute name). I'm currently calling the attribute 'stall'... and it's a boolean, which indicates whether traffic matching the pattern should be 'stalled' (i.e. ReverseDOS-ed, or just immediately 403-ed). The default is to 'stall' the request - remember: denying spammers access to your site is a big win, but if you can trap their bots for 20-40 seconds, you're doing the rest of the world a favor - but my thought is that this attribute might come in handy when you're creating filters that you think might occasional snag legitimate users.
- I'll likely be creating a new blog: updates.reversedos.com that people can subscribe to which will just notify them of updates to the product. (I might also allow people to sign up and give me their email addys, but I think most people would prefer an RSS feed that rarely updates to handing out their email addy - I know I would.)
Might I suggest a behavior attribute?
filter behavior="stall"
filter behavior="immediate"
Posted by: Adam | August 28, 2006 at 10:58 PM
Adam, that's PERFECT.
Posted by: Michael K. Campbell | August 29, 2006 at 08:22 AM