the anti-dos mechanism.
@Rejetto: After doing some research on how a typical '
Denial-of-Service' (DoS) is done, which basically consists on overloading a server, I want to contribute with my overall opinion about the Anti-DoS feature.
IMHO, the current implementation is an overkill (I mean, is nice you have implemented some anti-DoS, but for me it's way too-much over-protective). My ideas are:
A) Have different limits for
upload, than for
download.
B) Give
downloads a more relaxed limit than the current one.
C) Count
how many requests were done every 5 sec (read below)
D) Limit
uploads to only one per 1/second or one per 5/secs.
E) Limit the
repetition of downloading the same file, 1 file per 10/secs.
F) Limit/slowdown the request of
serving pages (not the internal elements)
G) Have a '
maintenance mode' for extreme cases, limiting everyone but the admin
Some points are self-explained.
About point "C" (counting how many request were done every 5 sec, for the same SessionID), if you give the server admin an option, the admin could configure his server according how many pages are typically needed to be requested (when doing normal usage). For example, seeing Danny's example, if he has a photo gallery, and his page needs to serve 20 thumbnails at once (20 is an invented number for this example). Then, it would be normal to have 20 requests in a 5 seconds time-frame.
About point "F", if a page needs to serve only 10 elements (1 html + 1 css + 2 js + 6 images), then it's normal to have 10 requests in 1 or 2 seconds. So, an even smarter (automatic way), would be HFS automatically counting how many elements the page have (when parsing the template), and applying on-the-fly the limits (if the requests exceed the elements found for the requested page).
This would be a behavior limit, since HFS knows how many elements needs the page, and it would let HFS distinguish a legitimate user from an attacker. This, along with limiting how many pages could be served per second, could let us have a more relaxed download rate for elements (like images, css, js), but a more strict limit to request new pages, for example, when exploring folders, we could only serve 1 every 2 seconds, avoiding (or delaying) when the user opens several tabs at the same time.
About point "G", finally, if HFS detects is being attacked (in an very extreme/hard way), then the server could automatically go in 'maintenance mode' for 1 hour (applying a more strict request rate for everyone during that hour), and in that time-frame, it will only allows to login the admin (so he can take care of his server and review the configuration).
This is a good read I recommend about DoS. Also, please read THIS (it explains why 'Rate limitation is not the way to go').Well, I leave you these ideas.
Do what you think is best...
Cheers,
Leo.-