Hi! What are the real limitations of HFS on simultaneous file downloads? After 15 users simultaneously download one file over the Internet, the connection is lost on my server. Changed the router, then changed the provider. The situation has improved, but the problem is not completely solved. How much load can HFS handle ? Thanks.
40+ simultaneous RQ's which could be done by 1 to 40+ users.
Try the simultaneous downloads limit (set at 2 if internet faster than 150 megabit) or netlimiter or set properties of ethernet card to 100 megabit to reduce flooding. (hfs speed limit is worse. hfs connections limit is worse.)
See my WatchCat script to regain connectivity automatically.
See my Stripes-Oneshot template for lower RQ cost.
And
this.If your router supports iptables then:
iptables -I INPUT -d 192.168.1.200 -m connlimit --connlimit-above 40 -j REJECT
whereby 200 would be changed to your server's lan ip
whereby 40 would be changed much lower if incoming rate is flooding at more than 150 megabits.
Weird exception REJECT used for wan even though we should use DROP for strangers; and, that's because, for http server we do want seamless retry that REJECT makes possible (not the black hole of DROP). If the INPUT chain won't do it, try the FORWARD chain.
Alternative for iptables-crippled ('diet' iptables) routers is use hashlimit instead (dd-wrt, etc...).
HFS limits menu (if fast internet)
simultaneous downloads limit = good
inbuilt speed limit = worse* (use netlimiter or ethernet port speed setting)
inbuilt connections limit = worse* (use iptables connlimit or hashlimit)
*On fast internet with your single thread Apache/HFS/Nginx for Windows, by the time a flood has reached the http server, it was too late; so, the flood has to be throttled before then. If 100 megabit or lower, a single thread server can do many simultaneous downloads. Or, if gigabit/flooding then 1 or 2.