rejetto forum

Software => HFS ~ HTTP File Server => router & port problems => Topic started by: vladimirov70 on March 19, 2021, 03:07:01 PM

Title: What are the real limitations of HFS on simultaneous file downloads?
Post by: vladimirov70 on March 19, 2021, 03:07:01 PM
Hi! What are the real limitations of HFS on simultaneous file downloads? After 15 users simultaneously download one file over the Internet, the connection is lost on my server. Changed the router, then changed the provider. The situation has improved, but the problem is not completely solved. How much load can HFS handle ? Thanks.
Title: Re: What are the real limitations of HFS on simultaneous file downloads?
Post by: danny on March 19, 2021, 08:38:55 PM
Hi! What are the real limitations of HFS on simultaneous file downloads? After 15 users simultaneously download one file over the Internet, the connection is lost on my server. Changed the router, then changed the provider. The situation has improved, but the problem is not completely solved. How much load can HFS handle ? Thanks.
40+ simultaneous RQ's which could be done by 1 to 40+ users. 
Try the simultaneous downloads limit (set at 2 if internet faster than 150 megabit) or netlimiter or set properties of ethernet card to 100 megabit to reduce flooding.   (hfs speed limit is worse.  hfs connections limit is worse.)

See my WatchCat script to regain connectivity automatically.

See my Stripes-Oneshot template for lower RQ cost.

And this. (http://rejetto.com/forum/index.php?topic=13429.msg1066801#msg1066801)

If your router supports iptables then: 
iptables -I INPUT -d 192.168.1.200 -m connlimit --connlimit-above 40 -j REJECT
whereby 200 would be changed to your server's lan ip
whereby 40 would be changed much lower if incoming rate is flooding at more than 150 megabits.
Weird exception REJECT used for wan even though we should use DROP for strangers; and, that's because, for http server we do want seamless retry that REJECT makes possible (not the black hole of DROP).  If the INPUT chain won't do it, try the FORWARD chain.
Alternative for iptables-crippled ('diet' iptables) routers is use hashlimit instead (dd-wrt, etc...).

HFS limits menu (if fast internet)
simultaneous downloads limit = good
inbuilt speed limit = worse* (use netlimiter or ethernet port speed setting)
inbuilt connections limit = worse* (use iptables connlimit or hashlimit)

*On fast internet with your single thread Apache/HFS/Nginx for Windows, by the time a flood has reached the http server, it was too late; so, the flood has to be throttled before then.   If 100 megabit or lower, a single thread server can do many simultaneous downloads.  Or, if gigabit/flooding then 1 or 2. 
Title: Re: What are the real limitations of HFS on simultaneous file downloads?
Post by: vladimirov70 on March 20, 2021, 04:14:05 PM
Thank you, friend! This is valuable information for me. I will apply it.
Title: Re: What are the real limitations of HFS on simultaneous file downloads?
Post by: danny on March 21, 2021, 08:54:10 AM
Tmeter free version http://www.tmeter.ru/en/ can do up to 4 filters, such as speed limit HFS (to reduce flooding).
 I see bandwidth shaping aka speed control, but don't see bruteforce protection

------------------
also
Netbalancer https://netbalancer.com/download versions older than 9.3 would allow 3 free filters (including speed limit); and the April 2016 v9.2.7.839 version looks like it is meant for Window 7. 
Likewise I see speed limit but don't see bruteforce protection

------------------
I wonder if there are other free ways to do bandwidth-shaping with Windows? 


P.S.  There is another: 
Linux virtual machine running HFS on wine (because running iptables in front of HFS). 
Title: Re: What are the real limitations of HFS on simultaneous file downloads?
Post by: vladimirov70 on March 21, 2021, 02:18:13 PM
Thnks! (http://images.vfl.ru/ii/1616336212/5e039a9a/33760467_m.jpg) (http://vfl.ru/fotos/5e039a9a33760467.html)