Thanks for the directions to
https://github.com/rejetto/hfs2/discussions/ I agree that a multi-group discussion area is more sustainable, because many more topics will get a more favorable proportion of actual people traffic.
I guess there's about 2000 HFS2X servers. There are daily downloads of updated HFS2X, by real people, but not in large numbers. So far as I know, HFS2X is the only windows server using its own code as the distribution server, without a CDN buffer. The uses for HFS2X are niche:
The main specialty is to catalog a lot of files any way you want to. The streaming-list beats the performance of list-before-draw and pagination schemes. The HFS2X update is router-cooperative so it doesn't need speed limit, yet it will find and list your files really fast.
With a web file server, bot traffic doesn't have much effect (mainly noise). However, forum software is a much different case, with need of more topics on site causing more real people traffic, proportionately. I have noticed several companies hosting bots in private while advertising defense against bots in public. It reminds me of:
https://cybernews.com/security/scam-bots-hitting-website-can-lead-to-financial-loss/ Anti-bot setup with HFS2x:
Currently, the zip with updated HFS2x includes a little txt note with anti-bot filter examples you can use in Events (menu). Also, templates are updated to decreased verbosity for fast recovery, less data and less cpu time. For files, a recommendable organization is Unbrowsable root folder (left panel, right click /, flags, uncheck browsable), for the purpose of access forwarded (to browseable subfolder) by DNS. Currently, I have 5 websites (1 hfs server, 1 dynamic dns, and 5 forwarding address that help by specifying folder and port number); and the method is helpful if your ISP blocks port 80 (forwarder answers on 80 and sends to the real folder and port). The template used by
http://software.run.place is actually an edited stripes.tpl using the 'diff template' function to show just for that folder/site. Also works is making a copy of either throwback.tpl or stripes.tpl named as hfs.diff.tpl putting it into a high volume (or public) folder for which the fast little template is helpful at saving cpu work and data.
Except for a banip compatible router (or similar) with curated filter lists installed, there really isn't a 'one fell swoop' approach to dropping bot traffic. Behavior filters, such as use real browser, ban hacky request, forward to a different port, unbrowsable root, can do a cumulative 12% apiece, approximately. Not one thing will have a big effect, but the combination does.