rejetto forum
Software => HFS ~ HTTP File Server => Topic started by: user_hfs on October 09, 2011, 01:58:32 PM
-
Hello! I ask this question, which arises from HFS hang when accessing the user to the folder in which there are about three thousand files, 28 GB volume?
Although when referring to other folders in which objects are placed in folders, then HFS works fine normally.
Can we somehow solve this?
I would appreciate an answer
-
Dispatcher these three hundred file in sub-files to limit the number of file in the list, hfs behaves as if it met one "out of memory"
-
Thanks for your reply , Mars
Do I understand what the big list, the more difficult to cope with the HFS? The list should not be longer than 300 files? Or does it depend on the characteristics of a server?
-
Hfs created an index form in memory for every found file, the more the list is long, and the more the space report is important, and the working time also, during this search of files hfs can seem blocked.
-
Thank you very much for your reply, Mars.
I think this must be some way to optimize a large list of files. While on the other hand, a list of three thousand to one page - not very convenient. But somebody like that, so you should consider to optimize, unless of course it is possible