Author Topic: [b260]Download archive file from folder bigger than 4GB then HFS crash!  (Read 4357 times)

0 Members and 1 Guest are viewing this topic.

Offline Novox

  • Occasional poster
  • *
  • Posts: 84
    • View Profile
Hello dev,

I have found some bugs when I want to download an archive file from a folder that bigger than 4GB. After I clicked download as zip, then HFS process running at 20% CPU usage (look like it try to zip a folder) and not responsible. I need to manually restart it.

I have tried to download as zip from a folder smaller than 4GB and found that the CPU usage problem is not exist. I have downloaded a 1GB folder, download dialog is instantly appear when I was clicked at the download as zip button without taking some CPU.

Thank you for a great file server :)

This problem refers to this topic that I was created to avoid this problem by filter an archive download when the folder bigger than specific size: http://www.rejetto.com/forum/index.php/topic,8609.0.html
« Last Edit: July 15, 2010, 06:54:31 AM by Novox »

Offline rejetto

  • Administrator
  • Tireless poster
  • *
  • Posts: 12949
    • View Profile
HFS activity on archive folders depends on the number of files, not on the size.
from windows explorer, right click on the folder, properties, and tell us how many files are inside.

Offline Novox

  • Occasional poster
  • *
  • Posts: 84
    • View Profile
Here is the screenshot of number of files.

First one, I tried to download as tar for Adobe with size 4.45GB and 4 files then HFS Freeze.

Second, I tried to download as tar for Greenzone Movie that contains 46 fils in it with 4.19GB size but HFS isn't freeze.

Maybe the problem is happened when a single file has a size bigger than 4GB? Like a file .7z one in the first screenshot?

Any idea?
« Last Edit: July 17, 2010, 02:36:01 AM by Novox »

Offline Novox

  • Occasional poster
  • *
  • Posts: 84
    • View Profile
A new screenshot, I have proof that when I want to download a 4.5GB single file as tar, it will make HFS Freeze.


Offline Mars

  • Operator
  • Tireless poster
  • *****
  • Posts: 1890
    • View Profile
why do not you share the file into several parts with auto-extract?

Offline Novox

  • Occasional poster
  • *
  • Posts: 84
    • View Profile
why do not you share the file into several parts with auto-extract?

What does it means auto-extract?

All the file in my server is from torrent download. I can't manage or change the file structure while seeding. Otherwise, the torrent will rehash and then corrupted.

Offline crazyboris

  • Tireless poster
  • ****
  • Posts: 140
    • View Profile
do you still use FAT as "filesystem"??
FAT dont support bigger files than about 4 gig.

maybe thats it.

Offline Novox

  • Occasional poster
  • *
  • Posts: 84
    • View Profile
I used NTFS as my filesystem.  :)

Offline rejetto

  • Administrator
  • Tireless poster
  • *
  • Posts: 12949
    • View Profile
i had success at reproducing the problem, now i can no more.
i'll try more.
i guess i can confirm there's a bug in hfs.

Offline Novox

  • Occasional poster
  • *
  • Posts: 84
    • View Profile
Yes, it is. Is it possible to make HFS to archive the file larger than 4GB by seperate it into 2GB/part instead of download a single file?

Offline rejetto

  • Administrator
  • Tireless poster
  • *
  • Posts: 12949
    • View Profile
ok, found the bug, and fixed in next release (263)

Offline Novox

  • Occasional poster
  • *
  • Posts: 84
    • View Profile
ok, found the bug, and fixed in next release (263)

Yeah cool! Thanks alot!