rejetto forum

Testing build #100

rejetto · 71 · 51558

0 Members and 1 Guest are viewing this topic.

Offline rejetto

  • Administrator
  • Tireless poster
  • *****
    • Posts: 13510
    • View Profile

Offline Rafi

  • Tireless poster
  • ****
    • Posts: 452
    • View Profile
yes...

edit: I really don't see why you should use this, unless you have some strange problems like those described before or  in case of large size file(s) (over 4G ? ) .
« Last Edit: May 25, 2007, 06:00:57 PM by Rafi »


Kalle B.

  • Guest
how hard is the freezing? does HFS become unusable?
i may try to find a multithreading solution.

Yes the whole HFS freezes, ongoing downloads, GUI, everything. It lasts for the whole time that it takes HFS to get the list of files from the network drive. The time depends on connection speed and the number of files & folders on the drive. With 10M LAN-connection and little amount of files it's like a minute or two but when the amount goes into thousands, it's over half an hour...I haven't actually waited any longer that that but just killed the hfs process from the task manager..

Multithreading would be great. I think that would fix this kind of freezing perfectly..


Offline rejetto

  • Administrator
  • Tireless poster
  • *****
    • Posts: 13510
    • View Profile
ok, i will work on it.

yes...

sadly that wouldn't solve the freezing problem :(


Offline Rafi

  • Tireless poster
  • ****
    • Posts: 452
    • View Profile
will it help in cases of very large (>4G ) resulted  archive file ?


Offline rejetto

  • Administrator
  • Tireless poster
  • *****
    • Posts: 13510
    • View Profile
nope.
"archiving" takes no time, it's real-time.
it's the listing that takes time.


Offline Rafi

  • Tireless poster
  • ****
    • Posts: 452
    • View Profile
rejetto,  I think  I am not making myself clear ... when the file reaches the "other" side - it is being created as ONE file, and it may be larger the 4G. Would't  this be a problem if the remote OS does  not supprt it ?


Offline MarkV

  • Tireless poster
  • ****
    • Posts: 764
    • View Profile
rejetto,  I think  I am not making myself clear ... when the file reaches the "other" side - it is being created as ONE file, and it may be larger the 4G. Would't  this be a problem if the remote OS does  not supprt it ?


Indeed, that's a point. Speaking of FAT16, it's even down to 2GiB. And these limits apply for both download and upload.

Rhetorical question: What if I download ~folder.tar >4GiB and have FAT32? Baaad crash.

Solution #1: HFS detects filesystem and disables the feature in case if FAT. Is there a way to detect the client's filesystem?
Solution #2: ~folder.tar generally breaks into 2GiB pieces to cover all eventualities.

Code: [Select]
FAT16    - max. file size 2GiB
FAT32    - max. file size 4GiB-2B
NTFS4/5 - max. file size 2TiB (currently limited to volume size 2TiB)

see also http://www.ntfs.com/ntfs_vs_fat.htm


MarkV
http://worldipv6launch.org - The world is different now.


Offline Giant Eagle

  • Tireless poster
  • ****
    • Posts: 535
  • >=3 RAWR!
    • View Profile
    • RAWR-Designs.com
A simple solution would be to split the whole archive in 1GiB files.. "-.folder.part1.tar" "-.folder.part2.tar"

But anyway.. >_< are you really going to download an entire file server that is over 4GiB's? I dont think so. Or can someone give me an example situation?


Offline Rafi

  • Tireless poster
  • ****
    • Posts: 452
    • View Profile
back to the beginning... that 's what I suggested in the first place - have an option to define if and how to split the archives...


Offline rejetto

  • Administrator
  • Tireless poster
  • *****
    • Posts: 13510
    • View Profile
first, TAR doesn't support multi volume
second, do you really think i should spend time supporting mega-archives for FAT32 ?


Offline Rafi

  • Tireless poster
  • ****
    • Posts: 452
    • View Profile
Quote
second, do you really think i should spend time supporting mega-archives for FAT32 ?
edit: I really don't see why you should use this, unless you have some strange problems like those described before or  in case of large size file(s) (over 4G ? ) .
as I said - no...
... you could/might like to spend it on "tuning" the current implementation like - archiving "permissions" per user, and/or per folder,  support for selecting multiple/single folder-targets for archiving in the template (remote side, like the extra  column I mentioned before) etc.. :)

Again - a VERY nice feature !!!



Offline maverick

  • Tireless poster
  • ****
    • Posts: 1052
  • Computer Solutions
    • View Profile
second, do you really think i should spend time supporting mega-archives for FAT32 ?

No rejetto.  That time can be spent doing other things.  I haven't used a fat32 system for years. 
maverick


Offline MarkV

  • Tireless poster
  • ****
    • Posts: 764
    • View Profile
first, TAR doesn't support multi volume
second, do you really think i should spend time supporting mega-archives for FAT32 ?

No, but HFS should at least have an option where you select the used filesystem. If you select FAT or FAT32, HFS disables ~folder.tar for folders larger 2GiB or 4GiB-2B. At least prevent crashes or corrupt files in these cases. Maybe even a warning for the client:
Code: [Select]
Please note that in case your filesystem is FAT32, folder archives larger than approximately 4GiB can not be downloaded correctly.

MarkV
http://worldipv6launch.org - The world is different now.


Offline TSG

  • Operator
  • Tireless poster
  • *****
    • Posts: 1935
    • View Profile
    • RAWR-Designs
Ye good idea Mark V, i wouldn't go coding away to make fat 32 support, ntfs has been the way for windows pc's for at least the last 7 years... even my sisters old windows 2000 machine is on ntfs lol. Not many users would use anything older than windows 2000 nowadays... if they do then they are either on a system that has only like 6gb *wild guess of minimal hdd space* and would be completely useless to do 4gb files on, or they REALLY should think about buying a new computer lol. I will admit, my external hard drive i use is formatted in fat 32, but i wouldn't go putting files bigger than 4gb on it anyway haha.