rejetto forum

folder archive

MarkV · 18 · 5564

0 Members and 1 Guest are viewing this topic.

Offline MarkV

  • Tireless poster
  • ****
    • Posts: 764
    • View Profile
Just a little question concerning folder archive.

As I understood it, the tar archive is created and served on the fly. But what happens if data/files etc. are changed/created/deleted while serving a folder .tar? How will HFS react? Can the archive corrupt?
http://worldipv6launch.org - The world is different now.


Offline rejetto

  • Administrator
  • Tireless poster
  • *****
    • Posts: 13510
    • View Profile
while the tar is served, the single file is open, thus changes(write) are not permitted by Windows.
but if you stop the download, and restart in the middle of a file, the resulting file will be corrupted.

what i could do is, if the request includes a date, check if the timestamp of file in the middle of the resume point is newer than the request...
in such case i may decide to not allow the resume.
this also mean that if you downloaded a Gig of other files, that will be lost.
opinions?
« Last Edit: November 26, 2008, 09:34:22 AM by rejetto »


Offline maverick

  • Tireless poster
  • ****
    • Posts: 1052
  • Computer Solutions
    • View Profile
what i could do is, if the request includes a date, check if the timestamp of file in the middle of the resume point is newer than the request...
in such case i may decide to not allow the resume.
this also mean that if you downloaded a Gig of other files, that will be lost.
opinions?

Please don't even think about not allowing resume when downloading.  That is an important feature IMHO.
maverick


Offline rejetto

  • Administrator
  • Tireless poster
  • *****
    • Posts: 13510
    • View Profile
in such case, the file will just result corrupted inside the archive


Offline maverick

  • Tireless poster
  • ****
    • Posts: 1052
  • Computer Solutions
    • View Profile
in such case, the file will just result corrupted inside the archive

I see your point.  Maybe not allowing resume for "tar archives" might be the best solution.  To be honest, I haven't seen this type of problem with a tar archive and don't know if it is a common problem or not.
maverick


Offline rejetto

  • Administrator
  • Tireless poster
  • *****
    • Posts: 13510
    • View Profile
the point is: if you never meet the problem, then you'll also be always allowed to resume.
forbidding is only for cases where a file would get corrupted.
problem is: only that file, not the whole archive, that's why the question is debated.
hard or impossible to know if the downloader would prefer restarting or not the download from beginning.
IDK if asking is possible, but even asking would be very confusing for 90% of people, i guess.


Offline maverick

  • Tireless poster
  • ****
    • Posts: 1052
  • Computer Solutions
    • View Profile
problem is: only that file, not the whole archive, that's why the question is debated.

Well then, why not let the user (downloader) decide on what he wants to do --  Accept the resumed download as is containing the corrupt file (that could be downloaded seperately at a different time) or re-download the whole archive again from the beginning.

Would HFS be smart enough to:
1. Notify the downloader with a popup or something indicating there is a corrupt file in the archive before resuming and also let him know what the corrupt filename is.
2. Give the downloader the option to either cancel the resumed download containing the corrupt file, continue the resumed download as is with the corrupt file,  or start the download process again from the beginning.


maverick


Offline MarkV

  • Tireless poster
  • ****
    • Posts: 764
    • View Profile
Thank heavens tar is not a solid archive type.

Another option would be to duplicate the corrupt file inside the archive (with a new name), I mean, d/l it a second time, this time uncorrupted.
http://worldipv6launch.org - The world is different now.


Offline rejetto

  • Administrator
  • Tireless poster
  • *****
    • Posts: 13510
    • View Profile
Ok guys, i tested by using wget, firefox, downThemAll, FDM.
All of them didn't report any timestamp, asking to resume.
So there's no way for me to know if there would be a corruption or not.

Quote
Another option would be to duplicate the corrupt file inside the archive (with a new name), I mean, d/l it a second time, this time uncorrupted.

this is a very nice idea, although unsuitable due to my statement above.


Offline Wasserfloh

  • Occasional poster
  • *
    • Posts: 81
    • View Profile
Ok guys, i tested by using wget, firefox, downThemAll, FDM.
All of them didn't report any timestamp, asking to resume.
So there's no way for me to know if there would be a corruption or not.
What speaks against one timestamp in the file name?


Offline rejetto

  • Administrator
  • Tireless poster
  • *****
    • Posts: 13510
    • View Profile

Offline MarkV

  • Tireless poster
  • ****
    • Posts: 764
    • View Profile
Either the timestamp is asked after finishing the d/l, or it is not asked at all. :(

Couldn't you check at least the filesize?
http://worldipv6launch.org - The world is different now.


Offline rejetto

  • Administrator
  • Tireless poster
  • *****
    • Posts: 13510
    • View Profile

Offline MarkV

  • Tireless poster
  • ****
    • Posts: 764
    • View Profile
With the filesize from the original listing? Possible?

Damn that's so easy with FTP...
http://worldipv6launch.org - The world is different now.


Offline rejetto

  • Administrator
  • Tireless poster
  • *****
    • Posts: 13510
    • View Profile
i know only the filesize of the server-side file.
i know nothing about the archive you are trying to resume.
so i see nothing to compare with.