rejetto forum
Software => HFS ~ HTTP File Server => Topic started by: zigomatic on January 22, 2009, 09:43:43 PM
-
I see some troubles with file like big movie in HD MKV due to IE limitation.
This should be great to abble to download this file by spliting in rar files on the fly.
On the page, if you have in front of HFS , a big file, you see in example 4 links representing this splited file.
Up to the downloader to rebuild by using a rar tools.
-
It is more realistic to distribute files smaller than the limit from the beginning, not?
(-2 GB it seems ???, moreover)
... Easier to throw IE in nettles ... and use a real browser.
FF, Opera, ... etc ... ;)
-
But it would be, nonetheless, a very usable feature. Maybe it could be implemented using the folder.tar code that's already inside HFS. I'd recommend splitting it into 1GiB parts to be on the safe side.
-
I agree.
But after the request to implement the format .Zip; this request ...
I think rejetto will require Aspirin. :D
-
i just googled and, yes tar supports file splitting.
i will put this in the to-do.
doesn't IE7 support 4GB+ files?
-
It is not possible for a server to send several files some following the others as if it was multiple successive demands?
-
sadly not.
HTTP only knows 1request-1reply.
Even gmail gives you a zip if you ask to download ALL attachments. (and google knows how to get the best from the web)
-
If I remember my last tests correctly, IE7 still has the restriction. But as it's not always exactly 2GiB, but a few bytes less, I recommend the 1GiB block size (maybe even configurable). A further advance of this is: You don't have to download large files in one piece (slow connection anyone?).
Disadvantage: The built-in ZIP tool doesn't support .tar, so an external tool is always required.
-
perhaps one way, I don't unsderstand all then I give you the link:
http://1997.webhistory.org/www.lists/www-talk.1994q2/0158.html
-
perhaps one way, I don't unsderstand all then I give you the link:
http://1997.webhistory.org/www.lists/www-talk.1994q2/0158.html
i think he's just proposing.
-
doesn't IE7 support 4GB+ files?
Yes, it dows.
BTW as far as I know ZIP doesn't support 2GB+ files.
-
If you ask me, downloading a file that large without a proper download manager is about as dodgy as driving a car with bald tyres. I guess in places without internet limitations and low speeds its not a problem, but here in Australia we have internet qouta's, mine is 40gb a month, with a maximum speed of 1500k on my plan. Downloading a large file using the browser alone is... suicide... so I use a download manager to solve these issues.
I can see the advantages of this though. Splitting is handy for more than just old windows computers running fat32, I have a 320gb external hdd formatted in fat32 for portability between operating systems, and I bumped into the 4gb limit trying to move a 6gb file the other day. :D
-
in a similar case, you have fat32 and download a splitted file, but you can't rejoin it. very useful :)
-
in a similar case, you have fat32 and download a splitted file, but you can't rejoin it. very useful
You should maybe add an option 'tube of glue' in hfs.
;D ;D
-
in a similar case, you have fat32 and download a splitted file, but you can't rejoin it. very useful
You should maybe add an option 'tube of glue' in hfs.
;D ;D
lol, well I cant say I have ever unzipped a multi-part tar or zip, but i frequently do it for rar's. But then I guess .rar is even more impossible with hfs.
-
From my side, as I often use HFS on UNIX, I prefer the tar format ;D
-
If you ask me, downloading a file that large without a proper download manager is about as dodgy as driving a car with bald tyres.
ROFL `n` lol @mars
-
Dear,
I think , it's a good idea to implement this like some rapidshare or other hosting stuff use this method.
It's not a problem on IE or Firefox , it's more a problem in general way to manage files more than X GB,
there is limitation on proxy level also, during all tests, there is one proxy based on linux cutting the transfer a 2.00GB.
Even with firefox and the plugin download_them_all , i need to stop and restart the download manualy which is not user friendly.
Another comment about folder TAR ,i very a good idea but if i have 25GB (so big folders) on the folder, this would be nice to be disabled.
Have nice day
-
Another comment about folder TAR ,i very a good idea but if i have 25GB (so big folders) on the folder, this would be nice to be disabled.
Select the folder, right mouse button and select the Properties .... click on "Flags" tab and deselect "Archivable" ;D
-
You are so fuzzking awsome luca69, where have you been!? :-*
-
there is limitation on proxy level also, during all tests, there is one proxy based on linux cutting the transfer a 2.00GB.
Even with firefox and the plugin download_them_all , i need to stop and restart the download manualy which is not user friendly.
i wonder how many people are behind such proxy.
i consider it an interesting feature, just less urgent than many others.
but HFS is open source and anyone (skilled) can decide to spend time on this.
Another comment about folder TAR ,i very a good idea but if i have 25GB (so big folders) on the folder, this would be nice to be disabled.
as luca says, it is enabled by default on the root, but you can decide to set it on specific folders only.