rejetto forum
Software => HFS ~ HTTP File Server => Topic started by: Johnway on October 24, 2003, 01:52:35 PM
-
I found that the linux util 'wget' cannot 100% download
the file from HFS. It always hang on 98% or 99%.
I need to Ctrl-C to stop the wget but the downloaded file correct
and missing no byte!
In my linux version of Opera, it can complete download file from HFS.
-
yes, i noticed it too
maybe wget waits until the server disconnects, but HFS do not force disconnection.
i just downloaded a Windows version of wget, so i can test it
-
I have windows and linux version, both have same problem.
-
wget ver 1.9 for windows
seems to work ok if you use the switch
--no-http-keep-alive
-
You are correct, becasue hfs doesn't tell wget that the conection is closed, wget thinks that there is still data to retrieve.
http://manpages.ubuntu.com/manpages/lucid/man1/wget.1.html
Wget has been designed for robustness over slow or unstable network
connections; if a download fails due to a network problem, it will keep
retrying until the whole file has been retrieved. If the server
supports regetting, it will instruct the server to continue the
download from where it left off.
solution
run wget with tries option adn no clober option to fix this...
-t number
--tries=number
Set number of retries to number. Specify 0 or inf for infinite
retrying. The default is to retry 20 times, with the exception of
fatal errors like "connection refused" or "not found" (404), which
are not retried.
-nc
--no-clobber
If a file is downloaded more than once in the same directory,
Wget’s behavior depends on a few options, including -nc. In
certain cases, the local file will be clobbered, or overwritten,
upon repeated download. In other cases it will be preserved.
When running Wget without -N, -nc, -r, or -p, downloading the same
file in the same directory will result in the original copy of file
being preserved and the second copy being named file.1. If that
file is downloaded yet again, the third copy will be named file.2,
and so on. (This is also the behavior with -nd, even if -r or -p
are in effect.) When -nc is specified, this behavior is
suppressed, and Wget will refuse to download newer copies of file.
Therefore, ""no-clobber"" is actually a misnomer in this
mode---it’s not clobbering that’s prevented (as the numeric
suffixes were already preventing clobbering), but rather the
multiple version saving that’s prevented.
When running Wget with -r or -p, but without -N, -nd, or -nc, re-
downloading a file will result in the new copy simply overwriting
the old. Adding -nc will prevent this behavior, instead causing
the original version to be preserved and any newer copies on the
server to be ignored.
When running Wget with -N, with or without -r or -p, the decision
as to whether or not to download a newer copy of a file depends on
the local and remote timestamp and size of the file. -nc may not
be specified at the same time as -N.
Note that when -nc is specified, files with the suffixes .html or
.htm will be loaded from the local disk and parsed as if they had
been retrieved from the Web.
-------------------
-
yes, i noticed it too
maybe wget waits until the server disconnects, but HFS do not force disconnection.
i just downloaded a Windows version of wget, so i can test it
I am still wondering why it happened? can you provide additional details. Thanks in advance
Regards,
ValidEdge (https://validedge.com/steam-download-stopping/)