rejetto forum

Software => HFS ~ HTTP File Server => Topic started by: everettt on September 07, 2017, 01:26:52 PM

Title: Unlimited Speed Hang
Post by: everettt on September 07, 2017, 01:26:52 PM
Setting HFS to default unlimited speed results in either a disconnect or hang. Server uses w g e t command and if I rate limit w g e t there is no issue (w g e t ... --limit-rate=8000k ...). However if I do not rate limit the w g e t command or rate limit HFS the transfer quits without any indication on how or why. The point at which it stops is also not consistent.
When unlimited HFS graph reports speeds like 2873980 kbps and
Log reports speeds anywhere from 3MB/s to 68MB/s

Can anyone help suggest what the problem might be and/or how to debug what the failure mechanism is?

Windows 7 Pro 64bit, SP1
Intel Core i5-2400 3.10 GHz
Ethernet Server Adapter X520-2 #2
Title: Re: Unlimited Speed Hang
Post by: bmartino1 on September 15, 2017, 07:42:47 PM
To diagnose a bit further, especially with w g e t... (R u using Windows gnu w g e t?) Or a Linux box w g e t?...

Either way wire shark the test again and watch the packet. They will tell you where and why it disconnected...

Wire shark would help diagnose that further...

http://filehippo.com/download_wireshark/32/
Title: Re: Unlimited Speed Hang
Post by: everettt on September 15, 2017, 08:07:14 PM
RHEL 6.5, 6.9, 7.3 versions "w g e t"
I will give Wireshark a try to see if I can capture the failure.
Thanks
Title: Re: Unlimited Speed Hang
Post by: bmartino1 on September 17, 2017, 03:02:46 AM
i remember experimenting with that command w g e t http://manpages.ubuntu.com/manpages/wily/man1/w g e t.1.html (http://manpages.ubuntu.com/manpages/wily/man1/w g e t.1.html)

https://tinyurl.com/ya45fhq2

and to make it run on hfs, i had to add a null speed option. something i posted onthe forum a  long time ago.

it was an issues with w g e t, not hfs. and how it talked to the web site.

 i think i had to force html with the option -f, there are also some w g e t option that can help diagnoise such as a log file...


----------------
now i rember, it was the no clober option, as hfs keeps sending the file...

-nc
       --no-clobber
           If a file is downloaded more than once in the same directory,
           W get's behavior depends on a few options, including -nc.  In
           certain cases, the local file will be clobbered, or overwritten,
           upon repeated download.  In other cases it will be preserved.

           When running W get without -N, -nc, -r, or -p, downloading the same
           file in the same directory will result in the original copy of file
           being preserved and the second copy being named file.1.  If that
           file is downloaded yet again, the third copy will be named file.2,
           and so on.  (This is also the behavior with -nd, even if -r or -p
           are in effect.)  When -nc is specified, this behavior is
           suppressed, and W get will refuse to download newer copies of file.
           Therefore, ""no-clobber"" is actually a misnomer in this
           mode---it's not clobbering that's prevented (as the numeric
           suffixes were already preventing clobbering), but rather the
           multiple version saving that's prevented.

           When running W get with -r or -p, but without -N, -nd, or -nc, re-
           downloading a file will result in the new copy simply overwriting
           the old.  Adding -nc will prevent this behavior, instead causing
           the original version to be preserved and any newer copies on the
           server to be ignored.

           When running W get with -N, with or without -r or -p, the decision
           as to whether or not to download a newer copy of a file depends on
           the local and remote timestamp and size of the file.  -nc may not
           be specified at the same time as -N.

           Note that when -nc is specified, files with the suffixes .html or
           .htm will be loaded from the local disk and parsed as if they had
           been retrieved from the Web.
Title: Re: Unlimited Speed Hang
Post by: everettt on September 20, 2017, 03:23:04 PM
Wireshark results:
Wireshark hangs with "not responding".
HFS hangs with no message, connections remain open
Linux server hangs in the middle of a file, issuing a CRLF results in a new blank line as the job has not completed
The state appears to be indefinite, or as long as I was willing to wait
If I switch off HFS, the server terminates the running job but Wireshark remains stuck and had to be terminated with the task manager. There was some erratic and intermittent behavior where it appeared to be trying to display more packets but I
could not put up with it any longer.
w g e t options example:
-r -nH -l1 --no-verbose --no-parent --directory-prefix=/abc --reject "index.html*" --accept rpm http://192.168.0.14/HP/deliverables/AddtlRpms/

The issue is still open and unexplained.

Title: Re: Unlimited Speed Hang
Post by: everettt on September 20, 2017, 04:27:40 PM
This appears to be where it all starts in the log file:
Read error (Connection reset by peer) in headers.

After which I get a long list of "connection refused" messages.
"-F" made no difference.

Still looking for answers.
Title: Re: Unlimited Speed Hang
Post by: everettt on September 20, 2017, 06:45:15 PM
Getting closer but still no solution.
What I find is that the server sends a request, but HFS never answers. If I use F4 to stop and start HFS, the transfer continues. However it is not long before I get another failure. I suspect that the default connection timeout is much longer than I ever wanted to wait so I never realized what was happening. Now that I have had a chance to experiment I can make it worse but I cannot seem to make it much better.
I also suspect that because the unlimited rate is very high that the Windows OS contributes to the fact that HFS may not be getting every single request.
So the question is:
How do I set the get command parameters such that I am resending requests to HFS in a way that is not confused or locked up due to connections not being terminated and restarted as both the get command and HFS would understand?

I have followed a few examples I found using Google but as I said, I can make it worse but I can't seem to make it better.

Title: Re: Unlimited Speed Hang
Post by: bmartino1 on September 23, 2017, 02:28:05 AM
Hmm, many things are going on here, usually this is where mars/ other hop in.

Need a few more details for testing...

What I'd recommend next is this w get option:
--retry-connrefused
           Consider "connection refused" a transient error and try again.
           Normally W get gives up on a URL when it is unable to connect to the
           site because failure to connect is taken as a sign that the server
           is not running at all and that retries would not help.  This option
           is for mirroring unreliable sites whose servers tend to disappear
           for short periods of time.
Title: Re: Unlimited Speed Hang
Post by: bmartino1 on September 23, 2017, 02:33:10 AM
I need your full w get line that your trying to download.
Specifically the option and file name/type

Are you using an edited template j. Hfs? Or using default config?

Finally, on Linux is you download path write capable/ just download, not running over a stream(as this would require more/other stuff)...

The error you started to post vs the error you have now(includes Wireshark issue) leads me to believe this is a win pc problem.

I'm surprised that Wireshark is giving you issues...
Title: Re: Unlimited Speed Hang
Post by: everettt on September 23, 2017, 11:38:40 AM
Already tried "--retry-connrefused", did not work.
Already posted complete command "-r -nH -l1 --no-verbose --no-parent --directory-prefix=/abc --reject "index.html*" --accept rpm http://192.168.0.14/HP/deliverables/AddtlRpms/".
Wireshark was a waste of time because it too had trouble keeping up with the high data rate.
My best troubleshooting tool was to set verbose and watch the activity on the Linux side.
The closest I have been to a working setup was when I figured out that using "F4" to stop then restart HFS allowed HFS to restart and eventually finish (many "F4"s).
If I insert the rate limit option in the command line "--limit-rate=8000k", I have no issues at all except the entire transfer takes ~45 minutes.
Title: Re: Unlimited Speed Hang
Post by: everettt on September 23, 2017, 12:26:03 PM
Adding:
"The error you started to post vs the error you have now"
The issue has not changed "using unlimited rate results in hang".
I have tried many things including changes to "w get" and changes to the HFS configuration.
I have been able to make it worse, but I have not yet been able to make it any better.

Here is my current theory:
Linux send "w get" request.
Transfers begin, but at some point PC/HFS is so busy it is not able to respond to a request.
Linux "w get" timeout after request is 900 seconds. This is the "hang".
Recently I found that if I reset HFS, Linux realizes the connection drop and reissues the request and transfers begin again.

I realize that this is something that can happen. I also realize that this is something that can be managed like any other data transport method "start - fail - retry - loop until done".
In this case I just think that the failure is statistically common instead of statistically rare. In a statistically rare case the defaults work because a failure should not occur. This is why the rate limit case works flawlessly. In a statistically common case the defaults no longer work (waiting 900 seconds is completely unrealistic). If I knew about all of the controls and how they work, for both "w get" and HFS, I might be able to find a combination that works. This is what I am looking for, some assistance in finding a combination of controls that will retry immediately "making the assumption that HFS missed the request and will never respond so the request needs to terminate and retry". I would be willing to share a remote session with anyone willing to provide this level of assistance.

Some history: I ran into this issue more than a year ago using an older version of HFS. I made many similar attempts to get it working and after every attempt failed I was OK with the rate limit option because the total transfer size was small enough. This issue has been experienced in many locations using different PCs (Windows 7 Pro) running different versions of HFS and using different Linux servers running different versions of RHEL (6.5, 6.9. 7.2, 7.3).
Title: Re: Unlimited Speed Hang
Post by: bmartino1 on September 26, 2017, 11:47:23 AM
I will be looking into it, I'm currently away from a pc. From your w get line, it looks like you were targeting the whole folder, not just a specific file.
Title: Re: Unlimited Speed Hang
Post by: everettt on September 26, 2017, 12:08:15 PM
Yes, targeting a directory with about 33GB of data. This is why I am interested in solving the unlimited speed issues. Ultimately there will be as many as 20 servers running in parallel that will all want to get files from HFS.
Title: Re: Unlimited Speed Hang
Post by: bmartino1 on September 26, 2017, 04:23:20 PM
Ok, I think mars / rejetto would be better at explaining why hfs targeting a directory cause this issue. I would have you test with the *.* Athe the end so it tagetets all the files not the folder. Still going to test when I can. Apachae / windows folder share (special tech word used in maping network drives over internet) would be better..


Hfs is a file server not a web server. Hfs uses it's file protcals thought Pascal http coding. I don't know if an easy fix.
Title: Re: Unlimited Speed Hang
Post by: everettt on September 26, 2017, 04:38:56 PM
Using *.* at the end.
-r -nH -l1 --no-verbose --no-parent --directory-prefix=/abc --reject "index.html*" --accept rpm http://192.168.0.14/HP/deliverables/AddtlRpms/*.*

Connecting to 192.168.0.14:80... connected.
HTTP request sent, awaiting response... 404 Not Found
2017-09-26 12:32:43 ERROR 404: Not Found.

Warning: wildcards not supported in HTTP.

I'll see if I can find the HTTP verses other mode options to see if there are any alternatives that might work.
Title: Re: Unlimited Speed Hang
Post by: Fysack on September 30, 2017, 10:30:35 PM
where are u from everettt?
Title: Re: Unlimited Speed Hang
Post by: everettt on September 30, 2017, 11:10:42 PM
Boston Area
I tried w g e t FTP option with FileZilla. After some tweaking of FileZilla I was able to get some good performance but not as good as with HTTP in HFS, assuming the hang can be resolved.
Title: Re: Unlimited Speed Hang
Post by: Fysack on October 06, 2017, 09:31:23 PM
Nice :-* ..i was more thinking of the cyberspace aspect do  ;D ;D ;D
Cool brother (brofistemoj)
You are in to it, i like it, respect my man 8)
Title: Re: Unlimited Speed Hang
Post by: rejetto on November 05, 2017, 01:18:25 PM
hi everettt, sorry for the late reply.
I made a few experiments, using w-get for windows 1.8.2 that i had, and i also experienced problems going over 8000k. 9000k is like having no limit.
Then i updated to 1.19.2 and the problem was gone.
So, i invite you, for a start, to check your version and possibly update.

I've experienced some problems going with very high speeds, anyway.
If the speed is very high, the GUI stops responding while the download is going, like the CPU is totally dedicated to the transfer. That's bad, but not fatal.
Sadly HFS stops responding to requests after some transfers like that. I've found that quitting is not necessary, just switch off/on.

I suspect the exact procedure for the problem is to make a request while HFS is totally hang for another transfer. The problem doesn't occur if the new request comes after the "heavy" one is done.
Title: Re: Unlimited Speed Hang
Post by: Fysack on July 21, 2018, 12:15:06 AM
rejetto   :-*
Title: Re: Unlimited Speed Hang
Post by: everettt on July 31, 2018, 06:11:09 PM
Any progress on this topic? I am currently running with a rate limited wget command "wget -r -nH -l1 --no-verbose --no-parent --limit-rate=8000k" to avoid the hang. Using FileZilla I get much better performance without the hang but using HFS would be my preference.
Title: Re: Unlimited Speed Hang
Post by: bmartino1 on August 01, 2018, 12:52:10 PM
Any progress on this topic? I am currently running with a rate limited wget command "wget -r -nH -l1 --no-verbose --no-parent --limit-rate=8000k" to avoid the hang. Using FileZilla I get much better performance without the hang but using HFS would be my preference.


I will probably use a fresh download VM of them :
https://developers.redhat.com/blog/2016/03/31/no-cost-rhel-developer-subscription-now-available/

And do what test via the vitural to my personal PC.

Will be using :
Red Hat Enterprise Linux 7.5.0

And will share the where version and test at a later time/date

--- have the VMready, and hit my issues last tie when testing(i don't have that high of content to download and test i have done are working curenlty. id still susupect a update issues

maybee update your wget:
https://www.ibm.com/support/knowledgecenter/en/SSUFR9_2.1.1/com.ibm.swg.ba.cognos.zcap_sol.2.1.1.doc/t_zcap_sol_redhat_utilities.html

Code: [Select]
sudo yum whatprovides /usr/bin/wget
Title: Re: Unlimited Speed Hang
Post by: rejetto on August 15, 2018, 04:34:45 PM
Any progress on this topic? I am currently running with a rate limited wget command "wget -r -nH -l1 --no-verbose --no-parent --limit-rate=8000k" to avoid the hang. Using FileZilla I get much better performance without the hang but using HFS would be my preference.

did you update wget as suggested in my last post?
Title: Re: Unlimited Speed Hang
Post by: Fysack on October 26, 2018, 08:57:02 PM
yea as long as we dont do https, i think we contribute to the enviroment. gung-hoo. Money. Fake news. Armagedddon