rejetto forum

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - bmartino1

Pages: 1 ... 58 59 60 61
886
according to the ip address you posted, it sounds like you are in some form of switch thats is using networking/nat protocols to give you an ip address, i would recommend using a web name and add it to your HFS, such as a dyndns.org host name, as the site will be forwarded to HFS using the dynaminc dns updater....

IF your still having problems, then it is either an internal network error on how you are conecting to the internet, or an ISP error on ports on the network where your connection is coming in.

887
go to this website:

http://www.ipchicken.com/

the address is you external ip address, it can also be found in your router....

888
letting you know my account listed here will no longer be accessible...

889
HFS ~ HTTP File Server / POLL: Curiosity to the world?
« on: November 19, 2013, 02:45:24 AM »
Haven't seen a poll. thought i post one...

890
HFS ~ HTTP File Server / Re: Realm
« on: November 19, 2013, 02:30:13 AM »
read this post and your link, i too wondered what this was, as i never tested it or atempted to.

the post is not clear to me, is there documentation on it?

891
HFS ~ HTTP File Server / Re: download will not stop when using wget
« on: November 19, 2013, 02:26:54 AM »
You are correct, becasue hfs doesn't tell wget that the conection is closed, wget thinks that there is still data to retrieve.

http://manpages.ubuntu.com/manpages/lucid/man1/wget.1.html

Wget has been designed for robustness over slow or unstable network
       connections; if a download fails due to a network problem, it will keep
       retrying until the whole file has been retrieved.  If the server
       supports regetting, it will instruct the server to continue the
       download from where it left off.

solution
run wget with tries option adn no clober option to fix this...

   -t number
       --tries=number
           Set number of retries to number.  Specify 0 or inf for infinite
           retrying.  The default is to retry 20 times, with the exception of
           fatal errors like "connection refused" or "not found" (404), which
           are not retried.
   -nc
       --no-clobber
           If a file is downloaded more than once in the same directory,
           Wget’s behavior depends on a few options, including -nc.  In
           certain cases, the local file will be clobbered, or overwritten,
           upon repeated download.  In other cases it will be preserved.

           When running Wget without -N, -nc, -r, or -p, downloading the same
           file in the same directory will result in the original copy of file
           being preserved and the second copy being named file.1.  If that
           file is downloaded yet again, the third copy will be named file.2,
           and so on.  (This is also the behavior with -nd, even if -r or -p
           are in effect.)  When -nc is specified, this behavior is
           suppressed, and Wget will refuse to download newer copies of file.
           Therefore, ""no-clobber"" is actually a misnomer in this
           mode---it’s not clobbering that’s prevented (as the numeric
           suffixes were already preventing clobbering), but rather the
           multiple version saving that’s prevented.

           When running Wget with -r or -p, but without -N, -nd, or -nc, re-
           downloading a file will result in the new copy simply overwriting
           the old.  Adding -nc will prevent this behavior, instead causing
           the original version to be preserved and any newer copies on the
           server to be ignored.

           When running Wget with -N, with or without -r or -p, the decision
           as to whether or not to download a newer copy of a file depends on
           the local and remote timestamp and size of the file.  -nc may not
           be specified at the same time as -N.

           Note that when -nc is specified, files with the suffixes .html or
           .htm will be loaded from the local disk and parsed as if they had
           been retrieved from the Web.
-------------------

892
HFS ~ HTTP File Server / Re: Need some help please
« on: November 19, 2013, 02:21:53 AM »
i would recommend using a dyndns free accunt to put an address to computer and or vise versa

or in the windows host file to fix this issue: puting you hfs machine to a custom word
(this would have to be done on all the machine, and you would need to know an ip that you would dedicate to the machine running hfs)

http://en.wikipedia.org/wiki/Hosts_(file)

http://tinyurl.com/2f6w57f

893
as for the javascript, i believe you are talking about an rarw template or looking at the default hfs template that uses macros, and i belive you think it is java script....

yes, you can run hfs under ubuntu, but it can be hard to setup trafic going with the interfaces.
also, since hfs was designed as an executable, some feature will not work properly or not at all.

I have used hfs with ubuntu, but being ubuntu and have man pages, i would recommend using "LAMP"
http://www.howtoforge.com/ubuntu_lamp_for_newbies

894
windows media player doesn't suport "auth" stream(with out some major fiddling), use vlc for it

also, you can do it with html coding and a playlist file. have you ever herd of a "*.M3U"?
A m3u is a pre made cod elist of songs you put there via program or notepad...
using a msu file helps stream multiple files, and can put authentication in the file itself, so you create a list of music and put it in a m3u file the m3u become the single file that you play on a media player...

( yes these are going over music, but video formats also port over with them... )

http://en.wikipedia.org/wiki/M3U

Here a site that may work for you as well:

http://howto.wired.com/wiki/Stream_Your_Music_Online#Extended_M3U

please note:
--------------------------
Don't Feed the Lawyers:

You could be fined for broadcasting music/movies you don't own the copyright to, and in most cases the penalties are pretty severe. There are also plenty of rabid music industry lawyers with a history of going to great lengths to prosecute copyright violators. That said, be careful how you distribute your streams.
Some of the software solutions listed here have built-in password protection. Otherwise, you'll have to know some SSL and/or password protection web hosting configurations to lock down your stream.
Your other option -- and probably the safest -- is to just keep your station under wraps. You can listen to it freely, but don't go spreading the word. Remember, the internet has ears!
-------------------------------

895
looks new and inventive, appreciate the work you have done on it. i'm testing it on windows 7 x86 professional...

upon first launching it (as i have stunnel already on and working, it auto loaded to the stunnel and hfs config.

This hfs is no where near the same instal path of my other hfs builds......

Looking forward into your ssl embedded with hfs.

Thank you for your time and patience...

896
HFS ~ HTTP File Server / Re: no download
« on: November 18, 2013, 04:06:19 PM »
i have removed this setup and links from my hfs server.

897
Bug reports / Re: wild cards and band ip address
« on: November 18, 2013, 04:05:31 PM »

898
HFS ~ HTTP File Server / Re: no download
« on: November 18, 2013, 03:46:15 PM »
okay, i have a start on the right direction, from the setup.png, i added it again and the folder is called view2

http://bmartino1.dyndns.org/testing/view2/

this folder is doing what i want with it, my issue with this is resolved as i can use this in my html coding and rearange what i'mtrying to do.

Thank you rejetto for your help and support.

899
HFS ~ HTTP File Server / Re: no download
« on: November 18, 2013, 03:42:55 PM »
upon doing my idea, the folder permission to access the folder take over as the no download will not allow the traffic. shown a iframe source of access denied.

900
HFS ~ HTTP File Server / Re: no download
« on: November 18, 2013, 03:35:17 PM »
ive set up a scenario on my version of hfs:
http://bmartino1.dyndns.org/testing/

(for the account that will need access...
username: test
passsword: test

okay, i might be not understanding what you are trying to tell me...

as i (believe ) that i have done what you have posted, but the hfs login still takes over the folder...
upon adding the same folder hfs renames it, i rename it to the same name (see setup.png)

Issues, when traversing to the folder, the text document should be seen, but no access to download or read it, until log in
traversing to the path brings the login... eventually to hfs restricted access page... (see issue.png)

--------------------------------

i will try a new idae next, because i have them both under the same name...
using html and iframes two 2 different paths...
-------------

Thank you for your help and support

Pages: 1 ... 58 59 60 61