Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - bmartino1

Pages: 1 ... 53 54 [55] 56
811
HFS ~ HTTP File Server / Re: Folder Names not being shown
« on: December 04, 2013, 04:27:42 PM »
i would suspect your template file is casuing this issue, or that you are adding "empty" folders and/or adding folder from all over the computer instead of having them all in a structured folder tree...

I would ask that you post you "template you are using"
Menu> html template> edit (i'm using the beta build, so some menu option may not be available...)
or press the F6 button

notepad or HFS will open with the template you are using, posting it will let me/other forum members see if the macros are causing the issue due to editing it and/or bad template download.....

and/or try using another template:
Here is one that works for me:
Live template: https://drive.google.com/file/d/0B9u5dgydfOEuV3Q4WnZDbkxUTlk/edit?usp=sharing

812
router & port problems / Re: Advanced connectivity issue - Please help
« on: November 28, 2013, 12:52:09 AM »
I have this issue all the time...,

The issue is caused when hfs does the self test that it is going out to test if ports are open, and the router where hfs is is able to go thorugh....,
but not all the packets being sent are coming back as the dmz enabled router is sending it up and back on tyhe routers not going back to the HFS machine.

HFS is still working, doing the test may change you port number...

hfs computer machine :Time - to - Live (ttl) > Your network > ISP network

Your setup form post:
hfs machine> (router 1)the router with ports open > (router 2) comcast equipment router Dmz to router1

issue, hfs sending packet to check if open on router 1 while waiting for a response when packets are being set forward to router 2....
 solution: unknown- hfs is still working due to DMZ port forwarding rules...
http://en.wikipedia.org/wiki/DMZ_(computing)

some networking ingenuity can fix this... but its not really worth the effort...

813
according to the ip address you posted, it sounds like you are in some form of switch thats is using networking/nat protocols to give you an ip address, i would recommend using a web name and add it to your HFS, such as a dyndns.org host name, as the site will be forwarded to HFS using the dynaminc dns updater....

IF your still having problems, then it is either an internal network error on how you are conecting to the internet, or an ISP error on ports on the network where your connection is coming in.

814
go to this website:

http://www.ipchicken.com/

the address is you external ip address, it can also be found in your router....

815
letting you know my account listed here will no longer be accessible...

816
HFS ~ HTTP File Server / POLL: Curiosity to the world?
« on: November 19, 2013, 02:45:24 AM »
Haven't seen a poll. thought i post one...

817
HFS ~ HTTP File Server / Re: Realm
« on: November 19, 2013, 02:30:13 AM »
read this post and your link, i too wondered what this was, as i never tested it or atempted to.

the post is not clear to me, is there documentation on it?

818
HFS ~ HTTP File Server / Re: download will not stop when using wget
« on: November 19, 2013, 02:26:54 AM »
You are correct, becasue hfs doesn't tell wget that the conection is closed, wget thinks that there is still data to retrieve.

http://manpages.ubuntu.com/manpages/lucid/man1/wget.1.html

Wget has been designed for robustness over slow or unstable network
       connections; if a download fails due to a network problem, it will keep
       retrying until the whole file has been retrieved.  If the server
       supports regetting, it will instruct the server to continue the
       download from where it left off.

solution
run wget with tries option adn no clober option to fix this...

   -t number
       --tries=number
           Set number of retries to number.  Specify 0 or inf for infinite
           retrying.  The default is to retry 20 times, with the exception of
           fatal errors like "connection refused" or "not found" (404), which
           are not retried.
   -nc
       --no-clobber
           If a file is downloaded more than once in the same directory,
           Wget’s behavior depends on a few options, including -nc.  In
           certain cases, the local file will be clobbered, or overwritten,
           upon repeated download.  In other cases it will be preserved.

           When running Wget without -N, -nc, -r, or -p, downloading the same
           file in the same directory will result in the original copy of file
           being preserved and the second copy being named file.1.  If that
           file is downloaded yet again, the third copy will be named file.2,
           and so on.  (This is also the behavior with -nd, even if -r or -p
           are in effect.)  When -nc is specified, this behavior is
           suppressed, and Wget will refuse to download newer copies of file.
           Therefore, ""no-clobber"" is actually a misnomer in this
           mode---it’s not clobbering that’s prevented (as the numeric
           suffixes were already preventing clobbering), but rather the
           multiple version saving that’s prevented.

           When running Wget with -r or -p, but without -N, -nd, or -nc, re-
           downloading a file will result in the new copy simply overwriting
           the old.  Adding -nc will prevent this behavior, instead causing
           the original version to be preserved and any newer copies on the
           server to be ignored.

           When running Wget with -N, with or without -r or -p, the decision
           as to whether or not to download a newer copy of a file depends on
           the local and remote timestamp and size of the file.  -nc may not
           be specified at the same time as -N.

           Note that when -nc is specified, files with the suffixes .html or
           .htm will be loaded from the local disk and parsed as if they had
           been retrieved from the Web.
-------------------

819
HFS ~ HTTP File Server / Re: Need some help please
« on: November 19, 2013, 02:21:53 AM »
i would recommend using a dyndns free accunt to put an address to computer and or vise versa

or in the windows host file to fix this issue: puting you hfs machine to a custom word
(this would have to be done on all the machine, and you would need to know an ip that you would dedicate to the machine running hfs)

http://en.wikipedia.org/wiki/Hosts_(file)

http://tinyurl.com/2f6w57f

820
as for the javascript, i believe you are talking about an rarw template or looking at the default hfs template that uses macros, and i belive you think it is java script....

yes, you can run hfs under ubuntu, but it can be hard to setup trafic going with the interfaces.
also, since hfs was designed as an executable, some feature will not work properly or not at all.

I have used hfs with ubuntu, but being ubuntu and have man pages, i would recommend using "LAMP"
http://www.howtoforge.com/ubuntu_lamp_for_newbies

821
windows media player doesn't suport "auth" stream(with out some major fiddling), use vlc for it

also, you can do it with html coding and a playlist file. have you ever herd of a "*.M3U"?
A m3u is a pre made cod elist of songs you put there via program or notepad...
using a msu file helps stream multiple files, and can put authentication in the file itself, so you create a list of music and put it in a m3u file the m3u become the single file that you play on a media player...

( yes these are going over music, but video formats also port over with them... )

http://en.wikipedia.org/wiki/M3U

Here a site that may work for you as well:

http://howto.wired.com/wiki/Stream_Your_Music_Online#Extended_M3U

please note:
--------------------------
Don't Feed the Lawyers:

You could be fined for broadcasting music/movies you don't own the copyright to, and in most cases the penalties are pretty severe. There are also plenty of rabid music industry lawyers with a history of going to great lengths to prosecute copyright violators. That said, be careful how you distribute your streams.
Some of the software solutions listed here have built-in password protection. Otherwise, you'll have to know some SSL and/or password protection web hosting configurations to lock down your stream.
Your other option -- and probably the safest -- is to just keep your station under wraps. You can listen to it freely, but don't go spreading the word. Remember, the internet has ears!
-------------------------------

822
looks new and inventive, appreciate the work you have done on it. i'm testing it on windows 7 x86 professional...

upon first launching it (as i have stunnel already on and working, it auto loaded to the stunnel and hfs config.

This hfs is no where near the same instal path of my other hfs builds......

Looking forward into your ssl embedded with hfs.

Thank you for your time and patience...

823
HFS ~ HTTP File Server / Re: no download
« on: November 18, 2013, 04:06:19 PM »
i have removed this setup and links from my hfs server.

824
Bug reports / Re: wild cards and band ip address
« on: November 18, 2013, 04:05:31 PM »

825
HFS ~ HTTP File Server / Re: no download
« on: November 18, 2013, 03:46:15 PM »
okay, i have a start on the right direction, from the setup.png, i added it again and the folder is called view2

http://bmartino1.dyndns.org/testing/view2/

this folder is doing what i want with it, my issue with this is resolved as i can use this in my html coding and rearange what i'mtrying to do.

Thank you rejetto for your help and support.

Pages: 1 ... 53 54 [55] 56