rejetto forum

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Messages - bmartino1

Pages: 1 ... 57 58 59 60
HFS ~ HTTP File Server / Re: Automatic download
« on: January 31, 2014, 04:43:09 PM »
yes, a code simalr to this

the lineplaced in the header is what you need:
<meta http-equiv="refresh" content="0"; URL="">

break down
meta data - http-equiv="refresh"  is a comand to hit the refresh button on the webpage, but to a specfic page you put...
content="0" is how many second will pass before the page will refresh, 0 means as soon as possible...
URL, is the path to the download file you are wanting to send...

html code(form http to transfer to https...(LOLZ):
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 3.2//EN">
<meta http-equiv="refresh" content="0"; URL="">
<p>redirecting you to <a href=""> &lt;.a&gt;</a></p>

LOLZ... hopes this helps

HFS ~ HTTP File Server / Re: Webcam on HFS
« on: January 31, 2014, 04:34:19 PM »
your wanting advance code, heres a start in the right irection...

when you save to registry what is the path in the registry it saves to?

I ask this because i run multiple hfs at one time, and i clicked save to registry instead of ini file, so now when my other hfs load, they load the one that was savde to registry, in whihc i want to delete the line so when they load they(from startup of machine) they load there ini files and not shudown server due to loading the registry file.

just looking for the path in regedit.... any help would be appreciated

FHFS / Re: How to hide delete account option
« on: December 31, 2013, 05:07:13 PM »
That's more of a question for a programmer as it is "hard wired" in the gui(some templates macro coding can fix this...), the question is what are you trying to do that you need to disable account deletion?

HFS ~ HTTP File Server / Re: Folder Names not being shown
« on: December 04, 2013, 04:27:42 PM »
i would suspect your template file is casuing this issue, or that you are adding "empty" folders and/or adding folder from all over the computer instead of having them all in a structured folder tree...

I would ask that you post you "template you are using"
Menu> html template> edit (i'm using the beta build, so some menu option may not be available...)
or press the F6 button

notepad or HFS will open with the template you are using, posting it will let me/other forum members see if the macros are causing the issue due to editing it and/or bad template download.....

and/or try using another template:
Here is one that works for me:
Live template:

router & port problems / Re: Advanced connectivity issue - Please help
« on: November 28, 2013, 12:52:09 AM »
I have this issue all the time...,

The issue is caused when hfs does the self test that it is going out to test if ports are open, and the router where hfs is is able to go thorugh....,
but not all the packets being sent are coming back as the dmz enabled router is sending it up and back on tyhe routers not going back to the HFS machine.

HFS is still working, doing the test may change you port number...

hfs computer machine :Time - to - Live (ttl) > Your network > ISP network

Your setup form post:
hfs machine> (router 1)the router with ports open > (router 2) comcast equipment router Dmz to router1

issue, hfs sending packet to check if open on router 1 while waiting for a response when packets are being set forward to router 2....
 solution: unknown- hfs is still working due to DMZ port forwarding rules...

some networking ingenuity can fix this... but its not really worth the effort...

according to the ip address you posted, it sounds like you are in some form of switch thats is using networking/nat protocols to give you an ip address, i would recommend using a web name and add it to your HFS, such as a host name, as the site will be forwarded to HFS using the dynaminc dns updater....

IF your still having problems, then it is either an internal network error on how you are conecting to the internet, or an ISP error on ports on the network where your connection is coming in.

go to this website:

the address is you external ip address, it can also be found in your router....

letting you know my account listed here will no longer be accessible...

HFS ~ HTTP File Server / POLL: Curiosity to the world?
« on: November 19, 2013, 02:45:24 AM »
Haven't seen a poll. thought i post one...

HFS ~ HTTP File Server / Re: Realm
« on: November 19, 2013, 02:30:13 AM »
read this post and your link, i too wondered what this was, as i never tested it or atempted to.

the post is not clear to me, is there documentation on it?

HFS ~ HTTP File Server / Re: download will not stop when using wget
« on: November 19, 2013, 02:26:54 AM »
You are correct, becasue hfs doesn't tell wget that the conection is closed, wget thinks that there is still data to retrieve.

Wget has been designed for robustness over slow or unstable network
       connections; if a download fails due to a network problem, it will keep
       retrying until the whole file has been retrieved.  If the server
       supports regetting, it will instruct the server to continue the
       download from where it left off.

run wget with tries option adn no clober option to fix this...

   -t number
           Set number of retries to number.  Specify 0 or inf for infinite
           retrying.  The default is to retry 20 times, with the exception of
           fatal errors like "connection refused" or "not found" (404), which
           are not retried.
           If a file is downloaded more than once in the same directory,
           Wget’s behavior depends on a few options, including -nc.  In
           certain cases, the local file will be clobbered, or overwritten,
           upon repeated download.  In other cases it will be preserved.

           When running Wget without -N, -nc, -r, or -p, downloading the same
           file in the same directory will result in the original copy of file
           being preserved and the second copy being named file.1.  If that
           file is downloaded yet again, the third copy will be named file.2,
           and so on.  (This is also the behavior with -nd, even if -r or -p
           are in effect.)  When -nc is specified, this behavior is
           suppressed, and Wget will refuse to download newer copies of file.
           Therefore, ""no-clobber"" is actually a misnomer in this
           mode---it’s not clobbering that’s prevented (as the numeric
           suffixes were already preventing clobbering), but rather the
           multiple version saving that’s prevented.

           When running Wget with -r or -p, but without -N, -nd, or -nc, re-
           downloading a file will result in the new copy simply overwriting
           the old.  Adding -nc will prevent this behavior, instead causing
           the original version to be preserved and any newer copies on the
           server to be ignored.

           When running Wget with -N, with or without -r or -p, the decision
           as to whether or not to download a newer copy of a file depends on
           the local and remote timestamp and size of the file.  -nc may not
           be specified at the same time as -N.

           Note that when -nc is specified, files with the suffixes .html or
           .htm will be loaded from the local disk and parsed as if they had
           been retrieved from the Web.

HFS ~ HTTP File Server / Re: Need some help please
« on: November 19, 2013, 02:21:53 AM »
i would recommend using a dyndns free accunt to put an address to computer and or vise versa

or in the windows host file to fix this issue: puting you hfs machine to a custom word
(this would have to be done on all the machine, and you would need to know an ip that you would dedicate to the machine running hfs)

as for the javascript, i believe you are talking about an rarw template or looking at the default hfs template that uses macros, and i belive you think it is java script....

yes, you can run hfs under ubuntu, but it can be hard to setup trafic going with the interfaces.
also, since hfs was designed as an executable, some feature will not work properly or not at all.

I have used hfs with ubuntu, but being ubuntu and have man pages, i would recommend using "LAMP"

windows media player doesn't suport "auth" stream(with out some major fiddling), use vlc for it

also, you can do it with html coding and a playlist file. have you ever herd of a "*.M3U"?
A m3u is a pre made cod elist of songs you put there via program or notepad...
using a msu file helps stream multiple files, and can put authentication in the file itself, so you create a list of music and put it in a m3u file the m3u become the single file that you play on a media player...

( yes these are going over music, but video formats also port over with them... )

Here a site that may work for you as well:

please note:
Don't Feed the Lawyers:

You could be fined for broadcasting music/movies you don't own the copyright to, and in most cases the penalties are pretty severe. There are also plenty of rabid music industry lawyers with a history of going to great lengths to prosecute copyright violators. That said, be careful how you distribute your streams.
Some of the software solutions listed here have built-in password protection. Otherwise, you'll have to know some SSL and/or password protection web hosting configurations to lock down your stream.
Your other option -- and probably the safest -- is to just keep your station under wraps. You can listen to it freely, but don't go spreading the word. Remember, the internet has ears!

Pages: 1 ... 57 58 59 60