rejetto forum

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Messages - Tsuna

Pages: 1
Not working. REEEEEEE

I want that the file can be only accessed by a specific IP I want.

like a folder's file can be accessed through nor can access the file.

Using the latest dev version and having a roadblock where I want to limit a folder such that only a specific IP can view it/access it
But unable to find anything like this in the docs or on the forum

I think it may be possible via a script for that folder? Looking for some guidance on the same

HFS ~ HTTP File Server / Re: [REQUEST]Show IP address by using forwarded IP
« on: December 21, 2020, 07:53:16 AM »
it was introduced in build #199.

+ show header(http_x_forwarded_for) instead of IP, but only if IP in a customizable ip-mask ( by default)

at the moment there's no graphical way to edit this mask, you must edit the ini file and search for the line starting with

You may even put * as mask, that will enable the feature for every address.
This is not so by default because the content of that header may be tampered and be not the real ip but anything. It's up to you to trust what the proxy says.

When you said you may even put * as mask did you perhaps mean
"forwarded-mask=::1;" >> "forwarded-mask=*" ?

there is no solution, stunnel does not allow to transmit external ip addresses except in its linux version
What about a reverse proxy?
I setup a reverse proxy over a cloudflare worker >> dns >> HFS server
I tried passing the cf connecting ip header and the xforwarded for header, the best i could get it to was show the "IP connected" log output.
But when I download the file, it wont show the origin IP but of cf instead.

Am i doing things wrong or is HFS simply not accepting the header?

HFS ~ HTTP File Server / Re: Help starting and securing HFS
« on: April 15, 2020, 12:29:39 AM »
Thanks for all the quick replies and the patience for the same.

I notice that HFS does include the "allow referer" option, can this now be used to control download mooching?
Properties for folder
Flags: disable Browsable

See also

Update: prevent edit the url
Code: [Select]
{.if not|{.header|Referer.}|{:{.disconnect.}:}.}
Will look into these.

add /* at the end
I don't remember the necessary syntax right now, but i've found this old post about it

Oh, I'm gonna try that, if it does work then I can somewhat prevent some level of mooching.

3. you can protect file2 from being guessed in 2 easy ways
- you can add something to the name, like file2-xyz
- you can leave the original name, but create a virtual folder xyz and put file2 there
I had the exact idea, I suggested we add a small CRC code to the end of the file, this makes guess work moot.
I think virtual folders are a no no, the app itself says not to use them cause the folder size is large (8k ish files and 800gb ish)

Rejetto - ideally, all of this can be fixed if we can add a "expiring links" say, just include nan expiring token for a session so that links are only valid for that user for that session.
Deny if Ip doesn't match would work smooth.

pair this up with referer match and boom, safe and secure

HFS ~ HTTP File Server / Help starting and securing HFS
« on: April 13, 2020, 05:58:48 PM »
I wanted to run a fie download server with HFS and got stuck with some issue

The setup I'm aiming for is like this

for (where links will be posted)
and say.. (the URL for the files)
Links to stuff on-site >> a proxy server over Nginx to secure my source server > links

Problems, major problems that I'm stuck with
1. How to configure referee function on rejetto, such that if a user takes a URL and steals is, they cant use it.
Ive no idea how I can get the referee to work, what should we even write here? ? (didn't work)
Maybe my Nginx isn't passing it, idk.

2. How do you control/disable "get list" function
I can edit the HTML and remove the button, but it wont matter if a user knows I'm using rejetto's fs, go to\&recursive
And done, all links are yours (getting point1 to work would be a great help here though)

3. If a user downloads he can figure out file2, file3 and so on. (This I actually asked in a different thread, if I can get point 1 and 2 to work, I can fix most of the damage)

I really want to use this to deliver my files but cant cause once I do, people are bound to steal the links and I'm done for. :/
Kindly assist, if this can be worked around, id like to know how.
If this cant be worked around, id like to know that as well so I can give up on this idea entirely.

router & port problems / Re: Query regarding "URL path masking"
« on: March 21, 2020, 01:48:49 PM »
Or we could base64(not a nice solution tbh) but seems like a dirty hack could do.
At the moment the only thing stopping me from using hfs is that the urls are visible clean.

router & port problems / Query regarding "URL path masking"
« on: March 21, 2020, 06:44:58 AM »
Been a fan of HFS for years now, testing and that and using it for casual stuff.

Recently tried to setup a small server to test things out, stuck with a small issue about url path
For ex

I have hfs running at and there are files i want to provide existing at,2,3 and so on
Problem is if a user knows the path of file1, he can simply edit the url and leech.

I can block "reading,viewing" the root or "folder" so user cant view all my files but can still try

is there any setting/method i missed under hfs that allows me to hide my paths in urls?
like maybe

Pages: 1