rejetto forum

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Topics - Rom_1983

Pages: 1
1
Français / [SOLVED] Possible cache problem
« on: November 11, 2022, 07:43:28 PM »
I'm using a HTML page declared in HFS as an index to serve.
In this page, I use Javascript to import HTML partials (it works, I've tested and validated it).
I then use a second JS script to operate on those snippets included in the main HTML.

Problem :
- The second script seems to fail because it doesn't find the nodes he's attempting to target. Indeed, when using console.log() function, it returns an empty node. Yet, both script are loaded at the end of the page, and the second script even wait for the complete loading of the page (in wich the first script import the partials). Also, I've used various technics like async functions, pauses, or await (to wait the first functions to load), with no result.
- When using the Google Chrome inspector, in the "Console" tab, I see that the node is empty (when using console.log() to debug).
- Sometimes it works, especially when I change the content of the 2nd JS script and refresh the page, indicating that a cache problem might occur. But as soon as I push the F5 key again, the script fails again, with the same error messages (again, this is a hint of a cache in action).


Debuging :
Everything works fine if I don't use the first script and just plain copy/paste the snippets in the main HTML, meaning that the 2nd script works.
But it doesn't work anymore if use the importation system : however, I can see in the source code of the page (fully loaded) that the snippets are correctly included.

Important information :
- I checked the "Disable cache" option in the "Network" tab of the inspector.
- I spam F5, Shift+F5, CTRL+F5, with no result.
- I use a DYNDNS to reach my own machine. I use this because : 1) I needed to host the HTML under a server for JS importation to work (because of CORS block), 2) I use a second machine to reach the HTML hosted on the first one, and 3) OBS Studio doesn't allow to reach a LAN machine, he wants a WAN address. :-\

Actual conclusion :
- If Google Chrome doesn't cache, it may be the server serving files who does it.

QUESTION

Is HFS v2 caching things ? If yes, how to disable it for a specific index served ?

2
Edit (2022-04-29) : /!\ DO NOT USE THE PYTHON SCRIPT PROVIDED BY @NAITLEE, UNTIL MORE ARE INVESTIGATED IN THE SECURITY OF THE SCRIPT AND THIS WARNING HAS BEEN REMOVED /!\


Hello,

I have set several alias in DUCKDNS.ORG, like :
  • project1.duckdns.org
  • project2.duckdns.org

all pointing to my home IP address. They are intented to be public websites.

In HFS, I have REAL folders for each of them :

/
|-- project1
|-- project2

PROBLEM

When I share the URLs, they are like :


Code: [Select]
project1.duckdns.org/project1
project2.duckdns.org/project2


and this is ugly.

This is even worse if I store the REAL folders in a parent "empty" folder :

/
|-- public_websites /
                                   |-- project1
                                   |-- project2

wich leads to URLs like :

Code: [Select]
project1.duckdns.org/public_websites/project1
project2.duckdns.org/public_websites/project2


REQUEST

I would like them to be just project1.duckdns.org and project2.duckdns.org, pointing to the REAL FOLDERS whatever the position in the VFS tree.
For this, I see two solutions.

SOLUTION 1 : A ROUTING SYSTEM

This is basically what's called "URL rewritting".

HFS should provide a way to detect the URL typed, and LINK IT (not redirect !) to REAL FOLDERS, as junctions (in Windows) or hard links (in LINUX) redirect resources of a hard disk.
One way to achieve this would be to right-click on folders, set the URL, allowing us to type "/" in order for HFS to "reroute" the "/" root to those resources, depending on a certain hostname detected.
There's already a macro to detect the important part of the URL :

Code: [Select]
{.header|host.}
But I don't see what to do with it in the DIFF TEMPLATE of the "/" node.  :-[

SOLUTION 2 : SEVERAL ROOTS

HFS should provide several roots ("/") in several UI, like if it was hosting different websites.



If there is already a solution, I would be excited to hear it because I'm actually desperate. DYNDNS don't offer the way to set an IP + a route as a suffix, like "/public_websites/project1", wich would help to translate project1.duckdns.org into something like 70.56.33.81/public_websites/project1 and transmit the request, thus allowing HFS to directly receive and interprete the "/public_websites/project1" route but still letting it hidden to the user in his browser.


3
Bug reports / [Solved] Robots are scanning my HFS server
« on: March 21, 2021, 09:56:13 PM »
Hi,

These images proves that a robot is actually scanning my HFS server. How can we avoid that ? I know the technic of the robot.txt, but where to place it given that the "/" path isn't a real path ?





4
Bonjour,

Le moins que l'on puisse dire, c'est que HFS met les nerfs à rude épreuve.

Mon serveur est hébergé chez moi. Exemple du dossier qui sera pris en compte ici dans la description du problème :
Code: [Select]
/
  public_host  (dossier reéel)
      forum1
      forum2
      ...

Je voudrais faire les choses suivantes :
- Rendre accessible des fichiers dans n'importe quel sous-dossier de public_host depuis le web, pour notamment héberger des images à utiliser sur des forums.
- Cacher le contenu de public_host (bah oui, pas envie qu'on aille fouiner et faire du rétro-engineering sur mon activité internet, mes images des différents forums, etc).
- Et en même temps en rendre visible le contenu pour tout utilisateur logué (moi surtout) qui aurait les droits, afin d'obtenir facilement les liens vers les fichiers.

Sauf que :
- HFS a trois trucs intéressants pour ça : le flag "Browsable", le flag "Recursively hidden", et l'accès protégé par user/pass.
- J'ai commencé par tester "Recursively hidden" : cool, on voit pas le contenu comme prévu, mais je fais comment pour obtenir les liens des fichiers et pouvoir les copier/coller ?
- Décocher "Browsable" est plus propre (on a un joli message "forbidden"), mais tout aussi inutile pour avoir les liens.
- Activer "Browsable", décocher "Récursively hidden", et activer la protection par user/pass nous fait récupérer la liste des fichiers : mais du coup, les liens vers les ressources ne marchent plus sur les forums et la boîte de login apparaît !!!
- Ne pas activer l'user/pass, mais lui mettre "Ignore limits" et cacher le contenu en décochant "Browsable" et/ou cochant "Récursively hidden" ? Eh bah non, marche pas.

Ca rend fou. :o Il n'y a donc personne chez les devs pour s'être dit "ça serait chouette de pouvoir parcourir son serveur HFS, de faire un clic-droit sur un fichier, et de copier/coller le lien tout en cachant le contenu du dossier aux anonymes ?"

Parce que là du coup, je sèche. Je ne vois pas comment faire pour obtenir le lien d'un fichier que j'aurait ajouté à mon dossier réel, hormis en mémorisant l'URL de mon serveur, en la copiant quelquepart, en tapant à la main le chemin (dossiers/sous-dossiers) vers le fichiers, et en collant le nom et l'extension du fichier. Bref, hyyyyyyyyyyyyyper rébarbatif.

Pourquoi ne pas avoir offert la possibilité de browser les fichiers directement dans la console d'HFS ? Comme ça, on pourrait garder le paramétrage tout en ayant la liste des fichiers et la possibilité de copier leur lien.
Lorsqu'on fait un clic-droit sur un dossier réel, on n'a que "Browse it F9", ou "Open it F8". Pourquoi bon sang ??

Le plus triste dans tout ça c'est que j'étais en train de vouloir héberger deux images à poster sur un autre forum... pour solutionner un problème avec un logiciel, sachant que lui-même était sensé m'aider à en régler un autre !! Et soudain, en voulant parcourir la racine de mon serveur HFS, je me suis aperçu qu'ayant caché son contenu, je n'avais plus accès aux liens ! Donc je me retrouve avec un nouveau problème sous forme de dilemme, j'en ai marre. Personne ne fait d'effort pour être logique... .  :-\

5
Français / [solved] Strange multiple connections on multiple ports
« on: February 18, 2021, 03:58:18 PM »
Hi,

I've just started to use HFS to share some files on IRC (I created a public fodler for temporary files), and I'm worried about a few things :

- Multiple connexions FOR THE SAME FILE seems to occur from the same IPs.
- This occurs even the "fully downloaded" is noticed.
- Thos multiple connections can be reached a hundred. This is quite flooding the log.
- Different ports are used for those strange connections.

See the screenthos I share in attachment.
The file "Mes_idées_de_jeux.mp4" is just 5Mb, it's not normal that such an amount of connections occur.

Thanks to help me to figure if this is normal.

Pages: 1