rejetto forum

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Messages - NaitLee

Pages: 1 2 3 4 ... 14
I want to get suggestions about whether to write a part of a HFS 3 plugin in another language (like C, Go, C++)
Particularly I want to write my template parser with Go, and communicate with a "shell" js with stdio.
This may gain performance. But what I actually want is to avoid the mess of node.js
My tpl plan have already been interrupted twice, mostly because I have no mind what to do and how to continue.
Whenever I want a small feature I need to request/install another big package. Whenever I need a small structure I end up with a large object/class.
Even if I want to call a C procedure natively I need a "node plugin", a quite bloated structure in C++, and will break at any time in the future.
This is for big organized applications, but rather complicated for a simple language parser, dealing with simple computational logic.
This defeated the sole purpose. Even in Python I have not seen such a dilemma.
JavaScript is doing well long ago, but finally ruined by jquery/react/minimizors/obfuscators, and finally node. Even TypeScript failed to rescue this.

... ... See what Unix old guys say to: OO Programming Node.js All Software
Mostly for fun. But reasonable, as the cruel reality.

HFS 3 is in Node, it works so well.
Let me try to do better in another way... :)

∵ E = MC2
∴ Errors = (More Code)2
∎ Q.E.D.

HFS ~ HTTP File Server / Re: a new beginning...
« on: March 17, 2022, 07:10:09 AM »
For newest 0.14.2 code, I should manually compile the "shared" folder in order to continue build-all.
cd shared && npx tsc && cd ..
Yes it's easy. Put it in the build-all process :D

Current HFS 3 version successfully built and running on my android phone, with Termux :D

HFS ~ HTTP File Server / Re: Server name
« on: March 09, 2022, 04:20:57 AM »
Just some hack, putting some entries in the "hosts" file makes some convenience... :D
On Windows it is at C:/windows/system32/drivers/etc/hosts
It's a simple text file, opening with notepad.exe is sufficient.
Modification may require Administrator privilege.

By adding something like:
Code: [Select]    home    work
you can browse those IP by entering "home/" or "work/" in the browser address bar. Don't forget port if it's not 80
The trailing slash is required to not let the browser google for "home" or "work"... :)

HFS ~ HTTP File Server / Re: Server name
« on: March 09, 2022, 04:10:17 AM »

On newer HFS 2.4 RC7 you can use diff template on root folder "/"

Right-click on root folder (house icon and slash), Properties, Diff template,
inside put this:
Code: [Select]
title=Home Computer
HFS=Home Computer
Then browser tab title "HFS" and served page "HTTP File Server" will become "Home Computer".

I didn't remember when the "^ template section prepend" was introduced, but likely not 2.4 rc2.
It's advised to update to 2.4 rc6 or rc7, since many problems (most of unicode and ddos protection) were fixed in them.
use a slightly-older/community template to avoid some rc7 template problem, if they annoyed you :)

For other template, after switching press F6 on HFS window, in the opened text editor find the title (typically "HFS" "HTTP File Server") you want to modify, change it/them and save the file.


HTML & templates / Re: archive not working
« on: February 23, 2022, 04:37:27 AM »

This problem is already been found about half years ago: HFS 2.4 RC7 default template tries to post archive selection data to a javascript section.
It is mostly caused by the transition to removing jquery, and since then HFS 2.4 RC7 "paused update" :D

My related post:
Ones like @dj already found this even earlier :) But I can't find...

- Let's help fix the code logic in the template
- If matters, temporarily switch to an older version of default template (that uses jquery), or a satisfying third-party template.

HFS ~ HTTP File Server / Re: a new beginning...
« on: January 29, 2022, 02:42:57 PM »
My tpl plugin needs to get file list, better with streaming :)

I see the file_list function in api.file_list.ts, without "sse" parameter I can get it directly.
But first it's not in exported api, and more important, this method seems won't stream the list but do one big shot.

I also noted the onDirEntry which a plugin can define. Seems a good point.
But, is it for frontend (exclusively)? Should I use it for a middleware plugin?

Also, should my plugin be able to call internal apis like authentication/permission check, or make an http redirect/do other hacks?
... or simply let HFS itself handle them properly ::)

ok it's set to public :)

I'm thinking about parallel segment evaluation & just-in-time compiler (to optimize segment structure)
This will require more efforts so I'd try them far later :D

HFS ~ HTTP File Server / Re: a new beginning...
« on: January 20, 2022, 01:27:03 PM »
I've found a github repo mirroring service for me:
This works on git clone (and pull) but not a browser. Though It's enough :D
So I decided to pull & build HFS3 by myself, if there's an update.

Followed instructions in, the build result is running well ;D
~/hfs $ cd dist && node src

... while there are notable alerts from npm:
Code: [Select]
22 vulnerabilities (3 low, 18 moderate, 1 high)

This is the output of npm audit, before fixing:
Code: [Select]
# npm audit report

follow-redirects  <1.14.7
Severity: high
Exposure of sensitive information in follow-redirects -
fix available via `npm audit fix`

1 high severity vulnerability

To address all issues, run:
  npm audit fix

Of course I ran npm audit fix

But if I try building again, in the procedure the "22 vulnerabilities" alert is still there.

Auditing again will output found 0 vulnerabilities
Should it be true or false ???

HFS ~ HTTP File Server / Re: Hosting and PDF viewing
« on: January 19, 2022, 09:05:47 AM »
There's pdf.js, and it's the pdf viewer built-in firefox. :)
You'll find relative helpful resources on the web, to include pdf.js into the served pages

中文 - Chinese / Re: 有人嘗試通過HFS下載並執行程序
« on: January 19, 2022, 08:59:12 AM »
看來這些攻擊者還是死心不改…… :D
最近也有相關報告:1st 2nd
的確,HFS2 之前有個漏洞,但早已被修復。而那些受影響的版本則已經見不到了。

在 hfs.exe 同目錄創建一個檔案,命名爲不含 .txt 後綴)
Code: [Select]
{.if|{.any macro marker|%url%.}{.count substring|createobject|{.lower |%url%.}.}|{:{.disconnect|%ip%.}:}.}
(此方法來自報告 1。可將其中 createobject 換爲其他攻擊關鍵字。)
(也可從 rejetto wiki 瞭解“宏/巨集”(macro)相關編程,對實際使用請求進行處理。)

P.S. 我平時使用簡體。若相關術語不正確,請諒解 ;)

HFS ~ HTTP File Server / Re: a new beginning...
« on: January 17, 2022, 03:49:07 PM »
This may enough for most others. But in my own case I should consider if srcDir is/can be absolute. ::)

Here is a unix-like environment.
My workdir is at ~/Documents/Tradit-TPL, I just symlinked it to ~/Applications/hfs0.6.0/plugins/Tradit-TPL
This way I don't need to move the workspace around to update hfs. I just create a new symlink to new version.

But when the plugins are imported by HFS/Node, the symlink is resolved by Node,
then my plugin scripts can't import HFS libs with a relative path for/from HFS/Node runtime: nothing is around the real workspace.

And that's why previously there's an "absRequire" thing...

Sure I can define my own development config (as in-plugin debug configs), if I can make everything well by/for myself :)

btw is there any convenient technical methods on Windows/other platforms for developing hfs plugins (or hfs itself)?
If really no, we may implement some in HFS level, to help developers :)

PS. seems my posts are "ignoring" normal users that just cares file sharing. I'll try my best to focus on the main target :)
But, I have much enthusiast to investigate HFS3, plugins, and macros... I want to grow and be well...

A note for passing-by guests: this is an technical topic. For seeking template themes see other topics :)

HFS3 default frontend is so fast.
But for template makers like me, want to make template useful for both HFS versions (HFS2 and 3)
In my thought it's not the disliked "compatible", but "universal", since there's no reason for a frequent casual user to leave away from HFS2.

Now I am making a plugin for the new HFS3 to support "traditional" templates.
Macros are there for HFS2.3 to implement useful logics, making a (dynamic pages based) template more "smart".
I've already made it to parse macros in PHFS. But if you tried using it you can find it's very slow, compared to Delphi HFS2.
Yes, Python itself is slow in these basic operations like string batch,
but there are other reasons, including every time we request a section it parses through the raw strings again and again, even if the macro procedure is fixed.

I want to make things faster. Though may still slower than pure-ajax, I want to try my best, at least for skill practicing :)
I'm thinking about serializing macros to make least waste in each execution/evaluation.
And, after this, macro injection (attack) will never work, even if there's an entrance for such action.
As for now I got some ideas, stated below, in normal text and/or source code (with comments)...

Some concepts are made:

MacroSegment and MacroUnit
These will nest some instances of each other to make the macro procedure clear & easy for computer.
Get more details in the code snippets below. Be prepared for thinking :D

MacroContext and MacroContextGlobal
These are for storing eg. variables, and stack for "liner macro execution"[1] (more info below)
Code in snippets may be modified to add more things.

MacroExecutor and MacroExecutors
For defining static functions to execute macros. A MacroUnit have an executor attribute assigned to one in MacroExecutors.
This may change to getter/setter in the future, to support "dynamic executor"[2]

Also see Footnote, FaQ, and Trivia, at the end of this post :D

Some (TypeScript) code snippets, for description: (may be modified at any time)
Code: [Select]
class MacroContextGlobal {
static globals: Record<string, string> = {};
static cache: Record<string, string> = {};

class MacroContext extends MacroContextGlobal {
    variables: Record<string, string> = {};
    stack: MacroContextStackItem[] = [];
    shift(count: number = 1): MacroContextStackItem[] {
        return new Array(count).map(() => this.stack.shift() || null);
    shiftAll(): MacroContextStackItem[] {
        return this.stack.splice(0, this.stack.length);

interface MacroExecutorFunction {
    (ctx: MacroContext, args?: MacroExecutorArgs, kwargs?: MacroExecutorKwargs): MacroSegment;

class MacroExecutor {
    /** @this {MacroExecutor} */
    _function: MacroExecutorFunction;
    flags: MacroExecutorFlags;
        func: MacroExecutorFunction,
        flags: MacroExecutorFlags = C.NO_MULTI_FLAG
    ) {
        this._function = func.bind(this);
        this.flags = flags;
    execute(ctx: MacroContext, args: MacroExecutorArgs, kwargs: MacroExecutorKwargs): MacroSegment {
        // NOTE: in the future, we may check some flags here before execution
        return this._function(ctx, args, kwargs);

var macroExecutors = new MacroExecutors();

 * A "part" of the whole macro expression, like a quote block, or a piece of string as argument of a macro.
 * A `MacroSegment` can be *evaluated*, to produce a plain string, then send to client / put into `MacroUnit` args/kwargs.
 * The term *evaluate* can be understood as original *dequote*, if there are items in `execOrder`.
 * In a section there's a root `MacroSegment`. 
 * Concepts:
 * - `segOrder` and `execOrder`:
 *   - Macros are mixed with plain parts and executable parts,
 *     for result production, we first take a sub-segment from `segOrder` as text,
 *     then, we take a `MacroUnit` from `execOrder` then execute it, finally get text.
 *     By repeating until last `segOrder`, we complete.
 * - `isPlain`:
 *   - For marking current segment as plain, i.e. no need to be executed.
 * - `isAlias`:
 *   - For marking current segment as alias from `[special:alias]`
class MacroSegment {
        // ... there are some attributes for plain representation as string, number, boolean. will change later
    segOrder: MacroSegment[];
    execOrder: MacroUnit[];
    isPlain: boolean;
    isAlias: boolean;
    isDynamic: boolean;
    private _inferTypesFromString(value: string): void {
        this._asString = value;
        let value_trimmed = value.trim();
        let possible_number = tryParseNumber(value_trimmed);
        this._asNumber = possible_number;
        this._asBoolean = !!possible_number;
        raw: string = C.EMPTY_STRING,
        segOrder: MacroSegment[] = [],
        execOrder: MacroUnit[] = [],
        isPlain: boolean = true,
        isAlias: boolean = false,
        isDynamic: boolean = false
    ) {
        // if (raw === null) {}
        this.segOrder = segOrder;
        this.execOrder = execOrder;
        // this.isPlain = this.isAlias = (raw !== null);
        this.isPlain = isPlain;
        this.isAlias = isAlias;
        this.isDynamic = isDynamic;

 * A part of the whole macro expression that have specified function, as a macro block. 
 * A `MacroUnit` can be *executed*, for performing special operations.
 * Concepts:
 * - `executor`:
 *   - An instance of `MacroExecutors`, taken from `MacroExecutors`.
 * - `args`:
 *   - A list of arguments, as `MacroSegment`.
 *     They **may** be dynamically *evaluated* by individual `MacroExecutor`.
 * - `kwargs`:
 *   - A list of keyword arguments, always optional, indexed with string, also as `MacroSegment`.
class MacroUnit {
    executor: MacroExecutor;
    args: MacroSegment[] = [];
    kwargs: Record<string, MacroSegment> = {};
        executor: MacroExecutor = MacroExecutors._unknown,
        args: MacroSegment[] = [],
        kwargs: Record<string, MacroSegment> = {}
    ) {
        this.executor = executor;
        this.args = args;
        this.kwargs = kwargs;


[1] "liner macro execution"
Let's consider an example:
The normal way is to walk from start, see the most-inner macro, pick up, execute, then replace it as result, then do again until end...
But in our way, after serialization, instructions are ordered there one by one:
execOrder = [ mul, add ]; (pseudo code. note that these are MacroUnits, which wrapped both an executor and arguments (as nested MacroSegments, plain or evaluatable))
... after the "mul" unit executed, it's result is pushed to stack of current MacroContext, then in "add" unit we leave a mark to let it shift one element from the stack as an argument.
This is mind-exhausting, but computer is really doing effective liner action.

[2] "dynamic executor"
Another example:
I think most dynamic language developers have tried such method to determine which function to use. :D
(wantSub ? sub : add)(5, 3) (sub and add are functions)
While it just works, it may confuse a static computing rule.
So our MacroExecutor need to be dynamic at here, by making the executor attr a getter.

= Why don't publish source code now?
- The source now only contains these "ideas" and completely not usable. It takes some time to integrate this large scale.
= Well... where will the source code be?
- On here of GitLab. But it's empty now.
= What's wrong with GitHub?
- Here have trouble accessing it, ranging the whole mainland region. Successfully viewing is by luck.
= Mirror to GitHub?
- I'll consider/try when the project become active.

I scribbled on my note paper in order to understand all of these by myself.
This project is developing on a new laptop with Manjaro GNU/Linux, for playing with edge-technique stuffs now my main workstation
I didn't want to touch Node.js, until I want to work on this. :)
The source code is full of typo "executer" before I post this. :P
I'm trying out Tabnine, an AI assist for coders. It auto-completed many pieces of code here. (Note: no advertisement meanings at all, but may help)

HFS ~ HTTP File Server / Re: a new beginning...
« on: January 17, 2022, 12:42:55 PM »
ok, let's try to keep plugins "subjective"...

Now that apis for a plugin are manipulated by HFS, and many more sooner-or-later,
so whether a plugin is ready is determined by HFS.
Why doesn't HFS finally tell the plugin is ready? it is as easy as calling init() on the plugin module, if there is one.
Of course the procedure of init() is by plugin developers :) seems better than a mysterious setTimeout thread trick
I accept the "exports.api" if this kind of objective-or-subjective is dealt well :)

Also consider asynchronously initialized plugins. Await one of it's the case and it's necessary.
Consider telling HFS it's async / should be awaited (or other characteristics) by exposing something (like isAsync: true) in module.exports.

I think we still need to tell plugins where are builtin files, for edge cases.
Even if it means "at my own risk" or "not officially supported", it's a way to play with HFS in a kinder method.
Of course we can limit it with something like debug flags... Just in completeness.
... If really still don't want to do this, just omit these :D

btw I choose to import builtin files (as a fallback layer of exports.api) for now for my tpl plugin, for maximum efficiency of development & test cycle

HFS ~ HTTP File Server / Re: a new beginning...
« on: January 17, 2022, 09:34:17 AM »
Some random thoughts... :)

I see this:
Code: [Select] = data = { }

While I can understand changes to plugin, this "cloning" may not as good as think.
It's better to consider the "prototype" in JavaScript.
Again about the "api injection" technique, consider other acceptable methods :P

Also before HFS 0.6, there's this in my tpl plugin:
Code: [Select]
var traditTpl = function(ctx) { template.serve(ctx); return true; }
traditTpl.prototype.middleware = function(ctx) { template.serve(ctx); return true; }
// ...
module.exports = traditTpl;
I just think it maybe useful, so wrote this by casual. And hfs 0.6 really want a "middleware" attribure.
But this didn't work -- the object cloning doesn't clone the prototype.

... Wish these make some sense :D

Tip: manipulate prototypes with Object.setPrototypeOf(target, object) etc.
PS. I don't afraid of changes. I just want to keep usual practice and keep things as-good :)

HFS ~ HTTP File Server / Re: a new beginning...
« on: January 17, 2022, 06:04:04 AM »
The ISP here seldomly allows me (and other people) to access GitHub, whether I can access HFS repository is by luck.
If possible please mirror the repo to somewhere like GitLab, then ones here can have another way...
GitLab can also be self-hosted.
I'm moving to GitLab also.

I see the directory structure changed.
But it made me confused: should I access HFS libs by simply importing the original files?
In the last 2 days I structured my tpl plugin (with/for hfs 0.5), it uses this:
Code: [Select]
import process_1 from 'process';
import path_1 from 'path';

// Workaround for my workspace, which is symlinked to plugin directory.
const NODE_PATH = process_1.cwd();
export function absRequire(path: string): any {
    return require(path_1.join(NODE_PATH, path));
This is used to solve many things, with absolute cwd.
And with it I am able to import HFS libs, and use them directly.

While we are exposing some apis in another way (like the const api = exports.api = {}), which means we'd better just use it.
But in theory it's always not enough, and if there's something missing, I will directly import HFS libs then use stuffs inside, without nothing bad.
This way such exposing became useless :'(

In my case:
Code: [Select]
import { absRequire } from './absRequire';
const frontEndApis_1: HFSFrontEndApis = absRequire('frontEndApis').frontEndApis;
const config_1: HFSConfig = absRequire('./config');
These are before hfs 0.6. They work just well.

So the suggestion is, simply make HFS libs themselves useable in a standard way, then let plugin makers take knowledge from some kind of documents (or source code)
EDIT: "including" the built-in stuffs to a central lib is also an option, and this can keep interfaces stable (not suddenly changed in the future)

PS. I personally feel the exports.api with setTimeout uncomfortable. So I don't want to use it :(

Pages: 1 2 3 4 ... 14