rejetto forum

a new beginning...

rejetto · 84 · 33298

0 Members and 1 Guest are viewing this topic.

Offline rejetto

  • Administrator
  • Tireless poster
  • *****
    • Posts: 13510
    • View Profile
i've noticed several problems with the previous distribution, so i've worked hard to try to solve them all.
https://github.com/rejetto/hfs/releases/tag/v0.6.1-alpha

i think the exe version can be a good solution for most users now.

a big change is that the frontend folder is "outside" now.
the main purpose is for full replacement (if you have another), it is still not designed to be edited.
Customization of the default frontend is to be done through plugins.
« Last Edit: January 16, 2022, 08:58:43 PM by rejetto »


Offline NaitLee

  • Tireless poster
  • ****
    • Posts: 203
  • Computer-brain boy
    • View Profile
The ISP here seldomly allows me (and other people) to access GitHub, whether I can access HFS repository is by luck.
If possible please mirror the repo to somewhere like GitLab, then ones here can have another way...
GitLab can also be self-hosted.
I'm moving to GitLab also.



I see the directory structure changed.
But it made me confused: should I access HFS libs by simply importing the original files?
In the last 2 days I structured my tpl plugin (with/for hfs 0.5), it uses this:
Code: [Select]
import process_1 from 'process';
import path_1 from 'path';

// Workaround for my workspace, which is symlinked to plugin directory.
const NODE_PATH = process_1.cwd();
export function absRequire(path: string): any {
    return require(path_1.join(NODE_PATH, path));
}
This is used to solve many things, with absolute cwd.
And with it I am able to import HFS libs, and use them directly.

While we are exposing some apis in another way (like the const api = exports.api = {}), which means we'd better just use it.
But in theory it's always not enough, and if there's something missing, I will directly import HFS libs then use stuffs inside, without nothing bad.
This way such exposing became useless :'(

In my case:
Code: [Select]
import { absRequire } from './absRequire';
const frontEndApis_1: HFSFrontEndApis = absRequire('frontEndApis').frontEndApis;
const config_1: HFSConfig = absRequire('./config');
These are before hfs 0.6. They work just well.

So the suggestion is, simply make HFS libs themselves useable in a standard way, then let plugin makers take knowledge from some kind of documents (or source code)
EDIT: "including" the built-in stuffs to a central lib is also an option, and this can keep interfaces stable (not suddenly changed in the future)

PS. I personally feel the exports.api with setTimeout uncomfortable. So I don't want to use it :(
« Last Edit: January 17, 2022, 09:09:57 AM by NaitLee »
"Computation is not forbidden magic."
Takeback Template | PHFS


Offline NaitLee

  • Tireless poster
  • ****
    • Posts: 203
  • Computer-brain boy
    • View Profile
Some random thoughts... :)

I see this:
Code: [Select]
this.data = data = { ...data }

While I can understand changes to plugin, this "cloning" may not as good as think.
It's better to consider the "prototype" in JavaScript.
Again about the "api injection" technique, consider other acceptable methods :P

Also before HFS 0.6, there's this in my tpl plugin:
Code: [Select]
var traditTpl = function(ctx) { template.serve(ctx); return true; }
traditTpl.prototype.middleware = function(ctx) { template.serve(ctx); return true; }
// ...
module.exports = traditTpl;
I just think it maybe useful, so wrote this by casual. And hfs 0.6 really want a "middleware" attribure.
But this didn't work -- the object cloning doesn't clone the prototype.

... Wish these make some sense :D

Tip: manipulate prototypes with Object.setPrototypeOf(target, object) etc.
PS. I don't afraid of changes. I just want to keep usual practice and keep things as-good :)
« Last Edit: January 17, 2022, 12:01:13 PM by NaitLee »
"Computation is not forbidden magic."
Takeback Template | PHFS


Offline rejetto

  • Administrator
  • Tireless poster
  • *****
    • Posts: 13510
    • View Profile
The ISP here seldomly allows me (and other people) to access GitHub, whether I can access HFS repository is by luck.
If possible please mirror the repo to somewhere like GitLab, then ones here can have another way...

can't you mirror it yourself by providing gitlab the address of my repo?
Anyway, with some research i've found this address. I hope it may be useful to you
https://github.123vip.workers.dev/-----https://github.com/rejetto/hfs/releases

Quote
I see the directory structure changed.

i try to keep this as infrequent as possible of course

Quote
But it made me confused: should I access HFS libs by simply importing the original files?

i think I should consider every plugin real need, and see if I can offer a way to solve it without it depending on HFS sources structure.
As a fallback a plugin can consider importing HFS sources, but it's a risk, especially when we are in a fast evolving situation like now.

In the last 2 days I structured my tpl plugin (with/for hfs 0.5), it uses this:

Quote
This is used to solve many things, with absolute cwd.

cwd is not a good solution IMO, as it is not directly related to the sources but to the runtime.

Quote
While we are exposing some apis in another way (like the const api = exports.api = {}), which means we'd better just use it.

exactly, that should be your first place where to look. And if you don't find what you need you should tell me, so we can discuss it.

Quote
But in theory it's always not enough, and if there's something missing, I will directly import HFS libs then use stuffs inside, without nothing bad.

it's not nothing. You depend on my sources structure, which I may be forced to change. While I will make an extra effort to keep plugins api still.

Quote
So the suggestion is, simply make HFS libs themselves useable in a standard way,

standard means that I should not change my sources so your import will always work? this is practically impossible :)

Quote
EDIT: "including" the built-in stuffs to a central lib is also an option, and this can keep interfaces stable (not suddenly changed in the future)

It's not just a matter of where things are and what name they have. It's not enough. Your suggestion is not feasible.

Quote
PS. I personally feel the exports.api with setTimeout uncomfortable. So I don't want to use it :(

I tried other solutions before this, and they were worse, then I got this.
If you have a better solution I will be happy to consider it. I'm not in love with this either so I won't try to defend it.
But what you just suggested is bad and doesn't take into account real world needs.


Offline rejetto

  • Administrator
  • Tireless poster
  • *****
    • Posts: 13510
    • View Profile
While I can understand changes to plugin, this "cloning" may not as good as think. It's better to consider the "prototype" in JavaScript.

i'm happy to see that you know this case could make use of prototypes. :) It's a concept most javascript programmers ignore, but I've always liked.
In THIS case anyway it may be better to keep shallow-cloning, as (1) it is easy to read for most programmers, (2) there's no actual difference in performance, (3) I should reconsider the rest of the code as they are not equivalent (for a start, the "delete" won't do what I need, and there may be other consequences in the rest of the source).

Code: [Select]
var traditTpl = function(ctx) { template.serve(ctx); return true; }
traditTpl.prototype.middleware = function(ctx) { template.serve(ctx); return true; }
// ...
module.exports = traditTpl;

While this may work, the use it makes of prototypes seems pointless to me. Maybe you have some use of it that I don't see here.

Quote
I just think it maybe useful, so wrote this by casual. And hfs 0.6 really want a "middleware" attribure.
But this didn't work -- the object cloning doesn't clone the prototype.

you are not supposed to do like that.
there's a simple example provided, and it's very simple:
exports.middleware = function(ctx) { }


Offline NaitLee

  • Tireless poster
  • ****
    • Posts: 203
  • Computer-brain boy
    • View Profile
ok, let's try to keep plugins "subjective"...

Now that apis for a plugin are manipulated by HFS, and many more sooner-or-later,
so whether a plugin is ready is determined by HFS.
Why doesn't HFS finally tell the plugin is ready? it is as easy as calling init() on the plugin module, if there is one.
Of course the procedure of init() is by plugin developers :) seems better than a mysterious setTimeout thread trick
I accept the "exports.api" if this kind of objective-or-subjective is dealt well :)

Also consider asynchronously initialized plugins. Await one of it's the case and it's necessary.
Consider telling HFS it's async / should be awaited (or other characteristics) by exposing something (like isAsync: true) in module.exports.

I think we still need to tell plugins where are builtin files, for edge cases.
Even if it means "at my own risk" or "not officially supported", it's a way to play with HFS in a kinder method.
Of course we can limit it with something like debug flags... Just in completeness.
... If really still don't want to do this, just omit these :D

btw I choose to import builtin files (as a fallback layer of exports.api) for now for my tpl plugin, for maximum efficiency of development & test cycle
"Computation is not forbidden magic."
Takeback Template | PHFS


Offline rejetto

  • Administrator
  • Tireless poster
  • *****
    • Posts: 13510
    • View Profile
i like the .init suggestion.
About the async, it's ok that you define your init as async.
I will await the init so we are covered for the future. At the moment there's nothing done synchronously after that.

About importing HFS sources.
When executing "normally" with node you probably can require('../../src/config')
like this
Code: [Select]
const { getConfig } = require('../../src/config')
console.log('ciao', getConfig('port'))

but this won't work with exe distributions (just tried), as I noticed it virtualize sources position.
So my idea is that this will be the way to go from next version
Code: [Select]
exports.api = {}
exports.init = function() {
const { getConfig } = require(exports.api.srcDir + '/config')
console.log('ciao', getConfig('port'))
}

Good? Better ideas?
« Last Edit: January 17, 2022, 02:17:12 PM by rejetto »


Offline NaitLee

  • Tireless poster
  • ****
    • Posts: 203
  • Computer-brain boy
    • View Profile
This may enough for most others. But in my own case I should consider if srcDir is/can be absolute. ::)

Here is a unix-like environment.
My workdir is at ~/Documents/Tradit-TPL, I just symlinked it to ~/Applications/hfs0.6.0/plugins/Tradit-TPL
This way I don't need to move the workspace around to update hfs. I just create a new symlink to new version.

But when the plugins are imported by HFS/Node, the symlink is resolved by Node,
then my plugin scripts can't import HFS libs with a relative path for/from HFS/Node runtime: nothing is around the real workspace.

And that's why previously there's an "absRequire" thing...

Sure I can define my own development config (as in-plugin debug configs), if I can make everything well by/for myself :)

btw is there any convenient technical methods on Windows/other platforms for developing hfs plugins (or hfs itself)?
If really no, we may implement some in HFS level, to help developers :)

PS. seems my posts are "ignoring" normal users that just cares file sharing. I'll try my best to focus on the main target :)
But, I have much enthusiast to investigate HFS3, plugins, and macros... I want to grow and be well...
"Computation is not forbidden magic."
Takeback Template | PHFS


Offline rejetto

  • Administrator
  • Tireless poster
  • *****
    • Posts: 13510
    • View Profile
I don't see the problem in ensuring you get directly an absolute path. Seems easy.
It's cool someone is playing with it.


Offline mandoz

  • Occasional poster
  • *
    • Posts: 34
    • View Profile
Hi guys,
I would like to try the new version but I have some problems ....
i use a pc with windows 10 pro 64. i tried as a simple user and as  administrator
thank.


Offline rejetto

  • Administrator
  • Tireless poster
  • *****
    • Posts: 13510
    • View Profile
yes, it seems to be working, but you are right: it doesn't tell you what to do.
open your browser and type "localhost" in the address bar.
I will add this message to the next version :)


Offline mandoz

  • Occasional poster
  • *
    • Posts: 34
    • View Profile
Hi rejetto,
I did it. the browser opens a normal search page
does not open hfs3


Offline rejetto

  • Administrator
  • Tireless poster
  • *****
    • Posts: 13510
    • View Profile
i have no idea: if it says listening on port 80 it is supposed to work, but further investigation is complicated.
you could verify that the port is actually open. it can be done in several ways. An easy one is running: telnet localhost 80
if a black window without text opens, then the port is open.

also, can you post a single screenshot with both your browser on localhost and hfs window running?


Offline mandoz

  • Occasional poster
  • *
    • Posts: 34
    • View Profile
maybe I understood!!
it seems not to work with portable software
works with installed software like edge, opera
it does not work with internet explorer
it's possible?


Offline rejetto

  • Administrator
  • Tireless poster
  • *****
    • Posts: 13510
    • View Profile
Internet Explorer? On Windows 10?  Seriously? 😆

Hfs is not compatible with it and it will probably never will.