The ISP here seldomly allows me (and other people) to access GitHub, whether I can access HFS repository is by luck.
If possible please mirror the repo to somewhere like
GitLab, then ones here can have another way...
GitLab can also be self-hosted.
I'm moving to GitLab also.
I see the directory structure changed.
But it made me confused: should I access HFS libs by simply importing the original files?
In the last 2 days I structured my tpl plugin (with/for hfs 0.5), it uses this:
import process_1 from 'process';
import path_1 from 'path';
// Workaround for my workspace, which is symlinked to plugin directory.
const NODE_PATH = process_1.cwd();
export function absRequire(path: string): any {
return require(path_1.join(NODE_PATH, path));
}
This is used to solve many things, with absolute cwd.
And with it I am able to import HFS libs, and use them directly.
While we are exposing some apis in another way (like the const api = exports.api = {}), which means we'd better just use it.
But in theory it's always not enough, and if there's something missing, I will directly import HFS libs then use stuffs inside, without nothing bad.
This way such exposing became useless
In my case:
import { absRequire } from './absRequire';
const frontEndApis_1: HFSFrontEndApis = absRequire('frontEndApis').frontEndApis;
const config_1: HFSConfig = absRequire('./config');
These are before hfs 0.6. They work just well.
So the suggestion is, simply make HFS libs themselves useable in a standard way, then let plugin makers take knowledge from some kind of documents (or source code)
EDIT: "including" the built-in stuffs to a central lib is also an option, and this can keep interfaces stable (not suddenly changed in the future)
PS. I personally feel the exports.api with setTimeout uncomfortable. So I don't want to use it