rejetto forum

Software => HFS ~ HTTP File Server => Beta => Topic started by: rejetto on May 05, 2007, 01:54:49 PM

Title: Testing build #100
Post by: rejetto on May 05, 2007, 01:54:49 PM
download www.rejetto.com/temp/hfs100.exe

what's new...
Code: [Select]
+ Folder archive
- "Allowed referer..." now hidden in easy mode
Title: Re: Testing build #100
Post by: Flynsarmy on May 05, 2007, 02:04:12 PM
How does folder archiving work? Does it create a file on your comp that is the same file size as the files in the folder put together?

Eg if you had 30gb of videos in a folder and someone tried to download the folder, would your comp create a redundant 30gb file?
Title: Re: Testing build #100
Post by: TSG on May 05, 2007, 02:11:17 PM
Nice, works perfectly so far for me, only problems i can see is if someone will have the ability to download all my files at once... we need a template symbol like the upload section where we can 'restrict archived downloading' to different folders. :)

And when we set a folder to archived downloading 'on', a section %folder-archive% will no longer be null for that folder :) or something...
Title: Re: Testing build #100
Post by: rejetto on May 05, 2007, 05:54:14 PM
Nice, works perfectly so far for me, only problems i can see is if someone will have the ability to download all my files at once...

he could do it anyway by using any downloader... so, what's the gain in restricting this capability?
Title: Re: Testing build #100
Post by: rejetto on May 05, 2007, 05:55:17 PM
How does folder archiving work?

it's a virtual file. it never exists. every piece of it is built on demand and discarded after sending.
it's one of the coolest things i ever programmed ;D
Title: Re: Testing build #100
Post by: TSG on May 05, 2007, 06:13:44 PM
well, in my case i do not have very high bandwidth (256kbps upstream), so i would not like someone archiving a folder without my permission and hogging all my bandwidth for hrs on end, even if its a registered user... also in flynsarmy's case, his uploads count towards his limit, so if he leaves his image gallery open and goes to uni and finds someone has gone and downloaded a few hundred mb cause they archived his entire collection... get what im saying? we need the ability to disable the feature on some folders, and be able to assign accounts to archiving :) for bandwidths sake  :D
Title: Re: Testing build #100
Post by: rejetto on May 05, 2007, 06:16:01 PM
you didn't answer to my objection
Title: Re: Testing build #100
Post by: Pearl051 on May 05, 2007, 06:23:53 PM
you didn't answer to my objection

Agreed he didnt. One answer could be that most users are not technical enough to know about Teleport Pro or other site downloaders. So it could work on them. Other than that yeah it seems like a waste of programming effort. I would rather see that time go towards something else.

Offtopic: Really nice tool! I used the earlier versions long time ago and today I needed something similar so I checked out the site. Much improved  ;D thx.
Title: Re: Testing build #100
Post by: rejetto on May 05, 2007, 06:26:59 PM
If he doesn't want to "give" too much data, he should put some limit, and if the limit doesn't exist, maybe we should create them.

problems require real solutions...not just masturbation ;)
Title: Re: Testing build #100
Post by: TCube on May 05, 2007, 06:35:46 PM
Regarding use of downloaders some limits already existed,  am I right ? 
i.e max "simultaneous download from a single Ip adress"
that way no complete directory could be sent in one go ...
Title: Re: Testing build #100
Post by: TSG on May 05, 2007, 06:37:54 PM
lol well i just think having the button visible, which is something i'd like to do, would be a lot more tempting than to go 'right click > download all with download manager'.... maybe just give the ability to assign accounts to the archive feature for each folder... :-\ my main concern is just bandwidth usage... i know a download manager can do the same thing... but at least if you decide to boot them the files they have already are intact... and they wont come back to try again for them... i already have 3 IP's banned because they kept trying to come back and download all of my images in a folder with a download manager...

This leads to another question, does the archiver work in a way that the downloader can stop the download and it will cancel the last file then save... or is it one constant file that can hog bandwidth for long amounts of time and if they accidentally cancel halfway its a failed archive (if say there was 200mb in a folder)? which will lead them to try to download yet again and use more bandwidth time...

Anyhoo... sleeptime, i will add this new feature to my template code tomorrow :D
Title: Re: Testing build #100
Post by: rejetto on May 05, 2007, 06:43:07 PM
Regarding use of downloaders some limits already existed,  am I right ? 
i.e max "simultaneous download from a single Ip adress"
that way no complete directory could be sent in one go ...

you are wrong, that limit will never block you from downloading the whole folder, nor it will slow you in any way.
you will just download files in sequence, that's just what you would do with the folder archive.
Title: Re: Testing build #100
Post by: TCube on May 05, 2007, 06:47:44 PM
never used before but now I see... that won't do ...tough !  ;D
Title: Re: Testing build #100
Post by: rejetto on May 05, 2007, 08:11:17 PM
the archive download is resumable.
of course as long as you don't change the content of the folder, because it would change the archive itself.

The point here is deciding if i should spend time working on this request that will complicate the GUI with new commands. Adding commands to the account is more feasible, because accounts have few commands, the GUI would not get crowed. You get the point?
We already have tens commands for folders, so we must be careful adding new.
Anyway, nothing was decided yet.
Title: Re: Testing build #100
Post by: rejetto on May 05, 2007, 08:13:41 PM
i agree with the fact that a nice link to download it all can be tempting.
but there can be other ways, like asking for a confirmation with a warning message, or maybe other things.
Title: Re: Testing build #100
Post by: MarkV on May 06, 2007, 02:22:56 AM
the archive download is resumable.
of course as long as you don't change the content of the folder, because it would change the archive itself.

Resuming the download failed for me...

I'm testing with a 13GiB folder.

1st test was with FF2 internal download - 2 tests: download stopped after 480MiB, a few bytes more or less... just disconnected and 'done' LOL
2nd test was with Free Download Manager - failed completely (unknown network error)
3rd test is with Getright (still running) - when I pressed Pause to temporarily hold the download and then resumed, Getright told me that it 'could not find the file on this server'
4th test is with latest stable wget 1.10.2 - running great but resuming not tested yet.

So far so good...


MarkV

P.S. Second test with Getright was O.K., even resuming. Must have been a SNAFU with my system, don't know...
Title: Re: Testing build #100
Post by: maverick on May 06, 2007, 03:40:19 AM
it's a virtual file. it never exists. every piece of it is built on demand and discarded after sending.
it's one of the coolest things i ever programmed ;D

Tested locally and works just fine.  Fast!

Stop/Resume works good.  Tested with Opera on a 200mb archive consisting of 60 files.

There are many *old* items left in the to-do list.  If it were me, I wouldn't waste time working on a folder archiver that also supports rar and/or zip.  Tar is good enough and solves the need for a HFS folder archiver.  Archive tools like winrar and others support opening most formats including tar archives.
Title: Re: Testing build #100
Post by: traxxus on May 06, 2007, 09:11:01 AM
Omg i must be blind... how i can download a folder as an archive ?  ???

EDIT: ok.. its in the default template, and no in the trayon.


Oh and... why i can download a protected folder without "acces denied" message ? The folder was empty, but this solution is nasty.
im confused... please take a look at  http://traxxus.dyndns.org:100/

every folder has the same size if i want downlaod the folder as tar archive (impossible). whats wrong ?

Edit2: its so strong with this feature... please add a function, where i can choose which folder can downloaded as an archive or not !
Title: Re: Testing build #100
Post by: Alons0 on May 06, 2007, 03:42:19 PM
rejetto please add %number-addresses-ever% to be active in the tray message. Please add it in the next version. And a question: why the tray is so big with empty rows(after writing and deleting rows)? Now there's 3 rows and 3 empty rows :'(
Title: Re: Testing build #100
Post by: rejetto on May 06, 2007, 05:05:17 PM
"so big" ?
it's a standard 16x16 as every tray icon

rows? are you talking about to the text message popping up when you mouse hover it?
Title: Re: Testing build #100
Post by: maverick on May 06, 2007, 11:21:35 PM
Nice, works perfectly so far for me, only problems i can see is if someone will have the ability to download all my files at once... we need a template symbol like the upload section where we can 'restrict archived downloading' to different folders. :)

True, with downloaders your files aren't that safe.  However, I have been successful so far (2+ years) by protecting the html of, for example, external folders that store my wallpapers which was a target in the past.  Downloaders don't even see them now.

To restrict archived downloading to certain folders, by showing a public link, you don't need special template symbols.  Just use the diff template feature for the folders you want to have the folder archive link in.  No other folders are involved.  Works very well here. 
Title: Re: Testing build #100
Post by: give2me on May 07, 2007, 07:52:43 PM
oh boy .........
this is going to end in a wet dream  ;D

but can someone explain how to use the code?
i can't get this thing started  :-[

edit: i did some editing on my template , because that was the reason it did not work.
Now it's working super !!  ;)

offtopic: I noticed that resetting on the logscreen (hits) is already fixed .
Very nice job
RESPECT
Title: Re: Testing build #100
Post by: Esente on May 09, 2007, 02:05:21 PM
For the user that said this feature is only in the default template:

I don't know about trayon template?!? (The only template I use is ThunderChicken Of Glory). But you can find the place for the header, and add the following

<a href="~folder.tar?recursive">Folder archive</a>

For example, in ThunderChicken Of Glory template, find these codes:

<a href="#bottom">Bottom</a>      
&nbsp;•&nbsp;

and replace with:

<a href="~folder.tar?recursive">Folder archive</a>
&nbsp;•&nbsp;

I'm not trying to impress anybody. I'm also a noob at these, but I found out that small thing. Hope it helps whoever need it. Cheer.

BTW, cool features. It works perfectly for me.
Title: Re: Testing build #100
Post by: TSG on May 09, 2007, 03:10:50 PM
Nice work Esente, next build will have this feature and will also force people to upgrade their HFS hehe. or maybe i can just make a js to hide it ... meh lol
Title: Re: Testing build #100
Post by: Alons0 on May 09, 2007, 03:32:35 PM
"so big" ?
it's a standard 16x16 as every tray icon

rows? are you talking about to the text message popping up when you mouse hover it?
Yes i talk about the tray message. Now there's three rows with info (the ip, uptime and downloads) and 3 empty rows (after i added symbols and then deleted them).
Title: Re: Testing build #100
Post by: Esente on May 09, 2007, 03:51:38 PM
Did I say "replace" in my previous post? My bad! What I really meant is inserting the 2nd codes after the 1st codes, not replacing it.

And, huh? I only have 3 rows with info in the tray messages. No empty row for me!
Title: Re: Testing build #100
Post by: Giant Eagle on May 09, 2007, 04:29:57 PM
I don't know about trayon template?!? (The only template I use is ThunderChicken Of Glory). But you can find the place for the header, and add the following

<a href="~folder.tar?recursive">Folder archive</a>

HFS Terayon does not currently support this cause im waiting for a official release of HFS that supports this function before i release an update of HFS Terayon.

Anyway, the same can be achieved for Terayon by just following Esente's discription,

Look for the <span class="nav_top"> that 'Almost' looks the same as the one from below, but replace it with this one:

Code: [Select]
<span class="nav_top">
   <a href="/">Home</a> &#8226;
   <a href="">Refresh</a> &#8226;
   <a href="http://www.rejetto.com/forum/index.php?board=21.0" target="_blank">Forum</a> &#8226;
   <a href="%folder%~folder.tar?recursive">Dowload Archive</a> &#8226;
   <a href="">Current user: [%user%]</a> &#8226;
   <a href="~login">Login</a> &#8226;
   <a href="/template/about.html">About</a> &#8226;
   <a href="http://www.rejetto.com/forum/index.php?topic=4071.0" target="_blank">Help</a>
</span>

Shud do it
Title: Re: Testing build #100
Post by: rejetto unlogged on May 12, 2007, 12:24:55 PM
HFS Terayon does not currently support this cause im waiting for a official release of HFS that supports this function before i release an update of HFS Terayon.

if you don't want to wait, or you want your tpl to work with older builds, you can detect the build with javascript

if ("%build%" == "100") echo "<a href='archive feature'> ..... ";
Title: Re: Testing build #100
Post by: Alons0 on May 14, 2007, 10:30:07 AM
...
And, huh? I only have 3 rows with info in the tray messages. No empty row for me!
I added symbols then deleted them and now there're 3 empty rows. Rejetto please help :-[
Title: Re: Testing build #100
Post by: Rafi on May 18, 2007, 10:34:38 AM
Hi. can someone just tell me how to use this new archived downloading ? do I define something in the folder when I create it ?
Title: Re: Testing build #100
Post by: TSG on May 18, 2007, 12:56:59 PM
http:\\mooooooo/lol/~folder.tar

there. but you could have searched the thread for the answer.
Title: Re: Testing build #100
Post by: rejetto on May 23, 2007, 01:32:15 PM
to test this new feature, just use the DEFAULT template
Title: Re: Testing build #100
Post by: Rafi on May 23, 2007, 03:07:13 PM
hmmm... right.. the "folder archive" print was so small, that  I didn't see it .... :( 
May be add a nice new column/buttons/links at the right side of the directory list saying - "download-archived folder" ?

BTW, something funny happened - inside this folder I wanted to archive was a file named "index.html"... HFS template  - did not browse and show the file list but opened this file instead... is it by design ?
Title: Re: Testing build #100
Post by: MarkV on May 24, 2007, 03:22:50 AM
HFS has a feature called Default file mask... Maybe it is set for that folder? Note this feature inherits from parent folders to subfolders, if you set it somewhere higher in your structure it may be still in effect. In HFS GUI, hover your pointer over the folder in question and read the tooltip. If it mentions something like Default file mask: index.html the option is in effect.
Title: Re: Testing build #100
Post by: Rafi on May 24, 2007, 04:12:54 AM
yes you were right ... index.* was inherited from above. what does this mask mean anyways, and why does is cause to 'open' this file instead of listing all all them  ?
Title: Re: Testing build #100
Post by: Kalle B. on May 24, 2007, 06:50:12 PM
About this + Folder archive...  :-\

It is just what I've need for so long, BUT when someone happens to click this link on my VFS root, it freezes my HFS for many hours. HFS starts scanning though my hundreds of gigabytes of data...some of which are on network drives (perhaps the biggest reason). So indeed there should be a way to 1) Disable it for some folders and 2) make it not recursive.
Title: Re: Testing build #100
Post by: Kalle B. on May 24, 2007, 08:45:04 PM
Update to my last post:

The slow-down problem I reported in my last post was actually caused by the usage of the network drives... everything works great now that I hid all of those folders. But I still think there is need for another menu item to disable the downloading of the whole folder. And maybe something should be improved in the way HFS handles resources starting with "\\" as it currently freezes the program slightly also in normal browsing ...some kind of simplified way (not trying to find out file sizes and attributes) of accessing them perhaps?
Title: Re: Testing build #100
Post by: TSG on May 24, 2007, 10:34:44 PM
~folder.tar is not recursive?, it will only work for the current files within the folder, the rest i agree.

You must be using ~folder.tar?recursvie or something....
Title: Re: Testing build #100
Post by: rejetto on May 25, 2007, 01:14:20 AM
was does this mask mean anyways, and why does is cause to 'open" this file instead of listing all all them  ?

your "why" is the answer to your "what does".
why? it's exactly what it does, some people need it, some doesn't ;)

It is just what I've need for so long, BUT when someone happens to click this link on my VFS root, it freezes my HFS for many hours.

i think the same happens when you click "file list". isn't true?
the problem is file listing, and the archive needs the file list to work.

And maybe something should be improved in the way HFS handles resources starting with "\\" as it currently freezes the program slightly also in normal browsing ...some kind of simplified way (not trying to find out file sizes and attributes) of accessing them perhaps?

there is no simplified way i'm aware of.
Title: Re: Testing build #100
Post by: MarkV on May 25, 2007, 02:16:56 AM
yes you were right ... index.* was inherited from above. was does this mask mean anyways, and why does is cause to 'open" this file instead of listing all all them  ?

This feature is only useful if you're serving a webpage with HFS. index.* means: IF there is a file called index.* (ATM index.htm[l]) DO serve it in place of the folder index. If you're not serving any websites, just remove the index.*.
Title: Re: Testing build #100 - Simplified ways to avoid long time files' listing
Post by: Rafi on May 25, 2007, 04:45:53 AM
there is no simplified way i'm aware of.
I'm aware of several ways (not sure they are simple...)
1. give a global option if to  archive net drives or not
2. have the possibility to flag each of the HFS folders if to be included or not when archiving (defaults to yes)
3. give the possibility to enable-disable archiving on a login-user level

 8)
Title: Re: Testing build #100
Post by: rejetto on May 25, 2007, 05:00:52 AM
that's not exactly what i was talking about... indeed this wouldn't solve the problem for a full archive
Title: Re: Testing build #100
Post by: Kalle B. on May 25, 2007, 01:45:34 PM
It is just what I've need for so long, BUT when someone happens to click this link on my VFS root, it freezes my HFS for many hours.
i think the same happens when you click "file list". isn't true?
the problem is file listing, and the archive needs the file list to work.
Yes, ~files.lst?recursive freezes HFS too if the network drives are visible, ~files.lst obviously doesn't. It's exactly the same thing as with ~folder.tar?recursive and ~folder.tar ..recursive makes it freeze.

If there is no any other way for HFS to gather the file listing from the network drive, then there is not much to be done..
Title: Re: Testing build #100
Post by: rejetto on May 25, 2007, 04:33:50 PM
how hard is the freezing? does HFS become unusable?
i may try to find a multithreading solution.
Title: Re: Testing build #100
Post by: Rafi on May 25, 2007, 04:39:54 PM
just an idea - an option for multiple-files rar... so you can start download while it's still looking for the rest of the files...
Title: Re: Testing build #100
Post by: rejetto on May 25, 2007, 04:58:46 PM
you mean multi volume archives?
Title: Re: Testing build #100
Post by: Rafi on May 25, 2007, 05:02:46 PM
yes...

edit: I really don't see why you should use this, unless you have some strange problems like those described before or  in case of large size file(s) (over 4G ? ) .
Title: Re: Testing build #100
Post by: Kalle B. on May 25, 2007, 08:15:55 PM
how hard is the freezing? does HFS become unusable?
i may try to find a multithreading solution.

Yes the whole HFS freezes, ongoing downloads, GUI, everything. It lasts for the whole time that it takes HFS to get the list of files from the network drive. The time depends on connection speed and the number of files & folders on the drive. With 10M LAN-connection and little amount of files it's like a minute or two but when the amount goes into thousands, it's over half an hour...I haven't actually waited any longer that that but just killed the hfs process from the task manager..

Multithreading would be great. I think that would fix this kind of freezing perfectly..
Title: Re: Testing build #100
Post by: rejetto on May 26, 2007, 03:15:14 PM
ok, i will work on it.

yes...

sadly that wouldn't solve the freezing problem :(
Title: Re: Testing build #100
Post by: Rafi on May 26, 2007, 03:33:54 PM
will it help in cases of very large (>4G ) resulted  archive file ?
Title: Re: Testing build #100
Post by: rejetto on May 26, 2007, 04:00:45 PM
nope.
"archiving" takes no time, it's real-time.
it's the listing that takes time.
Title: Re: Testing build #100
Post by: Rafi on May 26, 2007, 04:14:55 PM
rejetto,  I think  I am not making myself clear ... when the file reaches the "other" side - it is being created as ONE file, and it may be larger the 4G. Would't  this be a problem if the remote OS does  not supprt it ?
Title: Re: Testing build #100
Post by: MarkV on May 26, 2007, 07:24:01 PM
rejetto,  I think  I am not making myself clear ... when the file reaches the "other" side - it is being created as ONE file, and it may be larger the 4G. Would't  this be a problem if the remote OS does  not supprt it ?


Indeed, that's a point. Speaking of FAT16, it's even down to 2GiB. And these limits apply for both download and upload.

Rhetorical question: What if I download ~folder.tar >4GiB and have FAT32? Baaad crash.

Solution #1: HFS detects filesystem and disables the feature in case if FAT. Is there a way to detect the client's filesystem?
Solution #2: ~folder.tar generally breaks into 2GiB pieces to cover all eventualities.

Code: [Select]
FAT16    - max. file size 2GiB
FAT32    - max. file size 4GiB-2B
NTFS4/5 - max. file size 2TiB (currently limited to volume size 2TiB)

see also http://www.ntfs.com/ntfs_vs_fat.htm


MarkV
Title: Re: Testing build #100
Post by: Giant Eagle on May 26, 2007, 08:21:22 PM
A simple solution would be to split the whole archive in 1GiB files.. "-.folder.part1.tar" "-.folder.part2.tar"

But anyway.. >_< are you really going to download an entire file server that is over 4GiB's? I dont think so. Or can someone give me an example situation?
Title: Re: Testing build #100
Post by: Rafi on May 26, 2007, 08:26:00 PM
back to the beginning... that 's what I suggested in the first place - have an option to define if and how to split the archives...
Title: Re: Testing build #100
Post by: rejetto on May 27, 2007, 03:34:41 PM
first, TAR doesn't support multi volume
second, do you really think i should spend time supporting mega-archives for FAT32 ?
Title: Re: Testing build #100
Post by: Rafi on May 27, 2007, 04:33:17 PM
Quote
second, do you really think i should spend time supporting mega-archives for FAT32 ?
edit: I really don't see why you should use this, unless you have some strange problems like those described before or  in case of large size file(s) (over 4G ? ) .
as I said - no...
... you could/might like to spend it on "tuning" the current implementation like - archiving "permissions" per user, and/or per folder,  support for selecting multiple/single folder-targets for archiving in the template (remote side, like the extra  column I mentioned before) etc.. :)

Again - a VERY nice feature !!!

Title: Re: Testing build #100
Post by: maverick on May 27, 2007, 05:34:57 PM
second, do you really think i should spend time supporting mega-archives for FAT32 ?

No rejetto.  That time can be spent doing other things.  I haven't used a fat32 system for years. 
Title: Re: Testing build #100
Post by: MarkV on May 27, 2007, 08:30:30 PM
first, TAR doesn't support multi volume
second, do you really think i should spend time supporting mega-archives for FAT32 ?

No, but HFS should at least have an option where you select the used filesystem. If you select FAT or FAT32, HFS disables ~folder.tar for folders larger 2GiB or 4GiB-2B. At least prevent crashes or corrupt files in these cases. Maybe even a warning for the client:
Code: [Select]
Please note that in case your filesystem is FAT32, folder archives larger than approximately 4GiB can not be downloaded correctly.

MarkV
Title: Re: Testing build #100
Post by: TSG on May 28, 2007, 05:21:20 AM
Ye good idea Mark V, i wouldn't go coding away to make fat 32 support, ntfs has been the way for windows pc's for at least the last 7 years... even my sisters old windows 2000 machine is on ntfs lol. Not many users would use anything older than windows 2000 nowadays... if they do then they are either on a system that has only like 6gb *wild guess of minimal hdd space* and would be completely useless to do 4gb files on, or they REALLY should think about buying a new computer lol. I will admit, my external hard drive i use is formatted in fat 32, but i wouldn't go putting files bigger than 4gb on it anyway haha.
Title: Re: Testing build #100
Post by: MarkV on May 28, 2007, 10:45:13 PM
Yes, the point here is NOT to encourage use of FAT/FAT32, but to prevent program crashes or corruptions by disabling the ~folder.tar for folders >2GiB(FAT16)/>4GiB-2B(FAT32).

The problem would be the detection. Server-side no problem I guess, HFS could do this and act accordingly. But clientside... Maybe with Javascript?


MarkV
Title: Re: Testing build #100
Post by: Rafi on May 29, 2007, 05:19:27 AM
maybe HFS template kus provide a bettrer save daily, saying the expected save size,  are just warn about it...  if possible - maybe - spit it itself ? ...
Title: Re: Testing build #100
Post by: TSG on May 29, 2007, 07:52:55 AM
Yes, the point here is NOT to encourage use of FAT/FAT32, but to prevent program crashes or corruptions by disabling the ~folder.tar for folders >2GiB(FAT16)/>4GiB-2B(FAT32).

The problem would be the detection. Server-side no problem I guess, HFS could do this and act accordingly. But clientside... Maybe with Javascript?


MarkV

Would be easy to do locally with hfs.

I doubt that javascript is possible though... its a bit of a security risk for javascript to read what the file system is... i am unsure if browsers/firewalls even allow this functionality.

YOU CAN however, detect the version of windows they are using. I have seen it done before... somewhere :P
Title: Re: Testing build #100
Post by: Alons0 on May 29, 2007, 11:12:52 AM
...
And, huh? I only have 3 rows with info in the tray messages. No empty row for me!
I added symbols then deleted them and now there're 3 empty rows. Rejetto please help :-[
Title: Re: Testing build #100
Post by: TSG on May 29, 2007, 02:39:49 PM
Alonso i'm getting kind of tired of seeing your redundant posts about your tray message, you obviously have some line breaks in there. Try removing all the characters in there,  Hit apply, then ok, you should have a blank tray message, then put the symbols you want in there. Trial and error things until you fix it. If you posted it... i think i have counted 4 times now... and haven't worked it out by... 3 pages later.... i would have tried re-installing hfs by that stage! haha.
Title: Re: Testing build #100
Post by: rejetto on May 31, 2007, 04:10:54 PM
Yes, the point here is NOT to encourage use of FAT/FAT32, but to prevent program crashes or corruptions by disabling the ~folder.tar for folders >2GiB(FAT16)/>4GiB-2B(FAT32).

The problem would be the detection. Server-side no problem I guess, HFS could do this and act accordingly. But clientside... Maybe with Javascript?

before discussing if it should be done client- or server-side,
there is no fast way to know the size of the archive in advance.
listing the files can take minutes.
Title: Re: Testing build #100
Post by: rejetto on May 31, 2007, 04:15:08 PM
...
And, huh? I only have 3 rows with info in the tray messages. No empty row for me!
I added symbols then deleted them and now there're 3 empty rows. Rejetto please help :-[

open a specific topic, and put a screenshot of your problem
Title: Re: Testing build #100
Post by: rejetto on May 31, 2007, 04:19:33 PM
No, but HFS should at least have an option where you select the used filesystem. If you select FAT or FAT32, HFS disables ~folder.tar for folders larger 2GiB or 4GiB-2B.

"you" ... who?
the problem is for the client only (and you don't know its file system).
HFS doesn't need NTFS to support 8GB archives, because they are virtual.
Title: Re: Testing build #100
Post by: rejetto on May 31, 2007, 04:23:48 PM
Ye good idea Mark V, i wouldn't go coding away to make fat 32 support, ntfs has been the way for windows pc's for at least the last 7 years...

i know several XP laptop with FAT32 file system

Quote
my external hard drive i use is formatted in fat 32, but i wouldn't go putting files bigger than 4gb on it anyway haha.

if you are interested, run "convert /?" from the command line, and you'll see you can switch to NTFS
Title: Re: Testing build #100
Post by: MarkV on May 31, 2007, 11:16:44 PM
No, but HFS should at least have an option where you select the used filesystem. If you select FAT or FAT32, HFS disables ~folder.tar for folders larger 2GiB or 4GiB-2B.

"you" ... who?
the problem is for the client only (and you don't know its file system).
HFS doesn't need NTFS to support 8GB archives, because they are virtual.

I'm talking about upload, not only archives but all types of files. Unfortunately, there is no way for HTML to determine the size of an upload before it's done. Uploading ~folder.tar is not implemented yet.

LZMA (without compression) supports multiarchive. All popular archives support LZMA (.7z), too. Maybe an alternative?

PS. My Laptop (Acer Aspire 5101AWLMi) came with FAT32 formatted drives, too. Had to convert them myself. Most users don't know about 'convert' and just go with FAT32.
Title: Re: Testing build #100
Post by: rejetto on June 03, 2007, 01:03:26 PM
LZMA (without compression) supports multiarchive. All popular archives support LZMA (.7z), too. Maybe an alternative?

for uploading, i guess .ZIP will be enough, and compression will be supported (but not multi-volume).
the no-compression for downloading is related to the "virtuality" of the archive,
and there's no virtual archive in uploading.