rejetto forum

testing 2.1

rejetto · 294 · 139292

0 Members and 1 Guest are viewing this topic.

ANTS

  • Guest
Rejetto I have founda big security flaw with this. If a folder is password protected and you use ~folder.tar?recursive the protected folder is included in the download even if the user doesn't enter a password. I love the feature of downloading files but this really needs to be fixed when you get a chance.


Offline maverick

  • Tireless poster
  • ****
    • Posts: 1052
  • Computer Solutions
    • View Profile
Quote from: "rejetto"
beta14
+ tar archives
Download tray icon doesn't show during download.  Downloads aren't counted.  Doesn't respect the 'Let Download' feature.  If 'Let Download' is disabled for a folder, the download is allowed anyway.  Needs to be fixed.  As 'Let Browse' is closely related to 'Let Download', that should be checked as well to ensure that downloads from non browseable folders aren't allowed for users that don't have access.  With beta14, an upload folder with [nofiles] can't be opened?  HFS would time out if I waited long enough.

Folder archives has been brought up many months ago as a suggested feature.  At that time it was also brought up to try and use an archive format that is more commonly used on the net.  You mentioned the zip format might be to hard to implement.  When the rar format was brought up, you said you would check into it.  The rar format is commonly used world wide.  Any chance of rar archives?

Tested OK with STunnel.

Quote from: "ANTS"
Rejetto I have founda big security flaw with this. If a folder is password protected and you use ~folder.tar?recursive the protected folder is included in the download even if the user doesn't enter a password.
I don't get the security flaw you mention you get.  If I password protect certain folders that are included within a sub-directory containing other folders, then use the ~folder.tar?recursive hfs  archiving method, all of the folders are archived 'except' the ones that I password protected which I would think is the expected behavior.
maverick


Offline rejetto

  • Administrator
  • Tireless poster
  • *****
    • Posts: 13510
    • View Profile
Quote from: "maverick"
Download tray icon doesn't show during download.  Downloads aren't counted.  Doesn't respect the 'Let Download' feature.
...
thanks for mentioning, it will help.
as i said, this is an experimental version and was meant just for testing the archive streaming.


Quote
Any chance of rar archives?

 15/06/2006 01.35.43, Computer Guy
 heh still tar eh :)
 can't get PK to work?

 15/06/2006 01.38.39, rejetto
because you think the problem is only the format
it took many hours to make the tar, because you have to solve many other problems that have nothing to do with the format
if you face too many problems in a single shot, you just fail

 15/06/2006 01.39.24, Computer Guy
 libraries and stuff like that?

 15/06/2006 01.39.30, rejetto
no lib can help me
because none of them work in streaming


Quote
Tested OK with STunnel.
nice


Offline rejetto

  • Administrator
  • Tireless poster
  • *****
    • Posts: 13510
    • View Profile
Quote from: "ANTS"
Rejetto I have founda big security flaw with this. If a folder is password protected and you use ~folder.tar?recursive the protected folder is included in the download even if the user doesn't enter a password. I love the feature of downloading files but this really needs to be fixed when you get a chance.
yes, just noticed. as i said above, it was just to test if the stream works. that's why i didn't updated the first post with this version. i will now work on the rest to get a complete beta.


Offline mastabog

  • Occasional poster
  • *
    • Posts: 18
    • View Profile
Another suggestion well worth looking into in my opnion: link fingerprints to automate file verification.

It started being supported by download managers (e.g. GetRight), browser extensions (e.g. mdhashtool for firefox) and others. It would be nice if this was an option in HFS so that URLs are copied with the hash anchors but also available in directory file listings.

I was thinking of adding a command in the right click menu, say, "Copy URL with MD5 hash", or a checkbox option so that whenever the user double-clicks the MD5 hash is computed and added to the link (both in clipboard and in HFS's address bar). You might even redesign a "Copy URL options" submenu and add the option in there.

Here is what i was thinking of:

 or  

or

Being a link anchor (#), it doesn't create problems for browsers or download managers that do not support link fingerprints. However, I don't think it's a good idea to have this enabled at all times as large files will cause heavy disk activity and cpu usage. You could also have a global option for this, like the ones in the global "Menu > IP address" menu ... some users sharing smaller files might want it globally enabled (not my case though).

Many of the friends I share large files with could benefit from this as they have slow connections and I have to use external tools to create separate md5 files and have them download those as well in order to verify the files locally on their machines using other 3rd party tools. With link fingerprints and the firefox extension or getright for instance they could do it all in one click and at the download finish they would get prompted if the file hash verification failed ... no more thirt party tools and separate files to be hosted and downloaded by neither me or my users.

I think this would make a great addition to HFS and it would probably be one of the first web servers (if not the first ever) to support link fingerprints natively ;)

Anyway, thanks for reading :)


Offline rejetto

  • Administrator
  • Tireless poster
  • *****
    • Posts: 13510
    • View Profile
Quote from: "maverick"
I don't get the security flaw you mention you get.  If I password protect certain folders that are included within a sub-directory containing other folders, then use the ~folder.tar?recursive hfs  archiving method, all of the folders are archived 'except' the ones that I password protected which I would think is the expected behavior.
it happens only if you have "list protected items..." disabled. i guess you have it enabled.


Offline maverick

  • Tireless poster
  • ****
    • Posts: 1052
  • Computer Solutions
    • View Profile
Quote from: "rejetto"
it happens only if you have "list protected items..." disabled. i guess you have it enabled.
Yes I have it enabled.  That's the behavior I want. The only folders my users see are the one's pertaining to their access level.
maverick


ANTS

  • Guest
Quote from: "maverick"
With beta14, an upload folder with [nofiles] can't be opened?  HFS would time out if I waited long enough.

I just tested that myself.
Rejetto, folders in the VFS that have no files in them cannot be opened. It just times out.


Offline rejetto

  • Administrator
  • Tireless poster
  • *****
    • Posts: 13510
    • View Profile
i fixed many things and added initial support for link fingerprints

redownload if you want, or wait for the final beta14


ANTS

  • Guest
Thanks rejetto.

However, tar downloads doesn't seem to be working for Virtual Folders. It's just a 512 byte damaged file.


Offline maverick

  • Tireless poster
  • ****
    • Posts: 1052
  • Computer Solutions
    • View Profile
Quote from: "rejetto"
i fixed many things and added initial support for link fingerprints
Quote from: "ANTS"
However, tar downloads doesn't seem to be working for Virtual Folders. It's just a 512 byte damaged file
I confirm that the new beta14 archiving isn't working at all.  Just get a damaged file like ANTS said.

Support for 'link fingerprints' - I might have missed it, but I saw nothing at all pertaining to this in the new beta14 menus.  Question - If no mirrors are involved, why would link fingerprints be necessary?
maverick


Offline mastabog

  • Occasional poster
  • *
    • Posts: 18
    • View Profile
link fingerprints was my suggestion (i wrote a long and detailed post above in this thread, here)

File verification through md5 or sh1 checks are not linked to mirror downloading but to broken downloads. When downloading big files or when having a slow connection, disconnection problems can lead to broken downloads. md5/sha1/sfv/etc files are necessary means to verify the download was ok.

A lot of websites provide md5 files in the same directory as the files to download (check any linux distro, apache, etc). Link fingerprints automates this process without the need for additional files. It just adds the md5 hash as a link anchor to the link of the original file. If you want to understand file verification better then please refer to http://microformats.org/wiki/hash-examples. Link fingerprints is a new idea and it is getting popular. You can read about it more in my post above.


Offline ants

  • Occasional poster
  • *
    • Posts: 20
    • View Profile
I have posted a lot and now I am registered :D :D

Anyways, thanks for the link mastabog, I never quite understoof what md5 and sh1 were about.

Moving on, and maverick, have you tried tar downloads with real folders? It works for me and now password protected/restricted directories are now not downloaded unless the user has logged in or supplied the password which is great. I'm not sure if the file is counted as a download though.

EDIT: However, I have 2.36gb in the root folder (just a single 'Real' folder) and the download box comes up as 2.36gb however when I download it, its only a 512 byte damaged file.

EDIT 2 *Just tested*: ~folder.tar?recursive does not work with over 2gb worth of files. If I have under 2gb it works (Internet Explorer 7.0 Beta 2)


Offline maverick

  • Tireless poster
  • ****
    • Posts: 1052
  • Computer Solutions
    • View Profile
Quote from: "ants"
Moving on, and maverick, have you tried tar downloads with real folders?
No.  There is a problem with this beta.  It shouldn't matter if the files were in virtual or real folders.

Quote
However, I have 2.36gb in the root folder (just a single 'Real' folder) and the download box comes up as 2.36gb however when I download it, its only a 512 byte damaged file.
I'm not bothering to test this beta any further.  Will wait until rejetto checks things out and does some fixing.

Quote
EDIT 2 *Just tested*: ~folder.tar?recursive does not work with over 2gb worth of files. If I have under 2gb it works (Internet Explorer 7.0 Beta 2)
Don't forget about the limitations of the browser.  IE=2gb, Opera=2gb, Firefox=4gb.  Workaround=use good download manager
maverick


Offline rejetto

  • Administrator
  • Tireless poster
  • *****
    • Posts: 13510
    • View Profile
i need to know what's the *.MD5 file format
found nothing
anyone knows?
for now i only support single <filename>.md5 files
don't know if there are other grouping formats