rejetto forum

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Topics - ninjapimp

Pages: 1
1
Bug reports / download limit problem
« on: April 06, 2011, 02:44:04 PM »
on HFS i've made sure there are no limits of any kind, yet users and myself see the problem that when i try to download a file it says I've reached a download limit??

how to fix the problem?

I see there are less than 3 ppl on my hfs leeching files and i login to leech something on same account and i'm denied.

the idea is anyone can login and leech but for some reason hfs decides to deny them even though they are not leeching anything already.

this problem happens with hfs and ssl.

here is a copy and paste of the error
Download limit
On this server there is a limit on the number of simultaneous downloads.
This limit has been reached. Retry later.

I notice the problem happens when i try to leech a 5th file
so it allows 4 files to download but after that no more.
yet the limit for simul downloads is already set to disabled


here is my hfs.ini
--------------------
Code: [Select]
HFS 2.3 beta - Build #272
active=yes
only-1-instance=no
window=93,7,779,511
window-max=no
easy=no
port=44300
files-box-ratio=0.206489675516224
log-max-lines=2000
log-read-only=yes
log-file-name=hfs_log_SSL_Year_%y%_month-%m%.log
log-font-name=
log-font-size=0
log-date=yes
log-time=yes
log-only-served=no
log-server-start=no
log-server-stop=no
log-connections=yes
log-disconnections=no
log-bytes-sent=yes
log-bytes-received=no
log-replies=yes
log-requests=yes
log-uploads=yes
log-deletions=yes
log-full-downloads=yes
log-dump-request=no
log-browsing=yes
log-icons=yes
log-progress=no
log-banned=no
log-others=yes
log-file-tabbed=no
log-apache-format=%h %l %u %t "%r" %>s %b "%{Referer}i %z"
tpl-file=D:\hfs_2\hfs.tpl
tpl-editor=
delete-dont-ask=no
free-login=no
confirm-exit=no
keep-bak-updating=no
include-pwd-in-pages=no
ip=127.0.0.1
custom-ip=127.0.0.1
listen-on=127.0.0.1
external-ip-server=
dynamic-dns-updater=
dynamic-dns-user=
dynamic-dns-host=
search-better-ip=no
start-minimized=no
connections-height=163
files-stay-flagged-for-minutes=0
auto-save-vfs=yes
folders-before=yes
links-before=yes
use-comment-as-realm=yes
getright-template=yes
auto-save-options=yes
dont-include-port-in-url=no
persistent-connections=yes
modal-options=yes
beep-on-flash=no
prevent-leeching=yes
delete-partial-uploads=no
rename-partial-uploads=
enable-macros=yes
use-system-icons=yes
minimize-to-tray=yes
tray-icon-for-each-download=no
show-main-tray-icon=yes
always-on-top=no
quit-dont-ask=no
support-descript.ion=yes
oem-descript.ion=no
enable-fingerprints=yes
save-fingerprints=no
auto-fingerprint=0
encode-pwd-url=yes
stop-spiders=yes
backup-saving=yes
recursive-listing=yes
send-hfs-identifier=yes
list-hidden-files=no
list-system-files=no
list-protected-items=no
enable-no-default=no
browse-localhost=no
add-folder-default=
default-sorting=
last-dialog-folder=\\thecus\raid5\FTP
auto-save-vfs-every=600
last-update-check=40206.8042147454
allowed-referer=
forwarded-mask=127.0.0.1
tray-shows=connections
tray-message=%ip%\nUptime: %uptime%\nDownloads: %downloads%
speed-limit=-1
speed-limit-ip=-1
max-ips=0
max-ips-downloading=0
max-connections=0
max-connections-by-ip=0
max-contemporary-dls=0
max-contemporary-dls-ip=0
login-realm=
open-in-browser=*.htm;*.html;*.jpg;*.jpeg;*.gif;*.png;*.txt;*.swf;*.svg;*.avi;*.mpg;*.mkv;*.m4v;*.mp4;*.mpeg
flash-on=
graph-rate=10
graph-size=59
graph-visible=yes
no-download-timeout=0
connections-timeout=20
no-reply-ban=no
ban-list=
add-to-folder=
last-file-open=D:\hfs_2\SSL File System.vfs
reload-on-startup=yes
https-url=no
find-external-on-startup=no
encode-non-ascii=yes
encode-spaces=no
mime-types=*.htm;*.html|text/html|*.jpg;*.jpeg;*.jpe|image/jpeg|*.gif|image/gif|*.png|image/png|*.bmp|image/bmp|*.ico|image/x-icon|*.mpeg;*.mpg;*.mpe|video/mpeg|*.avi|video/x-msvideo|*.txt|text/plain|*.css|text/css|*.js|text/javascript
in-browser-if-mime=yes
icon-masks=.ipa|18||.rar|10001||
icon-masks-user-images=10001:eNpz93SzsEwUYBBgSGNgUPzJwsjA4M6gwwAEILG2H7
address2name=
recent-files=D:\hfs_2\SSL File System.vfs|D:\hfs\iphonoe.vfs
trusted-files=D:\hfs\iphonoe.vfs|D:\hfs_2\SSL File System.vfs
leave-disconnected-connections=no
[color=blue]Edit by SilentPliz: removed private accounts informations[/color]
account-notes-wrap=yes
tray-instead-of-quit=no
compressed-browsing=no
use-iso-date-format=no
hints4newcomers=yes
save-totals=yes
log-toolbar-expanded=no
number-files-on-upload=yes
do-not-log-address=
last-external-address=
min-disk-space=0
out-total=2023065555573
in-total=997068128
hits-total=2264211
downloads-total=27843
upload-total=0
many-items-warning=yes
load-single-comment-files=yes
copy-url-on-start=no
connections-columns=IP address;120|File;180|Status;180|Speed;60|Time left;55|Progress;84|
auto-comment=no
update-daily=no
delayed-update=no
tester-updates=no
copy-url-on-addition=yes
ip-services=http://www.dovedove.it/hfs/ip.php|;http://www.checkip.org|color=green>;http://www.whatismyip.org/|;http://www.melauto.it/public/rejetto/ip.php|;http://checkip.dyndns.org|:;http://2ip.ru|#0033FF">;http://www.canyouseeme.org|td><b>
ip-services-time=40326.5595210301
update-automatically=no
prevent-standby=no
ips-ever-connected=127.0.0.1;196.201.51.14;
----------------------

2
How do I change the virtual file system path of an existing one?

my external raid server crashed
\\thecus\raid5\ftp

so i replaced the failed hardware created a new one but instead of calling it \\thecus\raid5 it is now \\thecus\raid

i cannot rename the raid once its created. to rename it means i must break it, losing all data just to rename it.

so instead i need to reconfigure HFS to look at a new location

but i'm having a hard changing the path.

i tried adding a new path but it wont work either as it won't find the target, yet windows explorer sees it fine

i can use windows explorer to drag n drop but then i loose all the setting that are on original one
i dont want to resetup everything, users, permissions etc.

all i want to do is edit the path but it wont let me

can anyone help me pls

3
66.249.68.167
a whois on that ip shows: crawl-66-249-68-167.googlebot.com

5/12/2010 11:06:20 PM 66.249.68.167:52928 Requested GET /WALLPAPERS/H.R.Giger/?sort=e&rev=1
5/12/2010 11:06:20 PM 66.249.68.167:52928 Sent 1460 bytes
5/12/2010 11:06:20 PM 66.249.68.167:52928 Served 53.81 KB

on my options I have it set to Prevent Spiders

so i'm wondering how is google getting past it??
i'm using beta build 260 and i dont memba this being a problem in previous builds

is there something i can do to block it?



4
I run two HFS sites

one is a non SSL, regular hfs and the other is using stunnel
a small problem I noticed which I'm not sure how to fix.
when I reboot my computer. I launch d:/hfs/hfs.exe  which is the normal version
then I launch the SSL version from d:/hfs_2/hfs_ssl.exe
but when I launch the second instance of hfs its with the settings and tempalte of the other.
so now I gota setup the port back to 44300 and change ip to 127.0.0.1, and lastly change the template.

is there a way to make it so each HFS loads its settings from the folder. I want them to run independent of each other with their own settings, and template.

the other question is in regards to the log file.
I noticed the log file was not generated for march.
it only seems to generate a log per month after i close HFS, if i leave it to run it will NOT make a new log for the month, instead i must close out HFS restart it and then it creates a new log for the month. i look at the log and it looks like it got stuck on Feb 25th because there is pages n pages worth of downloads for that date. I see someone just downloaded a file, i look on log and it registered it as feb 25th yet its march 3rd?? so clearly something has gone wrong in that it did not create a new log and log info there.. it seems to be stuck on Feb log for some reason. i closed HFS and started it right away it made a new log.

is there a way to make HFS create a new log the day the month changes?

i'm using HFS beta build 253

thx for the help

5
HFS ~ HTTP File Server / hfs not responding..thinking..when ya do a search
« on: February 12, 2010, 11:09:00 PM »
I use the rawr template because it has a search function on it.
with beta build 253 i notice when I type something to search it sits there forever.
the page says loading.
i go to hfs and it says not responding.
after about 30 sec, it then lets me kick al connections, which say thinking..
but it never displays the search results.

for example i go into a folder that has 257 files in it
and one of the files is named sim city
i search for sim, but it never displays results..HFS says thinking on IP address connection, in window top bar says not responding and after a bit that goes away but the web page saying loading and never displays results.

is it a problem with HFS or the rawr template?

6
I've tried for a few hours but i can't make it work.

i get this error from the stunnel log:
2010.02.10 14:27:13 LOG7[3828:2352]: Snagged 64 random bytes from C:/.rnd
2010.02.10 14:27:13 LOG7[3828:2352]: Wrote 1024 new random bytes to C:/.rnd
2010.02.10 14:27:13 LOG7[3828:2352]: RAND_status claims sufficient entropy for the PRNG
2010.02.10 14:27:13 LOG7[3828:2352]: PRNG seeded successfully
2010.02.10 14:27:13 LOG7[3828:2352]: Certificate: stunnel.pem
2010.02.10 14:27:13 LOG7[3828:2352]: Certificate loaded
2010.02.10 14:27:13 LOG7[3828:2352]: Key file: stunnel.pem
2010.02.10 14:27:13 LOG7[3828:2352]: Private key loaded
2010.02.10 14:27:13 LOG7[3828:2352]: SSL context initialized for service https
2010.02.10 14:27:13 LOG5[3828:2352]: stunnel 4.29 on x86-pc-mingw32-gnu with OpenSSL 0.9.8l 5 Nov 2009
2010.02.10 14:27:13 LOG5[3828:2352]: Threading:WIN32 SSL:ENGINE Sockets:SELECT,IPv6
2010.02.10 14:27:13 LOG5[3828:1048]: No limit detected for the number of clients
2010.02.10 14:27:13 LOG7[3828:1048]: FD 200 in non-blocking mode
2010.02.10 14:27:13 LOG7[3828:1048]: SO_REUSEADDR option set on accept socket
2010.02.10 14:27:13 LOG3[3828:1048]: Error binding https to 0.0.0.0:443
2010.02.10 14:27:13 LOG3[3828:1048]: bind: Permission denied (WSAEACCES) (10013)

2010.02.10 14:27:13 LOG3[3828:1048]: Server is down
--------------------------------------------------------
i download stunnel from here: ftp://stunnel.mirt.net/stunnel/
i create a fresh PEM file from here: http://www.stunnel.org/pem/
here is my list of what I've done to set it up thus far:
1. make a new fodler called HFS_2
2. put a copy of hfs253.exe in it and rename it to HFS_SSL_253.exe
3. download stunnel and extract it to same folder as hfs
4. place the created PEM file in same folder as HFS
5. i then read http://www.rejetto.com/wiki/index.php?title=HFS:_Secure_your_server
6. edit/create the stunnel.conf file which looks like this:
; Lines preceded with a “;” are comments
; Empty lines are ignored
; For more options and details: see the manual (stunnel.html)

; File with certificate and private key
cert = stunnel.pem
key = stunnel.pem

; Log (1= minimal, 5=recommended, 7=all) and log file)
; Preceed with a “;” to disable logging
debug = 5
output = stunnel.log

; Some performance tuning
socket = l:TCP_NODELAY=1
socket = r:TCP_NODELAY=1

; Data compression algorithm: zlib or rle
compression = zlib

; SSL bug options / NO SSL:v2 (SSLv3 and TLSv1 is enabled)
options = ALL
options = NO_SSLv2

; Service-level configuration
; Stunnel listens to port 443 (HTTPS) to any IP
; and connects to port 44300 (HFS) on localhost
[https]
accept = 0.0.0.0:443
connect = 127.0.0.1:44300
TIMEOUTclose = 0

7. in HFS i change IP address to 127.0.0.1
8. In HFS i change port to 443
9. my router firewall is also forwarding port 443


if i type the url of my site it fails
https://xxx.server.com:443    says page not found
i'm not ahving much luck. stunnel seems to accept the stunnel.pem file fine
http://xxx.server.com:443   this works but its not in SSL..how to make it go into SSL?

what am I doing wrong. I believe I have all the proper files in the folder as they should be and followed the wiki step by step except I created online my key i did not use openssl.

can someone take pity and point out the obvious mistake im making.

note: i did go back and try to create my own PEM file using openssl but could not amke it work
openssl.exe req -new -x509 -days 3650 -nodes -config pem.conf -out stunnel.pem -keyout stunnel.pem

problem is theres no such file openssl.exe
i dowbload openssl-0.9.8l.tar.gz and extract and search for openssl.exe but its not there.

can i bypass using openssl to make my key and just use the website to create one for me? i was told that would work just as good to.



7
Bug reports / contsant "new version available"
« on: January 29, 2010, 04:02:07 AM »
everytime it checks it says theres a new version but i'm already running beta 252.
i update it anyways...
same thing again when it checks. it like in some kind of loop.

8
HFS ~ HTTP File Server / how to hide a file in HFS
« on: January 10, 2010, 04:47:39 AM »
for some reason, windows always makes a Thumbs.db file when i watch movies or pictures.
i've not figured out how to prevent this yet

i did a search and found this:
1.  In the HFS VFS gui right click on the shared folder
2.  Select 'Properties' then 'File Masks'
3.  In the 'Files Filter' column add the file extensions you want to share
    example - *.jpg  (if you have more than one, seperate the extensions with a ;)
    example - *.jpg;*.avi
4.  Click on Apply

but it does not work for me.
in file masks tab, i see Files Filter
so i put thumbs.db in this field. i apply

but now when i surf into the folder via web browser i get an odd behavior. when i try to list a folder that contains thumbs.db it asks me to save it.

the expected behavior is for it to not display.

am i missing something? did something vhange in the beta builds?
i tried putting *.db in other fields but no luck.

it seems putting *.db in any fields makes the folder unbrowseable or asks to save the thumbs.db file

any ideas?

thanks





9
Today I add a new folder, real folder.
btu when i try to surf to it, i get : Access Denied
This resource is not accessible, sorry.

I'm not sure why it wont work, it works for otehr folders I've done

any ideas?


this error comes via localhost mode or using full internet address
via local host it lets go deep into 1 folder then error comes

i can verify path is viable as i can access it via windows explorer

using latest beta build.

10
Bug reports / why are there multiple downloads from same file on same IP??
« on: December 06, 2009, 08:58:42 PM »


Can some help me understand how to fix this or why it is...
i see one user who is leeching 5 files of the exact same file...
should it not appear 1 user for the file, i don't understand why some users it spams multiple downloads for exact IP and file yet others who leech only displays one download
if you look closely it shows he is leeching a file 5 times all at different amounts...
is this caused by download accelerator being turned off? cuz i tried with it on or off and same result.


here is a thread in regards to me having problems with limits.
http://www.rejetto.com/forum/index.php/topic,7806.msg1047386.html#msg1047386


so i've removed limits and HFS works great when there are no limits in place.

my goal is to allow 10 ppl to leech at any given time and allow only 1 download PER IP. but when i try this it causes HFS to load the site garbled

Using Beta #250 on win2k8
i tried putting limits, like 1 per IP but when i do this it breaks HFS. I've tried some combinations on the limits

below are my debug log settings:
easy=no
# default: yes

port=2000
# default:

files-box-ratio=0.206489675516224
# default: 0

log-file-name=hfs_log_month-%m%.log
# default:

log-date=yes
# default: no

log-bytes-sent=yes
# default: no

log-apache-format=%h %l %u %t "%r" %>s %b "%{Referer}i %z"
# default:

tpl-file=D:\hfs\RAWR-Template-0.1.2\RAWR-Template-0.1.2.tpl
# default:

auto-save-vfs=yes
# default: no

tray-icon-for-each-download=no
# default: yes

last-dialog-folder=\\thecus\raid5\FTP\iPhone
# default: D:\hfs

auto-save-vfs-every=600
# default: 0

last-update-check=40153.3965076389
# default: 0

tray-shows=connections
# default: downloads

connections-timeout=20
# default: 60

last-file-open=D:\hfs\iphonoe.vfs
# default:

find-external-on-startup=yes
# default: no

encode-spaces=no
# default: yes

icon-masks=.ipa|18||
# default:

recent-files=D:\hfs\iphonoe.vfs
# default:

trusted-files=D:\hfs\iphonoe.vfs
# default:

compressed-browsing=no
# default: yes

out-total=73339793255
# default: 0

in-total=37295112
# default: 0

hits-total=90955
# default: 0

downloads-total=1759
# default: 0

connections-columns=IP address;120|File;180|Progress;84|Status;180|Speed;60|Time left;55|
# default: IP address;120|File;180|Status;180|Speed;60|Time left;55|Progress;70|

ip-services=http://www.dovedove.it/hfs/ip.php|;http://www.checkip.org|color=green>;http://www.whatismyip.org/|;http://www.melauto.it/public/rejetto/ip.php|;http://checkip.dyndns.org|:;http://2ip.ru|#0033FF">;http://www.canyouseeme.org|td><b>
# default:

ip-services-time=40151.6628440625
# default: 0


11
Bug reports / unable to stream movie files with 250
« on: December 01, 2009, 06:15:48 PM »
I've noticed while using beta build 250
if i share a folder that contains media files, like .avi, wmv, mpg, .mkv, rather than stream the movie to watch on the spot, it forces you to download it first.

i run IIS7 and when i click on a file it simply opens up a media player and streams the movie on the spot but with HFS it wont do this.

i also notice using a custom template there is a preview pane, i click on it but it sits there forever with spinning wheel of death but that maybe related to the rawr template

has anyone got luck watching movies from HFS? will it stream or does it download first?

12
HFS ~ HTTP File Server / analog won't parse HFS apache style log
« on: December 01, 2009, 05:37:08 PM »
is there anyone who is good with analog? http://www.analog.cx/
i use it alot to parse my IIS logs but I'm unable to get analog to parse hfs logs
i read here: http://www.analog.cx/docs/logfmt.html
and have applied the proper logformat but no dice

i set it to log apache style format: %h %l %u %t "%r" %>s %b "%{Referer}i %z"
logfile:
76.xxx.xx.229 - - [01/Dec/2009:00:04:00 -0600] "GET /iPhone/App-Store/Cydia%20downloads/lockinfo.jpg HTTP/1.1" 200 21520 -http://www.xxxxxxxx.com/showthread.php?t=6370&page=2 -

but when i run analog it parses nothing.
however when i run analog to parse my IIS logs it works perfect.
#Software: Microsoft Internet Information Services 7.0
#Version: 1.0
#Date: 2009-12-01 01:09:08
#Fields: date time cs-method cs-uri-stem cs-uri-query cs-username c-ip cs(User-Agent) sc-status sc-substatus sc-win32-status sc-bytes cs-bytes time-taken
2009-12-01 01:09:08 GET /movies - - 76.xx.xx.240 Mozilla/5.0+(PLAYSTATION+3;+1.00) 301 0 0 366 224 561


my analog config info for HFS:
LOGFILE C:\hfs\hfs_log_month-12.log
OUTFILE Report.html

and my lastly my analog config for IIS:
LOGFILE C:\inetpub\logs\LogFiles\W3SVC2\u_ex0912.log

other than file location everything else is the same but it simply won't parse the HFS log
what do i need to change or do to make it work
if someone could point me in a good direction i'd be grateful

i tried adding this to the config file:
APACHELOGFORMAT (%h %l %u %t "%r" %>s %b "%{Referer}i %z")
LOGFILE C:\hfs\hfs_log_month-12.log
OUTFILE Report.html
but no luck

13
Bug reports / slow listing of files in folder when more than 100 files
« on: November 28, 2009, 04:08:43 PM »
using beta 249
and please forgive is this was mentioned, i did a search but found nothing.

i have a folder with say 1000 files, when i browse inot this folder, HFS just sits theres like it has hung, but if you wait, say up to 30 sec to 60 sec it will finally display the list..
i guess this is because there are too many files to list.

when i use IIS7 and go into same folder it will list everything instantly there is no lag.

one simple workaround:
create sub folders a-e, f-m, n-s, t-z
and place files into those folders, idea being is to lessen the burden on HFS to list so many files at one time.

thing is i have dozens of folders like this.

is this a known issue?> cuz i could not find a known issue list for HFS

or is there something in the settings i could change so it lists files quicker.
even with 100 files in 1 folder it takes a good 15 seconds to display it....
perhaps a working in progress indicator....what i'm seeing is ppl loose their patience and exit the website thinking it has crashed when in reality hfs is building the file list to display

any ideas on how to speed it up?

thanks again.

14
HTML & templates / the custom template loads garbled and incorrectly
« on: November 24, 2009, 02:12:35 AM »
I think this has been mentioned before but i could not find a resolution for the custom templates problem

for me when i use those fancy rawr and thunderchicken templates, they start fine but after surfing several times i notice icon and images won't load, like the .css file did nto run and everything is wacky on screen and its all white!

please if someone could help as my search turned up little for my odd situation

i'll install rawr, or thunderchicken or any of those fancy templates.

they are very nice and i love the search function on them which is main reason why i want to use them but....
sometimes the page won't load properly, graphics dont load
i hit refresh and somtimes it loads completely
i hit refresh again and some of the page loads

i tested this from multiple computers, multiple IP address from different ISP
and the problem persists

a simple fix,, revert to a simple template, like hfs damn, beta black or the default template.

who do i persue for fix on this, the developer of the template, rawr? or hfs?

is it hfs? i dont think so because if i use default template i never see the refresh or page loads improper

my guess is the custom fancy templates by rawr are not completely compatible with beta 249

and you can use a non beta as those templates require beta builds.

if someone could point me in the right direction it would be helpful as i'd really like to get this rawr template to work or in fact a template with a good search function in ti would suffice.

i'm not very good at working with .css and html so i'm hoping its just a question that templates need updating for latest beta.

i notice problems while no is connected, i gota wide open bandwidth so i know it can't be isp or dns issues, other websites refresh perfectly. i'm sure its template that needs correcting imo

i did a search and found 2 ppl with same issues.

-----
beta bld 249
win2k8 quad core server
running iis7
limits
max simul downloads:10
max connections disavbed

15
HFS ~ HTTP File Server / [request] download logs in W3C format
« on: November 21, 2009, 11:38:44 PM »
I'm a heavy user of IIS7
one feature of IIs7 i find help is the IIs logs it generates from users hitting my website
i then run a freeware app called analog which helps me dtermine how much traffic and how much has been downloaded. quite informative to parse the logs so one can see whats really going on.

here is a small cut n paste of my iis7 logs to give you an idea

#Software: Microsoft Internet Information Services 7.0
#Version: 1.0
#Date: 2009-11-17 23:33:06
#Fields: date time cs-method cs-uri-stem cs-uri-query cs-username c-ip cs(User-Agent) sc-status sc-substatus sc-win32-status sc-bytes cs-bytes time-taken
2009-11-17 23:33:06 GET /iphone/Customization+images+&+files/Winterboard/Themes/Un-UnLockable.theme/Folders/SpringBoard.app/ru.lproj/ - - 66.249.68.144 Mozilla/5.0+(compatible;+Googlebot/2.1;++http://www.google.com/bot.html) 200 0 0 845 396 1092
2009-11-17 23:37:01 GET /iphone/xxxxxxx/An+Origami+Crane+Learning+1.1-Cuibap.ipa - - 118.100.93.197 Mozilla/4.0+(compatible;+MSIE+6.0;+Windows+NT+5.1;+SV1;+.NET+CLR+1.1.4322;+.NET+CLR+2.0.50727) 404 0 2 1382 342 358
2009-11-17 23:37:02 GET /robots.txt - - 193.47.80.38 Mozilla/5.0+(compatible;+Exabot/3.0;++http://www.exabot.com/go/robot) 404 0 2 1401 261 280
2009-11-17 23:37:02 GET /iphone/xxxxxxxxx/An+Origami+Crane+Learning+1.1-Cuibap.ipa - - 118.100.93.197 Mozilla/4.0+(compatible;+MSIE+6.0;+Windows+NT+5.1;+SV1;+.NET+CLR+1.1.4322;+.NET+CLR+2.0.50727) 404 0 2 1401 331 702


the iis log captures all data you need which can be parsed via a 3rd aprty app

it would be helpful if you could add this feature and perhaps put it in W3C format so we could use already established apps to parse the data.

because currently its like driving blind with HFS, there seems to be no log or any way to know and or capture history for bandwidth and traffic which is crucial when hosting files on the net. at least for me it is. as it lets me plan ahead, this way i can know how much bandwidth to allocate.

thanks again.

Pages: 1