rejetto forum

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - rxr

Pages: 1
1
HFS ~ HTTP File Server / Re: upload with filename inside URL
« on: April 07, 2009, 01:35:55 PM »
thanks for sharing rxr!
on my side, your post made me discover there's a CURL binary for Windows. may be very useful for the future.

Hey! I'm glad the info proved to be of value.

Indeed there is a binary for Windows as well as Linux (and I've tested both and they seem to work just as expected --- note that the command "curl --manual" gets a large text formated file echoed to console that gives some explanation for the many curl options).  Here are a few extra little bits of info I've learned on the subject...

1) The curl command shown below without an ending "/" after the <HFSfolder> works better in that it return the "301" code to the console when everything works properly (with an ending "/" you get the echo of the HFS html upload page, the upload still works fine, but having the return code available to any error checking in a script is probably be better).  You probably know better than I why there is this slight difference between having the "/" at the end and not having it.

    curl  -F fileupload1=@MyFile.iso  -F press="Upload files"  http://<HFSipaddress>/<HFSfolder>

2) I've tested curl ability to take input from stdin via a command, for example, like this:

    cat testfile.gz  |  curl -F fileupload1=@-;filename=atestfile.gz -F press="Upload files" http://<HFSipaddress/<HFSfolder>

and got it to work A-OK.  The result was the file being uploaded to HFSfolder and was given the name "atestfile.gz" as expected  (note that I'm working from memory and I think the above command was the syntax I used, but using stdin with the filename option to name the stdin file on the HFS server, I did get this to work --- again useful bit of knowledge for those wanting to build cmd scripts to upload files like the original poster was asking about). 

3) The last bit of testing I did with curl was with the "-G" option testing if it can be used like "wget" to download from an HFS server --- and if the curl result can be piped to other programs that accept stdin for input (like gunzip) to process.  It seems to work A-OK with HFS server and indeed the download can be piped.  For example I download the gzip file and piped it to gunzip and got the uncompressed output just as expected.

I'm sure there are more tricks that can be done by those that are more expert in all this than me.

HFS server is a very good program.   You should be proud of your work. 

Thanks.








2
HFS ~ HTTP File Server / Re: upload with filename inside URL
« on: April 03, 2009, 07:25:37 PM »
Sure was a fight to get this posted!
Hope it has not double (or triple) post!
 
Poster "yu le" asked a good question and deserves an answer. I'm not an expert in this area but here is what I found....
   
I suggest you read up on a program called "curl" at this web site:  http://curl.haxx.se/
And here is a link on this forum on the subject:  http://www.rejetto.com/forum/index.php?topic=6463.0
I've tried what was suggested and it worked for me (the HFS folder was set to allow any users to upload, no login):
 
             curl  -F fileupload1=@MyFile.iso  -F press="Upload files"  http://<HFSipaddress>/<HFSfolder>/
 
Now here is a brief cut and past from the curl manual (with some examples) to provide more insight:
 
 curl is a tool to transfer data from or to a server, using one of the supported protocols (HTTP, HTTPS, FTP, FTPS, SCP, SFTP, TFTP, DICT, TELNET, LDAP or FILE). The command is designed to work without user interaction.
 
Curl Option:
 
-F/--form <name=content>
 
(HTTP) This lets curl emulate a filled-in form in which a user has pressed the submit button. This causes curl to POST data using the Content-Type multipart/form-data according to RFC1867. This enables uploading of binary files etc. To force the 'content' part to be a file, prefix the file name with an @ sign. To just get the content part from a file, prefix the file name with the symbol <. The difference between @ and < is then that @ makes a file get attached in the post as a file upload, while the < makes a text field and just get the contents for that text field from a file.
 
Example, to send your password file to the server, where 'password' is the name of the form-field to which /etc/passwd will be the input:
                  curl -F password=@/etc/passwd www.mypasswords.com
 
To read the file's content from stdin instead of a file, use  -  where the file name should've been. This goes for both @ and < constructs.
 
You can also tell curl what Content-Type to use by using 'type=', in a manner similar to:
 
                  curl -F "web=@index.html;type=text/html" url.com
--or--
                  curl -F "name=daniel;type=text/foo" url.com
 
You can also explicitly change the name field of an file upload part by setting filename=, like this:
 
                  curl -F "file=@localfile;filename=nameinpost" url.com
 
See further examples and details in the Curl MANUAL.

                                                       -----------------

Ok, now I'd also like to point out a few other things on how to do this (based on what I found via Google)...
 
It seems it is possible to construct HTTP POST request via Java, VBA, etc... or even plain old text files.
You can do more Googling to find out more about the HTTP protocol and the POST method (along with GET, etc... methods defined in the HTTP protocol syntax, but the fact is you can just telnet to a port an http server is listening to and feed it commands in the correct format and it will work just like a browser does).  Here are some links of interest I found via Googling ...

Basic Http Protocol explained:       http://en.kioskea.net/contents/internet/http.php3

More on subject the http protocol that shows some good examples of just "cat" ing a text file of http syntax via pipe into a telnet session can be found at this link:    http://www.foureleven.org/art/art_netcat.html 

(Note: Those interested might want to research the subject of "netcat" as that might be of use itself, but be aware that some AntiVirus software thinks it is a virus).

Ok, that should be enough info for anybody to get a start on this!
 
By the way, I've been able to get HFS server to work in WinPE 2.0 from the WAIK (the free version of Vista from Microsoft).   HFS server combined with curl inside a VistaPE (with a GUI desktop windows manager like LiteStep) and your on your way to a nice rescue/recovery/setup LiveCD., I've been able to get HFS server to work in WinPE 2.0 from the WAIK (the free version of Vista from Microsoft).   HFS server combined with curl inside a VistaPE (with a GUI desktop windows manager like LiteStep) and your on your way to a nice rescue/recovery/setup LiveCD.

Hoped this helped some.





 
                                         

Pages: 1