rejetto forum

Queues

cybrey · 13 · 4883

0 Members and 1 Guest are viewing this topic.

Offline cybrey

  • Occasional poster
  • *
    • Posts: 5
    • View Profile
Fantastic product by the way, been looking for something like this for a while ( even started writing my own a while in php but never got around to finishing it).

I'm hoping I'm being stupid, but does the file server support queues ? by this I mean that the server can only support 1 simultaneous downloads but 10 users could request a file and as the sender frees up it the next file in the queue to the user that requested it.
« Last Edit: April 21, 2009, 05:13:30 PM by cybrey »


Offline rejetto

  • Administrator
  • Tireless poster
  • *****
    • Posts: 13510
    • View Profile
welcome!
i'm not sure i got what you are asking, but we are using HTTP here.
did you ever see a website or web service doing what you are asking?
if the answer is no, it's likely it cannot be done.

HFS anyway supports infinite simultaneous downloads.


Offline cybrey

  • Occasional poster
  • *
    • Posts: 5
    • View Profile
Hi. thanks for the welcome and the quick reply.

Yes nearly all file serving websites do what I'm asking, fileplanet and fileshack both implement a queueing system. The way they work ( and the system I half wrote worked) is when a file is requested its added into a queue on the server. Periodically the client ( in this case the web browser) polls the server to see if its turn is up. If not the page refreshes displaying the new position, if it is the turn then it sends the file in the usual manner.

Of course ( and this is the bit I never finished) the server maintains a list of when it last heard from the client and removes it if its not heard from it in a while. This takes care of when a client navigates away from a page. ( actually the two I mentioned both throw up a 2nd window to poll the server from).
« Last Edit: April 21, 2009, 08:52:45 PM by cybrey »


Offline Mars

  • Operator
  • Tireless poster
  • *****
    • Posts: 2059
    • View Profile
Hfs do not can work in the way you want it, on the other hand it is possible to select several files and to download them in the form of archive TAR.

Is'nt it  rejetto? ;)


Offline rejetto

  • Administrator
  • Tireless poster
  • *****
    • Posts: 13510
    • View Profile
i'm sorry, but such feature is not present in HFS.


Offline cybrey

  • Occasional poster
  • *
    • Posts: 5
    • View Profile
Hfs do not can work in the way you want it, on the other hand it is possible to select several files and to download them in the form of archive TAR.

Is'nt it  rejetto? ;)
Yes but serving 100 peoples file sequentially is a lot quicker than serving 100 peoples files simultaneously. (provided the disk isn't too badly fragmented).

i'm sorry, but such feature is not present in HFS.
ah ok .. thanks once again.

For background I was intending ( and may still do so) using the software at a LAN party to serve file patches ( these range from a few mb to a few gb in size). There are normally 250+ people at the LAN, obviously this volume of simultaneous requests on a file server will kill a box.
« Last Edit: April 21, 2009, 09:22:59 PM by cybrey »


Offline Kremlin

  • Tireless poster
  • ****
    • Posts: 137
    • View Profile
This is what i've experienced so far (I also serve about 200 people): When you limit the number of simultaneous downloads, the first person to send a request uppon a file completion gets the spot, but if someone loses the connection to the server and meanwhile someone sends a request, that will also make them steal the spot (remeber download managers send requests every so often when trying to connect to the server). Speed wise, technically seeding 100 people at the same time is faster in the sense that you get more MBs sent overtime (from the servers point of view), but this will slow down the downloading process individually for users (obvious). It's a matter of if seeding too many people at the same time takes the users too much of their time to download, then cutback at the number of simultaneous downloads allowed. LAN wise speeds can go pretty high up so no need to limit, although I find that when 15+ people are downloading at the same time the use of possible bandwidth will probably hit 100%, in my case 8MB/s (LAN).
« Last Edit: April 21, 2009, 10:35:07 PM by Kremlin »


Offline Foggy

  • Tireless poster
  • ****
    • Posts: 806
    • View Profile
Currently what you are asking is not possible with hfs out of the box but I do believe that such a system could be created with macro's.


Offline cybrey

  • Occasional poster
  • *
    • Posts: 5
    • View Profile
This is what i've experienced so far <snip>
Cheers for the information really useful. I think I'll have to play with the simultaneous downloads during the event to try and best utilize the bandwidth.


Offline Kremlin

  • Tireless poster
  • ****
    • Posts: 137
    • View Profile
Glad to help, beware that if you are sharing over a wireless network with such big speeds, although traffic is internal it can slow down internet speed due to the APs capability (data uses the same cables to travel to your pc with both intranet and internet).


Offline SamePaul

  • Occasional poster
  • *
    • Posts: 72
    • View Profile
Actually it does exist in HFS. It called "Max simultaneous downloads from single address"=1  and  "Max simultaneous addresses"=100. This way you will get almost exactly same behavior: 100 users will be able to download single file at the time.


Offline rejetto

  • Administrator
  • Tireless poster
  • *****
    • Posts: 13510
    • View Profile
your suggestion is for a queue that won't respect any order.


Offline cybrey

  • Occasional poster
  • *
    • Posts: 5
    • View Profile
Actually it does exist in HFS. It called "Max simultaneous downloads from single address"=1  and  "Max simultaneous addresses"=100. This way you will get almost exactly same behavior: 100 users will be able to download single file at the time.
This isn't a queue .. its free for all. What I need to be able to do is setup so that server has the ability to send 5 simultaneous files at the same time. If a user a requests a file and its already sending 5 files ... instead of saying "No you can't, please try later", it adds them to a queue. Now when one of those 5 simultaneous sends finishes, the person waiting gets sent their file.