Download Server
Haiku could include a dedicated API and an additional server who's task would be to manage and optimise simultaneous downloads on a global level.
Details
In order to optimise downloads from a Haiku system it is proposed that all mass downloads be conducted using a dedicated download kit. Any software which wanted to download large files should use the download kit API to queue the download, query it's progress and other management tasks. Classes such as 'BFTP' would be required.
The server would be composed of a basic driver which loads multiple add-on components described in more detail below (Note that the following categories are arbitrary and used only to ease discussion). It's tasks would be restricted to managing the host machine, taking action in the event of low disk space, prioritisation, etc.
Protocol Add-ons
The server would be composed of a basic driver which loads multiple add-on components for each supported download protocol. Protocols include both P2P (Such as bittorrent) and direct client-server downloads (FTP downloads). Each add-on would be required to implement, at minimum, the ability to:
- Download a specified object to a location on local storage
- Reference distinct downloads
- Query progress of downloads
- Assign speed limits / priorities
The management of downloads could thus benefit the user in the following ways,
- Increased accuracy of 'time remaining' estimates
- Efficiently manage limited bandwidth in order to retrieve urgent files first
Management Add-ons
Additional add-ons could be used in order to automatically ration the bandwidth according to particular needs or perform other such housekeeping tasks. A theoretical example includes a remote identification algorithm automatically prioritising friend's downloads above the general population's.
Issues
- Is it acceptable to utilise resources for a dedicated server? Could it be implemented as load-on-demand?
- Would it be better implemented as a library set, rather than a server? Is there enough demand to manage this many downloads at once?

Comments
Re: Download Server
I am planning to implement something like this to go along with my Haiku web browser. I do not know if it will get included in the core Haiku, though I'm sure it would be included on most "distros" or whatever we will call them.
If the original author of this RFC has any more input, let me know.
Of course if anyone else has input as well, please post :)
I have also been in discussion with "anarchos" in the #haiku IRC channel. He is going to write up some of the features he would be interested in, though we all seem to be on one page in regards to the core features (download_server to handle all downloads, add-ons for protocols, BitTorrent support, bandwidth throttling, etc.)
Re: Download Server
How does the browser/client pass on authentication information to the download server? The HTTP download would need both HTTP digest auth information and the cookie.
Would there be any problems with SSL or other types of encrypted connections? I could imagine that some protocols would require passing on the private/public keys that are used for the connection to the server.
What about servers that limit the number of connections from the same host? E.g., downloads via FTP might have to reuse the same socket (imagine you start the download from your browser).
Is this worth all the trouble and increased code complexity?
Re: Download Server
I don't think it would be a problem to pass that information if needed.
I don't see why the download_server's protocol handlers couldn't renegotiate the secure connection. I don't think it is necessary to pass the information from the browser.
You would never start a download in one application and finish it in another.
There are a lot of download managers on Windows and Linux that somehow seem to function despite all the above perceived problems.
I don't believe it will be any more complex than having the same sort of code in the browser. In case you don't know, WebKit does not really provide much of an infrastructure for handling downloads, so it will need to be written either way. I think having a global download manager will provide several benefits, such as downloads that are independent of the browser (so you can restart or endure a crash of the browser with no affect on downloads) and also a global bandwidth throttle, so you don't have to play hide-and-go-seek trying to find what is eating your bandwidth.
But if it turns out to be unduly complicated I have no problem just putting it in the browser if I have to.
Thanks for the input.
Re: Download Server
Wow! it's amazing!
what's the general status of webkit Ryan?
regards
Michael
Re: Download Server
There is still quite a bit of work required before we have a full-fledged browser. But it is much closer than it has ever been in the past.
I need to update the Haiku WebKit port with the latest code from the main WebKit project, and then I need to get the code for my port put in their repo.
I then will start working on a browser shell, referencing and maybe using some of the code from previous BeOS browser projects (Themis, NetOptimist, etc.) As I do this I will improve aspects of our WebKit port that are lacking.
I have used all the major browsers quite a bit (Firefox, Safari, Opera, IE a little) and plan to include the useful features they have without getting too bloated. Plus I don't want to be working on the browser forever.
Sometime while working on the above I will start experimenting with the download_server idea too.