Haiku could include a dedicated API and an additional server who's task would be to manage and optimise simultaneous downloads on a global level.
In order to optimise downloads from a Haiku system it is proposed that all mass downloads be conducted using a dedicated download kit. Any software which wanted to download large files should use the download kit API to queue the download, query it's progress and other management tasks. Classes such as 'BFTP' would be required.
The server would be composed of a basic driver which loads multiple add-on components described in more detail below (Note that the following categories are arbitrary and used only to ease discussion). It's tasks would be restricted to managing the host machine, taking action in the event of low disk space, prioritisation, etc.
The server would be composed of a basic driver which loads multiple add-on components for each supported download protocol. Protocols include both P2P (Such as bittorrent) and direct client-server downloads (FTP downloads). Each add-on would be required to implement, at minimum, the ability to:
- Download a specified object to a location on local storage
- Reference distinct downloads
- Query progress of downloads
- Assign speed limits / priorities
The management of downloads could thus benefit the user in the following ways,
- Increased accuracy of 'time remaining' estimates
- Efficiently manage limited bandwidth in order to retrieve urgent files first
Additional add-ons could be used in order to automatically ration the bandwidth according to particular needs or perform other such housekeeping tasks. A theoretical example includes a remote identification algorithm automatically prioritising friend's downloads above the general population's.
- Is it acceptable to utilise resources for a dedicated server? Could it be implemented as load-on-demand?
- Would it be better implemented as a library set, rather than a server? Is there enough demand to manage this many downloads at once?