Let's say you have a large game. An oldschool rpm/deb process would be - download repository list, get package file, get dependencies of it, download all files (network heavy), extract all files (disk heavy), install all files (CPU heavy). A next-gen approach would be - have a remote call to determine what is needed for the given package (yes, servers can and need to be smarter than just being a dumb http server), and then stream it - extracting it on the go and installing in parallel (I didn't mean installing multiple packages but having the download/extract/install phases happening in parallel) - being able to resume if connection lost. All the while it could skip getting parts it already has (say from a previous install, in rsync style manner) or doesn't need to minimize download size. Such a process would use less CPU, less network bandwidth and be faster than what is possible with yum/rpm (or apt/deb, etc). Yes, bold and slightly smelling of a brand new wheel, but would really be nice