My new work situation brought up an old debate with a good friend (perhaps good debate with an old friend? Works either way I suppose... but I digress): future topology of data and computing models on the network.
Or to put it another way: where do the leaf nodes connect to the edge of the network? Locally, in the home as a gateway for experience (or CPE in my new lingua franca) or remotely, that is, "directly" to remote applications and data stores.
This was/is partially a "client side computing" debate - where and how are performance, security, and storage best optimized.
But the observation at the end of it was this: The world only needs 6 servers arguments are currently in vogue (with consumers, who speak with their time), because, well, IT management sucks. To wit, allow me to posit: It is easier (i.e. better) to use remote applications with remote data for most users because it pushes the information management pain to professionals.
In order of "pain in the ass to maintain": Windows, Mac, Cell phone... not un-coincendentally, also a measure of how closed the software and hardware eco-systems are, in practice. Game consoles are particularly interesting in this regard (I'm rating them as easier than cel even), as everything but the VERY top layers of the stack are single sourced - sounds suspiciously like the RIA platform arguments, no?
(And all the User Access Controls in Vista, and installation hurdles for Apollo only argue against the edge being at the desktop for most applications...)