Could you also possibly have some processing handeled by the cilents too, say 4gb of ram on the server and part of the processing is given to the cilent maybe 1gb of ram and the server and cilent communicate to share info they process?
However if this is possible (I have no idea if it is) it would require a lot of internets.
We barely even have shared threads in the SAME computer, let alone NETWORKED threading in the consumer market.
Wait a few dozen years.
...90% of the clients wouldn't have the kind of bandwidth available for that anyway.
Well, The only problem with this is that if you offload things from the servers to the client, you lose security.
It isn't inconceivable to have a sort of Peer-to-Peer network for this purpose. It'd just be damn hard to do, and it'd be significantly less secure.
"Damn hard to do" is right. Even if we assume that nobody has a firewall, you've still got to assume that there's something the server can safely offload
and not need back any time in the conceivable future. Remember, you've got latency to contend with - anything the server sends 'away for handling' is effectively lost for anywhere up to half a second (or more, for laggy situations).
If you've still got a centralized server, nothing of any importance could be dealt with this way. You
might be able to offload connection requests...
maybe some chunk requests, depending on how they're handled... but most features would be restricted - you've got two sources of latency instead of 1, and having to wait 1-2 seconds every time you wanted to place a block for the server to recognize it would be game-breaking. If you've got a totally networked and decentralized server, we're talking about something that's never, AFAIK, been successfully attempted on any kind of reasonable scale, and that would probably require a substantial amount of original research to successfully design and implement. (There's also the security concerns you've already mentioned - any software on a user's computer
can be hacked, period. There's a reason hackers keep circumventing DRM 'solutions' - if the code is on their machine, they can eventually break
anything. If SecureROM, with dozens of security-focused engineers, can't stop hackers from manipulating their code, there's no way Notch will have a chance.)
Never mind that, assuming you have a 32-bit OS with access to ~3GB of RAM (4GB, minus what Windows reserves for addressing space), you should be able to support roughly ~60 players at a time. (Since being spread out is a very real possibility, we'll assume the worst case.) If you load the server on a 64-bit machine, you could potentially host thousands of players, depending on just how much RAM you've got - you'll have the potential for these massive, semi-MMO servers some people are demanding. So, really, there's no
need for such a complicated system in the first place.
So yeah, I really don't think Notch is going to consider a heavily-distributed networked system.