Yeah, the big problem with the idea is a security one. In general your security issues are not the NSA or whatever searching and archiving your giant server; it's people not updating their software or small bugs like that that allow access to problems on the smaller/client side of things. Consolidating your data into a single location means that you can layer level after level of security on it and if any holes are patched you simply need to update it in a single location (plus it makes it easy to keep physical access secure as well). Having a distributed network means that you where before you had only a single server you needed to make secure, you now have dozens or hundreds of machines you have to make sure are all running the absolute latest security code, a fact that anyone who has worked IT before will tell you is a technical impossibility in the vast majority of situations.
We can even draw some parallels to real life situations. Many people might remember the recent "apple store infected apps" debacle. In this case this was only possible in the first place because the people causing the problem were circumventing the control that apple has over the downloading of Xcode. However looking past that point, because all of the distribution was controlled by a single point, Apple was able to quickly "patch" the problem once it was discovered, by purging the apps from the app store. In a worst case scenario apple could even purge them from your phones, because at the time they had that capability. The end result? Vulnerability found and covered, problem solved and done within a few weeks. On the other hand take a look at the very famous Heartbleed Exploit that was announced back in early 2014. Because every single company was responsible for it's own usage of the old framework, (a more "distributed" usage), there was no easy "patch" that could be done. Sure OpenSSL could issue out a patch, but it would still be up to every single company that was using the system to manually update to the latest version. The result? After the course of a year some 74% or so of large companies that had code that were vulnerable had still failed to patch the problem. I have little doubt that if you went digging even today you could probably still find some fairly popular websites that were still vulnerable to the problem; after all, updating to newer versions of software takes time and development effort that could be better spent elsewhere in the eyes of the business oriented.
In short, while for things like physical protection you definitely want a backup in a physically distant location in case of disaster, in terms of electronic security your best bet is often to consolidate as much as you can to allow for better consolidated security (within reason; there is, of course, a point where consolidation begins to be harmful as you fall prey to the "keeping all your eggs in one basket" problem).
And making it a volunteer status simply makes the problem worse, since now anybody who actually wants to steal data simply has to volunteer and the largest security barrier that they have to overcome, that of actually getting their hands on the protected data in the first place, is automatically overcome as the data passes through their computer.