All your Google Base are belong to us.
"Automated controls will monitor the sealed reactor, Smith says, adjusting its electrical output and shutting it down if faults or tampering are detected. Alerts will be sent over secure satellite radio channels to the DoE or to an international agency overseeing the reactors."
From Newscientist via Slashdot
Also from Slashdot, an article about a Google project involving ISO shipping containers full of servers. These would complement Google's ongoing legendary buy-up of Dark Fiber, the dark fiber provides the capacity, and the mobile datacenters provide the processing power for who knows what - some say a parellel internet that would be continuously updated copy/archive/index of the "real thing", at the very least such a system would decrease the latency time inherent in running network applications as if they were on the desktop.
The comments to the Slashdot post include speculation that this mobile datacenter/dark fiber combo would be a good way to roll out an instant first world internet in rapidly developing countries like China. A few commenters also wonder whether the thing would be laced with alarms like Bruce Sterling's Perfect Crackhouse.
I wish I knew more about that Ars Electronica presentation from Sterling, crack is a great metaphor for this kind of distributed infrastructure. Data is addictive. So is cheap energy, and these proposals imply first world powers rolling into the third world with power and data in alarmed, guarded boxes, like the heroin dealers in Requiem for a Dream, selling dime bags from sunny Florida in freezing New York out of the back of a semi, ready to fire a few shots and split if some junkie fucks up and tries to build a nuclear bomb.
This is a long way from the attitude behind the $100 laptop project, where the Freedom to Tinker is so important that MIT turned down a software donation from Apple because it wasn't open source. Two ways of looking at mobile infrastructure flowing from the first world to the third: teach someone to fish, or start a cargo cult. Either these things are able to be taken apart and learned from, or they're totems dropped from the sky that deliver the goods until some native tries to mess with them.
The distributed/centralized dichotomy in processing and storage is another model of this split. The old model of a central processor with distributed terminals for network computing keeps popping up, keep your data and your applications online, and access it anywhere from a dumb interface box. This is how Google Earth works, but it's also how .mac and .net were intended to be used, the extreme example is WebTV. This never seems to catch on completely because storage and processing power is always getting smaller, faster, and cheaper, why should I put all of my data on .mac when I can carry it around in my ipod? The best strategy would be to use both, but over time, I'd rather keep my data and my processing power local, because otherwise I'd risk giving up some of my ability to mess with it. Sony's attempt to take over a little bit of control from user's local machines with a rootkit wouldn't be half as outrageous if that data was stored online somewhere (like Google Base). Any scheme to centralize data and distribute access has the (sometimes unintended) consequence of controlling what user's can do with that data.