Monday, October 23

The computer-on-a-planet

George Gilder's The Information Factories in the latest issue of Wired was, for me, a tour-de-force.

Have you heard of Moore's Law? It has some pretty interesting corollaries. As processor speed improvements decrease, memory increases have been surging. Power consumption has been surging.

(Makes me wonder about all of these pseudo Greens who whine about the Inconvenient Truth, or promote it (like MySpace), but consume massive amounts of power. Are MySpace Tom and Sergey and Larry fighting to reduce tech power consumption? I don't think so. But I digress...)

Page and Brin – with Microsoft, Yahoo, and Barry "QVC" Diller's Ask.com hot on their heels – are frantically taking the computer-on-a-chip and multiplying it, in massively parallel arrays, into a computer-on-a-planet.

Google appears to have attained one of the holy grails of computer science: a scalable massively parallel architecture that can readily accommodate diverse software.

Google's core activity remains Web search. Having built a petascale search machine, though, the question naturally arose: What else could it do? Google's answer: just about anything.

In every era, the winning companies are those that waste what is abundant – as signalled by precipitously declining prices – in order to save what is scarce. Google has been profligate with the surfeits of data storage and backbone bandwidth. Conversely, it has been parsimonious with that most precious of resources, users' patience.

As energy analysts Peter Huber and Mark Mills projected in 1999, the planetary machine is on track to be consuming half of all the world's output of electricity by the end of this decade.

Not that I think it's going to go down that way. Things change. But those numbers evoke Sky-Net.

"A power company could give away PCs and make a substantial profit selling power."

Hydropower is a limited and localized resource, while nuclear power promises centuries of nearly limitless energy that can be produced almost anywhere. China is moving forward with plans to build as many as 30 new nuclear plants; perhaps the next wave of data centers will be sited in Shenzhen.

Have you heard Thomas Watson's famous quote about the world demand for computers being five mainframes. I never knew it might come from this thinking:

This triumph of centralization is a strange, belated vindication of Grosch's law, the claim by IBM's Herbert Grosch in 1953 that computer power rises by the square of the price. That is, the more costly the computer, the better its price-performance ratio. Low-cost computers could not compete. In the end, a few huge machines would serve all the world's computing needs.

What if personal computers were a brief detour, a tangent on the way to the planetary computer? What is they made it possible by creating systems that could build it's massively parallel components?

Kottke's Pop!Tech wrap-up today touches on this topic:

Chris Anderson talked about, ba ba baba!, not the long tail. Well, not explicitly. Chris charted how the availability of a surplus in transistors (processors are cheap), storage (hard drives are cheap), and surplus in bandwidth (DSL is cheap) has resulted in so much opportunity for innovation and new technology.
 
Where will all of this stuff lead? Not sure. But it might be a fun ride...
Post a Comment