robust physical computation

von Neumann showed us that it was possible to make reliable systems out of unreliable components, igniting the computer revolution.

And then a half century focused on efficiency has showed us how to do the reverse...


Computers used to be expensive, slow, and big. They were housed in special facilities and had dedicated staff caring for them. A primary consideration was how to use them efficiently to solve problems.

But now computers are cheap, fast, and small. Increasingly we place them not in protected environments but in the world at large, and increasingly we expect them to administer themselves.

In other words, computers need to grow up.

But, like bad parents spoiling their children, we have left our computers woefully unprepared.

The legacy of von Neumann — a reliable programmable substrate to build on — has allowed, even encouraged, a lack of robustness in software and higher level design. That, combined with the relentless focus on techniques for increasing efficiency, has left us with increasingly brittle systems that we are, none the less, empowering with increasingly valuable and sensitive tasks, and deploying into the wild.


I offered a class at UNM in the Fall of 2009, where we explored robustness in the context of physical computation. We used the Illuminato X Machina as our basic 'cell', and worked to build robust multicellular systems to perform tasks. We did not abandon concerns for efficiency, but we explored ways of balancing robustness and efficiency, with the recognition that

Efficiency And Robustness are Mortal Enemies.

Not new ways of balancing them, necessarily, because living systems have been tuning robustness and efficiency since the beginning. But perhaps, more appropriate ways of balancing robustness and efficiency, in the changing computational world we are creating around us.

$Date: 2009-08-17 08:53:58 -0700 (Mon, Aug 17 2009) $