For a historian, all this technoptimism is hard to swallow. The harsh reality, as far as I can see, is that the next 25 years (2013-2038) are highly unlikely to see more dramatic changes than science and technology produced in the last 25 (1987-2012).

For a start, the end of the Cold War and the Asian economic miracle provided one-off, nonrepeatable stimuli to the process of innovation in the form of a massive reduction in labor costs and therefore the price of hardware, not to mention all those ex-Soviet Ph.D.s who could finally do something useful. The IT revolution that began in the 1980s was important in terms of its productivity impact inside the U.S.—though this shouldn’t be exaggerated—but we are surely now in the realm of diminishing returns (the symptoms of which are deflation plus underemployment due partly to automation of unskilled work).

The breakthroughs in medical science we can expect as a result of the successful mapping of the human genome probably will result in further extensions of the average lifespan. But if we make no commensurate advances in neuroscience—if we succeed only in protracting the life of the body, but not the mind—we will simply increase the number of dependent elderly.

My pessimism is supported by a simple historical observation.