Teschler on Topic
Leland Teschler • Executive Editor
On Twitter @ DW_LeeTeschler
Back in the dark days around the end of the year 2002, the NASDAQ-100 had dropped 78% from its peak, and the country was in the middle of a recession. Many early dot-com companies had run out of capital and gone bankrupt. Supporting industries such as advertising were hurting as well because demand for their services had fallen drastically.
I can still recall an elevator conversation I had back then with one of the executives where I worked. “The internet isn’t coming back,” he said dejectedly. “All the analysts we talk to say there’s nothing on the horizon that’s going to make online commerce anything like what it once was.”
A few months after this prediction of doom-and-gloom, a Harvard University student put together a website he called Facemash.
No one in 2002 foresaw the rise of a company that, ten years later, had 955 million monthly active users and revenues of $7.8 billion. Ditto for social media in general though early social media platforms such as GeoCities and Classmates.com had been around since the middle of the 1990s.
That brings us to the current call by U.S. policy makers for an industrial policy to fix perceived market failures and counter China’s growing economic prowess. A principle part of this idea is subsidies for U.S. producers of semiconductors, electric vehicles, and other goods, as well as legislation to subsidize American industrial R&D.
There’s only one thing wrong with calls for a U.S. industrial policy: Past government interventions of this sort have a terrible track record. Attempts by politicians to identify critical technologies in the past failed because government bureaucrats are unlikely to do the equivalent of foreseeing the next Facebook.
History is rife with examples of federal industrial programs that contributed little in the way of technological advances but endured long after the technical handwriting was on the wall. Examples include synthetic fuel from coal, the Clinch River Breeder Reactor, and the Supersonic Transport in the 1960s. But a more recent example concerns supercomputers made by Cray Research Inc.
In the 1990s, politicians decided supercomputers were a strategic technology. There was considerable pressure by the U.S. government on Japan to buy Cray machines. The fear was that Japanese makers of supercomputers would dominate the industry. U.S. officials seemed to think that keeping a functioning supercomputer industry in the U.S. hinged on making Cray viable.
The U.S. eventually did arm-twist Japan into buying a few Cray boxes. Meanwhile, competitors such as the Thinking Machines Corp. and Intel claimed the government was singling out Cray for special treatment. They complained that Cray lagged in developing massively parallel supercomputer technology that signaled where the industry was headed.
Cray Research eventually was bought out by what is now the Hewlett-Packard Enterprise Co. On the current list of top-ten supercomputers in the world, a machine from Japan occupies the top spot. But of the other machines on the top ten, two are made by IBM, two are by Dell Computer, two are by Nvidia, one is Chinese, and one is French.
Not every U.S. industrial policy effort has resulted in disaster. But most past efforts at boosting “critical” industries have had expensive, underwhelming outcomes. So it pays to be skeptical of policies that only work if politicians prophesy the technological future. DW
You may also like:
Filed Under: Commentary • expert insight, ALL INDUSTRY NEWS • PROFILES • COMMENTARIES