Thursday, August 28, 2008

Growth of Productivity in the US Economy 1870 - 1995

This graph was used by Louis Uchitelle of the New York Times to illustrate the "missing productivity" that the information revolution was supposed to bring to business. It cleverly shows the increase in productivity in the US economy from 1870 to the mid-'90s with major technological revolutions highlighted along the way. The Internet Revolution was just getting underway when Mr. Uchitelle first penned his thoughts and would later make visible the pent-up productivity that the computer was about to unleash.

Growth of the US economy 1870 - 1995 tied to major industrial innovations

Here is one of Mr. Uchitelle's essays on the subject from December 1996.

December 8, 1996

Measured in Productivity Gains,
The Computer Is a Disappointment



t the end of the 19th century, railroads and electric motors were expected to transform America, making a young industrial economy far more productive than any seen before. And they did.

At the end of the 20th century, computers were supposed to perform the same miracle. They haven't.

Computers do wonderful things. But in purely economic terms, their contribution has been less than a transforming force: they have failed to bring back the strong growth that characterized so many decades of the American Century. By that standard, they have been a disappointment.

"It is a pipe dream to think that computers will lead us back to a promised land," said Alan Krueger, a Princeton University economist.

The issue is productivity. Those who look to computers for economic miracles, and there are many, insist that measuring their contribution only in dollars misses the less tangible improvement in quality that computers have made possible. But quality is often in the eyes of the beholders rather than in their wallets.

Through decades of invention and change, productivity has been measured as the amount of "output," in dollars, that comes from an hour of labor.

A worker who makes 100 pencils in an hour, each valued at 50 cents, produces $50 of output. And the more output from each of the nation's workers, the greater the national wealth.

Or, put more broadly, productivity is the amount of output in dollars that comes from various "inputs," not only a worker's labor, but the tools he or she uses to carry out that labor: a machine or a computer or a wrench or an air conditioner that makes work more comfortable in summer.

People work faster or concentrate better, and that shows up quickly in tangible output.

By this definition, the output resulting from the computer revolution of the last 25 years has been disappointing.

Computers have, of course, contributed to productivity and economic growth. But that contribution has failed to register in government statistics as the kind of robust catalyst that made the 1950s and 1960s such prosperous years.

If computers have fallen short of expectations, that would help explain an apparent paradox that has puzzled economists and policy makers for two decades: how rapid technological progress and a booming stock market took place during a period of sluggish economic performance -- sluggish, that is, relative to earlier decades.

One possibility is that the statistics are wrong. A panel of economists came to this conclusion in a report to Congress last week, suggesting that growth has actually been quite robust but that this fact has been obscured by overstating the amount of output lost to inflation.

This happened, the panel hinted, partly because the beneficial economic role of computers was not correctly taken into account. Some price increases that registered as inflation should really have registered as increases in output from computers.

But there is another explanation. Perhaps the computer is one of those inventions, like the light bulb early in the century, that makes life much better without adding as much to tangible national wealth as appearances might suggest.

That is because, while the light bulb allowed factories to operate night shifts and students to study more easily, the measurable result was less impressive than the great improvement in the quality of life that the electric light bulb made possible.

Given the computer's ubiquity and convenience, should the calculation of productivity and wealth be changed to give more dollar value to the conveniences the computer has wrought?

That kind of recalculation has not been done over generations of technological change, largely because convenience is too hard to quantify and translate into dollars. Too often, convenience increases consumption more than production. With computers, "most of the recent use has been on the consumption side," said Zvi Griliches, a Harvard economist. "The time you waste surfing the Internet is not an output."

Others take a broader view. Children using home computers for schoolwork -- gathering data from the Internet, for example -- become better students, they say.

In time, that will translate into rising workplace skills and greater measurable output. But it hasn't yet, and standard practice dictates that the nation wait until it shows up in the numbers before proclaiming the computer's great contribution to productivity.

"People have high expectations of this happening overnight," said Nathan Rosenberg, an economic historian at Stanford University. "Computers are a major innovation, but absorbing so great an innovation involves many changes in work practices and behavior."

Right now, much of a personal computer's power goes untapped, or is employed in low-output tasks like sending and sorting through junk E-mail, compiling electronic Rolodexes and playing solitaire in the office.

Harnessing a computer's spectacular ability to deliver and manipulate information is not easy. Edward McKelvey, a senior economist at Goldman Sachs, offers a hypothetical illustration:

A consultant who charged $50 an hour 10 years ago to forecast trends in the economy now has a powerful desktop computer at his fingertips, feeding him information that in theory should make his forecasts more accurate. But he still charges clients $50 an hour because the forecasts, despite the computer, are not more accurate.

Perhaps the consultant might never get that good at forecasting, even with a computer, or perhaps he will become so adept at extracting data from its depths that his forecasts will begin to hit the bull's eye. And that accuracy would allow him to raise his hourly fee, or "output," to $70 an hour, a handsome improvement in his productivity.

There are other problems. The automated teller machine, for example, illustrates how measurable productivity has failed to respond fully to computer investment. A half-dozen machines installed in a bank's lobby permit the bank to cut its teller staff by half. That is clearly measurable productivity.

The bank's income, or output, from bank transactions remains unchanged, but the input in teller hours goes down. The idled tellers can shift to other income-producing activities, perhaps becoming loan officers.

To make the productivity rate continue rising, however, the bank must continue cutting teller hours as it installs more ATMs. Instead, the next machines go to a dozen outlying neighborhoods, so that customers can bank at odd hours, almost at their doorsteps, or verify the balances in their checking accounts, something they did not bother to do very often before ATMs.

That is convenience. Most banks don't charge extra fees for this convenience. If they had no neighborhood ATMs, then customers would have found themselves forced to use the machines already installed in the lobbies of their banks.

"The question is, how much would you have been willing to pay in fees for the convenience of having that neighborhood ATM if the banks refused to furnish them otherwise?" said Erich Brynjolfsson, an economist at the Massachusetts Institute of Technology's Sloan School of Business. "That would then enter into measurable output."

Through a survey, Brynjolfsson tried to calculate what additional amounts Americans would pay for hundreds of conveniences that computers make possible. He came up with a total of $70 billion in additional output.

That would add only one-tenth of one percent to the national wealth, which is the value of all the goods and services produced in the United States in a year -- hardly enough to get economic growth back to the rates (at least 3 percent a year) that were characteristic of the 1950s and 1960s.

Still, computers and software in all their various forms make an important contribution. The national wealth -- also known as the gross domestic product -- has risen at an annual rate of less than 2.5 percent, on average, in recent years.

That includes a contribution of roughly four-tenths of a percentage point from computers and their trappings, according to the calculations of two Federal Reserve economists, Stephen D. Oliner and Daniel E. Sichel. Manufacturing and the telecommunications industry have benefited especially from computerization.

But why haven't computers lifted the overall economy the rest of the way back to 3 percent growth? One reason is that they represent only 2 percent of the nation's capital stock, which is all the existing machinery, equipment, factories and buildings that business uses to produce goods and services.

By comparison, railroads in their heyday represented more than 12 percent. And they became the tool for opening up frontier lands to agriculture, and to new cities and industries.

At the same time, electric motors, replacing steam, gave the nation a much more flexible and efficient source of power, and made possible the assembly line. The output resulting from railroads and electric motors became enormous.

Perhaps there is some set of conditions, having no direct connection to computers, that must develop before American productivity and economic growth can return to the old levels -- conditions like greater demand for the potential output from computers, or hegemony again in the global economy.

Or perhaps, as some economists say, we should lower our expectations.

No comments: