The Fed’s Reckless Gamble
WHEN PICMO founder William Gross coined the term the “new normal,” he both stated the obvious and offered a fresh insight. Most people understand in a visceral way that things have changed dramatically when it comes to jobs and economic opportunities since the financial crisis of 2008. But more than something new, the current state of the U.S. economy represents a reversion to the old normal—the price deflation and slack job market that existed in the 1920s and 1930s—which was interrupted by World War II and the subsequent decades of the Cold War and massive government spending.
It is safe to say that everyone wishes for a return to business as usual, at least insofar as “normal” is understood by most Americans. Plentiful jobs, along with rising home and stock prices, worked for most of us. The only problem is that the old normal economy of the 2000s, for example, saw prices for homes, stocks and other asset classes growing at levels that were clearly not sustainable. When we saw annual double-digit increases in home prices in the United States during the mid-2000s, the one thing you could be sure about was that this rate of price change was unsound and probably a function of external factors such as low interest rates and easy credit.
Since the 2008 financial bust, the U.S. economy has been anything but normal. The housing market, for example, rebounded at double-digit rates in 2011–2013, but now seems to be losing momentum rapidly. Near-zero interest rates maintained by the Federal Open Market Committee (FOMC) prevented an immediate apocalypse in the form of a 1930s-style price deflation, but this is both good and bad news. The lack of a true debt deflation commensurate with the degree of excess prior to 2008 has left the U.S. economy hanging in a form of economic stasis. Without price deflation and debt restructuring, there is no economic “bounce” and thus no recovery in demand or jobs.
TODAY, THE U.S. economy is like a cardiac patient on artificial life support. Flat employment, flat credit growth (at least for productive purposes) and falling inflation-adjusted incomes are the attributes of the new normal. Nobel laureate Robert Shiller draws an explicit parallel between today’s “new normal” of no or slow wage and job growth and the late 1930s, when the U.S. economy began to sink under the weight of FDR’s New Deal experiment:
The depression that followed the stock-market crash of 1929 took a turn for the worse eight years later, and recovery came only with the enormous economic stimulus provided by World War II, a conflict that cost more than 60 million lives. By the time recovery finally arrived, much of Europe and Asia lay in ruins.
Shiller’s point about how World War II rescued America from the deflation of the late 1930s is often missed, deliberately, by many economists. FDR’s antibusiness rhetoric during the New Deal actually made the deflation of the 1930s worse by chasing private capital out of the U.S. economy. In their classic book A Monetary History of the United States, 1867–1960, Milton Friedman and Anna Jacobson Schwartz documented how private capital formation in the United States essentially went to zero by the late 1930s, leaving the public sector as the only engine of growth into the 1950s and 1960s. Large corporations and banks aligned with the federal government were the most significant source of credit and economic prosperity in that period. It took until the 1970s for private risk taking to truly reemerge in the U.S. economy, driving growth for decades thereafter. After these dramatic swings in growth and demand, however, we still have a muddled view of what constitutes long-term economic expansion.
While politicians and central banks can artificially increase the nominal growth rate for relatively short periods of time—we know this as a “bubble”—such machinations create no real wealth. We feel wealthier for a time, as in the Roaring Twenties and the 2000s. Yet when any significant proportion of the population tries to take its chips off the gaming table, the good times end. Given that an economy only truly grows wealth at the rate of real GDP growth, as Alex Pollock of the American Enterprise Institute observes, why do so many economists and the members of the FOMC call for policies to push higher and unsustainable rates of economic growth? The answer comes down to a basic difference between conservatives and liberals when it comes to inflation, a conflict of visions that has its roots in the dark days of the Great Depression.
Some on the left, like author William Greider, believe that a little inflation is good for working people and debtors, even if it erodes the purchasing power of wages. But just as a steady 2 percent increase in real wealth provides enormous benefits to a society, a steady 2 percent annual inflation rate can rob workers and families of the ability to meet basic needs in a matter of a few scant years. For example, an item that cost $20 in 1930 would cost $283 as of this writing, reflecting a cumulative rate of inflation of 1,315 percent, according to the Consumer Price Index (CPI) maintained by the Bureau of Labor Statistics (BLS).
Remember that because of various adjustments and omissions from the underlying data, the CPI greatly understates the actual rate of inflation experienced by individual consumers. Inflation, after all, is a monetary phenomenon that occurs when the value of money declines relative to the goods and services it can purchase. Small wonder that Americans have seen a steady decrease in real income over the past several decades. And yet the Federal Reserve and other central banks explicitly target inflation levels that are ultimately destroying consumer purchasing power.