It was Sunday night on Aug. 15, 1971, and many Americans were watching television — the most popular show that evening being the Western series “Bonanza.” (Older readers will recall that it chronicled the adventures of the Cartwright family — Ben, his three sons and their Chinese cook — on their Ponderosa Ranch in Nevada.) At 9 p.m. Eastern time, the Cartwrights and their rivals on the other two networks were interrupted by the somewhat less popular figure of President Richard Nixon.
The word “bonanza,” according to the Oxford English Dictionary, was introduced into American English in the 1880s to describe a highly productive or profitable mine, such as the silver mines of the Comstock Lode in Cartwright country. Ironically, Nixon was disrupting Sunday evening to tell Americans that the days of precious metal were over. The link between the U.S. dollar and gold — a link that dated back to the country’s adoption of the gold standard nearly a century before — was to be severed. The age of fiat money — that is, of currency backed by nothing more than the credibility of the U.S. Treasury — had dawned.
Not that Nixon put it like that. It’s worth watching a clip of his address to remind yourself just how terrible the production values of U.S. politics used to be. Nixon looks as if he is addressing the nation from a passport photo booth, a nasty blue curtain all but matching his equally nasty blue suit and tie. There were no teleprompters then, so he constantly looks down at his script. You would not know from his flat delivery how many hours he and his advisers and speechwriters had devoted to this historic text.
Americans by now were used to presidential addresses about Vietnam. It was less usual to have a lecture on the economy on a Sunday night. However, as Jeffrey E. Garten explains in his gripping account of the speech’s origins and consequences, “Three Days at Camp David,” the announcement had to go out before financial markets opened on Monday. In his own charmless way, Nixon was dropping a bombshell.
“The time has come,” Nixon declared, “for a new economic policy for the United States. Its targets are unemployment, inflation and international speculation.” There followed a succession of presidential pledges, in ascending order of radicalism: to introduce tax breaks to encourage investment; to repeal the excise tax on automobiles (but only U.S.-made ones); to bring forward planned income tax deductions (though with offsetting spending cuts); to impose a 90-day “freeze” on all prices and wages; and — the bombshell — “to suspend temporarily the convertibility of the dollar into gold.” Finally, Nixon announced a 10% tax on all imports — in a word, a tariff.
For foreign leaders, finance ministers and central bankers, this was stunning. Not only would the U.S. dollar cease to be convertible into gold; the U.S. was apparently turning away from the free trade it had embraced at the end of World War II and reverting to protectionism — though this proved to be just a threat to get the Europeans and Japanese to accept the dollar devaluation. In the words of Henry Brandon, the chief Washington correspondent of the London Sunday Times, this was the “moment of the formal dethronement of the Almighty Dollar.”
Except that it wasn’t.
From the distance of half a century, the most surprising thing about what the Japanese called “the Nixon shock” was precisely that it did not mark the end of the era of dollar dominance. On the contrary, the U.S. currency has only grown more important — its privilege even more exorbitant — since Nixon severed its link to gold.
There is an important lesson here for every commentator who is tempted to speculate about the dollar’s demise (and I have done it myself more than once). My old friend Steve Roach, the former chairman of Morgan Stanley Asia, made the standard case in January. Since then, the dollar has essentially flatlined, according to the trade-weighted indices produced by the Bank for International Settlements.
The arguments for a dollar crisis back in 1971 are familiar to modern ears. Inflation was rising. The budget deficit was worrisome. The trade deficit was growing. And Asian and European competitors were eroding U.S. economic leadership. Nixon’s economic bombshell needs to be seen in the broader context. He and his national security adviser, Henry Kissinger, were struggling to extricate the U.S. from an unpopular war in Vietnam. They were in the midst of a bold attempt to deal directly with China’s communist government in the hope of putting pressure on the North Vietnamese and their Soviet backers.
You might say that Joe Biden confronts a somewhat similar landscape (for Vietnam, read Afghanistan) except that the deficits of 2021 make the deficits of 1971 look trifling. The federal deficit in Nixon’s first term peaked at 2.1% of GDP. In the words of a July 21 Congressional Budget Office report, “At 13.4% of GDP, the deficit in 2021 would be the second largest since 1945, exceeded only by the 14.9 percent shortfall recorded last year.” And that doesn’t include the $1 trillion infrastructure bill that the Senate just passed, which the CBO thinks would widen the budget deficit by another $256 billion over 10 years. Nor does it include the $3.5 trillion antipoverty and climate package that the Senate majority leader, Chuck Schumer, would also like to enact this year.
As for the trade deficit, you have to squint to see one in 1971. It was a negligible $1.4 billion — true, the first trade deficit since 1893, but still tiny in a $1.2 trillion economy. The overall current account deficit at the time of the Nixon shock was 0.2% of GDP. Today it’s 3.5%.
As Garten tells the story, 15 white guys (as I said, the 1970s were different) repaired to Camp David and thrashed out Nixon’s new economic policy. The Texan force of political nature that was Treasury Secretary John Connally got most of what he wanted: in particular, “to screw the foreigners before they screw us.”
The losers were Paul Volcker, then a Treasury undersecretary, and the other financial technocrats who had hoped to re-engineer the Bretton Woods system — with the International Monetary Fund’s special drawing rights (a synthetic reserve currency) taking the place of gold. Yet Connally was playing the part of a wrecking ball, as Kissinger pointed out when he came to understand what was being cooked up. (He was on his way to Paris during that fateful weekend, for secret peace negotiations with the North Vietnamese.)
“I will be perfectly frank with you,” Connally candidly told reporters after Nixon’s TV address. “None of us know for certain what will occur.” Politically, it delivered the boost to the administration’s popularity Connally and Nixon had anticipated. But the collateral damage to American foreign policy — as Asian and European markets and currencies went haywire — took many months to repair. Not until the Smithsonian Agreement in late December were new exchange rate arrangements in place, whereby everyone else accepted the reality of dollar devaluation.
And even this did not last. First the Brits devalued, then the Italians (prompting Nixon’s famous outburst, “I don’t give a shit about the lira”). The dollar had to be devalued again in February 1973. By the end of that year most major currencies were floating — the outcome always preferred by Connally’s far more sophisticated successor as Treasury secretary, George Shultz.
You can see why journalists such as Henry Brandon thought it was the end of the line for the dollar. The 1970s became a horror show of double-digit inflation. At its nadir, the dollar had depreciated by around 50% compared with the Japanese yen and German Deutschmark. Yet neither currency displaced the dollar, despite numerous prophecies of that outcome.
The dollar rallied strongly in the first half of the 1980s — to the extent that there had to be coordinated intervention to weaken it under the September 1985 Plaza Accord. It had another wave of strength in the second half of the 1990s. And contrary to most predictions before the global financial crisis of 2008-9 — including the influential warnings of my old friend Nouriel Roubini, New York University’s so-called Dr. Doom — the dollar strengthened rather than weakened at times of economic stress, from the bankruptcy of Lehman Brothers Holdings Inc. to the plague of Covid-19.
Why was this? Why has the dollar remained dominant despite the apparent instability of this “nonsystem” (as the economist John Williamson called it) of sometimes floating, sometimes pegged exchange rates. I offer a three-part answer.
First, although the “great inflation” of the 1970s was disruptive, it proved to be curable. As Federal Reserve chair, Volcker administered the bitter medicine of higher interest rates and a recession that, combined with the supply-side reforms of President Ronald’s Reagan administration, fundamentally reset expectations. Independent central banks succeeded so well in reducing inflation that in 2004 Ben Bernanke, then a Fed governor, boasted of a “great moderation.”
Second, the system of liberalized capital markets born around this time — beginning with the eurodollar market — gave the dollar even more international utility than it had enjoyed under Bretton Woods. As the dominant currency not only in central bank international reserves but also in a rising share of international trade transactions, the dollar was more than ever the sun around which the other currencies of the world revolved.
Third, the terrorist attacks of Sept. 11, 2001, strengthened rather than weakened the U.S.-centered international financial system. Direct hits on the World Trade Center, a short distance from the New York Stock Exchange, could only briefly disrupt the smooth operation of American financial markets. And when the U.S. government went after those who had financed al-Qaeda and other extremist groups, it discovered a hitherto underestimated superpower: the ability to impose financial sanctions on any country or entity that defied Washington.
The increasing exertion of this superpower in response to a variety of different challenges to U.S. power — from the Russian annexation of Crimea to Swedish bank secrecy — revealed the full extent of American financial paramountcy. Excluding any actor from the dollar payment system was revealed as a more effective (and much cheaper) geopolitical lever than sending an aircraft carrier strike group. True, the U.S. could not restore Crimea to Ukraine. But it could inflict real pain on the Russian economy and the Russian political elite. Here was a powerful incentive to retain dollar dominance.
Yet the core of this financial power was and remains the U.S. banking system. And two recent developments have exposed the weakness of this core. First, the financial crisis originated in the undercapitalization and poor management of the American banks and their European counterparts. Second, and less obvious, technological innovations began to expose the banks’ fundamental inefficiency. As the Princeton historian Harold James insightfully argued last month:
The dollar’s long preeminence is being challenged, not so much by other currencies … as by new methods of speaking the same cross-border monetary language as the dollar. As the digital revolution accelerates, the national era in money is drawing to a close. … the demand for a monetary revolution is growing.
That revolution will be driven by digital technologies that enable not only new forms of government-issued fiat currencies … but also private currencies generated in innovative ways, such as through distributed ledgers. … The world is quickly moving to money based on information rather than on the credibility of a particular government.
In James’s neat framing, “Nixon’s closing of the gold window marked the end of a commodity-based monetary order, and the beginning a new world of fiat currencies.” Now, however, “we are moving toward another new monetary order, based on information.”
Or are we? The past 18 months have been an exciting phase of the monetary revolution. The pandemic has sped up both innovation in decentralized finance and adoption by a wider range of investors and institutions of established cryptocurrencies such as Bitcoin and Ethereum. In recent months, however, I have been depressed to see a wave of attacks on cryptocurrency by the custodians of the established order.
Among the standard-bearers of the backlash against crypto is Hyun Song Shin, with whom I once shared a staircase when we were students at the same Oxford college. In the latest BIS annual report, Shin denounces cryptocurrencies as “speculative assets rather than money … used to facilitate money laundering, ransomware attacks and other financial crimes.” Dismissing both Bitcoin and stablecoins, he argues that central banks must instead expedite the adoption and issuance of their own digital currencies, following China’s lead.
Martin Wolf of the Financial Times sounded an even more combative note last month. Central banks and governments, he argued, “have to get a grip on the new Wild West of private money,” and the best way would be to introduce digital currencies of their own. “The state must not abandon its role in ensuring the safety and usability of money,” Wolf went on, echoing Shin: “Bitcoin in particular has few redeeming public interest attributes … In my view, such ‘currencies’ should be illegal.”
These messages are being received and amplified in Washington. The President’s Working Group on Financial Markets, which is led by Treasury Secretary Janet Yellen, has expressed concerns about two stablecoins: Tether, which is under investigation by the Justice Department, and Facebook’s Diem, which was supposed to launch last month. Along with others such as Circle’s USDC, these stablecoins are backed by dollar assets. This has led some — for example, the former chairman of the Commodity Futures Trading Commission, Timothy Massad — to argue that stablecoins are like unstable money market funds.
Another talking point (used, for example, by Fed Governor Lael Brainard) is that stablecoins are analogous to the notes issued by wildcat banks in the 19th-century U.S. This is very bad financial history, as George Selgin has pointed out.
Perhaps the most startling illustration of this new mood was the speech given by Gary Gensler, chairman of the Securities and Exchange Commission, at the Aspen Strategy Forum on Aug. 3:
Primarily, crypto assets provide digital, scarce vehicles for speculative investment. Thus, in that sense, one can say they are highly speculative stores of value. … We also haven’t seen crypto used much as a medium of exchange. To the extent that it is used as such, it’s often to skirt our laws with respect to anti-money laundering, sanctions, and tax collection. … Right now, we just don’t have enough investor protection in crypto. Frankly, at this time, it’s more like the Wild West.
The use of stablecoins on these platforms may facilitate those seeking to sidestep a host of public policy goals connected to our traditional banking and financial system: anti-money laundering, tax compliance, sanctions, and the like. This affects our national security, too.
As Kissinger quipped after a comparable litany of congressional complaints about abuses by the intelligence agencies: “Except for that, there is nothing wrong with my operation?”
Gensler went on to argue that pretty much everything that moves in the world of crypto is almost certainly an unregistered security. Likewise, any platform where crypto tokens were traded or lent is subject to securities laws — and possibly also to commodities laws and banking laws. All he asked of Congress was “additional plenary authority to write rules for and attach guardrails to crypto trading and lending.”
As if to answer that classic plea by a regulator for yet more power, the Biden administration seized the opportunity presented by its own bipartisan infrastructure bill to insert a provision that, in the name of increasing tax revenue, would treat many, if not all, crypto participants as “brokers,” potentially imposing 1099-issuing and IRS-reporting requirements on them. Many of these participants merely serve as nodes in a network, processing encrypted information, and do not even have access to the information required by the bill.
A bipartisan group of senators — Republicans Pat Toomey and Cynthia Lummis, and Democrat Ron Wyden — rode to the rescue with a compromise amendment, which, while far from perfect, would have spared Bitcoin and Ethereum miners, validators, hardware makers and, most importantly, programmers themselves. Another bipartisan pairing, Senators Mark Warner and Rob Portman, proposed a competing amendment that would have created a carveout only for Bitcoin miners.
Yellen and the White House backed the Warner amendment, as it offered a legislative basis for the universal digital financial surveillance they seek without the political battle that standalone legislation would likely require. While Bitcoin is the most widely held and most valuable cryptocurrency, it is Ethereum’s rapid, decentralized financial system based on smart contracts that worries Treasury.
Overblown claims, such as Democratic Senator Elizabeth Warren’s warning that “shadowy super coders” would wreck the financial system, recall the alarmist reasoning used by the State Department in the 1990s when it attempted to restrict cryptography — an attempt overturned by the courts (in Bernstein v. United States), which deemed code to be protected free speech. The Warner amendment was an analogous attempt to choose “which foundational technologies are OK and which are not in crypto,” to quote Coinbase chief executive Brian Armstrong, a sentiment echoed by Tesla founder Elon Musk. In the end, the amendments fell by the wayside and the original language stands.
The right response came from Senator Ted Cruz, who proposed striking all crypto language from the bill. As “no more than five” senators could answer “what the hell a cryptocurrency even is,” he said, “the barest exercise of prudence would say we shouldn’t regulate something we don’t yet understand, we should actually take the time to try to understand it.”
I agree. And I also agree with the venture investor Adam Cochran (one of many “crypto bros” commenting on these developments) that “there is currently no greater way to risk the supremacy of the U.S. dollar, than by introducing anti-crypto legislation … The risk of cryptocurrency replacing the sovereignty of the U.S. dollar is *NOT* that people will start to denote everything in Bitcoin. It’s that this industry will set up shop elsewhere and it will use that currency.”
No doubt Cochran is talking his own DeFi book. But I like this argument for historical reasons. As Harold James says, we are living through a monetary revolution as profound as the one that swept away the remains of the gold standard. But there is a difference. In the 1970s and 1980s, the attempts by governments to regulate the revolution were swept away. Nixon’s price and wage controls were an abject failure, just as the economist Milton Friedman (and Shultz) had foreseen. Under Reagan, it was deregulation that enabled American financial institutions to become the dominant players in international markets.
The winners of my boyhood have become the bloated incumbents of my middle age. The innovative energy has passed to the crypto bros, leaving the established banks and their friends in Washington scrambling to make the barriers to competition even higher. If cryptocurrency is indeed the internet of money, then we are still at quite an early stage of its development. Restrictive regulation in the mid 1990s might have strangled in its infancy the commercialization of the world wide web. Restrictive regulation of crypto could turn out to be a very expensive mistake.
I feel in my bones that trying to compete with China to build the best central bank digital currency is a mug’s game. The American way is to let innovation rip. Avichal Garg of Electric Capital is right in thinking that the best strategy to preserve the dominance of the dollar is precisely to encourage the international adoption of dollar-linked stablecoins, rather than to stamp them out. As the internet of money grows, the dollar is well placed to be the preferred global on- and off-ramp, connecting the nascent “metaverse” to the physical world where we still pay our taxes in fiat.
If we have learned nothing else from the past half-century, it is surely that the best way to win a race with totalitarian rivals is not to copy them, but to out-innovate them. Make the wrong decision at this historic turning point, and we shall be interrupting a much bigger bonanza than Nixon did.
Bloomberg News provided this article. For more articles like this please visit
bloomberg.com.
Read more articles by Niall Ferguson