Reconsidering the Gold Standard

The gold standard is a monetary system where a currency is tied to the value of a unit of pure gold. Under such an arrangement, a nation can only print currency equal to the value of its gold reserves. Internationally, currency exchange rates between every country on the gold standard remain fixed.    


Gold has historically been treasured for its lustrous beauty, durability and malleability. This remarkable metal helped shape the course of history, triggering wars, inspiring timeless works of art and launching voyages of exploration. Originally used in jewelry and decorative arts, it became the standard medium of exchange for international trade. Gold coins eventually began to circulate around 700 B.C.

By the 16th century, paper currency was introduced in Europe. These notes were backed by precious metals, usually some combination of silver and gold. In 1821, Britain became the first country to formally adopt the gold standard. An international gold standard began to emerge during the 1870s as Germany, France, the United States and other powers eventually followed Britain’s lead. For the next fifty years, the global economy flourished. According to economist and professor Michael D. Bordo, “It was also a period of unprecedented economic growth with relatively free trade in goods, labor, and capital… Between 1880 and 1914, the period when the United States was on the ‘classical gold standard,’ inflation averaged only 0.1 percent per year.” (The Concise Encyclopedia of Economics, “Gold Standard”) 

Congressman and former presidential candidate Ron Paul stated this another way: “Not only did gold facilitate exchange of goods and services, it served as a store of value for those who wanted to save for a rainy day… When gold was used, and the rules protected honest commerce, productive nations thrived.” (Representative Ron Paul of Texas, speaking before the U.S. House of Representatives, February 15, 2006: “The End of Dollar Hegemony”)


Many countries suspended the gold standard with the outbreak of World War I. The conflict disrupted international trade and capital flow, and combatants needed greater financial flexibility to fund their war efforts. A number of nations resumed the gold standard after the war, only to abandon it again during the Great Depression. In 1934 President Franklin D. Roosevelt signed the Gold Reserve Act, which nationalized America’s non-jewelry gold supply and placed it in the hands of the U.S. Treasury. The measure severely compromised the gold standard from that point on. 

The Bretton Woods agreement, enacted in 1946, established a modified gold standard where foreign governments could convert U.S. dollars to gold at a fixed rate of $35 per ounce. The Nixon administration ended this policy in 1971, and no major country has used the gold standard since. Today’s U.S. dollars are simply a medium of exchange, with no intrinsic value and nothing to support them other than the creditworthiness of the United States. 

Amid deep uncertainty over federal deficit spending and the weakening dollar, many are starting to rethink the role of gold in our monetary system. In a 2008 briefing for the Cato Institute, Prof. Lawrence H. White described the gold standard as “a policy option that deserves serious consideration.” Advocates point to several advantages. Federal governments would be constrained in their ability to spend money, which would automatically safeguard against deficit spending. Domestic price levels would remain steady and currency exchange rates would be fixed rather than floating, lending stability to international trade. And because gold is widely considered a secure asset, a return to the gold standard would restore badly needed confidence in currencies and monetary systems.


Join the conversation as a VIP Member