Milton Friedman (1951, 1960) provided influential back-of-the-envelope estimates of the costs devoted to extracting gold under what he called a “strict” gold standard. I have criticized those estimates elsewhere (White 1999, pp. 42-48) for exaggerating the volume of gold reserves used by actual gold-standard economies, and thus exaggerating the resource cost. Friedman’s estimates assumed a 100 percent gold reserve ratio against demand deposits (1951) or against all bank deposits (1960), whereas sophisticated banking systems in gold-standard economies historically operated on small prudential reserve ratios (as he elsewhere recognized). Plugging a historically observed 2 percent reserve ratio against all the bank liabilities in the broad monetary aggregate M2, rather than Friedman’s 100 percent, yields a resource cost estimate one-fiftieth of his 1960 figure, namely 0.05 percent rather than 2.5 percent of national income.
President Nixon closed the gold window in 1971, ushering in our present era of fiat money. As inflation rose and became more variable, people began acquiring gold and silver coins and bullion for hedging against fiat money regimes. Friedman to his credit was alert to the resource costs of the gold and silver acquisition that emerged. In a note on “The Resource Costs of Irredeemable Paper Money” he observed (Friedman 1986, p. 644):
Real resources are employed … in the production of the gold and silver absorbed into the hoards accumulated by [individuals] who have come to regard gold or silver as a prudent component of their asset portfolios. Since the end of Bretton Woods, even the direct resource cost of the gold and silver accumulated in private hoards may have been as great as or greater than it would have been under an effective gold standard. That depends on whether gold production since 1971 has been greater or less than it would have been under an effective gold standard—a promising research topic whose conclusion is by no means obvious from casual empiricism.