Looking at the postings — the last offering was sometime last year — it’s hard to believe that this practice may someday be a linchpin of solutions to a host of contemporary problems, including economic inequality, stagnating living standards, and climate change, just to name a few.
We live in an era where national and international currencies exist as the fundamental medium of all economic activity. The liquidity and freedom of movement associated with these forms of capital have been credited with setting the conditions of the modern industrial world. And for a certain period of time, some this credit was justified.
Now, however, the evidence suggests that national currencies’ only-game-in-town status has enabled the worst excesses of our plutocracy and environmental irresponsibility. Critics have begun arguing that reviving the practice of local currencies will be a necessary ingredient in curbing these plutocratic excesses, and for realizing the decentralizing potential of a host of cutting-edge technological trends, such as blockchain, crowdsourcing, and digital platforms.
The first, and only, time I’ve been to Europe was for an academic conference in Germany a few years ago. I can recall wandering inside the massively ornate cathedrals and the thought occurring to me that whoever built this church took their religion seriously. Given the population of those Medieval towns at the time, at best, a few hundred people typically attended services. And to make things even weirder, often the cathedrals were built by the townspeople, and not rich, aristocratic sponsors.
That kind of religious fanaticism puts Jonestown to shame. The piety is impressive, sure, but it seems like overkill.
It turns out this thought has occurred to other people, and it has even produced some interesting scholarship. Scholars have come to consider the period between 1040–1290, what’s called the High, or Central, Middle Ages, to be a kind of golden age of the premodern Western world. This was an era of prosperity and leisure for most common people. Community life and public works were dramatically expanded across the continent. It was also the era in which many of the most famous cathedrals were built. It turns out that they were intended as the Medieval equivalent of roadside tourist attractions for the era’s many religious pilgrims — and their travel funds.
Scholars have argued that what drove this outpouring to public investment on the part of common people was the far more prominent role local currencies had in this era’s economic system. Most economic activity took place locally, and this gave community members a far better sense of the value of local public investments.
Scholars have especially emphasized the practice of “demurrage,” a kind of negative interest rate scheme that encouraged those holding local currencies to spend them sooner, rather than later. Since, unlike the current era, currency was not understood to be a store of wealth or value when it was saved, community members sought other, more community-based ways to invest their surplus currency, such as in public works that benefited everyone.
During the fourteenth and fifteenth centuries, monarchies and national governments increasingly sought to consolidate power over their territories and one of their key tactics was through enforcing the adoption of national currencies. It’s widely believed that this led to a series of disastrous monetary policies in the years to come, leading into the true “dark ages” of the late Medieval Era.[1] This should not surprise. The economic arrangements of the premodern era were largely restricted to local activities. Local currencies were better able to reflect the real economic growth and inflationary influences of the local economy.
But, while national currencies were probably, on balance, a drag on economic performance in the premodern era, they turned out to be highly effective during the modern industrializing era.[2] This is because many of the technological advances of the modern era lent themselves to long-distance economic activity. The industrial era allowed for mass production, mass transportation, and nation-wide economic organization.
Indeed, the twentieth century saw an explosion of revolutionary technologies that, when taken together, stood to profoundly alter the human condition. These included the electricity grid, the light bulb, indoor plumbing, the internal combustion engine, the telephone, air conditioning, major medical advances of all sorts, the interstate highway system and commercial air transportation, industrial agriculture and synthetic fertilizer — indeed, the list could go for pages.
During this era, worker productivity skyrocketed, as did economic expansion and median income, while living standards, self-reported life satisfaction, and life expectancy all progressively rose. National currencies, especially as they were employed in modern capitalist countries, served several useful purposes during this period. They allowed for a national, and even global, scale of economic exchange, thus offering these powerful technologies the chance to fully exploit their economic potential. At the same time, the high levels of liquidity offered by national currencies and their extensive banking systems allowed capitalist countries to exploit the full potential of these technologies in a relatively short period of time.
Unfortunately, this extraordinary period of technological development and economic expansion was relatively brief — probably lasting around fifty years, from around 1920 to 1970.[3] Since that time, the performance of developed capitalist economies has been decidedly mixed — arguably even poor, if considered from an ecological perspective. National currencies have not just exacerbated these problems. They are arguably a fundamental driver.
Since the early seventies, living standards and median household income has stagnated for the vast majority of families and workers, while both wealth and income inequality has skyrocketed.[4] Real GDP per capita in the U.S. grew from around $28,500 in 1981 to $51,000 in 2011.[5] Yet the vast major of households received very little of this newly created wealth. Median household incomes doubled in real dollars between 1949 and 1973, growing from $25,000 to $50,000, but has since only risen to $61,000, much this being a result of the larger percentage of women in the workforce.[6]
But while the middle and working classes have been squeezed by stagnating wages and the higher costs of health care, retirement, child care, housing, and college, the rich and upper-middle classes have engaged in an unprecedented explosion of consumer activity. Though, as noted, real GDP per capita in the U.S. nearly doubled over the last 35 years, the percentage of U.S. GDP spent on consumer goods rose from 60% to 69%.[7] Since middle-class and working-class wages have stagnated and been squeezed by costs of living, this means that most of this growth in consumption has come from the upper classes.
This massive growth in consumption has come at a brutal cost to the environment. Many have hoped that economic growth and resource consumption would eventually be “decoupled,” as the modern economy becomes more efficient. However, economist Tim Jackson has demonstrated that, as of 2015, there is simply no evidence that this is occurring, because most of the evidence for decoupling does not take into account the developing countries where much current economic production is done.[8]
The result has been that global warming appears to be occurring at an ever more accelerated pace.[9] We are already beginning to the famines, droughts, sea-level rise, and extreme weather events predicted by scientists. If the climate eventually reaches the 2°C danger zone, many believe political chaos and the death of millions of poor people will follow.
Stephen Cheney, former Marine brigadier general and CEO of the American Security Project, writes, “Climate change is what we in the military call a ‘threat multiplier.’ Its connection to conflict is not linear. Rather, it intensifies and complicates existing security risks, increasing the frequency, scale, and complexity of future missions…. [Its] effects will be particularly destabilizing in already-volatile situations, exacerbating challenges like weak governance, economic inequality, and social tensions — and producing truly toxic conflicts.”[10]
Can we really blame something as banal as a monetary system for all this? First, consider the circumstances that allowed national currencies to be so beneficial. Revolutionary technologies, such as those Gordon highlights in the early- to mid-twentieth century, tend to create enormous societal disruption, dramatically impacting social hierarchies and wealth distributions.
Especially in that unprecedented era of technological development and application, innumerable new jobs, professions, skills, and opportunities for entrepreneurship and social mobility became available. The wealth created during this era was widely distributed, and normal professions such as teachers, manufacturing-plant foremen, and skilled technicians received upper-middle-class levels of income.[11]
In these circumstances, the liquidity and nation-wide convertibility of a national currency like the dollar allowed this social disruption and organic redistribution of wealth to take place, probably even abetting and intensifying it. But as the pace of technological change has slowed down since 1970,[12] we’ve seen a reversion to the norm of human history, as the richest twenty percent have captured the virtually all the wealth created in the period since.[13]
This has been done largely through a variety of “rent-seeking” activities, that is, economic activities that are not productive — they do not contribute to economic growth — but are instead geared toward capturing an ever-larger share of the economic pie. These include exploiting stock options; the ability of high-level executives to effectively set their own (gigantic) salaries; exploiting negative externalities; union-busting; influencing the design of tax, regulation, worker protections, and the minimum wage; holding down wages while inflation persists, working hours rise, and benefits decrease; charging unnecessarily high interest rates; imposing transaction fees; and a host of other activities that serve no positive economic purpose.[14]
Of course, a variety of reforms have been proposed to address this these types of abuses of economic power. They certainly might be worth giving a try. Nevertheless, there is good reason to be skeptical that these reforms will ultimately be effective. Piketty has demonstrated, perhaps better than anyone else, just how anomalous the twentieth-century era of economic life truly was, and moreover, that, otherwise, human history shows that inequalities of wealth tend to compound and expand as the possessors of wealth constantly seek out ways to leverage their assets for control of the politics and economic life of their society.[15]
In fact, history offers no real examples of cases where redistributive policy was the driving force in creating widespread economic equality. While twentieth century progressive policies have long been the go-to example, Gordon has probably made this example untenable by showing that it was the explosion of revolutionary technological innovation that drove economic equality in the twentieth century, not progressive policies.
Consider just how effective the modern financial sector has become in positioning itself to capture, rather than produce, wealth. It uses sophisticated technology and access to high-level expertise to purchase equity in future investments that most everyday people will never be aware of. It uses its massive resource base to buy access to politicians, indirectly control public policy and legislation, and to influence public opinion. Finally, big banks, rather than the Federal Reserve, are responsible of the vast majority of money creation, thanks to their capacity to hold only a small percentage of their deposits in reserve. It’s hardly surprising, then, that Michael Hudson has to come to consider our current economy hardly at all a capitalist economy, and instead simply central planning by bankers.
How much of an effect does this have on everyday communities? Consider the famous “Walmart effect.” Many small communities welcome big-box stores, thinking it will create jobs and spur local investment. Yet just the opposite occurs. Several studies have shown that locally based businesses spend as much as fifty-four percent of their revenue in the local economy (in goods, professional services, wages, benefits, etc.), while big-box stores spend around fourteen percent, virtually all of it on (uncompetitively low) wages.[16]
It’s worth considering whether a more creative and experimental approach would be more effective at curbing the immense power the financial sector has achieved over the last several decades. Because national currencies are effectively the only game in town, big banks and multinational corporations are always in position to control capital flows and liquidity, and consequently, the future of the national economy.
All this is interesting, you might say, but is this idea of shifting toward local currencies at all realistic?
Obviously, I can’t answer that question definitively. I’m not exactly advocating for local currencies here, but I am suggesting that there is a good case to be made that they deserved to be explored and experimented with. However, let me suggest a few reasons why I think their plausibility should be taken seriously.
The first is that the performance of national currencies, at least from a longer-term historical perspective, has been vastly overrated. If you remove their performance through, say, the last sixty years of the twentieth century, national currencies have had, at best, a mixed record. As often as not they were sources of economic mismanagement, right up through the Great Depression.
You might make a case for the last couple decades, but I would argue that their exacerbation of economic inequality, and their poisoning impact on political and social life and the environment, effectively wipes out the positive impact they’ve had on economic development — which has mainly been wasted on hyper-consumerism and overconsumption anyway.
The second reason is that the use of national economic systems tends to be heavily reliant on unsustainable levels of resource consumption — especially of hydrocarbon fuels. It’s worth wondering whether there might be something fundamentally unsustainable in the very structure of the global economy. Yes, it’s possible we could have to green revolution in the unsustainable structure of the economy, but I have my doubts that it can be done in time to save the Earth’s climate from catastrophe.
Finally, local currencies are much closer in line with many of the trends of current technology toward greater decentralization and dematerialization. If you think about the kinds of revolutionary technologies Gordon points to (the automobile, electric light, industrial agriculture, etc.), these were, inherently, the kinds of technologies that demanded investments on a national, and even global, scale, particularly given the existing modes of production. In that technological context, and employing those modes of mass production, national currencies just made good economic sense.
However, many emerging technologies are directing us toward ever more decentralized modes of production and exchange. Of course, many industries (biotech, transportation, utilities, etc.) still require the kind of massive outlays of investment that can only be achieved through large-scale banking and federal government spending, which is why we won’t want to eliminate national currencies completely. Nevertheless, smart phone minicomputers, the Internet, platform technologies, crowdsourcing, and blockchain are just a few of the trends leading toward more decentralized economic activity.
And these are only organizationally oriented tech. Juliet Schor, Andrew McAfee, and Erik Brynjolfsson have all recently argued that numerous trends and technological modes are leading toward greater modes of decentralized local production. These include technologies like advanced analytics, robotics, drone technology, artificial intelligence, 3D printers, food computers, and vertical farming. This is on top of the general trend toward dematerialized consumption (e.g., email instead of regular mail, iTunes instead of CDs, etc.), which requires less large-scale economic organization.
Blockchain technology, in particular, holds enormous potential for increasing the viability and utility of local currencies. National currencies are preferred to local currencies because they are thought to be more stable and reliable, that they are most likely to maintain their value when saved. At the same time, they offer greater economic flexibility, allowing the holder to utilize their value virtually anywhere. Cryptocurrencies, because of their objective, decentralized, and transparent architecture, have in many ways solved the trust problem for local currencies.
And we are already beginning to see the realization of this potential. The summer of 2018 has seen a jump in local cryptocurrency initiatives. The city of Berkeley, California, announced it would begin to experiment with municipal bonds based on blockchain technology. Elsewhere, Israeli-based cryptocurrency network Colu established their first networks of local cryptocurrencies in London, Liverpool, Tel Aviv, and Haifa. And finally, Bancor announced it would partner with the nonprofit Grassroots Economics to establish a network of local cryptocurrencies in Kenya.
These currencies all operate on the complementary currency model, that is, they are intended to allow local actors to conduct transactions and collaborate economically when national currencies are scarce, thus helping spur sustainable economic development.
Thoughtful people, and even a number of scholars, are working to think through the possibilities local currencies hold for the coming era, especially in the wake of the cryptocurrency revolution. One thing both supporters of local currencies and of cryptocurrencies probably need to reconcile themselves to is the fact that, regardless of their decentralizing ideals, politics and economics will probably never be decoupled.
Local currencies have always had the most success when they are supported and encouraged by governing entities. As noted earlier, local currencies tend mainly to function as an alternative economic system when the national economy is in trouble. But this is largely because national governments have traditionally been so hostile to local currencies.
On the other hand, when state and local governments lend support and encouragement, local currencies tend to flourish, opening new avenues of community engagement and local economic growth. In recent decades Japan has conducted a nation-wide experiment with an enormous variety of local currency activities, involving hundreds of different initiatives.[17] The scale of this experimentation came about as a direct result of the support and encouragement it received from state and local officials.
At the same time, the application of blockchain to local currency may well be a more appropriate paradigm than the traditionally radical libertarian bent blockchain enthusiasts have had in the past. Arguably, the fatal flaw that led to fiascos such as the failure of Ethereum-based DOA and China’s near-takeover of Bitcoin was its essentially anarchist approach.
The libertarian approach sees economic life as simply a matter of isolated individuals seeking to maximize their utility. But the reality of economic life is that it takes place in concrete communities, with goals and agendas that transcend individual self-interest. Isolating individuals, as modern political economic trends have increasing sought to do, only make them vulnerable to organized conspiracies, such as rent-seeking on the part of powerful financial actors, or the leveraging of the resources of authoritarian regimes.
There is certainly still a place for radical decentralization in economic life, but it will probably never escape the need to engage politics, at the very least, at the local level. After all, local communities have a variety of options for incentivizing and encouraging greater participation in local currencies, such as negative interest rates, tax incentives, and zoning policies.
For better or worse national currencies will likely play a role in modern economics for the foreseeable future. For a relatively short period of time, they had an essential role in establishing the modern world. Their usefulness, however, is no longer what it used to be. It might be time to consider whether there might be better options available.
[1] Stephen Belgin & Bernard Lietaer, New Money for a New World [Electronic Edition], (Boulder, CO: Qiterra Press, 2011), ch. 6, “Back to the Future.”
[2] Robert J. Gordon, The Rise and Fall of American Growth: The U.S. Standard of Living Since the Civil War, (Princeton: Princeton University Press, 2016).
[3] Ibid, 2ff.
[4] Martin Ford, Rise of the Robots: Technology and the Threat of a Jobless Future, (New York: Basic Books, 2015), 35; Tyler Cowen, The Great Stagnation: How America Ate All the Low-Hanging Fruit of Modern History, Got Sick, and Will (Eventually) Feel Better, (New York: Penguin, 2010), 15. For information on inequality, see Thomas Piketty, Capital in the Twenty-First Century, A. Goldhammer (trans.), (Cambridge, MA: Harvard University Press, 2014); Matthew O’Brien, “How Economic Growth (and the 1%) Left the Middle Class Behind,” The Atlantic, 10 May 2012 (accessed 5 Oct. 2018): https://www.theatlantic.com/business/archive/2012/05/how-economic-growth-and-the-1-left-the-middle-class-behind/256998/
[5] Federal Reserve Bank of St. Louis. Source: U.S. Bureau of Economic Analysis. (Accessed: 15 July 2016): https://fred.stlouisfed.org/graph/?g=hh3.
[6] Ford (2015), 35; Cowen (2011), 15.
[7] Federal Reserve Bank of St. Louis. Source: U.S. Bureau of Economic Analysis. (Accessed: 15 July 2016): https://fred.stlouisfed.org/graph/?g=hh3.
[8] Tim Jackson, Prosperity without Growth: Economics for a Finite Planet [Electronic Edition] (New York: Earthscan, 2011), ch. 5, “The Myth of Decoupling.”
[9] Naomi Klein, This Changes Everything: Capitalism vs. The Climate, (New York: Simon & Schuster, 2014).
[10] Stephen Cheney, “Trump’s Choice on Climate Change,” Project Syndicate, 12 Dec. 2016 (accessed: 16 Jan. 2017): https://www.project-syndicate.org/commentary/trump-climate-change-security-risk-by-stephen-cheney-2016-12
[11] Piketty (2014), 276–279.
[12] See Gordon (2016) and Cowen (2010).
[13] Joseph Stiglitz, The Price of Inequality: How Today’s Divided Society Endangers Our Future [Electronic Edition] (New York: W. W. Norton & Co., 2013), ch. 1, “America’s 1 Percent Problem.” Also see Piketty (2015).
[14] Stiglitz (2013), ch. 2, “Rent-Seeking and the Making of An Unequal Society.”
[15] Piketty (2014).
[16] Belgin & Lietaer (2011), ch. 13, “Sustainable Development.”
[17] Belgin & Lietaer (2011), ch. 15, “Social-Purpose Currencies.”