Second of a Series of articles on the international monetary regime reprinted from the NY Sun.
Not sure I would agree with all of this. Net exports is different than manufacturing exports and manufacturing employment, especially in the global information economy. I believe the problem here is that the reserve currency allows the US central bank to issue too much US$ credit liabilities without paying the direct consequence. Our trading partners are not exactly happy about this either since they surrender control of their currencies to the dominance of the US Federal Reserve and US politics. I think we need to rein in political discretion over the value of money.
Journalism thrives on simple narratives and round numbers. So I must note that what President Nixon ended 50 years ago was not the international gold standard, which persisted despite interruptions for more than two millennia to 1914, but its complicated parody: the gold-exchange standard, established 99, not 50, years ago by a 1922 agreement at Genoa.
Prime Minister David Lloyd George convened the Genoa Conference in an effort to restore the economies of Central and Eastern Europe, modify the schedule of German reparations owed to France, and begin the re-integration of Soviet Russia into the European economy. Lacking any American support, the conference was a failure on all those counts.
The gold-exchange standard, John Maynard Keynes idea, was Genoas one tangible result. Keynes had proposed in 1913 that the monetary system of British colonial India be adopted world-wide. The British pound would remain convertible into gold, but Indias and other countries domestic payments would be backed by ostensibly gold-convertible claims on London. Following Genoa, the pound could be exchanged for gold, and other national currencies could be exchanged for pounds.
But there was a complication: unlike most currencies, the Indian rupee actually was based on silver, not gold, and British officials, including Keynes, overvalued the silver rupee, hoping to reduce heavy demands for British gold. British monetary experts inserted this scheme (without the silver wrinkle) in the 1922 Genoa accord, incidentally forestalling impecunious Britains repayment of its World War I debts in gold.
While working 35 years ago for Congressman Jack Kemp, I first coined the term the reserve currency curse. I was tutored in the subject by Lewis E, Lehrman, who in turn was influenced by the French economist Jacques Rueff (1896-1978). Keynes had claimed that what matters is only the value, not kind, of monetary reserves. It was Rueff who countered in 1932 that foreign exchange is qualitatively different from an equal value of precious metal.
With the creation of, say, dollar reserves, purchasing power has simply been duplicated, and thus the American market is in a position to buy in Europe, and in the United States, at the same time. This credit duplication causes prices to rise faster in the reserve-currency country than its trading partners, precipitating the reserve-currency countrys deindustrialization. That fate soon befell Great Britain, then the United States after the dollar replaced the pound under the 1944 Bretton Woods agreement.
Other countries backing their currencies with dollar-denominated securities led to a dilemma for America. The United States is the only major country with negative net monetary reserves (foreign official assets minus liabilities). All others even those whose currencies are used by foreign central banks have positive net reserves (i.e., those countries foreign official assets exceed their foreign official liabilities).
There is a correlation of more than 90% between Americas net reserves and its manufacturing employment. American net reserves had been positive before but turned negative by 1960, and manufacturing jobs have since disappeared in direct proportion to the decline in our net reserves. Focusing on one bilateral trade balance or other say, the US and China is a mugs game. What matters is the total balance, not bilateral subsets.
How could an American president reverse the reserve-currency curse? By making honesty the best policy: negotiating and starting repayment of all outstanding dollar reserves over several decades. Since international payments must be settled in real goods not IOUs the necessary production of American goods for export is the surest way to revive Americas manufacturing employment.
To increase our manufacturing jobs back to the peak of 17 million from todays 12 million, it would be necessary to repay most outstanding official dollar reserves. If President Biden is as ineffectual as most of his recent predecessors in responding to the reserve-currency curse, he, too, will have to get used to the title ex-President.
________
Mr. Mueller is the Lehrman Institute Fellow in Economics at the Ethics and Public Policy Center in Washington DC and author of Redeeming Economics. Image: Conferees at the Genoa Conference, with Prime Minister Lloyd George of Britain front and center. Detail of a British Government photo, via Wikipedia Commons.
This looks to be an excellent series of articles concerning the most important policy issue of the past 50 years. The global monetary regime that uses the US$ as the reserve currency and gives the world’s central banks discretion and control over the supply of fiat currency drives current global events, for better and worse. The effects range from economic crises and financial meltdowns to inequality, political conflict, and environmental degradation. Given the importance of money, I print the following article from the NY Sun in full…
God and Money: ‘A Perfect and Just Measure Shalt Thou Have’
Following begins a new series of columns marking the 50th anniversary of the collapse of the Bretton Woods gold exchange standard established in the closing months of World War II. A related editorial appears nearby.
* * *
The 50th anniversary of the collapse, on August 15, 1971, of the Bretton Woods monetary system is a momentous moment in the history of money. It should provide an occasion for thoughtful discussion focused on the road to reform, our priceless constitutional foundation, and the restoration of honest money.
Let us avoid an academic food fight among economists over prior international monetary systems. We should not be arguing about the classical gold standard versus the Bretton Woods pegged exchange-rate system, as these are just variations on the more significant theme of gold convertibility and the role of government in regulating money.
We cant even usefully revert to debating the old fixed-versus-flexible arguments that were part of Milton Friedmans justification for freely floating rates in the 1960s; the theoretical models for both positions have been mugged by reality.
Instead, we should be talking about money itself what is its basic purpose, its relationship with productive economic growth and whether today’s dysfunctional international monetary regime deserves to be designated any kind of system at all.
As the former chief of the International Monetary Fund, Jacques de Larosiere, noted at a conference in February 2014 at Vienna, today’s central bank-dominated monetary arrangements foster volatility, persistent imbalances, disorderly capital movements, currency misalignments.
These, he warned, were all major factors in the explosion of credit and leverage that precipitated the 2008 global financial crisis. Such an unanchored approach, he said, does not amount to a non-system but something considerably worse: an anti-system.
It is time to think creatively about money. We need to remind ourselves what it means as a measure, how it facilitates voluntary commerce and opportunity how it can lead to greater shared prosperity while remaining compatible with liberty, individualism, and free enterprise. Were at a moment when everything is on the table. For the wisdom of central bank mechanisms for conducting monetary policy is being called into question just as private alternative monies are making ever more credible bids for legitimacy.
Looking back and looking ahead, we can see that the most relevant and stimulating views emphasize the importance of productive economic activity and an open global marketplace. Money’s crucial role is to provide clear price signals to optimize the rewards of entrepreneurial endeavor and increased human knowledge.
Adam Smith wrote his treatise The Wealth of Nations during an age when nations forged a global monetary system by defining their currencies in terms of precise weights of gold and silver. A level monetary playing field arising from a system inherently disciplined by forces outside the control of government wherein the economic decisions of private individuals are not held hostage to the ambitions of politiciansserved profoundly liberal goals such as rule of law, private property, and the equal protection of human rights.
Modern-day visionaries likewise focus on the integrity of market signals conveyed through money. When Elon Musk says, I think about money as an information system, he goes to the heart of moneys unit-of-account function and underscores the importance of price signal clarity. When he tweets that goods and services are the real economy, any form of money is simply the accounting thereof, he illuminates the same reasoning that caused our constitutional Framers to include the power to coin money and regulate the value of American money, and of foreign coin, in the same sentence of our Constitution that grants Congress the power to fix our standard of weights and measures.
Money is meant to be a reliable measure, a meaningful unit of account, and a dependable store of value. When those qualities are undermined especially by government for purposes of redirecting economic outcomes at the risk of global financial instability, the dynamism and productive potential of free-market forces is diminished.
Political arguments in favor of maintaining government control over the issuance of money tend to invoke short-term objectives couched in words such as stimulus and the need for central bank support for an economy. Such calls are met with somber warnings about long-term unsustainability from the monetary authorities who nevertheless indulge them.
But thou shalt have a perfect and just weight, a perfect and just measure shalt thou have, goes the passage from the Book of Deuteronomy (25:15), that thy days may be lengthened in the land which the LORD thy God giveth thee. The biblical injunction against dishonest measures can be interpreted as alluding to sustainability not only in economic terms but also in the moral realm.
As noted by Robert Bartley, editor of the editorial page of The Wall Street Journal for more than 30 years, economist Robert Mundell was correct in his assessment that the only closed economy is the world economy. Its time to start building an ethical international monetary system.
________
Judy Shelton, an economist, is a senior fellow at the Independent Institute and author of Money Meltdown. Image: The conference room at the Mount Washington Hotel, Bretton Woods, New Hampshire, where, in 1944, the Bretton Woods Treaty was crafted. Via Wikipedia Commons.SupportAboutTerms
This incredible article is by Thomas Frank, a respected journalist and author (What’s the Matter With Kansas? – an exercise in urban political myopia) who is a well-educated and well-read member of the liberal urban media. Here’s an excerpt of his political touchstones:
Like everyone else I know, I spent the pandemic doing as I was told. A few months ago I even tried to talk a Fox News viewer out of believing in the lab-leak theory of Covid’s origins. The reason I did that is because the newspapers I read and the TV shows I watched had assured me on many occasions that the lab-leak theory wasn’t true, that it was a racist conspiracy theory, that only deluded Trumpists believed it, that it got infinite pants-on-fire ratings from the fact-checkers, and because (despite all my cynicism) I am the sort who has always trusted the mainstream news media.
[Ah, yes, it’s Trump’s fault. LOL.]
If an individual whose entire career and livelihood is wrapped up in ‘getting it right’ is so easily misled by our dominant media sources, what hope is there for the rest of us who have better things to do? Now he’s wondering if he’s gotten it all wrong and the larger consequences.
This is the problem with the urban corporate media that started to seriously degenerate after the 2000 election. But we have also learned how it started long before, as alternative media such as cable news, Talk Radio, and the Internet have presented an existential financial challenge for traditional media outlets, especially print newspapers and broadcast news.
Mr. Frank and his colleagues in corporate media (NYT, WaPo, LAT, Fox) need to undergo a serious bit of soul searching to discover if they have a role as the Fourth Estate in our information economy, or if they should just go pursue a career in real estate somewhere. Journalists today have to understand that nobody is going to hero worship them as the modern-day Woodward and Bernstein. Honest journalism and reputational capital is it’s own reward and can actually be lucrative on platforms like SubStack.
So here is what Frank has discovered:
Lab leaks happen. They aren’t the result of conspiracies: “a lab accident is an accident,” as Nathan Robinson points out; they happen all the time, in this country and in others, and people die from them.
There is evidence that the lab in question, which studies bat coronaviruses, may have been conducting what is called “gain of function” research, a dangerous innovation in which diseases are deliberately made more virulent. By the way, right-wingers didn’t dream up “gain of function”: all the cool virologists have been doing it (in this country and in others) even as the squares have been warning against it for years.
There are strong hints that some of the bat-virus research at the Wuhan lab was funded in part by the American national-medical establishment — which is to say, the lab-leak hypothesis doesn’t implicate China alone.
There seem to have been astonishing conflicts of interest among the people assigned to get to the bottom of it all, and (as we know from Enron and the housing bubble) conflicts of interest are always what trip up the well-credentialed professionals whom liberals insist we must all heed, honor, and obey.
The news media, in its zealous policing of the boundaries of the permissible, insisted that Russiagate was ever so true but that the lab-leak hypothesis was false false false, and woe unto anyone who dared disagree. Reporters gulped down whatever line was most flattering to the experts they were quoting and then insisted that it was 100% right and absolutely incontrovertible — that anything else was only unhinged Trumpist folly, that democracy dies when unbelievers get to speak, and so on.
The social media monopolies actually censoredposts about the lab-leak hypothesis. Of course they did! Because we’re at war with misinformation, you know, and people need to be brought back to the true and correct faith — as agreed upon by experts.
With this we get Mr. Frank’s revelation:
If it does indeed turn out that the lab-leak hypothesis is the right explanation for how it began — that the common people of the world have been forced into a real-life lab experiment, at tremendous cost — there is a moral earthquake on the way.
Because if the hypothesis is right, it will soon start to dawn on people that our mistake was not insufficient reverence for scientists, or inadequate respect for expertise, or not enough censorship on Facebook. It was a failure to think critically about all of the above, to understand that there is no such thing as absolute expertise.
Yeah, no kidding. And that’s a bad thing? It’s doubly ironic that most of the voices haranguing us to “follow the science” were really constraining true science. Critical thinking is merely what real scientists have been telling us all along, as opposed to those succumbing to “political” science. There are no absolutes in science, only skepticism and hypothesis testing – this applies to the pandemic as well as climate change and systemic racism and Modern Monetary Theory. And mea culpas won’t save journalists from the anvils of “I told you so’s” that will rain down upon their heads.
I’ve been watching a bit of the original TV miniseries on Amazon, The Underground Railroad, because I always enjoy learning something new and interesting from historical narratives. Just today I read this article on The Conversation which is a nice review of the motivations and intentions of the writers and director. It also provoked some thoughts I’ll share here.
I was struck by the following quotes about the director’s intention to present “slaves not as objects who were acted upon, but as individuals who maintained identities and agency – however limited – despite their status as property.”
The reviewer goes on to say,
In the past three decades there has been a movement among academics to find suitable terms to replace “slave” and “slavery.”
In the 1990s, a group of scholars asserted that “slave” was too limited a term – to label someone a “slave,” the argument went, emphasized the “thinghood” of all those held in slavery, rendering personal attributes apart from being owned invisible.
This makes perfect sense and should seem obvious. However, I believe the misuse or overuse of the label “slavery” has happened through associating it solely with the African/American experience, whereas enslavement has been inflicted upon many individuals and peoples across the world and across history. For sure, this docudrama is a narrative of the experience of black slaves on the North American continent, but its universalism should not be lost in that singular application.
I have emphasized the ideas of personal “identities and agency” in bold text above because this is really what applies to all people regardless of race or ethnicity. It also struck me that the appropriate term we are looking for is “The Unfree.” Every individual and oppressed group can relate to the idea of being unfree, if not enslaved. When you are unfree, you are deprived of free choice, free will, free agency, and the outward self-dignity imbued in that truest sense of human freedom. Historically and currently this condition is usually the result of a gross imbalance of power and a certain pathology of those who impose their unequal power over others. The history of the unfree applies to the ancient story of Spartacus, as well as any employee today preyed upon by an unreasonable boss.
This status of the unfree also highlights the fundamental condition of human identity, which is freedom. Freedom is what delineates our identities and personal agency in our lives, and it is sufficient in itself. In recent decades this truth has been twisted a bit to suggest that our chosen identities establish and signal our freedom, when actually it is only our freedom that helps guarantee the free and open expression of our identities. For example, one can assert one’s identity as “non-binary,” and the freedom of self-expression under the law defends the right to whatever that might be, but one cannot force others to use the preferred pronoun, that is not within the power of the state or any other entity without violating the basic tenets of freedom.
This is important because politics can intervene with laws and enforcement to guarantee our freedoms, but it cannot define or defend our personal identities or our self-dignity. As The Underground Railroad narrative demonstrates, slavery could not deprive the unfree slaves of their identities and their self-dignity, unless the individual allowed it. The oppressors can take away physical freedom, humiliate, and even impose a death sentence, but they cannot take away the freedom to think freely and the self-dignity of the oppressed. We witness these truths again and again in the stories of Holocaust and Gulag survivors.
It is also interesting to note that ideologically the primacy of freedom as a value tends to delineate today’s liberals and conservatives, as noted by Jonathan Haidt in his studies of political identity. Liberty is the primary moral value to those who identify on the right, while fairness and human caring are the dominant values asserted by many on the left. Leftists might argue that one cannot be free in an unfair society, but that only means we have to focus on freedom as a precondition to fairness. The issue of slavery the unfree, in universal world history as well as African American history, should enlighten us to the primary ordering of moral values: one cannot have fairness without the precondition of freedom, and without the precondition of freedom, fairness has no meaningful relation to our concepts of justice. (Unfortunately, this only hints at another discussion on the differences between fairness and justice, and the unnecessary qualifiers applied to the universal singular idea of moral justice.)
Lastly, this rich portrayal of the unfree escaping the bonds that defined them by preserving and expressing their self-dignity and personal agency provides the correct lesson on the true legacy of the American experiment – not that one group of our fore-bearers oppressed another, but that they both evolved under a constitutional system of laws to continue to progress toward a society of true liberty and justice for all. We have not arrived, but we are on the right track.
On Maverick: A Biography of Thomas Sowell, by Jason L. Riley.
Thomas Sowell is one of the towering American intellectuals of our time. An economist trained at the University of Chicago and a social theorist of the first rank, he has been a senior fellow at the Hoover Institution at Stanford University since 1980.
He has written an astonishing fifty books (if you count revised and expanded editions), numerous essays, and a long-running, twice-a-week newspaper column. Extraordinarily wide ranging, he has covered everything from the rudiments of economics to race relations, the housing crisis of 2008 to late-talking children.
His best known book, Basic Economics (2000), a best-selling, chart-, graph-, and jargon-free introduction to the subject, is now in its fifth edition and has been translated into seven languages.
No less an authority than Milton Friedman, who taught Sowell at the University of Chicago, has said that “The word ‘genius’ is thrown around so much that it’s becoming meaningless, but nevertheless I think Tom Sowell is close to being one.”
So it’s about time for there to be a biography of this remarkable man, although it should be noted that Maverick is far more an intellectual biography than a personal one.1 And we should be grateful to Jason L. Riley for writing a very good one. Riley is the author of Please Stop Helping Us: How Liberals Make It Harder for Blacks to Succeed (Encounter). He is also a senior fellow at the Manhattan Institute and a columnist at TheWall Street Journal.
Sowell’s life did not get off to an easy start, to put it mildly. In 1930, the year he was born into a black family in Gastonia, North Carolina, the Great Depression was gathering strength. And Jim Crow was in full force, so he seldom encountered white people in his early years. As Riley explains, “He’d been turned away from restaurants and housing because of his skin color. He’d felt the pain and humiliation of racism firsthand throughout his life. He needed no lectures from anyone on the evils of Jim Crow.”
His father had died a few months before his birth, and his mother, a housemaid, already had four children. So he was raised by a great-aunt.
The family moved to Harlem when he was nine, part of the great migration of black families from the South to the North in search of greater opportunity in those years. Forced to drop out of high school to get a job, he only went to college after a stint in the Marines during the Korean War.
He was the first member of his family to get beyond the seventh grade, and he was ignorant of even the basics of higher education. At first he thought that professors who were addressed as “doctor” were physicians as well as professors. “It came as a revelation to me that there was education beyond college,” he wrote, “and it was some time before I was clear whether an M.A. was beyond a Ph.D. or vice versa. Certainly I had no plans to get either.”
At first he attended night classes at the historically black Howard University. There, his professors noted his remarkable intellect and capacity for hard work and helped him transfer to Harvard the next year. He thrived there intellectually and graduated at the age of twenty-eight magna cum laude.
But he was less enamored of the social atmosphere in Cambridge. Sowell noted that he “resented attempts by some thoughtless Harvardians to assimilate me, based on the assumption that the supreme honor they could bestow was to allow me to become like them.”
Chicago was not an imitation of anything. It was wholly itself.
He got his master’s degree the next year at Columbia and intended to get his doctorate there as well, so he could study under George Stigler, who had written an essay on the early economist David Ricardo that Sowell had greatly admired. (It might be noted that the very first quotation in Sowell’s Basic Economics, written many years later, is from George Stigler.) But when Stigler (who won a Nobel Prize in 1982) moved to the University of Chicago, Sowell followed him there. He was very glad he did.
For while Sowell thought Columbia was a sort of a “watered-down” version of Harvard, Chicago was not an imitation of anything. It was wholly itself.
And the economics department was extraordinarily rigorous. Ross Emmett, an authority on the economics department at Chicago, told Riley that “During that period of time, Harvard took in twenty-five to twenty-seven students and graduated twenty-five of them, whereas Chicago took in seventy students and graduated twenty-five of them.” In the fifty-two years that Nobel Prizes in economics have been awarded, no fewer than thirteen have gone to scholars associated with the University of Chicago.
Although Chicago has long been the center of the study of free-market economics, Sowell was a Marxist in his twenties. He explained that, when working as a Western Union messenger after he left high school, he would sometimes ride the bus from the Wall Street area to his home in Harlem. The ride took him past the upscale department stores on Fifth Avenue, past Carnegie Hall, and through the affluent residential neighborhoods of Riverside Drive. “And then,” Sowell wrote, “somewhere around 120th Street, it would cross a viaduct and onto 135th Street, where you have the tenements. And that’s where I got off. The contrast between that and what I’d been seeing most of the trip really baffled me. And Marx seemed to explain it.”
But then he took a summer job at the U.S. Department of Labor in 1960, when he turned thirty. Even after a year at the University of Chicago, including a course under Milton Friedman, Sowell had “remained as much a Marxist as I had been before arriving.”
He spent the summer analyzing the sugar industry in Puerto Rico, where a minimum wage was set by the U.S. Government. It wasn’t long before he noticed that as the minimum wage had risen, the number of sugar workers fell. He had always supported minimum wages, assuming they helped the poor earn a decent living. But now he realized that minimum-wage laws cost jobs and were a net detriment to the poor.
“From there on,” Sowell wrote, “as I learned more and more from both experience and research, my adherence to the visions and doctrines of the left began to erode rapidly.”
Soon, Sowell was “rethinking the whole notion of government as a potentially benevolent force in the economy and society.” He also couldn’t help noticing that his fellow bureaucrats did not care if the minimum wage helped workers. Their job was to enforce the laws. It was not to see if the laws did any good.
“It forced me to realize, Sowell wrote, “that government agencies have their own self-interest to look after, regardless of those for whom a program has been set up.” Marxist theory ignores the powerful force of self-interest in the working of economies, and Sowell came to realize the centrality of self-interest to the human universe.
At Chicago, Sowell studied the history of ideas under the great Friedrich Hayek, but it was Hayek’s own ideas that had lasting consequences for him. Hayek’s essay “The Use of Knowledge in Society” dealt with how the information used to make economic decisions spreads through an economy. Its central insight is that knowledge is highly dispersed and no one person or group can possess all the knowledge needed to make good economic decisions. Therefore, he argued, the decision-making process should also be decentralized, the opposite of what Marx argued for.
Later, when Sowell was asked to teach a course on the Soviet economy, the significance of Hayek’s essay hit home:
I could see what the factors were that led the Soviets to do what they were doing, and why it wasn’t working. There was a knowledge problem that was inherent in that system. In a nutshell, those with the power didn’t have the knowledge, and those with the knowledge didn’t have the power.
Out of this came one of Sowell’s most important books, Knowledge and Decisions (1980), which extended Hayek’s work and, as Riley says, “would do so in ways that even Hayek had never contemplated.”
In hopes of reaching a wider audience than Hayek, who wrote in the technical language of economics, Sowell’s book, in “lieu of graphs and equations . . . offers rich metaphors and copious real-world examples that make the weightier concepts under discussion not merely digestible but tasty.” This appeal to a wider audience is no small part of the reason that Sowell has been so influential.
Another is that, while an economist by training, Sowell’s mastery of subjects is far wider. Gerald Early, of Washington University, noted that his expertise extends to sociology and history as well. “He had some kind of mastery of other fields to do the kind of comprehensive stuff he was doing. Whether you agree totally with his ideas or not, it was impressive what he was doing. Who knew an economist could write that stuff?”
Indeed, far too many economists can’t write, period. Sowell most certainly can. Early, who is black himself, noted that “I knew lots of black people who were not academics and who had heard about him and were reading his stuff because it was accessible.”
Another thing that distinguishes Sowell from all too many other economists is his insistence that theory be tested in the real world. Gunnar Myrdal, who won the Nobel Prize in economics in 1974, for instance, argued that third-world countries could not develop without extensive foreign aid and much central planning, despite the fact that post-war Japan, Taiwan, South Korea, and Singapore did exactly that in the late twentieth century.
“I got no sense,” Sowell wrote, “that Myrdal actually investigated these theories of his and compared them with anything that actually happened. I myself, of course, started out on the left and believed a lot of this stuff. The one thing that saved me was that I always thought facts mattered. And once you think that facts matter, then of course that’s a very different ball game.”
Myrdal and his type are essentially theoretical in their approach to economics. Sowell, like Stiller, Hayek, and Friedman, is empirical, demanding real-world proof, not just elegant ideas.
“The market can be ruthless in devaluing degrees that do not mean what they say.”
Sowell has always regarded himself as fortunate that his higher education came before the era of affirmative action, which he regards as an unmitigated disaster for blacks. In his memoir, My Grandfather’s Son (2007), the Supreme Court Justice Clarence Thomas recalled how shocked he had been when his law degree from Yale and his sterling grades failed to impress the white-shoe law firms where he applied for a job. “Now I knew what a law degree from Yale was worth when it bore the taint of racial preference,” he wrote.
But Sowell had predicted this in the very first days of affirmative action. “The double standard of grades and degrees is an open secret on many college campuses, and it is only a matter of time before it is an open secret among employers as well,” he predicted in 1970. “The market can be ruthless in devaluing degrees that do not mean what they say. It should be apparent to anyone not blinded by his own nobility that it also devalues the student in his own eyes.”
One of Sowell’s most important contributions has been to notice how wide the gap often is between ordinary black Americans and black intellectuals and civil rights leaders. In a pair of op-eds in The WashingtonPost in 1981, Sowell wrote that
Historically, the black elite has been preoccupied with symbolism rather than pragmatism. Like other human beings, they have been able to rationalize their special perspective and self-interest as a general good. Much of their demand for removing racial barriers was a demand that they be allowed to join the white elite and escape the black masses.
In other words, they have been all too anxious to do what Sowell had spurned doing many years before at Harvard.
In fact, Sowell doesn’t have much use for the pretensions of intellectuals of whatever color. Perhaps my favorite quote in Maverick is used by Riley to open his chapter on “Sowell’s Wisdom”: “Some of the biggest cases of mistaken identity are among intellectuals who have trouble remembering that they are not God.”
In this short, well-written book, Jason Riley leads the reader on an enlightening tour of the thought and experiences of one of the most luminous minds this country has produced.
It should cause many readers to explore the works of Thomas Sowell. They will be richly rewarded for doing so.
1Maverick: A Biography of Thomas Sowell, by Jason L. Riley; Basic Books, 304 pages, $30.
Uncertainty is the nature of the universe. Get used to it.
Change is the only constant.
Those are far more profound statements than they appear to most people. Once we introduce time into the three-dimensional realm of space and matter, change is inevitable and with change we get the unpredictable nature of the future. Mankind is merely a bystander in this universe of uncertain change and it’s really the puzzle that has confounded the most brilliant minds in history: from Ptolemy to Copernicus to Einstein to Hawking.
The puzzle becomes more salient when we realize how uncertainty has shaped natural biology with biodiversity, adaptation, and the survival instincts that help species perpetuate in an uncertain environment of constant change. Humans are different only in that we are cognitively aware of the uncertainty. Together with our naturally endowed survival instincts, this awareness helps determine our behavioral adaptations. It applies to individual behavior and aggregate social behavior.
The social sciences, especially economics and finance, have now started to focus on this uncertain nature of the universe and how it determines how we cohere and interact in social communities characterized by economic exchange and social and political organization.
We can perhaps appreciate the extent that uncertainty infuses our lives by noting its influence on shaping our cultures and institutions. Religion, for example, is a faith-based belief system that not only shapes social behavior through doctrine, but also offers succor through prayer for the fears that uncertainty provokes. Modern democratic government has been called upon to manage the societal and economic risks of uncertainty through social insurance programs for retirement and healthcare, welfare, income maintenance, environmental risk and national defense.
A few authors have explicated a world characterized by uncertainty, most notably Nassim Nicholas Taleb with his compendium of books, including The Black Swan and Antifragile, as well as John Kay and Mervyn King with their collaboration, Radical Uncertainty.
I highly recommend these books, along with Peter Bernstein’s classic, Against the Gods. These frameworks for analysis help us understand the nature of uncertainty, risk and reward, and the imperatives of risk and loss aversion. Then we can decide how best to manage its inevitable effect on our lives.
Link to an excellent podcast of a symposium at the London School of Economics with the two authors of Radical Uncertainty:
I recently read or heard various critics of Bitcoin compare it unfavorably to the US dollar. This short article explains why fundamentally they are not that different. Each relies on the trust people have in the currency to be able to use it as a store of value or a medium of exchange. Trust can be fractured in either case. The main difference between crypto and fiat currency is the fact that governments usually demand that we pay taxes in the national currency, but that can easily change. Crypto has the added trust factor in that it doesn’t rely on the prudence of politicians.
Biden commands trillions in the way previous presidents have commanded billions
by Lionel Shriver
May 6, 2021 | 8:24 am
I like Bill Maher. He’s a rare practicing left-wing comic who’s actually funny. But last week, his routine on cryptocurrency hit eerie harmonics.
‘I fully understand that our financial system isn’t perfect, but at least it’s real,’ he began. By contrast, crypto is ‘just Easter bunny cartoon cash. I’ve read articles about it. I’ve had it explained to me. I still don’t get it, and neither do you’.
Bitcoin is ‘made up out of thin air’ and is comparable with ‘Monopoly money’. As for conventional legal tender: ‘We knew money had to originate from and be generated by something real, somewhere. Cryptocurrency says, “No, it doesn’t”… Or as another analyst put it, “It’s an open Ponzi scheme”. It’s like having an imaginary best friend who’s also a banker.
‘Our problem here is at root not economic but psychological. People who have been raised in a virtual world are starting to believe they can really live in it. Much of warfare is a video game now; why not base our economy the same way? Cryptocurrency is literally a game.
‘Do I need to spell this out? There is something inherently not credible about creating hundreds of billions in virtual wealth, with nothing ever actually being accomplished, and no actual product made or service rendered. It’s like Tinkerbell’s light. Its power source is based solely on enough children believing in it.’
That monologue was broadcast in the same week Joe Biden promoted the third of his gargantuan spending programs, bringing his first 100 days’ total discretionary spending proposals to $6 trillion. (Context: total US GDP is $21 trillion.) This lavish largesse would be slathered atop the annual (and growing) nondiscretionary budget of nearly $5 trillion, against $3.5 trillion in tax revenue. Let’s tweak Maher’s routine, then:
‘I fully understand that our financial system isn’t perfect, but at least, or so I’ve imagined, it’s real. But the American dollar increasingly resembles Easter bunny cartoon cash. I’ve read articles about Modern Monetary Theory. I’ve had it explained to me. I still don’t get it, and neither do you.
‘Dollars are now made up out of thin air and comparable with Monopoly money. We thought we knew that money had to originate from and be generated by something real, somewhere. Modern Monetary Theory says, “No, it doesn’t”… Or as another analyst put it, “Quantitative easing is an open Ponzi scheme”. The Federal Reserve is like having an imaginary best friend who’s also a banker.
‘Our problem here is at root not economic but psychological. People who have been raised in a virtual world are starting to believe they can really live in it. Much of warfare is a video game now; why not base our economy the same way? The conjuring of “borrowed” money from ether, only to have that debt swallowed by a central bank and disappear, is literally a game.
‘Do I need to spell this out? There is something inherently not credible about the Fed creating not just hundreds of billions, but trillions in wealth, with nothing ever actually being accomplished, and no actual product made or service rendered. It’s like Tinkerbell’s light. Its power source is based solely on enough infantilized citizens believing in it.’
Somehow that monologue isn’t as funny in the second version.
While Maher decries the electricity squandered on crypto ‘mining’, at least the color of the Fed’s money is genuinely green. Tap a few keys, and voilà: trillions from pennies on the energy bill. So in the past year, the Fed effortlessly increased the world’s supply of dollars by 26 percent and is on track for a similar surge in 2021. But is drastic monetary expansion truly without cost?
I’ve made Maher’s Tinkerbell analogy myself, but to explain how traditional currency functions. I noted in an essay accompanying my novel The Mandibles, about America’s 2029 economic apocalypse: ‘Currency is a belief system. It maintains its value the way Tinkerbell is kept aloft by children believing in fairies in Peter Pan.’
In the novel, a fictional economics professor pontificates: ‘Money is emotional. Because all value is subjective, money is worth what people feel it’s worth. They accept it in exchange for goods and services because they have faith in it. Economics is closer to religion than science. Without millions of individual citizens believing in a currency, money is colored paper. Likewise, creditors have to believe that if they extend a loan to the US government they’ll get their money back or they don’t make the loan in the first place. So confidence isn’t a side issue. It’s the only issue.’
My confidence is going wobbly. Biden commands trillions the way previous presidents have commanded billions, while the public is so dazzled by zeros that they don’t know the difference.
I’ve my quibbles with the particulars. Spending in inconceivable quantity courts waste and fraud. Biden’s American Families Plan casts so many freebies upon the waters as to constitute a de facto universal basic income, and government dependency doesn’t seem characteristic of a good life. Pandemic-relief unemployment supplements (which many Democrats would make permanent) are so generous that small businesses can’t find employees willing to work even for two to three times the minimum wage. Biden is effectively reversing Clinton-era welfare reforms, which moved so many poor Americans from state benefits to self-respecting employment. Financing all these goodies by hiking corporate taxes is popular, but only because few people realize that every-one pays corporate taxes through lower pension-fund returns, job losses from corporate flight, lower wages and higher prices.
But it’s the bigger picture that unnerves me. Zero interest rates have installed an accelerating debt loop. Governments, companies and individuals borrow because money is free. Central banks won’t raise interest rates, lest the cost of servicing all this burgeoning debt bankrupt the debtors. Governments, companies and individuals borrow still more because money is free. The Federal Reserve has already announced it won’t raise interest rates even if inflation climbs, while refusing to cite what level inflation would have to hit before reconsidering. I’ve plotted this story before. It doesn’t end well.
These remarks from the symposium offer a revealing analysis of US financial and fiscal policy. I include in this post the essay by David Goldman, as I believe it offers a foundation for understanding the predicament we have created with poorly conceived financial policies that are now being accelerated and amplified. I have highlighted sections in RED.
May 5, 2021
Massive demand-side stimulus combined with constraints on the supply-side in the form of higher taxes is a sure recipe for inflation and eventual recession. The Fiscal Year 2021 US budget deficit will amount to 15% of US GDP after the passage of an additional $1.9 trillion in demand stimulus, according to the Committee for a Responsible Federal Budget, a proportion that the United States has not seen since World War II.
It is hard to avoid the conclusion that the Biden Administration’s fiscal irresponsibility arises from a cynical political calculation. It evidently proposes to employ the federal budget as a slush fund to distribute benefits to various political constituencies, gambling that the avalanche of new debt will not cause a financial crisis before the 2022 Congressional elections. The additional $2.3 trillion in so-called infrastructure spending that the Administration has proposed consists mainly of handouts to Democratic constituencies.
Where is Foreign Money Going?
During the 12 months ending in March, the deficit stood at 19% of GDP. Even worse, the Federal Reserve absorbed virtually all the increase in outstanding debt on its balance sheet. In the aftermath of the 2009 recession, when the deficit briefly rose to 10% of GDP, foreigners bought about half the total new issuance of Treasury debt. During the past 12 months, foreigners have been net sellers of US government debt. (See Figure 1.) The US dollar’s role as the world’s principal reserve currency is eroding fast, and fiscal irresponsibility of this order threatens to accelerate the dollar’s decline.
The Federal Reserve has kept short-term interest rates low by monetizing debt, but long-term Treasury yields have risen by more than a percentage point since July. Markets know that what can’t go on forever, won’t. At some point, private holders of Treasury debt will liquidate their holdings—as foreigners have begun to do—and rates will rise sharply. (See Figure 2.) For every percentage point increase in the cost of financing federal debt, the US Treasury will have to pay an additional quarter-trillion dollars in interest. The United States well may find itself in the position of Italy in 2018, but without the rich members of the European Union to bail it out.
The flood of federal spending has had a number of dangerous effects already:
The US trade deficit in goods as of February 2021 reached an annualized rate of more than $1 trillion a year, an all-time record. China’s exports to the US over the 12 months ending in February also reached an all-time record. Federal stimulus created demand that US productive facilities could not meet, and produced a massive import boom.
Input prices to US manufacturers in February rose at the fastest rate since 1973, according to the Philadelphia Federal Reserve’s survey. And the gap between input prices and finished goods prices rose at the fastest rate since 2009. (See Figure 3.)
The Producer Price Index for final demand rose at an annualized 11% rate during the first quarter. The Consumer Price Index shows year-on-year growth of only 1.7%, but that reflects dodgy measurements (for example, the price shelter, which comprises a third of the index, supposedly rose just 1.5% over the year, although home prices rose by 10%).
If foreigners are net sellers of US Treasury securities, how is the United States financing an external deficit in the range of $1 trillion a year? The US has two deficits to finance, the internal budget deficit, and the balance of payments deficit, and here we refer to the second. The answer is: By selling stocks to foreigners, according to Treasury data. (See Figure 4.) Foreign investors have been dumping low-yielding US Treasuries and corporate bonds during the past year, according to the Treasury International Capital (TIC) system. Foreign investors bought $400 billion of US equities and nearly $500 billion of US agency securities (backed by home mortgages) during the 12 months through January, but sold $600 billion of Treasuries and $100 billion of corporate bonds.
This is a bubble on top of a bubble.[Double Bubble = Trouble.] The Federal Reserve buys $4 trillion of Treasury securities and pushes the after-inflation yield below zero. That pushes investors into stocks. Foreigners don’t want US Treasuries at negative real yields, but they buy into the stock market which keeps rising, because the Fed is pushing down bond yields, and so forth.
At some point, foreigners will have a bellyful of overpriced US stocks and will stop buying them. When this happens, the Treasury will have to sell more bonds to foreigners, but that means allowing interest rates to rise, because foreigners won’t buy US bonds at extremely low yields. Rising bond yields will push stock prices down further, which means that foreigners will sell more stocks, and the Treasury will have to sell more bonds to foreigners, and so forth.
The 2009 crisis came from the demand side. When the housing bubble collapsed, trillions of dollars of derivative securities backed by home loans collapsed with it, wiping out the equity of homeowners and the capital base of the banking system. The 2021 stagflation—the unhappy combination of rising prices and falling output—is a supply-side phenomenon. [Back to the Future of That 70s Show] That’s what happens when governments throw trillions of dollars of money out of a helicopter, while infrastructure and plant capacity deteriorate.
The present situation is unprecedented in another way: Not in the past century has the United States faced a competitor with an economy as big as ours, growing much faster than ours, with ambitions to displace us as the world’s leading power.
The source of the 2008 crisis was overextension of leverage to homeowners and corporations. I was one of a small minority of economists who predicted that crisis.
Federal debt in 2008 was 60% of GDP, not counting the unfunded liabilities of Medicare and the Social Security System. As of the end of 2020, Federal debt had more than doubled as a percentage of GDP, to 130%. The Federal Reserve in 2008 owned only $1 trillion of securities. US government debt remained a safe harbor asset; after the Lehman Brothers bankruptcy in September 2008, the 30-year US Treasury yield fell from 4.7% to 2.64%, as private investors bought Treasuries as a refuge.
The Treasury: Not a Refuge from, but a Cause of Crisis
Today the US Treasury market is the weak link in the financial system, supported only by the central bank’s monetization of debt. If the extreme fiscal profligacy of the Biden Administration prompts private investors to exit the Treasury market, there will be no safe assets left in dollar financial markets. The knock-on effects would be extremely hard to control
The overwhelming majority of over-the-counter (privately traded) derivatives contracts serve as interest-rate hedges. Market participants typically pledge Treasury securities as collateral for these contracts. The notional value of such contracts now exceeds $600 trillion, according to the Bank for International Settlements. Derivatives contracts entail a certain amount of market risk, and banks will enter into them with customers who want to hedge interest-rate positions only if the customers put up collateral (like the cash margin on a stock bought on credit) (See Figure 5) The market value (after netting for matching contracts that cancel each other out) is about $15 trillion. If the prices of Treasury securities fall sharply, the result will be a global margin call in the derivatives market, forcing the liquidation of vast amounts of positions.
Something like this occurred between March 6 and March 18, 2020, when the yield on inflation-protected US Treasury securities (TIPS) jumped from about negative 0.6% to positive 0.6% in two weeks. The COVID-19 crash prompted a run on cash at American banks, as US corporate borrowers drew down their credit lines. US banks in turn cut credit lines to European and Japanese banks, who were forced to withdraw funding to their customers for currency hedges on holdings of US Treasury securities. The customers in turn liquidated US Treasury securities, and the Treasury market crashed. That was the first time that a Treasury market crash coincided with a stock market crash: Instead of acting as a crisis refuge, the US Treasury market became the epicenter of the crisis.
The Federal Reserve quickly stabilized the market through massive purchases of Treasury securities, and through the extension of dollar swap lines to European central banks, which in return restored dollar liquidity to their customers. These emergency actions were justified by the extraordinary circumstances of March 2020: An external shock, namely the COVID-19 pandemic, upended financial markets, and the central bank acted responsibility in extending liquidity to the market. But the Federal Reserve and the Biden Administration now propose to extend these emergency measures into a continuing flood of demand. The consequences will be dire.
The present situation is unprecedented in another way: Not in the past century has the United States faced a competitor with an economy as big as ours, growing much faster than ours, with ambitions to displace us as the world’s leading power. China believes that America’s fiscal irresponsibility will undermine the dollar’s status as world reserve currency.
Here is what Fudan University Professor Bai Gang told the Observer, a news site close to China’s State Council:
Simply put, this year the United States has issued a massive amount of currency, which has given the US economy, which has been severely or partially shut down due to the COVID-19 epidemic, a certain kind of survival power. On the one hand, it must be recognized that this method . . . is highly effective. . . . The US stock market once again hit a record high.
But what I want to emphasize is that this approach comes at the cost of the future effectiveness of the dollar lending system. You do not get the benefit without having to bear its necessary costs.
A hegemonic country can maintain its currency hegemony for a period of time even after the national hegemony has been lost. After Britain lost its global hegemony, at least in the 1920s and 1930s, the pound sterling still maintained the function of the world’s most important currency payment method. To a certain extent, the hegemony of the US dollar is stronger than any currency before it. . . .
We see that the US dollar, as the most important national currency in the international payment system, may still persist for a long time even after US hegemony ends. Since this year, the US has continued to issue more currency to ease the internal situation. The pressure will eventually seriously damage the status of the US dollar as the core currency in the international payment system.
America has enormous power, but the Biden Administration and the Federal Reserve are abusing it. And China is waiting for the next crisis to assert its primacy in the world economy.
There are sections of the public health and scientific community that have become infected with the general level of fear and anxiety in the population. You see that very clearly in the recent open letter signed by teaching unions and various behavioral scientists – more or less a coalition of the anxious – demanding that face masks be mandatory in schools until at least 21 June. These people are looking for a risk-free world. Anybody who understands the nature of risk knows that there is no such thing. The eradication of risk comes at unacceptable social and economic costs wherever you try to do it.
The propagation of fear is visible not just in advertising but also in the constant reiteration of certain symbols. You will find public health specialists say that they know masks don’t achieve very much, but they do remind people that there’s a pandemic going on. They address the question of how to keep people in a constant state of fear.
Democracy is messy, it’s uncontrolled, it can be disruptive. But all of those things are actually really important for a good society. It’s out of the messiness that creativity and change and innovation come. We are erring too far towards elitist control, which is what edges us closer to the Chinese model of the party and the party scientists decreeing what a good life for citizens is and devising systems and structures to enforce it. That should concern us.
It doesn’t seem so long ago when the promise of an inter-connected world was all the rage, with the free sharing of information being praised as the dawning of a new age for global communication and community. An audacious young Mark Zuckerberg praised his nascent social network, proclaiming that “Connecting the world is really important, and that is something that we want to do. That is why Facebook is here on this planet.”
The equally precocious founders of Google offered a mission to Do No Evil while organizing the world’s information to make it universally accessible and useful. Likewise, their youthful optimism gushed about “the potential for technology to remake the world into a better place.” Early outcomes were hopeful, as Facebook’s network grew quickly to more than 2 billion users, while the Arab Spring was heralded politically as “the Twitter revolution,” and “google” became a verb.
But how fast the wheel has turned. Today we find ourselves blaming social media for disseminating misinformation as propaganda (“fake news”), destroying objective journalism, invading user privacy, corrupting elections, enabling and fomenting ideological extremism, canceling political dissent by censoring free speech, cornering markets and suppressing competition, crushing small businesses, and harming the physical and mental health of its users. Whew!
No, it really hasn’t been that long. It’s like we’ve imbibed a heady drug, woke up, and found ourselves addicted, wondering how we got here and how to break this compulsive habit before it breaks us. To paraphrase David Byrne, “How did we get here?”
There are two things we need to understand to better answer this question. One is the nature of social interaction, individually and socially, and how that flows from our behavioral instincts. Second, is how the business model and logic of large-scale social media networks manipulate and profit from those natural instincts.
“Man is by nature a social animal” – Aristotle, Politics
As Aristotle noted, humans crave social interaction. As young people, we seek the approval of elders and peers as a form of bonding and belonging. We form families, tribes, neighborhoods, and communities to fulfill this instinctual need for social engagement and mutual protection. Modern social networking technology feeds on that need, but what technology offers today differs from centuries of traditional social interaction in terms of scale and the implications for identity, trust, and commitment.
Evolutionary psychologist Robin Dunbar has observed that human face-to-face relationships have an upward bound of about 150 relationships before dissipating. Beyond that we lose track of our personal networks, so institutional structures must be established to cohere the community network. We understand intuitively that friendships formed through in-person relationships are fundamentally different than “friends” on Facebook. Nobody has 6000+ “friends.” Friends connected through online social networks (OSNs) are too easy; requiring little or no commitment. As the degrees of separation increase, peers on OSNs become virtually anonymous. So, as defined by trust and commitment, our social media friends are not really true friends at all. OSNs have allowed us to make connections that live on the other side of the globe, so we don’t really ‘know’ who we are engaging. What this means is that on social media we are able to shed the constraints of reputation, integrity, and trust. This opens up social engagement to all kinds of malicious intent.
“If you can’t say anything nice, then don’t say anything at all.” ~ Aesop (c.620-560 BC)
“Gossiping and lying go hand in hand.” ~ Proverb
“Whoever gossips to you, will gossip about you.” ~ Spanish Proverb
ComparingOSNs to traditional gossip networks can provide valuable insights. As these quotes show, gossiping has a long, checkered history. In traditional societies, the human propensity to gossip can serve a useful purpose in reinforcing a community’s cultural norms and values by calling attention to and ostracizing those who violate those norms. These practices can range from the harmless to quite ruthless, all in the name of solidifying the community under those accepted norms. It’s how the traditional community survives and maintains stability. Individuals living in modern liberal societies often find such conformity stifling. But gossip can also be positive, promoting one’s good character, as indicated by the phrase, “Your reputation precedes you.” (The key here is how we value reputation.)
Scientific studies show the reward center in the brain—the caudate nucleus—is activated in response to gossip, especially malicious gossip. For instance, subjects seem to be amused or entertained by celebrity scandals. We all know this form of Schadenfreude as the basis of the business model for supermarket tabloids, as exemplified by the modern fascination with the British monarchy and its human foibles.
Furthermore, studies have shown that subjects get a dopamine hit from superficial engagement on social media. The likes, the emojis, the comments, all offer instant gratification that somebody out there approves of us or at least notices us. Popularity metrics signal our status on the social media hierarchy. All this feeds our sense of self-worth and self-esteem and helps shape our identity. This dopamine rush is exploited by social media user interfaces in order to provoke and prolong user engagement on the platforms.
By appealing to our base instincts social media has transformed itself into this role of spreading gossip, but on a far larger scale with far less restraint. In this respect, Facebook and Twitter have become little more than global gossip networks, where those gossiping and being gossiped about have no relationships to a shared community. We see this today in the attempts to cancel those who disagree with an accepted narrative or ideology, where perceived transgressors are set upon by Twitter mobs and trolls. We see it with teenage bullying and exploitation. We see it with constant virtue signaling.
With large-scale, anonymous networks, where no one can be held accountable for attacking another, bad behavior becomes far too easy and tempting, perhaps irresistible. Peoples’ careers and lives are being destroyed by what can be viewed as an unserious game with very serious consequences. In one study conducted in Germany, researchers found that Facebook’s own engagement tools were tied to a significant rise in membership in extremist organizations. In the US, Facebook has been blamed, rightly or wrongly, for the rise of white supremacist groups.
Paradoxically, this scaling effect, enabling anonymity and lack of accountability, is what really makes today’s social media anti-social. There is no trust or reputational capital to be lost and removing these constraints can bring out the worst in us. OSNs have developed to the point where we are getting all the negative effects of gossip with fewer of the positive effects of shared community promised. In this respect, large-scale OSNs make no sense as a social institution. Nevertheless, the psychological and emotional allure of online social engagement is overpowering, while the financial power associated with OSNs is formidable. In terms of economic power and global reach, our social media giants can go head-to-head with most countries. The top five company valuations on US financial markets are all Big Tech, with Google and Facebook ranked at #4 and #5.
The Primacy of Technology
The big social media platforms today, such as Facebook, YouTube, and Twitter, make money through targeted advertising, creating specialized interfaces to keep users engaged and collect as much data as possible to sell to targeted advertisers. The more users, sharing more information flowing through the network, the more advertising revenues increase along with company valuations.
With this profit incentive, the OSN platforms’ engagement strategies go beyond user-initiated behaviors, using AI machine algorithms and click-bait to solicit engagement among users by suggesting friends, games, similar content, contests, and memes. This is why Facebook asks users to play what seem to be silly games: such engagement, no matter how meaningless, can become instantly monetized. Most of these interactions are fairly innocuous, but, because sensationalism and conflict attract engagement, many are meant to provoke political conflict or collusion. In addition, the engagement strategies depend upon keeping attention siloed. If users are regularly exposed to different points of view, if they develop healthy habits for weighing fact versus fiction, they will be tougher targets for engagement.
At best, OSN click-bait strategies and target algorithms yield an endless cacophony of digital noise to compete against any positive human interaction. At worst, and most often than not, we get warring tribes that only venture outside their walled silos to engage with the enemy.
Furthermore, social media is a winner-take-all industry as OSNs have become virtual monopolies through network effects. Much like national languages and computer operating systems, the more users on the network, the more new users want to join the party, the more personal data is harvested, and the more valuable the growing network is to advertisers. This creates a significant barrier to entry for competitors, where any successful new platform is quickly swallowed up by the giants, as when Facebook bought up WhatsApp and Instagram, and Google purchased YouTube.
Their dominance grants Facebook and Google immense bargaining leverage over publishers, content creators, and other stakeholders, who often have no choice but to hand over their own proprietary data to satiate the platforms’ thirst for content. This bargaining power has crushed many creative professions and independent publishers. As far as users who provide all this valuable content go, well, they get a free profile page and a few tools to deepen their engagement.
When it comes to business practices and power over the global internet, Big Tech is unrivaled. As plainly stated by one recent study:
Facebook and Google use their dominant position as gatekeepers to the internet to surveil users and businesses, amass unrivaled stores of data, and rent out targeting services to third parties who can then target content – from ads for shoes to racist propaganda – at users with a perceived precision unrivaled by any other entity. …The longer users remain on the platform – hooked on sensationalist content, which the platforms’ algorithms prioritize – the more money Facebook and Google make from advertising.[1]
Despite the apparent toxicity of these social media platforms, for those who wish to fulfill a sincere desire for wider social connection and engagement, there is no other game in town. Without meaningful competition, Big Tech has transformed their platforms not to help us communicate, but to addict us to their services in order to sell more advertising. For the rest of us the result has not been the promised congenial, global community, but rather a malevolent battle for primacy and survival.
The result is that Big Tech has acquired its own acronym for its five biggest players—the FAANGs—referring to Facebook, Amazon, Apple, Netflix, and Google. Fangs have never been warm and fuzzy.
The Nature of Political Engagement in Democracy
As suggested in the title of this essay, we need to address what all this means for political democracy. Current events might give us a clue, from a previous summer of ongoing urban riots across the country against local government and law enforcement to a protest at the Capitol in January against the 2020 presidential election that turned violent. The chaos in both cases can be traced to the role of social media provocation and coordination.
Democracy is a form of political order that relies on a social choice mechanism called voting that seeks to support and manage self-government. The social choice challenge is always how to distill an inestimable number of personal preferences and interests down to a single pragmatic social policy agenda. It’s not a simple task, nor an obvious one. Neither an authoritarian hierarchy nor a chaotic populist mob accomplishes the objective. Democracy is a messy business, as Churchill said, the worst of all possible political systems, except for all the alternatives.
American democracy is built on a decentralized structure that seeks to best fulfill the goal of self-government while adhering to our stated values of liberty and justice. This is an especially difficult challenge in a large population made up of diverse cultural, ethnic, and racial groups spread over a large landmass, like the USA.
When systems grow large and complex, nature, technology, and history show us that the best way to manage is to decentralize the process. So, in the US we have fifty states made up of thousands of counties and municipalities to decentralize government. What is also required to facilitate the process is a method of ranking policy priorities that can converge on a workable ordering of those priorities. The final condition is a voting process that allows people to compromise on the big issues and find convergence on decisions the entire population finds acceptable, if not ideal.
Our decentralized system of representative governance seeks to fulfill these objectives while also imposing some necessary trade-offs. Our voting system of winner-take-all plurality yields a two-party system where the winning strategy is to acquire more than 50% of the vote. This design eschews proportional representation with a multitude of competing interests by forcing voters to set priorities and move toward a centrist coalition. This design seeks a majority mandate through a process acceptable to the minority as well, the incentive being to capture the center of American politics.
After capturing that center through elections, the governing coalition must then govern the entire populace while adhering to the accepted process to maintain legitimacy. This requires, above all else, convergence through compromise.
The beauty of a two-party system is that voter choices are forced towards the center of compromise to be successful, so a winning strategy will appeal more to commonalities among voters rather than differences. The alternate idea of proportional representation and multiple parties creates more responsive but fragile coalitions, whereas with a dominant centrist coalition, the two-party structure creates greater stability with greater resistance to change.
Naturally, this process favors the status quo (i.e., conservatism?) rather than change (progressivism?) and thus the trade-off is unappealing to those agents of change among us. Understandable, but all societies survive by following time-tested values and practices until they no longer serve, so the burden of change is always on those eager to embrace it. While time is on their side, the change agents often cannot wait.
Given America’s profile as a large country with a large culturally, ethnically and racially diverse population, democratic governance is no small task. Convergence is far easier with a smaller population, a smaller land area, and a more homogeneous culture, with shared racial and ethnic identities. The USA has none of these advantages, but, starting with a relatively small population and land area, the designers of the US Constitution displayed remarkable foresight in their design.
So, the million-dollar question is whether our social media technology is making our task easier or more impossible?
As discussed above, social media is making us more tribal, more isolated from those different than us, more alienated from a common national identity. The face-to-face appreciation of each “other” is lost and technology’s depreciation of humanity allows us to cast that “other” in dark shadows instead of bright enlightenment. It is replacing true meaning with a false sense of tribal identity and differentiation. And where we cannot find this differentiation, we create it. It’s ironic that our commonalities far outweigh our differences, yet these small differences are what we magnify through much of our social media engagement.
We can easily see that these behaviors are short-circuiting our political democracy. We are creating a bimodal distribution of political preferences rather than a unified, centrist “national” one. Ultimately, we are adrift, wondering what American democracy is all about. Without the strength of conviction, we are weak and vulnerable. And as we drift, those with anti-democratic tendencies, whether authoritarian or anarchic, are harnessing these tools to overcome our institutional constraints and undermine our foundations of liberty and justice. We have seen how some of these interests have used the unique crisis of a global pandemic to advance their narrow agenda. It is particularly shocking how some narrow interests employ science as a political weapon, but then completely dismiss scientific skepticism when it doesn’t serve their purpose.
In the social media space, we are seeing censorship of opinion, even informed opinion; canceling of those we disagree with professionally and socially (this is a modern form of ostracism, banishment, and exile from the community); invasions of privacy; collusion; attacks on personal liberties; and the incitement of social disorder and chaos. What is worse is that our traditional media platforms in news journals and television/radio broadcasting have been sucked into this vile vortex, spreading propaganda as objective news.
These developments expose two serious threats to free democracy:
An ideological ‘tribal’ civil war among citizens inflamed by information media, making democratic compromise impossible; and
A danger of collusion between Big Tech and Big Govt to infringe upon constitutional freedoms and privacy by co-opting social media platforms, such as we have seen in China.
This second danger seems particularly acute as the solution recently discussed in the US Senate in response to the first danger. We cannot allow unaccountable governments to co-opt unaccountable technology platforms with the idea that “they” will make us safe. It flips the definition of a people’s democracy on its head.
Remedies?
Are there remedies that can halt this disintegration of our social and political institutions or do things just fall apart? As a free democracy, we need to defend free speech as the basis of communication and comprehension of differing viewpoints. How else to find compromise? We also need objective sources of information we can trust. And we need the integrity of objective national media.
There are many policy proposals that address the problems of Big Tech, from rewriting the Digital Millennium Copyright Act to make OSNs more accountable and liable for the information spread on their networks to breaking up the Big Tech monopolies to changing the revenue model. This essay is intended to bring attention to and explain the problem without going deeply into possible regulatory solutions, but the author’s impressions are that online “search” is likely a public good just like the public library and should be regulated like a public utility; broad and deep vertical integration of product markets is likely subject to anti-trust laws; while barriers to entry should be reduced to counter the network externalities that create quasi-monopolies and help foster greater competition and innovation in technology markets. The advertising revenue model relies on harvesting free data from users, so a more just model would share that data value with the users that create it.
But just as important is an appeal on the personal level to voluntary behavioral modifications among social media users, much like those promoted to decrease tobacco consumption. This is necessary for our personal mental health and our social peace of mind. We know the nihilistic and narcissistic behaviors we engage in on social media are unhealthy. We are fighting for attention, we are competing for status, we are allowing ourselves to become smug with our own created self-image. We are in zero-sum, finite games. But I doubt any of this brings us a sense of meaning, purpose, or fulfillment, no matter how many “likes” we get.
We also know that we crave the affirmation of our unique personal identities and a sense of belonging in our social communities. We need positive-sum, infinite games. (War is a finite game, peace is an infinite game.) Technology can serve us in this capacity, but only if we create social media that makes sense. What makes sense is small scale, inter-personal, commonality of interests, and a great deal of empathy and open-mindedness. What makes sense are positive social interactions that reward our human social and creative instincts.
Lastly, we need to reject ideological politics as personal identity. Political differences are natural, but fused with identity they become threatening and lead to self-defensive reactions. Our partisan identities should mean relatively little compared to our identities as creative, intelligent, interesting, empathetic individuals.
Our online, interconnected world will become more so, but we need to ensure it doesn’t become a more conflicted and contentious one. We do not want a world that wages war by cyber means. More crucial, we need to ensure that technology enhances our humanity by safeguarding our treasured values of liberty and justice for all.