War, US Hegemony, and Geopolitics

This article in Foreign Affairs by Robert Kagan addresses the current geopolitical crisis in historical context using the standard analysis of national security and international relations. However, I think Kagan misses the tail that wags the dog that will shape the near future of geopolitics. That would be the hegemonic role played by the US$, and by association with the C5 central banks in the West over the international monetary system.

Finance unleashes centripetal forces on the flows of capital, concentrating capital in the core of the market system, which, in the case of US$ policy, is the US financial system. All those dollars created over the past 30-40 years must flow back to the US in some way to be reinvested in US$ assets. What this does is suck capital from the periphery.

The capital-starved periphery has relied on the cheap cost of labor relative to the developed world, but about 35 years ago that began to be sucked up by China’s liberalized mercantilist trading policies. So the periphery has been starved of both capital and labor income. This leads to political strive and eventually to armed conflict along the borders of these periphery countries. In some cases, like Mexico, Afghanistan, and Columbia, it has led to failed democracies being transformed into narco-trafficking states run by criminal enterprises that control the governments of these countries.

Russia has been on the wrong side of this widening divide between rich and poor. So have countries like Iran, Iraq and North Korea. The continuation of US$ central bank easy credit policies and the CCP labor policies will only aggravate geopolitical conflict across the globe. And some will still wonder why. The global pandemic has merely thrown gasoline on the fire.

It’s also important to note that these two forces operate at the micro as well as national and international levels, causing a growing divide between the asset-rich and the asset-poor within nations as well as between nations. We’ve seen the imbalance grow in our own cities and communities between rich and poor. More billionaires, more poverty, and a hollowing out of the middle class. We can expect more conflict to come as the direct consequence of our misguided monetary, credit, and fiscal policies snd the Ukraine is just another canary in the coal mine.

The Price of Hegemony

Can America Learn to Use Its Power?

By Robert Kagan, May/June 2022

For years, analysts have debated whether the United States incited Russian President Vladimir Putin’s interventions in Ukraine and other neighboring countries or whether Moscow’s actions were simply unprovoked aggressions. That conversation has been temporarily muted by the horrors of Russia’s full-scale invasion of Ukraine. A wave of popular outrage has drowned out those who have long argued that the United States has no vital interests at stake in Ukraine, that it is in Russia’s sphere of interest, and that U.S. policies created the feelings of insecurity that have driven Putin to extreme measures. Just as the attack on Pearl Harbor silenced the anti-interventionists and shut down the debate over whether the United States should have entered World War II, Putin’s invasion has suspended the 2022 version of Americans’ endless argument over their purpose in the world.

Inflation? What Inflation?

Not that we haven’t been writing here for the past dozen years: central bank policy is the key to mismanaging public finance. MMT? Stupid is as stupid does.

We’re Paying for All of That ‘Free’ Money Now, Aren’t We?

Why is everything so darn expensive? A deep dive noting that economic minds on the right and the left are coming to an agreement — sure, the supply-chain issues and the labor shortage didn’t help, but the biggest factor in our runaway inflation was “vast amounts of government rescue aid, including three rounds of stimulus checks” — way more aid than the European Union, Canada, or the United Kingdom gave to their citizens — and lo and behold, we’ve got much worse inflation rates than they do. We’ve borrowed and spent ourselves into this inflation crisis. 

Fake Money?

This article published in Vox is not very insightful (as most articles in Vox are mostly partisan political parroting), but it does raise lots of questions for the layperson to contemplate. First off, all fiat currency money is fake until it’s made real, which means until you trade currency for real assets, it’s only worth the paper it’s printed on. Value is not in money, it’s in the positive returns that an asset delivers over time. That return can be in additional currency, time, or energy.

Money Has Never Felt More Fake

Some excerpts:

“Money feels cold and mathematical and outside the realm of fuzzy human relationships. It isn’t,” he wrote. “Money is a made-up thing, a shared fiction. Money is fundamentally, unalterably social.”

Yes. As stated above. Money is actually stored time. The more money you have the more time you’ve stored up. But it doesn’t extend your time on earth, it only allows you to trade it for more free time within that uncertain lifetime we all have.

GameStop has come to epitomize an era of meme investing, where ordinary investors are piling into stocks and cryptocurrencies and digital assets not necessarily because they believe in the underlying value of the thing they’re buying (though some do) but instead because it just seems like a thing to do. Dogecoin or NFTs or stock in theater chain AMC get popular online or in their social circles, and they turn around and think, why not?

Yes, the psychological nature of fiat money means that psychology can drive the prices of goods and services. This is especially true with speculation based on the greater fool theory of value.

Value is ultimately a story, one we tell to ourselves and to others. In the United States, we’ve convinced ourselves of the story of the dollar, which is backed by the full force of the US government. But it’s ultimately just a piece of paper. Cryptocurrencies and NFTs and AMC all come with their own stories, which, admittedly, can be on the kooky side.

Well, yes, it’s a story, but whether that story is truth or fiction is borne out by subsequent experience. Value is a function of a positive stream of desired “goods,” much like a bond delivers coupon payments (i.e, interest) every quarter. If the bond fails, well then, the story was fiction.

There’s more to the current money landscape than dogecoin and meme stocks that makes the whole thing seem a little fake. The stock market soared during much of 2020 and 2021, even during the depths of the pandemic, making it hard not to wonder what the whole thing is for. The federal government was able to deliver a lot of money through monetary and fiscal relief to keep the markets — and regular people — afloat.

Yes, money printed by the Fed to monetize excess government borrowing is fake unless it is converted to real value through the conversion of time and energy into real goods and services. Paying people not to work is converting money into fake value that will evaporate in time.

“If it’s just a dot-com bubble, it sucks for the people who invested,” says Hilary Allen, a law professor at American University who specializes in financial regulation. “But if it’s 2008, then we’re all screwed, even those of us who aren’t investing, and that’s not fair. It really depends on who’s getting into this and how integrated it’s getting with the rest of the financial system.”

Well, Prof. Allen doesn’t quite get it. In the early 20th century, market meltdowns bankrupted speculators and financiers and the rich who saw their assets devalued. That is no longer the case as just about everyone has a stake in financial assets through their pensions, real estate, and income flows. We’re all “invested” and it is true today that those most hurt today are those without asset portfolios. The Fed protects the asset rich today. It’s why when Mr. Market eventually ends this game, there will be nowhere to hide except far off the grid. Maybe that’s why the tech billionaires want to colonize outer space? Good luck.

Crony Crapitalism at its Worst

A brief look at the Citicorp case and court rulings from Matt Taibbi’s TK News on Substack:

Meet Jed Rakoff, the Judge Who Exposed the “Rigged Game”

“We have mass incarceration for the poor, and it’s totally hands-off for the rich, and that’s pretty hard to stomach.” Justice Jed Rakoff on his new book, and his famous challenge to the system

“Heads we win, tails you lose,” is not capitalism!

The Other

I find this book and its review oddly obtuse. The root cause remains mysterious? Perhaps the fact that these behaviors are documented across time, place, and culture might suggest a root cause in human nature? The author coins this as xenophobia, but that is not the accurate word because it suggests “otherism” is rooted in human psychology, whereas we observe the same behaviors in other species, especially those of pack animals like wolves, hyenas, and apes.

Instead I would attribute “otherism” to a natural survival instinct that sees the other as a possible threat, especially when the invasion involves rivalry over scarce resources. This would apply across many species that exhibit a sense of “insider vs. outsider” groups.

The difference with human society is that we aware of the moral implications of ostracizing or persecuting the “other” as fellow homo sapiens. We also have a multitude of characteristics we can use to differentiate groups, such as skin color, race, ethnicity, language, gender, cultural habits, etc. In fact, this multitude implies that our current obsession with race may not be the most important factor. I would guess that several of these characteristics coalesce around the nuances of cultural antipathy. In other words, it may not be skin color that matters most, it’s just the most obvious.

For example, a black male that attends Ivy League schools and works on Wall St. can assimilate easily into mainstream society and apparently can easily become President, whereas a black rapper who speaks urban dialect, sports tattoos, and sags his pants below his posterior has almost no chance of assimilating into the dominant cultural milieu, no matter how rich he is. This would apply on a less obvious scale to those, say, who cannot speak fluently the dominant language of a society.

The challenge for a diverse society is to manage the cultural conflicts that arise from our differences. These conflicts cannot be managed with platitudes and bromides about tolerance or focusing solely on chosen identities. Unfortunately this is where our author and reviewer end up: quoting polls about how people feel about national identity in Europe and the USA. It’s an odd comparison because the historical definition of being French or German is categorically different than being American. For centuries people’s identities were defined by where they were born into a dominant local culture. The American experiment is a complete departure from that because it is a land of immigrants (and involuntary servitude in the case of slavery). The true differences between indigenous tribes and European settlers is really a matter of when they arrived on the continent. The struggle for power dominance between insiders and outsiders is a global historical phenomenon, not just a North American one.

How can we meet this challenge of the “other” when globalization is turning us all into “others”? First, we must recognize that antipathy of the other is partly driven by fear, and the fear may very well be rational. Fear of Middle Eastern terrorists touting the conquering mission of Islam is not an irrational fear. An invasion of migrants across borders is a rational fear. The point being that rational fears can be overcome, but not by denying or condemning them.

Some have wrongly assumed that because nationalism can engender a negative attitude toward the other, the nation-state must be a detriment to peace and harmony. This is exactly wrong because nation-states with borders and defined governance are exactly what prevents chaos and conflict by defining the rights to scarce resources. It is why the nation-state has been so durable over the last 400 years. In this respect, One Worldism makes no sense and is a dangerous flirtation.

Second, as this idea of the nation-state suggests, we need to understand that a multicultural society can be detrimental to a free, democratic one. All communities develop and maintain cultural norms and values that make it easier to live together in peace. Accepting the dominant values of the society we live in is merely to understand this and not an impediment to celebrating one’s own cultural heritage. America has been more successful than other developed democracies because being American is not defined by skin color, or language, or race, but by the voluntary acceptance of the American credo of individual rights and freedoms. It is truly the melting pot. Anyone from anywhere in the world can adopt this spirit, even if they cannot transplant themselves. But this fact also underscores the importance of assimilation to the dominant values of a society’s culture, and the USA is no exception. In the USA we might classify these values according to constitutional principles of liberty, justice and law as well as according to commonly accepted behavioral norms. It does not mean surrendering to any “other’s” cultural heritage, but merely accepting those attributes easily assimilated without sacrificing our individual identities.

We can see that uncontrolled borders with uncontrolled waves of migrants only undermines the good will people harbor for embracing the other. It creates uncertainty and disruption to the stable societal norms and anxiety over scarce material resources. It also threatens the touchstones of national identity. Unfortunately, the southern border crisis is now something American society will have to manage and it is not helped by wrongly attributing the problem to systemic racism. This is merely a tragic fallacy. A free diverse society can embrace and has embraced a tolerant attitude toward newcomers, but the prudent pace of adaptation is crucial. No society can peacefully absorb a horde of migrants completely unassimilated to the cultural values and norms of that society. It only invites chaos and conflict.

One can only pray that our national leaders in Washington D.C. wake up to these realities.

‘Of Fear and Strangers’ Review: The Others

Many of history’s most nightmarish episodes are rooted in humanity’s propensity for hatred of ‘The Other.’ But the root cause remains mysterious.

wsj.com/articles/of-fear-and-strangers-history-xenophobia-book-review-the-others-11634912374October 22, 2021

By Adam Kuper Oct. 22, 2021 10:20 am ET

George Makari’s concern with xenophobia goes back to a childhood trauma. In 1974, at the age of 13, he was taken on a family visit to his parents’ native Beirut. Suddenly, the travelers found themselves caught in the midst of what would become a civil war. “To me, it was bizarre,” Dr. Makari recalls in “Of Fear and Strangers: A History of Xenophobia.” He continues: “All these bewildering sects were far more alike than different. All were Levantines who spoke the same dialect; all loved the same punning humor, devoured the same cuisine, abided by strict rules of hospitality, and approached any purchase as a three-act play: bargain, stage a walk-out, then settle. They were quick with proverbs and went agog when Fairuz sang. And yet, subtle distinctions in their identities now meant life or death.”

Of Fear and Strangers: A History of Xenophobia

By George Makari

W.W. Norton

Today, Dr. Makari, a psychiatrist, psychoanalyst and the director of Weill Cornell’s DeWitt Wallace Institute of Psychiatry, sees xenophobia as a threat to social peace, not only in the Middle East but also in Europe and North America, where recent political convulsions have been driven by a bristling hostility toward strangers and outsiders. Dr. Makari is clear that a lot of different impulses are often conflated here: “ethnocentrism, ultranationalism, racism, misogyny, sexism, anti-Semitism, homophobia, transphobia, or Islamophobia.” What might they have in common? “Is there any one term specific enough to not be meaningless, while broad enough to allow us to consider whatever common strands exist between these phenomena?” He thinks that there is: xenophobia. And if all these disorders are variants of the same affliction, then perhaps they have the same cause and might be susceptible to the same treatment.

Dr. Makari traces the invention of “xenophobia” to the 1880s, when psychiatrists came up with a variety of “phobias” apparently caused by traumatic experience. “Hydrophobia”—a fear of water—was an old term for rabies. There followed a rash of other phobias, from claustrophobia to my personal favorite, phobophobia—the fear of being frightened. (One commentator remarked that the number of phobias seemed limited only by an Ancient Greek dictionary.) Xenophobia entered a medical dictionary in 1916 as a “morbid dread of meeting strangers.

Like many psychiatric classifications, early definitions of xenophobia covered too much ground. What began as a psychiatric diagnosis would soon be used to describe the fury with which colonized populations often turned on settlers. These settlers, in turn, would be accused of xenophobia by the critics of colonialism, as waves of migrations in the years leading up to World War I provoked fears of a loss of national identity.



In the U.S., three confrontations between different segments of the population proved formative. The first pitted the Puritans, who were themselves refugees from religious persecution, against Native Americans. The second was the forced migration and enslavement of millions of Africans by descendants of the country’s European settlers. The third was provoked by the migrants, first from Europe, then from Asia, who arrived after the Civil War largely for economic reasons.

Dr. Makari notes that in 1860 60% of the white population in the U.S. was of British origin, while 35% were broadly classified as German. By 1914, after 20 million immigrants had passed through American ports, 11% of the white population had British roots, 20% German, 30% Italian and Hispanic, and 34% Slavic. The settled sense of identity enjoyed by established white American Protestants was threatened. There was, in particular, a panic about Chinese immigration, even though the number of arriving Chinese was relatively small. This led to the passage of the Chinese Exclusion Act in 1882, prohibiting the immigration of Chinese laborers. In 1892, 241 lynchings were recorded in America. Two-thirds of the victims were black; the remaining third were mostly Chinese and Italian. In 1908, the Harvard philosopher Josiah Royce asked: “Is it a ‘yellow peril,’ or ‘black peril,’ or perhaps, after all, is it not some form of ‘white peril’ which threatens the future of humanity in this day of great struggles and complex issues?”

Dr. Makari’s whirlwind historical survey tells a compelling story of racial and ethnic animosity, but he might have paid more attention to religious conflicts. Europe in the 16th and 17th centuries was torn by bloody wars between Catholics and Protestants, a feud that still festered in 20th-century Ireland. The Partition of India in 1947 was accompanied by violent Hindu-Muslim confrontations and the displacement of more than 10 million people. When communist Yugoslavia fell apart, Orthodox Christians and Muslims waged war in the Balkans. The Middle East is currently going through another cycle of Shiite-Sunni wars. Are these religious hatreds also to be considered xenophobia?

Then there are sometimes ferocious confrontations between political parties, or fratricidal quarrels between factions within parties. And what about all those brawling sports fans? So many apparently irrational fears and hatreds. Could they all possibly come down to the same psychic or social forces?

One idea is that there is something fundamentally human here. Early human groups competed for territory. All intruders were enemies. The more you feared and hated outsiders, the better your chances of survival. So xenophobia bestowed an evolutionary advantage. Sports fans are simply expressing inherited tribal instincts. Even babies are frightened by a strange face.

This is a popular one-size-fits-all explanation. But it is problematic. For one thing, anthropologists do not agree that constant strife was the norm during the 95% of human history when small nomadic bands lived by hunting and gathering. The Victorian anthropologist Edward Burnett Tylor said that early humans would have had to choose between marrying out or being killed out. When Europeans in the early 19th century made contact with surviving communities of hunter-gatherers, different bands were observed forming marriage alliances and trading partnerships that generally kept feuds from raging out of control.

In the aftermath of World War II and the Holocaust, however, a better explanation of mass hatreds was needed. The orthodox theory in American psychology at the time was behaviorism, which explained habitual attitudes and responses as the products of conditioning: Pavlov’s dogs salivated at the sound of a bell because they had been conditioned to recognize this as a cue for food. In the same sort of way, children are warned against strangers and so conditioned to fear others.

Less orthodox, but more influential in the long run, is the notion of projection. Each of us half-recognizes our shameful desires, infantile fears, aggressive impulses. Instead of dealing with them, we may accuse someone else of harboring those same feelings, cleansing ourselves by shifting the blame onto a scapegoat.

According to yet another analytic theory, the people most susceptible to collective paranoia are the children of strict and demanding fathers whom they feared and adored. Theodor Adorno, the lead author of the classic account “The Authoritarian Personality,” wrote that the typical subject “falls, as it were, negatively in love.” Cowed by the father-figure, he is masochistically submissive to authority and sadistically takes out his anger on the weak.

These psychoanalytic theories all seek to explain the personal traumas and particular pathologies of individuals. But how do whole populations come to share common anxieties and antipathies? In 1928, the sociologist Emory Bogardus published the landmark study “Immigration and Race Attitudes.” One of its disconcerting findings was that the most widely disliked people in the U.S. at the time were “Turks.” Though very few Americans had actually encountered a member of that group, they had heard about them. And what they had heard about was the massacre of Armenians in Turkey after World War I, which was presented in the press as a slaughter of Christians at the hands of Muslims.

It was this power of the media to shape popular sentiment that the journalist Walter Lippmann came to dread. An early supporter of American involvement in World War I, Lippmann had joined the Inter-Allied Propaganda Board in London. In 1922 he published “Public Opinion,” his study of “how public opinion is made.” In it, he borrowed a term from the printing press: stereotype. We all share ready-made ideas that facilitate snap judgments about people and situations. These stereotypes are crude but may be useful in a pinch. They save time and trouble.

Effective propaganda weaponizes stereotypes. Lippmann’s work inspired Sigmund Freud’s American nephew Edward Bernays, who set up the first public relations business. Bernays in turn influenced Joseph Goebbels, who made terrible use of his ideas. Social media now serves up propaganda on steroids.

Yet surely not everyone is gulled—at least not all the time. How then to explain what is going on when strangers are demonized? Dr. Makari suggests that some combination of these psychological and sociological theories may be cobbled together to guide our thinking. This is probably the best that we can manage at present. What then can be done to limit the damage? Here Dr. Makari is less helpful. He suggests that all will be well if society becomes more equal, open and informed. He might as well add that social media should be better regulated, and the public better equipped for critical thought. Failing that, we may have to relive these nightmares of collective hatred again and again for a long time to come.

Yet there are grounds for hope. A study released in May this year by the Pew Research Center reported that conceptions of national identity in the U.S. and Western Europe have recently become more inclusive. Compared with 2016, “fewer people now believe that to truly be American, French, German or British, a person must be born in the country, must be a Christian, has to embrace national customs, or has to speak the dominant language.” This may suggest that xenophobia waxes and wanes with recent events, and that politicians can fan or tamp down outbreaks of public fear and fury. Wise and prudent leaders really might spare us a great deal of trouble.

—Mr. Kuper, a specialist on the ethnography of Southern Africa, has written widely on the history of anthropology.

Copyright ©2021 Dow Jones & Company, Inc. All Rights Reserved. 

Abandon Identity Politics – Now.

This is an interesting study on political tolerance and intolerance across the US. Click the link to view the interactive maps. Some of the data seems obvious as the most partisan counties in America are also the least tolerant. Some of the data presents puzzles, like what’s going on in FL? And NY seems to have no data. But I think the general conclusion is correct: Identity politics destroys democracy. We need to end it now, one citizen at a time.

The Geography of Partisan Prejudice

theatlantic.com/politics/archive/2019/03/us-counties-vary-their-degree-partisan-prejudice/583072March 4, 2019Politics

A guide to the most—and least—politically open-minded counties in AmericaBy Amanda RipleyRekha Tenjarla, and Angela Y. He

March 4, 2019

Editor’s note: The maps in this article have been corrected to address problems with two entries in the underlying data. People searching for some counties were shown different counties, and some saw information that didn’t match the county they’d searched for.


We know that Americans have become more biased against one another based on partisan affiliation over the past several decades. Most of us now discriminate against members of the other political side explicitly and implicitly—in hiringdating, and marriage, as well as judgments of patriotism, compassion, and even physical attractiveness, according to recent research.

But we don’t know how this kind of stereotyping varies from place to place. Are there communities in America that are more or less politically forgiving than average? And if so, what can we learn from the outliers?

To find out, The Atlantic asked PredictWise, a polling and analytics firm, to create a ranking of counties in the U.S. based on partisan prejudice (or what researchers call “affective polarization”). The result was surprising in several ways. First, while virtually all Americans have been exposed to hyper-partisan politicians, social-media echo chambers, and clickbait headlines, we found significant variations in Americans’ political ill will from place to place, regardless of party.

We might expect some groups to be particularly angry at their political opponents right now. Immigrants have been explicitly targeted by the current administration, for example; they might have the most cause for partisan bias right now. But that is not what we found.

In general, the most politically intolerant Americans, according to the analysis, tend to be whiter, more highly educated, older, more urban, and more partisan themselves. This finding aligns in some ways with previous research by the University of Pennsylvania professor Diana Mutz, who has found that white, highly educated people are relatively isolated from political diversity. They don’t routinely talk with people who disagree with them; this isolation makes it easier for them to caricature their ideological opponents. (In fact, people who went to graduate school have the least amount of political disagreement in their lives, as Mutz describes in her book Hearing the Other Side.) By contrast, many nonwhite Americans routinely encounter political disagreement. They have more diverse social networks, politically speaking, and therefore tend to have more complicated views of the other side, whatever side that may be.

We see this dynamic in the heat map. In some parts of the country, including swaths of North Carolina and upstate New York, people still seem to give their fellow Americans the benefit of the doubt, even when they disagree. In other places, including much of Massachusetts and Florida, people appear to have far less tolerance for political difference. They may be quicker to assume the worst about their political counterparts, on average. (For an in-depth portrait of one of the more politically tolerant counties in America, see our accompanying story on Watertown, New York.)

To do this assessment, PredictWise first partnered with Pollfish to run a nationwide poll of 2,000 adults to capture people’s feelings about the other party. The survey asked how people would feel if a close family member married a Republican or a Democrat; how well they think the terms selfishcompassionate, or patriotic describe Democrats versus Republicans; and other questions designed to capture sentiments about political differences.

Based on the survey results, Tobias Konitzer, the co-founder of PredictWise, investigated which demographic characteristics seemed to correlate with partisan prejudice. He found, for example, that age, race, urbanicity, partisan loyalty, and education did coincide with more prejudice (but gender did not). In this way, he created a kind of profile of contemporary partisan prejudice.

Next, Konitzer projected this profile onto the broader American population, under the assumption that people with similar demographics and levels of partisan loyalty, living in neighborhoods with comparable amounts of political diversity, tend to hold similar attitudes about political difference. He did this using voter files acquired by PredictWise from TargetSmart, a commercial vendor. Voter files are essentially data snapshots about all American adults, based on publicly available records of voter registration and turnout from past elections, along with data about neighborhood variables and demographic traits. In this way, PredictWise was able to rank all 3,000 counties in the country based on the estimated level of partisan prejudice in each place. (For more technical detail about the methodology, click here.) “What I find most striking is that we find a good degree of variation,” Konitzer says. Some states, like Texas, show a real mix of prejudiced and nonprejudiced counties; whereas Florida is very consistent—and fairly prejudiced—from place to place.

Nationwide, if we disregard the smallest counties (which may be hard to pin down statistically, since they have fewer than 100,000 people), the most politically intolerant county in America appears to be Suffolk County, Massachusetts, which includes the city of Boston. In this part of the country, nine out of every 10 couples appear to share the same partisan leaning, according to the voter-file data. Eight out of every 10 neighborhoods are politically homogeneous. This means that people in Boston may have fewer “cross-cutting relationships,” as researchers put it. It is a very urban county with a relatively high education level. All these things tend to correlate with partisan prejudice.

We now assume that the other political side is much more extreme than it actually is, as Matthew Levendusky and Malhotra have found. In a 2012 survey, they found that Republicans rate fellow Republicans as more hard-line on taxes, immigration, and trade than they actually are; and Democrats rate Republicans as even further to the right.

These distortions lead us to make worse decisions. Most obviously, politicians refuse to compromise on things like border walls and budgets, even when it hurts the country. But regular people’s judgments get warped too. For example, parents are less likely to vaccinate their children when the other party’s president is in the White House, according to a 2019 working paper by the Stanford Ph.D. candidate Masha Krupenkin. Regardless of who is in power, mutual-fund managers are more likely to invest in funds handled by fellow partisans, a bias that does not lead to better returns.

The irony is that Americans remain in agreement on many actual issues. Eight out of 10 Americans think that political correctness is a problem; the same number say that hate speech is a concern too. Most Americans are worried about the federal budget deficit, believe abortion should be legal in some or all cases, and want stricter gun regulation. Nevertheless, we are more and more convinced that the other side poses a threat to the country. Our stereotypes have outpaced reality, as stereotypes tend to do.

By contrast, the North Country, in far upstate New York, just east of Lake Ontario, seems to be more accepting of political differences. The same seems to be true in parts of North Carolina, including Randolph, Onslow, and Davidson Counties. In these places, you are more likely to have neighbors who think differently than you do. You are also more likely to be married to someone from the other side of the aisle. It’s harder to caricature someone whom you know to be a complicated person.

Other research has also found that more educated and politically engaged people tend to be more politically prejudiced. But the PredictWise analysis also detected a correlation with urbanicity and life stage. Older Americans and people living in or near sizable cities, from Dallas, Texas, to Seattle, Washington State, seem to be more likely to stereotype and disdain people who disagree with them politically.

We don’t know what is causing what, unfortunately, as is often the case in sociological research. We just know that being older and living in or near a city seem to go along with partisan prejudice in general. This may be because, according to decades of research into how prejudice operates, humans are more likely to discriminate against groups of people with whom they do not have regular, positive interactions. (In Europe, some research suggests that anti-immigrant sentiments tend to be higher in people who live in homogeneous neighborhoods near—but not among—immigrants.)

And in America, people who live in cities (particularly affluent, older white people) can more easily construct work and home lives with people who agree with them politically. They may be cosmopolitan in some ways and provincial in others.

Americans now routinely guess one another’s partisan leanings based on what they eat, drive, and drink (Dunkin’ Donuts? Republican; Starbucks? Democrat), according to a working paper by the University of Pennsylvania Ph.D. candidate Hye-Yon Lee. And based on these unreliable cues, they say they’d be more or less likely to want to live, work, or hang out with one another.

We are now judging one another’s fundamental decency based on whether we eat at Chipotle or Chick-fil-A. This may seem silly—harmless, even. But it is uncomfortably reminiscent of stories from conflict zones abroad. In Northern Ireland, for example, an outsider visiting during the Troubles had no way to tell unionists and nationalists apart. They were pretty much all white Christians, after all. But the locals themselves routinely guessed one another’s identity based on their names, the spacing of their eyes, their sports jerseys, the color of their hair, their neighborhood, or even how much jewelry they wore­. This process came to be known as “telling.” If a reliable cue didn’t exist, people would make one up. It was a way to move about in the world in a time of profound tribalism, during which 3,600 people were killed.

In parts of America, it is markedly more uncomfortable to be perceived as a Democrat right now. In other places, it is very isolating to be outed as a Republican. To get a sense of these differences, we asked PredictWise to do two other rankings—this time reflecting the directional flow of partisan prejudice. The resulting maps reveal places where Democrats are the most dismissive of Republicans and vice versa.

In general, Republicans seem to dislike Democrats more than Democrats dislike Republicans, PredictWise found. We don’t know why this is, but this is not the only study to have detected an imbalance. For example, in a 2014 survey by the Pew Research Center, half of consistently conservative respondents said it was important for them to live in a place where most people share their political views—compared with just 35 percent of consistent liberals. But a more recent survey, conducted in December by The Atlantic and the Public Religion Research Institute, found that Democrats were the ones showing more ill will—with 45 percent saying they’d be unhappy if their child married a Republican (versus 35 percent of Republicans saying they’d be unhappy if their child married a Democrat). So it’s hard to know exactly what’s going on, but what’s clear is that both sides are becoming more hostile toward one another.

Conflict and protest are vital to democracy. But whenever people begin to caricature one another, anywhere in the world, predictable tragedies occur. Fixable problems do not get fixed. Neighbors become estranged, embittered, and sometimes violent. Everyone ends up worse off, sooner or later. “This is the great danger America faces,” Representative Barbara Jordan of Texas said in 1976. “That we will cease to be one nation and become instead a collection of interest groups: city against suburb, region against region, individual against individual. Each seeking to satisfy private wants.”

Partisan prejudice is different from other forms of prejudice. It is not yet embedded in all of our institutions, the way racism has been. But the evidence shows that it distorts our thinking, just like other kinds of prejudice. “Just like with race, the problem is that when people stereotype, they miss the variation within a group,” says Stanford University’s Neil Malhotra, who has researched political behavior for more than a decade.

Fundamentally, partisan prejudice is another way for one group of humans to feel superior to another. New research suggests that it is now more acceptable in some areas of life than racial prejudice. In a 2012 experiment, the political scientists Shanto Iyengar and Sean Westwood gave nearly 2,000 Americans implicit-bias tests and found that partisan bias was more widespread than racial bias. About 70 percent of Democrats and Republicans showed a reflexive bias for their own party. (Take a version of this test here.)

Of course, it can be harder to tell someone’s political leanings than someone’s skin color. And it’s hard to develop an implicit-bias test that mimics realistic, everyday encounters. But when people think they can guess someone’s political leanings, they discriminate accordingly.

In a 2014 study, Karen and Thomas Gift at Duke University sent out 1,200 resumes, tweaking some to suggest a candidate with previous experience in a Democratic or Republican organization. And employers seemed to notice. In a conservative county in Texas, a Republican applicant had to submit about five resumes for each positive callback. By contrast, a Democratic applicant needed to submit seven resumes to get a callback. (And the Republican candidates had a similar disadvantage in a liberal California county.)

What makes this kind of prejudice unusual is that it is currently very easy to defend. What is wrong with discriminating against someone based on political values? After all, unlike race or sexuality, politics is something you choose. If you choose unwisely, maybe you deserve to be judged accordingly.

Yes and no. We have more choice over our politics than over our sexuality, without a doubt. But the vast majority of people follow their parents’ lead when it comes to party affiliation, just as they do with religion. In fact, some researchers have even found that political tendencies are significantly influenced by genetics, with identical twins sharing even more political opinions than fraternal twins.

Most people adopt a political team at a young age and very rarely change—regardless of whether they make more money or need more government help at different life stages. Political preferences are not rational or linear decisions, even though they feel that way“People bind themselves into political teams that share moral narratives,” Jonathan Haidt writes in his book The Righteous Mind. “Once they accept a particular narrative, they become blind to alternative moral worlds.”

About four in 10 Americans identify as independent today, but even they pick sides. Most independents consistently lean either right or left in their voting behavior over time and tend to exhibit similar prejudices as people who claim a specific party.

As politics have become more about identity than policy, partisan leanings have become more about how we grew up and where we feel like we belong. Politics are acting more like religion, in other words.

This is partly because partisan identities have begun to line up with other identities, as Lilliana Mason describes in her book, Uncivil Agreement. Making assumptions about people’s politics based on their race or religiosity is easier than it was in the past. Black people get typed as Democrats; people who go to church on Sunday are assumed to be Republicans. (But as always, stereotypes still mask complexity: About half of black Americans go to church at least once a week, for example, a far higher rate than that of white Americans.)

In other words, partisan prejudice now includes a bunch of other prejudices, all wrapped up into one tangled mess. “Americans are really divided, but not in terms of policy; they’re divided in terms of identity,” Mason says. “And the more identities come into play, the more salient they are, the harder it will be to agree, even if policy positions shift.” Politics are becoming a proxy battle for other deep divisions that have almost nothing to do with environmental regulation or tax policies.

Hope is embedded in all these maps: This kind of prejudice is malleable. That is why it varies so much from place to place. By cultivating meaningful relationships across divides, by rewarding humility and curiosity over indignation and righteousness, people can live wiser, fuller lives. They can also learn to speak one another’s language, which means they might one day even change one another’s minds. This happens organically in some places, we now know. Maybe it’s time to think of these outliers as rare and interesting, worthy of our attention, before they become extinct.

Freedom = Choice + Autonomy + Protection

I came across this article in this past weekend’s WSJ. It discusses the transformation of work instigated by the pandemic lockdown. I have long maintained that people living in a free society crave the positive freedoms of choice and personal autonomy, subject to the negative freedom of security against the fear of risk and loss. The pandemic has brought that behavior to the fore with the demands for freedom to work when and where we want, with whom, and on what as the fulfillment of life’s meaning. At the same time we have demanded protection through government from healthcare risks we cannot control ourselves. Now that we have been forced to reexamine our lives and search for new meaning, we are mostly unwilling to give it up and go back to how we worked and lived before. This is social progress, because the post-industrial employment and career path was an economic imperative imposed on our human nature, and therefore faintly unnatural. We all felt it inside.

Within this new structural paradigm of freedom, people are free to imagine, create, share, and connect. This the vision of tuka, the social network platform that connects the creative…

I reprint the article in full:

The Real Meaning of Freedom at Work

wsj.com/articles/the-real-meaning-of-freedom-at-work-11633704877

October 8, 2021

By Adam Grant

As the Covid-19 pandemic moves into a new phase, many companies have started insisting that we come back to the office full-time. In response, people are quitting their jobs in droves. Flexibility is now the fastest-rising job priority in the U.S., according to a poll of more than 5,000 LinkedIn members. More than half of Americans want their next job to be self-employed—some as entrepreneurs, others as freelancers in the gig economy or content curators in the creator economy.

When Covid untethered us from our offices, many people experienced new forms of flexibility, and the taste of freedom left us hungry for more. We started rethinking what we wanted out of work. But the Great Resignation is not a mad dash away from the office; it’s the culmination of a long march toward freedom. More than a decade ago, psychologists documented a generational shift in the centrality of work in our lives. Millennials were more interested in jobs that provided leisure time and vacation time than Gen Xers and baby boomers. They were less concerned about net worth than net freedom.

In a classic 1958 lecture, the philosopher Isaiah Berlin distinguished between two types of freedom. Negative liberty is freedom from obstacles and interference by others. Positive liberty is freedom to control your own destiny and shape your own life. If we want to maximize net freedom in the future of work, we need to expand both positive and negative liberty.

The debate about whether work should be in-person, remote-first or hybrid is too narrow. Yes, people want the freedom to decide where they work. But they also want the freedom to decide who they work with, what they work on and when they work. Real flexibility is having autonomy to choose your people, your purpose and your priorities.

Remote work has granted us some negative liberties. It can release employees from the manacles of micromanagers, the trap of traffic jams and the cacophony of open offices. But it has also created new constraints on time. Even before Covid, many people reported spending the majority of their work time in meetings and on emails. Once everyone was reachable around the clock, collaboration overload only got worse.

In a study led by economist Michael Gibbs, when more than 10,000 employees of a large Asian IT company started working from home during the pandemic, productivity fell even as working hours increased. The researchers didn’t measure the physical and emotional toll of Covid, but the data showed that people got less done because they had less time to focus. They were stuck in more group meetings and got interrupted more often.

Good segmentation policies allow people to commit to predictable time off that shields them from work intrusions into their lives.

To free people from these constraints, we need better boundaries. There’s evidence that working from home has been more stressful for “segmentors” who prefer to separate the different spheres of life than for “integrators” who are happy to blur the lines. Good segmentation policies allow people to commit to predictable time off that shields them from work intrusions into their lives. For example, the healthcare company Vynamic has a policy called “zzzMail” that discourages sending emails on nights and weekends.

We need boundaries to protect individual focus time too. On remote teams, it’s not the frequency of interaction that fuels productivity and creativity—it’s the intensity of interaction. In a study of virtual software teams by collaboration experts Christoph Riedl and Anita Woolley, the most effective and innovative teams didn’t communicate every hour. They’d spend several hours or days concentrating on their own work and then start communicating in bursts. With messages and bits of code flying back and forth, their collaborations were literally bursting with energy and ideas.

One effective strategy seems to be blocking quiet time in the mornings as a window for deep work, and then coming together after lunch. When virtual meetings are held in the afternoon, people are less likely to multitask—probably in part because they’ve been able to make progress on their own tasks. For the many workplaces rolling out hybrid schedules of one or two remote days each week, it might also help to have teams coordinate on-site days so they can do individual work at home and collaborate when they’re in the same room.

Over the past year and a half, we’ve discovered a new constraint of remote work: Zoom fatigue is real. Yes, turning off your self-view can make you less self-conscious, but it doesn’t remove the cognitive load of worrying about how other people will perceive you and trying to read their facial expressions. Turning the camera off altogether can help. In a summer 2020 experiment led by organizational psychologists Kristen Shockley and Allison Gabriel, when employees at a healthcare company had the freedom to turn their video off during virtual meetings, it reduced fatigue—especially for women and new hires, who generally face more pressure to monitor their image.

New research reveals that having voice-only conversations isn’t just less exhausting; there are times when it can be more effective. When two people working on a problem together only hear each other’s voices, they’re more likely to pause to listen to each other, which translates into more equal speaking time and smarter decisions. And if you’re trying to read someone’s emotions, you’re more accurate if you close your eyes or turn off the lights and just listen to their voice.

This doesn’t mean cameras should never be on. Seeing human faces can be helpful if you’re giving a presentation, building trust or trying to coordinate in a big group. But videos can also be a constraint—you don’t need them in every meeting. The most underused technology of 2021 might be the phone call.

In a world of rising inequality, remote work has released some restraints. Many working mothers have struggled during the pandemic, in large part because of the responsibility of child care when schools were closed. But research suggests that in normal circumstances, the option to work remotely is especially helpful for working mothers, giving them the flexibility to excel in their jobs. And working from home, Black employees have reported less stress. One survey found that 97% of Black knowledge workers currently working from home want to remain partially or fully remote for the foreseeable future.

But going remote runs the risk of limiting positive liberties. In a landmark 2014 experiment at a call center in China, a team led by economist Nicholas Bloom randomly assigned hundreds of employees to work from home. Although remote workers were 13% more productive, they were only half as likely to be promoted—likely because they didn’t have enough face time with senior managers.

It’s well documented that many managers mistake visibility for value and reward presence instead of performance. The very employees who gain freedom from constraints thanks to remote work may end up missing out on the freedom to develop their skills and advance their careers.

One source of positive liberty is the freedom to choose who we interact with and learn from. After more than 60,000 employees at Microsoft transitioned to remote work during the pandemic, researchers found that their personal networks became more siloed and static. There were fewer new connections between people, fewer bridges between teams and fewer real-time conversations within groups. That made it tougher to acquire and share knowledge.

To give people the freedom to learn, we need to work harder to open doors. In the summer of 2020, researchers teamed up with a large company that hired more than a thousand interns to work remotely in 16 cities. They found that scheduling “virtual water coolers”— informal meetings with senior managers—elevated interns’ satisfaction as well as their performance ratings and their odds of getting a return offer. Just three or four virtual meetings with senior managers was enough to open the door to learning, mentoring and trust. What if more leaders hosted virtual office hours?

Another source of positive liberty is the freedom to decide what work we do. A few years ago, I visited a California tomato paste company called Morning Star to understand how they’ve managed to sustain success for several decades without bosses. When you first arrive at Morning Star, you’re assigned the job of your predecessor. After a year, you’re invited to rewrite your job description, with two conditions. You have to explain how your revamped job will advance the company’s mission, and you have to get the people who work with you most closely to agree to it.

When employees have the flexibility to customize their work, they’re more effective, more satisfied and more likely to stay.

Organizational psychologists Amy Wrzesniewski and Jane Dutton call this “job crafting,” and it enables people to become active architects of their own tasks and interactions. Extensive research suggests that when employees have the flexibility to customize their work, they’re more effective, more satisfied and more likely to stay.

The biggest source of positive liberty may be the freedom to decide when and how much we work. If we’ve learned anything from the pandemic about going remote, it’s that people aren’t shirking from home—they’re working overtime. But the 40-hour workweek was not ordained from above; it’s a human invention that grew out of the Industrial Revolution. Anthropologists find that for more than 95% of human history, people enjoyed more leisure time than we do now. Generations of hunter-gatherers subsisted on 15-hour workweeks. When we started treating humans like machines, we began confusing time spent with value created.

At the Brazilian manufacturing company Semco, leaders noticed that when retirement finally gives people the freedom to pursue their passions for travel, sports, arts and volunteering, their health often stands in the way. So the company started a Retire-A-Little program, inviting workers to trade 10% of their salaries for Wednesdays off. They expected it to be popular with employees in their 50s, but it was actually employees in their 20s who jumped at the opportunity to trade money for time.

When people have the flexibility to work less, they often focus better and produce more. In the U.S. alone, researchers estimate that companies waste $100 billion a year paying for idle time. When Microsoft Japan tested a 4-day workweek, productivity climbed by 40% and costs declined. The Icelandic government tested reducing workweeks from 40 to 36 hours at the same pay in offices, hospitals and police stations over a four-year period. It found that well-being and work-life balance improved, while productivity was sustained across the board—and in some cases heightened.

Offering the freedom to work less is an opportunity to attract, motivate and retain talented people. From 2018 to 2021, the number of job postings offering a four-day workweek has tripled, but they are still less than one in 100 jobs. Along with shortening the workweek, it’s worth rethinking the workday. What if we finished at 3 p.m. so that working parents could be with their children when they came home from school? Would we see better results—and higher quality of life—in six focused hours than eight unfocused hours?

Flexible work is here to stay, but companies that resist it may not be.

Flexible work is here to stay, but companies that resist it may not be. One of the biggest mistakes I saw companies make before Covid was failing to experiment with new forms of freedom. As employers contemplate a return to the workplace, a good place to start might be to ask people about the experiments they’ve run in the past year and a half and the ones they’d love to try moving forward. What old constraints should we try removing, and what new freedoms could we test?

Work isn’t just our livelihood. It can be a source of structure, belonging and meaning in our lives. But that doesn’t mean our jobs should dictate how we spend most of our waking hours. For several generations, we’ve organized our lives around our work. Our jobs have determined where we make our homes, when we see our families and what we can squeeze in during our downtime. It might be time to start planning our work around our lives.

Appeared in the October 9, 2021, print edition as ‘The Real Meaning of Freedom at Work The Value of Liberty for Workers.’

The Fed Is Playing With Fire

The following WSJ article warns again of the excessive monetary policies of the Federal Reserve…

The Fed Is Playing With Fire

Clinging to an emergency policy after the emergency has passed, Chairman Powell courts asset bubbles.

By Christian Broda and Stanley Druckenmiller May 10, 2021 6:16 pm

With Covid uncertainty receding fast, and several quarters deep into the strongest recovery from any postwar recession, the Federal Reserve’s guidance continues to be the most accommodative on record, by a mile. Keeping emergency settings after the emergency has passed carries bigger risks for the Fed than missing its inflation target by a few decimal points. It’s time for a change.

The American economy is back to prerecession levels of gross domestic product and the unemployment rate has recovered 70% of the initial pandemic hit in only six months, four times as fast as in a typical recession. Normally at this stage of a recovery, the Fed would be planning its first rate hike. This time the Fed is telling markets that the first hike will happen in 32 months, 2½ years later than normal. In addition, the Fed continues to buy $40 billion a month in mortgages even as housing is clearly running out of supply. And the central bank still isn’t even thinking about ending $120 billion a month of bond purchases.OPINION: POTOMAC WATCHBiden Wants to Trample Covid-19 Vaccine Patents00:001xSUBSCRIBE

Not only is the recovery happening at record speed, excesses of fiscal policy are already visible. Consumers are spending like never before, construction is booming, and labor shortages are ubiquitous, thanks to direct government transfers. Two-thirds of all relief checks were sent after the vaccines were proved effective and the recovery was accelerating. Opportunistic politicians didn’t let the pandemic go to waste. Especially after the Trump years, Congress has decided to satisfy its long list of unmet desires.

Isn’t the Fed’s independence supposed to act as a counterbalance to these political whims?


The emergency conditions are behind us. Inflation is already at historical averages. Serious economists soundly rejected price controls 40 years ago. Yet the Fed regularly distorts the most important price of all—long-term interest rates. This behavior is risky, for both the economy at large and the Fed itself.

Future fiscal burdens will put the kind of political pressure on the central bank that hasn’t been seen in decades. The federal government has added 30% of GDP in extra fiscal deficits in only two years, right as the baby-boomer retirement wave is beginning to accelerate. The Congressional Budget Office projects that in 20 years almost 30% of all yearly fiscal revenues will have to be used solely to pay back interests on government debt, up from a current level of 8%. More taxes simply won’t be enough to bridge the gap, so pressures to monetize the deficit will inevitably rise over the years. The Fed should be adapting policy today to minimize these risks.

The risks are no longer hypothetical. For decades Treasurys have been the preferred asset for foreigners looking to hedge global portfolios. It was therefore shocking and unprecedented that in the midst of last year’s stock-market meltdown and while the Cares Act was being debated, foreigners aggressively sold Treasurys. This was dismissed by the Fed as a problem in the plumbing of financial markets. Even after trillions spent to prop up the bond market, foreigners have continued to be net sellers. The Fed chooses to interpret this troubling sign as the result of technicalities rather than doubts about the soundness of current and past policies.

America’s deep divisions also make the central bank’s independence crucial. Fighting inequality and climate change are very far from the Fed’s central mission. There’s a reason central bankers are supposed to be unpopular. Inflation is often the result of a fragmented society that feels unrepresented by weak political leadership. Eventually, the choice between fiscal discipline—lower taxes or higher spending—and forcing the central bank’s hand becomes an easy one for politicians to make.

With these risks in mind, and with unambiguous evidence of a strong recovery, the Fed should be doing more than just reanchoring inflation expectations to a slightly higher level. Fed policy has enabled financial-market excesses. Today’s high stock-market valuations, the crypto craze, and the frenzy over special-purpose acquisition companies, or SPACs, are just a few examples of the response to the Fed’s aggressive policies. The central bank should balance rather than fuel asset prices. The pernicious deflationary episodes of the past century started not because inflation was too close to zero but because of the popping of asset bubbles.

With its narrow focus on inflation expectations, the Fed seems to be fighting the last battle. Just because the Fed hasn’t faced big trade-offs in recent decades doesn’t mean trade-offs aren’t coming or that they no longer exist. Chairman Jerome Powell needs to recognize the likelihood of future political pressures on the Fed and stop enabling fiscal and market excesses. The long-term risks from asset bubbles and fiscal dominance dwarf the short-term risk of putting the brakes on a booming economy in 2022.

Mr. Broda is a partner at Duquesne Family Office LLC, where Mr. Druckenmiller is chairman and CEO.

Do We Need a Social Welfare State?

One must evalulate all the trade-offs.

The following article in today’s NY Times asks the provocative question of whether we can afford a major shift to a social welfare state. One must also ask if the USA needs such a level of social welfare spending and what trade-offs it might impose. This is a question that must be answered through the democratic political process because the economic trade-offs are real.

See comments in RED.

Can America Afford to Become a Major Social Welfare State?

nytimes.com/2021/09/15/opinion/biden-spending-plan-welfare.html

By N. Gregory Mankiw

September 15, 2021

In the reconciliation package now being debated in Washington, President Biden and many congressional Democrats aim to expand the size and scope of government substantially. Americans should be wary of their plans — not only because of the sizable budgetary cost, but also because of the broader risks to economic prosperity.

The details of the ambitious $3.5 trillion social spending bill are still being discussed, so it is unclear what it will end up including. In many ways, it seems like a grab bag of initiatives assembled from the progressive wish list. And it may be bigger than it sounds: Reports suggest that some provisions will arbitrarily lapse before the end of the 10-year budget window to reduce the bill’s ostensible size, even though lawmakers hope to extend those policies at a later date.

People of all ages are in line to get something: government-funded pre-K for 3- and 4-year-olds, expanded child credits for families with children, two years of tuition-free community college, increased Pell grants for other college students, enhanced health insurance subsidies, paid family and medical leave, and expansions in Medicare for older Americans. A recent Times headline aptly described the plan’s coverage as “cradle to grave.”

If there is a common theme, it is that when you need a helping hand, the government will be there for you. It aims to assist people who are struggling in our rough-and-tumble market economy. On its face, that instinct doesn’t sound bad. Many Western European nations have more generous social safety nets than the United States. The Biden plan takes a big step in that direction.

Can the United States afford to embrace a larger welfare state? From a narrow budgetary standpoint, the answer is yes. But the policy also raises larger questions about American values and aspirations, and about what kind of nation we want to be.

The issue Prof. Mankiw addresses here is the question as to whether the costs of such programs yield the benefits desired. There is a lot of talk on the left that Modern Monetary Theory demonstrates that deficits don’t constrain government spending so that politicians should spend what’s needed to achieve whatever objective they choose. This is a bit of wishful fantasy. What matters economically and financially is whether such spending yields a greater return in terms of freedom and quality of life for society as a whole. If such spending merely increases the deficit but does not invest in the productivity of the economy, then it is a dead weight upon society. It’s not much different than one’s personal desire to choose between buying a new car or instead investing in education. One must compare how each choice will yield in terms of financial freedom and happiness over the longer term.

The Biden administration has promised to pay for the entire plan with higher taxes on corporations and the very wealthy. But there’s good reason to doubt that claim. Budget experts, such as Maya MacGuineas, president of the Committee for a Responsible Federal Budget, are skeptical that the government can raise enough tax revenue from the wealthy to finance Mr. Biden’s ambitious agenda.

The United States could do what Western Europe does — impose higher taxes on everyone. Most countries use a value-added tax, a form of a national sales tax, to raise a lot of revenue efficiently. If Americans really want larger government, we will have to pay for it, and a VAT could be the best way.

The costs of an expanded welfare state, however, extend beyond those reported in the budget. There are also broader economic effects.

Arthur Okun, the former economic adviser to President Lyndon Johnson, addressed this timeless issue in his 1975 book, “Equality and Efficiency: The Big Tradeoff.” According to Mr. Okun, policymakers want to maximize the economic pie while slicing it equally. But these goals often conflict. As policymakers attempt to rectify the market’s outcome by equalizing the slices, the pie tends to shrink.

Mr. Okun explains the trade-off with a metaphor: Providing a social safety net is like using a leaky bucket to redistribute water among people with different amounts. While bringing water to the thirstiest may be noble, it is also costly as some water is lost in transit. 

In the real world, this leakage occurs because higher taxes distort incentives and impede economic growth. And those taxes aren’t just the explicit ones that finance benefits such as public education or health care. They also include implicit taxes baked into the benefits themselves. If these benefits decline when your income rises, people are discouraged from working. This implicit tax distorts incentives just as explicit taxes do. That doesn’t mean there is no point in trying to help those in need, but it does require being mindful of the downsides of doing so.

Yes, we must reconcile the trade-off, but I would also characterize it as freedom and liberty to pursue one’s personal happiness versus the promise of individual economic security promised by the collective. The fulfillment of that promise is often costlier than anticipated and the benefits disappointing.

Which brings us back to Western Europe. Compared with the United States, G.D.P. per person in 2019 was 14 percent lower in Germany, 24 percent lower in France and 26 percent lower in the United Kingdom.

Economists disagree about why European nations are less prosperous than the United States. But a leading hypothesis, advanced by Edward Prescott, a Nobel laureate, in 2003, is that Europeans work less than Americans because they face higher taxes to finance a more generous social safety net.

In other words, most European nations use that leaky bucket more than the United States does and experience greater leakage, resulting in lower incomes. By aiming for more compassionate economies, they have created less prosperous ones. Americans should be careful to avoid that fate.

The point of course, is not that leisure time is undesirable but that people can choose how they invest their time and energy, rather than have state policy reward or penalize that choice arbitrarily. In a free and just society, this choice should be left to the individual. Liberty and security are not mutually exclusive goals.

Compassion is a virtue, but so is respect for those who are talented, hardworking and successful. Most Americans descended from immigrants, who left their homelands to find freedom and forge their own destinies. Because of this history, we are more individualistic than Europeans, and our policies rightly reflect that cultural difference.

That is not to say that the United States has already struck the right balance between compassion and prosperity. It is a continuing tragedy that children are more likely to live in poverty than the overall population. That’s why my favorite provision in the Biden plan is the expanded child credit, which would reduce childhood poverty. (I am also sympathetic to policies aimed at climate change, which is an entirely different problem. Sadly, the Biden plan misses the opportunity to embrace the best solution — a carbon tax.)

But the entire $3.5 trillion package is too big and too risky. The wiser course is to take more incremental steps rather than to try to remake the economy in one fell swoop.

Actually, I would suggest that the choice between liberty and security is a false one and the assumption that security can only be secured for the individual by the state to also be false. The leftist assumption is that the state has to intervene to redistribute wealth after the fact when instead we can design policies that empower citizens to join in the distribution of that wealth by participating in the risk-taking venture before the fact. Then the distribution of resources in society will mostly take care of itself. As it is now, and with this social welfare expansion, we prevent most individuals who need to participate from participating, forcing them to depend on the largesse of the state or the dictates of the market. This is hardly optimal in the search for liberty and justice. In light of my preference to preserve my liberty and take care of my own security, my answer to Prof. Mankiw’s question would be NO.

N. Gregory Mankiw is a professor of economics at Harvard. He was the chairman of the Council of Economic Advisers under President George W. Bush from 2003 to 2005.