The Other

I find this book and its review oddly obtuse. The root cause remains mysterious? Perhaps the fact that these behaviors are documented across time, place, and culture might suggest a root cause in human nature? The author coins this as xenophobia, but that is not the accurate word because it suggests “otherism” is rooted in human psychology, whereas we observe the same behaviors in other species, especially those of pack animals like wolves, hyenas, and apes.

Instead I would attribute “otherism” to a natural survival instinct that sees the other as a possible threat, especially when the invasion involves rivalry over scarce resources. This would apply across many species that exhibit a sense of “insider vs. outsider” groups.

The difference with human society is that we aware of the moral implications of ostracizing or persecuting the “other” as fellow homo sapiens. We also have a multitude of characteristics we can use to differentiate groups, such as skin color, race, ethnicity, language, gender, cultural habits, etc. In fact, this multitude implies that our current obsession with race may not be the most important factor. I would guess that several of these characteristics coalesce around the nuances of cultural antipathy. In other words, it may not be skin color that matters most, it’s just the most obvious.

For example, a black male that attends Ivy League schools and works on Wall St. can assimilate easily into mainstream society and apparently can easily become President, whereas a black rapper who speaks urban dialect, sports tattoos, and sags his pants below his posterior has almost no chance of assimilating into the dominant cultural milieu, no matter how rich he is. This would apply on a less obvious scale to those, say, who cannot speak fluently the dominant language of a society.

The challenge for a diverse society is to manage the cultural conflicts that arise from our differences. These conflicts cannot be managed with platitudes and bromides about tolerance or focusing solely on chosen identities. Unfortunately this is where our author and reviewer end up: quoting polls about how people feel about national identity in Europe and the USA. It’s an odd comparison because the historical definition of being French or German is categorically different than being American. For centuries people’s identities were defined by where they were born into a dominant local culture. The American experiment is a complete departure from that because it is a land of immigrants (and involuntary servitude in the case of slavery). The true differences between indigenous tribes and European settlers is really a matter of when they arrived on the continent. The struggle for power dominance between insiders and outsiders is a global historical phenomenon, not just a North American one.

How can we meet this challenge of the “other” when globalization is turning us all into “others”? First, we must recognize that antipathy of the other is partly driven by fear, and the fear may very well be rational. Fear of Middle Eastern terrorists touting the conquering mission of Islam is not an irrational fear. An invasion of migrants across borders is a rational fear. The point being that rational fears can be overcome, but not by denying or condemning them.

Some have wrongly assumed that because nationalism can engender a negative attitude toward the other, the nation-state must be a detriment to peace and harmony. This is exactly wrong because nation-states with borders and defined governance are exactly what prevents chaos and conflict by defining the rights to scarce resources. It is why the nation-state has been so durable over the last 400 years. In this respect, One Worldism makes no sense and is a dangerous flirtation.

Second, as this idea of the nation-state suggests, we need to understand that a multicultural society can be detrimental to a free, democratic one. All communities develop and maintain cultural norms and values that make it easier to live together in peace. Accepting the dominant values of the society we live in is merely to understand this and not an impediment to celebrating one’s own cultural heritage. America has been more successful than other developed democracies because being American is not defined by skin color, or language, or race, but by the voluntary acceptance of the American credo of individual rights and freedoms. It is truly the melting pot. Anyone from anywhere in the world can adopt this spirit, even if they cannot transplant themselves. But this fact also underscores the importance of assimilation to the dominant values of a society’s culture, and the USA is no exception. In the USA we might classify these values according to constitutional principles of liberty, justice and law as well as according to commonly accepted behavioral norms. It does not mean surrendering to any “other’s” cultural heritage, but merely accepting those attributes easily assimilated without sacrificing our individual identities.

We can see that uncontrolled borders with uncontrolled waves of migrants only undermines the good will people harbor for embracing the other. It creates uncertainty and disruption to the stable societal norms and anxiety over scarce material resources. It also threatens the touchstones of national identity. Unfortunately, the southern border crisis is now something American society will have to manage and it is not helped by wrongly attributing the problem to systemic racism. This is merely a tragic fallacy. A free diverse society can embrace and has embraced a tolerant attitude toward newcomers, but the prudent pace of adaptation is crucial. No society can peacefully absorb a horde of migrants completely unassimilated to the cultural values and norms of that society. It only invites chaos and conflict.

One can only pray that our national leaders in Washington D.C. wake up to these realities.

‘Of Fear and Strangers’ Review: The Others

Many of history’s most nightmarish episodes are rooted in humanity’s propensity for hatred of ‘The Other.’ But the root cause remains mysterious.

wsj.com/articles/of-fear-and-strangers-history-xenophobia-book-review-the-others-11634912374October 22, 2021

By Adam Kuper Oct. 22, 2021 10:20 am ET

George Makari’s concern with xenophobia goes back to a childhood trauma. In 1974, at the age of 13, he was taken on a family visit to his parents’ native Beirut. Suddenly, the travelers found themselves caught in the midst of what would become a civil war. “To me, it was bizarre,” Dr. Makari recalls in “Of Fear and Strangers: A History of Xenophobia.” He continues: “All these bewildering sects were far more alike than different. All were Levantines who spoke the same dialect; all loved the same punning humor, devoured the same cuisine, abided by strict rules of hospitality, and approached any purchase as a three-act play: bargain, stage a walk-out, then settle. They were quick with proverbs and went agog when Fairuz sang. And yet, subtle distinctions in their identities now meant life or death.”

Of Fear and Strangers: A History of Xenophobia

By George Makari

W.W. Norton

Today, Dr. Makari, a psychiatrist, psychoanalyst and the director of Weill Cornell’s DeWitt Wallace Institute of Psychiatry, sees xenophobia as a threat to social peace, not only in the Middle East but also in Europe and North America, where recent political convulsions have been driven by a bristling hostility toward strangers and outsiders. Dr. Makari is clear that a lot of different impulses are often conflated here: “ethnocentrism, ultranationalism, racism, misogyny, sexism, anti-Semitism, homophobia, transphobia, or Islamophobia.” What might they have in common? “Is there any one term specific enough to not be meaningless, while broad enough to allow us to consider whatever common strands exist between these phenomena?” He thinks that there is: xenophobia. And if all these disorders are variants of the same affliction, then perhaps they have the same cause and might be susceptible to the same treatment.

Dr. Makari traces the invention of “xenophobia” to the 1880s, when psychiatrists came up with a variety of “phobias” apparently caused by traumatic experience. “Hydrophobia”—a fear of water—was an old term for rabies. There followed a rash of other phobias, from claustrophobia to my personal favorite, phobophobia—the fear of being frightened. (One commentator remarked that the number of phobias seemed limited only by an Ancient Greek dictionary.) Xenophobia entered a medical dictionary in 1916 as a “morbid dread of meeting strangers.

Like many psychiatric classifications, early definitions of xenophobia covered too much ground. What began as a psychiatric diagnosis would soon be used to describe the fury with which colonized populations often turned on settlers. These settlers, in turn, would be accused of xenophobia by the critics of colonialism, as waves of migrations in the years leading up to World War I provoked fears of a loss of national identity.



In the U.S., three confrontations between different segments of the population proved formative. The first pitted the Puritans, who were themselves refugees from religious persecution, against Native Americans. The second was the forced migration and enslavement of millions of Africans by descendants of the country’s European settlers. The third was provoked by the migrants, first from Europe, then from Asia, who arrived after the Civil War largely for economic reasons.

Dr. Makari notes that in 1860 60% of the white population in the U.S. was of British origin, while 35% were broadly classified as German. By 1914, after 20 million immigrants had passed through American ports, 11% of the white population had British roots, 20% German, 30% Italian and Hispanic, and 34% Slavic. The settled sense of identity enjoyed by established white American Protestants was threatened. There was, in particular, a panic about Chinese immigration, even though the number of arriving Chinese was relatively small. This led to the passage of the Chinese Exclusion Act in 1882, prohibiting the immigration of Chinese laborers. In 1892, 241 lynchings were recorded in America. Two-thirds of the victims were black; the remaining third were mostly Chinese and Italian. In 1908, the Harvard philosopher Josiah Royce asked: “Is it a ‘yellow peril,’ or ‘black peril,’ or perhaps, after all, is it not some form of ‘white peril’ which threatens the future of humanity in this day of great struggles and complex issues?”

Dr. Makari’s whirlwind historical survey tells a compelling story of racial and ethnic animosity, but he might have paid more attention to religious conflicts. Europe in the 16th and 17th centuries was torn by bloody wars between Catholics and Protestants, a feud that still festered in 20th-century Ireland. The Partition of India in 1947 was accompanied by violent Hindu-Muslim confrontations and the displacement of more than 10 million people. When communist Yugoslavia fell apart, Orthodox Christians and Muslims waged war in the Balkans. The Middle East is currently going through another cycle of Shiite-Sunni wars. Are these religious hatreds also to be considered xenophobia?

Then there are sometimes ferocious confrontations between political parties, or fratricidal quarrels between factions within parties. And what about all those brawling sports fans? So many apparently irrational fears and hatreds. Could they all possibly come down to the same psychic or social forces?

One idea is that there is something fundamentally human here. Early human groups competed for territory. All intruders were enemies. The more you feared and hated outsiders, the better your chances of survival. So xenophobia bestowed an evolutionary advantage. Sports fans are simply expressing inherited tribal instincts. Even babies are frightened by a strange face.

This is a popular one-size-fits-all explanation. But it is problematic. For one thing, anthropologists do not agree that constant strife was the norm during the 95% of human history when small nomadic bands lived by hunting and gathering. The Victorian anthropologist Edward Burnett Tylor said that early humans would have had to choose between marrying out or being killed out. When Europeans in the early 19th century made contact with surviving communities of hunter-gatherers, different bands were observed forming marriage alliances and trading partnerships that generally kept feuds from raging out of control.

In the aftermath of World War II and the Holocaust, however, a better explanation of mass hatreds was needed. The orthodox theory in American psychology at the time was behaviorism, which explained habitual attitudes and responses as the products of conditioning: Pavlov’s dogs salivated at the sound of a bell because they had been conditioned to recognize this as a cue for food. In the same sort of way, children are warned against strangers and so conditioned to fear others.

Less orthodox, but more influential in the long run, is the notion of projection. Each of us half-recognizes our shameful desires, infantile fears, aggressive impulses. Instead of dealing with them, we may accuse someone else of harboring those same feelings, cleansing ourselves by shifting the blame onto a scapegoat.

According to yet another analytic theory, the people most susceptible to collective paranoia are the children of strict and demanding fathers whom they feared and adored. Theodor Adorno, the lead author of the classic account “The Authoritarian Personality,” wrote that the typical subject “falls, as it were, negatively in love.” Cowed by the father-figure, he is masochistically submissive to authority and sadistically takes out his anger on the weak.

These psychoanalytic theories all seek to explain the personal traumas and particular pathologies of individuals. But how do whole populations come to share common anxieties and antipathies? In 1928, the sociologist Emory Bogardus published the landmark study “Immigration and Race Attitudes.” One of its disconcerting findings was that the most widely disliked people in the U.S. at the time were “Turks.” Though very few Americans had actually encountered a member of that group, they had heard about them. And what they had heard about was the massacre of Armenians in Turkey after World War I, which was presented in the press as a slaughter of Christians at the hands of Muslims.

It was this power of the media to shape popular sentiment that the journalist Walter Lippmann came to dread. An early supporter of American involvement in World War I, Lippmann had joined the Inter-Allied Propaganda Board in London. In 1922 he published “Public Opinion,” his study of “how public opinion is made.” In it, he borrowed a term from the printing press: stereotype. We all share ready-made ideas that facilitate snap judgments about people and situations. These stereotypes are crude but may be useful in a pinch. They save time and trouble.

Effective propaganda weaponizes stereotypes. Lippmann’s work inspired Sigmund Freud’s American nephew Edward Bernays, who set up the first public relations business. Bernays in turn influenced Joseph Goebbels, who made terrible use of his ideas. Social media now serves up propaganda on steroids.

Yet surely not everyone is gulled—at least not all the time. How then to explain what is going on when strangers are demonized? Dr. Makari suggests that some combination of these psychological and sociological theories may be cobbled together to guide our thinking. This is probably the best that we can manage at present. What then can be done to limit the damage? Here Dr. Makari is less helpful. He suggests that all will be well if society becomes more equal, open and informed. He might as well add that social media should be better regulated, and the public better equipped for critical thought. Failing that, we may have to relive these nightmares of collective hatred again and again for a long time to come.

Yet there are grounds for hope. A study released in May this year by the Pew Research Center reported that conceptions of national identity in the U.S. and Western Europe have recently become more inclusive. Compared with 2016, “fewer people now believe that to truly be American, French, German or British, a person must be born in the country, must be a Christian, has to embrace national customs, or has to speak the dominant language.” This may suggest that xenophobia waxes and wanes with recent events, and that politicians can fan or tamp down outbreaks of public fear and fury. Wise and prudent leaders really might spare us a great deal of trouble.

—Mr. Kuper, a specialist on the ethnography of Southern Africa, has written widely on the history of anthropology.

Copyright ©2021 Dow Jones & Company, Inc. All Rights Reserved. 

Freedom = Choice + Autonomy + Protection

I came across this article in this past weekend’s WSJ. It discusses the transformation of work instigated by the pandemic lockdown. I have long maintained that people living in a free society crave the positive freedoms of choice and personal autonomy, subject to the negative freedom of security against the fear of risk and loss. The pandemic has brought that behavior to the fore with the demands for freedom to work when and where we want, with whom, and on what as the fulfillment of life’s meaning. At the same time we have demanded protection through government from healthcare risks we cannot control ourselves. Now that we have been forced to reexamine our lives and search for new meaning, we are mostly unwilling to give it up and go back to how we worked and lived before. This is social progress, because the post-industrial employment and career path was an economic imperative imposed on our human nature, and therefore faintly unnatural. We all felt it inside.

Within this new structural paradigm of freedom, people are free to imagine, create, share, and connect. This the vision of tuka, the social network platform that connects the creative…

I reprint the article in full:

The Real Meaning of Freedom at Work

wsj.com/articles/the-real-meaning-of-freedom-at-work-11633704877

October 8, 2021

By Adam Grant

As the Covid-19 pandemic moves into a new phase, many companies have started insisting that we come back to the office full-time. In response, people are quitting their jobs in droves. Flexibility is now the fastest-rising job priority in the U.S., according to a poll of more than 5,000 LinkedIn members. More than half of Americans want their next job to be self-employed—some as entrepreneurs, others as freelancers in the gig economy or content curators in the creator economy.

When Covid untethered us from our offices, many people experienced new forms of flexibility, and the taste of freedom left us hungry for more. We started rethinking what we wanted out of work. But the Great Resignation is not a mad dash away from the office; it’s the culmination of a long march toward freedom. More than a decade ago, psychologists documented a generational shift in the centrality of work in our lives. Millennials were more interested in jobs that provided leisure time and vacation time than Gen Xers and baby boomers. They were less concerned about net worth than net freedom.

In a classic 1958 lecture, the philosopher Isaiah Berlin distinguished between two types of freedom. Negative liberty is freedom from obstacles and interference by others. Positive liberty is freedom to control your own destiny and shape your own life. If we want to maximize net freedom in the future of work, we need to expand both positive and negative liberty.

The debate about whether work should be in-person, remote-first or hybrid is too narrow. Yes, people want the freedom to decide where they work. But they also want the freedom to decide who they work with, what they work on and when they work. Real flexibility is having autonomy to choose your people, your purpose and your priorities.

Remote work has granted us some negative liberties. It can release employees from the manacles of micromanagers, the trap of traffic jams and the cacophony of open offices. But it has also created new constraints on time. Even before Covid, many people reported spending the majority of their work time in meetings and on emails. Once everyone was reachable around the clock, collaboration overload only got worse.

In a study led by economist Michael Gibbs, when more than 10,000 employees of a large Asian IT company started working from home during the pandemic, productivity fell even as working hours increased. The researchers didn’t measure the physical and emotional toll of Covid, but the data showed that people got less done because they had less time to focus. They were stuck in more group meetings and got interrupted more often.

Good segmentation policies allow people to commit to predictable time off that shields them from work intrusions into their lives.

To free people from these constraints, we need better boundaries. There’s evidence that working from home has been more stressful for “segmentors” who prefer to separate the different spheres of life than for “integrators” who are happy to blur the lines. Good segmentation policies allow people to commit to predictable time off that shields them from work intrusions into their lives. For example, the healthcare company Vynamic has a policy called “zzzMail” that discourages sending emails on nights and weekends.

We need boundaries to protect individual focus time too. On remote teams, it’s not the frequency of interaction that fuels productivity and creativity—it’s the intensity of interaction. In a study of virtual software teams by collaboration experts Christoph Riedl and Anita Woolley, the most effective and innovative teams didn’t communicate every hour. They’d spend several hours or days concentrating on their own work and then start communicating in bursts. With messages and bits of code flying back and forth, their collaborations were literally bursting with energy and ideas.

One effective strategy seems to be blocking quiet time in the mornings as a window for deep work, and then coming together after lunch. When virtual meetings are held in the afternoon, people are less likely to multitask—probably in part because they’ve been able to make progress on their own tasks. For the many workplaces rolling out hybrid schedules of one or two remote days each week, it might also help to have teams coordinate on-site days so they can do individual work at home and collaborate when they’re in the same room.

Over the past year and a half, we’ve discovered a new constraint of remote work: Zoom fatigue is real. Yes, turning off your self-view can make you less self-conscious, but it doesn’t remove the cognitive load of worrying about how other people will perceive you and trying to read their facial expressions. Turning the camera off altogether can help. In a summer 2020 experiment led by organizational psychologists Kristen Shockley and Allison Gabriel, when employees at a healthcare company had the freedom to turn their video off during virtual meetings, it reduced fatigue—especially for women and new hires, who generally face more pressure to monitor their image.

New research reveals that having voice-only conversations isn’t just less exhausting; there are times when it can be more effective. When two people working on a problem together only hear each other’s voices, they’re more likely to pause to listen to each other, which translates into more equal speaking time and smarter decisions. And if you’re trying to read someone’s emotions, you’re more accurate if you close your eyes or turn off the lights and just listen to their voice.

This doesn’t mean cameras should never be on. Seeing human faces can be helpful if you’re giving a presentation, building trust or trying to coordinate in a big group. But videos can also be a constraint—you don’t need them in every meeting. The most underused technology of 2021 might be the phone call.

In a world of rising inequality, remote work has released some restraints. Many working mothers have struggled during the pandemic, in large part because of the responsibility of child care when schools were closed. But research suggests that in normal circumstances, the option to work remotely is especially helpful for working mothers, giving them the flexibility to excel in their jobs. And working from home, Black employees have reported less stress. One survey found that 97% of Black knowledge workers currently working from home want to remain partially or fully remote for the foreseeable future.

But going remote runs the risk of limiting positive liberties. In a landmark 2014 experiment at a call center in China, a team led by economist Nicholas Bloom randomly assigned hundreds of employees to work from home. Although remote workers were 13% more productive, they were only half as likely to be promoted—likely because they didn’t have enough face time with senior managers.

It’s well documented that many managers mistake visibility for value and reward presence instead of performance. The very employees who gain freedom from constraints thanks to remote work may end up missing out on the freedom to develop their skills and advance their careers.

One source of positive liberty is the freedom to choose who we interact with and learn from. After more than 60,000 employees at Microsoft transitioned to remote work during the pandemic, researchers found that their personal networks became more siloed and static. There were fewer new connections between people, fewer bridges between teams and fewer real-time conversations within groups. That made it tougher to acquire and share knowledge.

To give people the freedom to learn, we need to work harder to open doors. In the summer of 2020, researchers teamed up with a large company that hired more than a thousand interns to work remotely in 16 cities. They found that scheduling “virtual water coolers”— informal meetings with senior managers—elevated interns’ satisfaction as well as their performance ratings and their odds of getting a return offer. Just three or four virtual meetings with senior managers was enough to open the door to learning, mentoring and trust. What if more leaders hosted virtual office hours?

Another source of positive liberty is the freedom to decide what work we do. A few years ago, I visited a California tomato paste company called Morning Star to understand how they’ve managed to sustain success for several decades without bosses. When you first arrive at Morning Star, you’re assigned the job of your predecessor. After a year, you’re invited to rewrite your job description, with two conditions. You have to explain how your revamped job will advance the company’s mission, and you have to get the people who work with you most closely to agree to it.

When employees have the flexibility to customize their work, they’re more effective, more satisfied and more likely to stay.

Organizational psychologists Amy Wrzesniewski and Jane Dutton call this “job crafting,” and it enables people to become active architects of their own tasks and interactions. Extensive research suggests that when employees have the flexibility to customize their work, they’re more effective, more satisfied and more likely to stay.

The biggest source of positive liberty may be the freedom to decide when and how much we work. If we’ve learned anything from the pandemic about going remote, it’s that people aren’t shirking from home—they’re working overtime. But the 40-hour workweek was not ordained from above; it’s a human invention that grew out of the Industrial Revolution. Anthropologists find that for more than 95% of human history, people enjoyed more leisure time than we do now. Generations of hunter-gatherers subsisted on 15-hour workweeks. When we started treating humans like machines, we began confusing time spent with value created.

At the Brazilian manufacturing company Semco, leaders noticed that when retirement finally gives people the freedom to pursue their passions for travel, sports, arts and volunteering, their health often stands in the way. So the company started a Retire-A-Little program, inviting workers to trade 10% of their salaries for Wednesdays off. They expected it to be popular with employees in their 50s, but it was actually employees in their 20s who jumped at the opportunity to trade money for time.

When people have the flexibility to work less, they often focus better and produce more. In the U.S. alone, researchers estimate that companies waste $100 billion a year paying for idle time. When Microsoft Japan tested a 4-day workweek, productivity climbed by 40% and costs declined. The Icelandic government tested reducing workweeks from 40 to 36 hours at the same pay in offices, hospitals and police stations over a four-year period. It found that well-being and work-life balance improved, while productivity was sustained across the board—and in some cases heightened.

Offering the freedom to work less is an opportunity to attract, motivate and retain talented people. From 2018 to 2021, the number of job postings offering a four-day workweek has tripled, but they are still less than one in 100 jobs. Along with shortening the workweek, it’s worth rethinking the workday. What if we finished at 3 p.m. so that working parents could be with their children when they came home from school? Would we see better results—and higher quality of life—in six focused hours than eight unfocused hours?

Flexible work is here to stay, but companies that resist it may not be.

Flexible work is here to stay, but companies that resist it may not be. One of the biggest mistakes I saw companies make before Covid was failing to experiment with new forms of freedom. As employers contemplate a return to the workplace, a good place to start might be to ask people about the experiments they’ve run in the past year and a half and the ones they’d love to try moving forward. What old constraints should we try removing, and what new freedoms could we test?

Work isn’t just our livelihood. It can be a source of structure, belonging and meaning in our lives. But that doesn’t mean our jobs should dictate how we spend most of our waking hours. For several generations, we’ve organized our lives around our work. Our jobs have determined where we make our homes, when we see our families and what we can squeeze in during our downtime. It might be time to start planning our work around our lives.

Appeared in the October 9, 2021, print edition as ‘The Real Meaning of Freedom at Work The Value of Liberty for Workers.’

The UnFree

I’ve been watching a bit of the original TV miniseries on Amazon, The Underground Railroad, because I always enjoy learning something new and interesting from historical narratives. Just today I read this article on The Conversation which is a nice review of the motivations and intentions of the writers and director. It also provoked some thoughts I’ll share here.

The Conversation – The Underground Railroad

I was struck by the following quotes about the director’s intention to present “slaves not as objects who were acted upon, but as individuals who maintained identities and agency – however limited – despite their status as property.”

The reviewer goes on to say,

In the past three decades there has been a movement among academics to find suitable terms to replace “slave” and “slavery.”

In the 1990s, a group of scholars asserted that “slave” was too limited a term – to label someone a “slave,” the argument went, emphasized the “thinghood” of all those held in slavery, rendering personal attributes apart from being owned invisible.

This makes perfect sense and should seem obvious. However, I believe the misuse or overuse of the label “slavery” has happened through associating it solely with the African/American experience, whereas enslavement has been inflicted upon many individuals and peoples across the world and across history. For sure, this docudrama is a narrative of the experience of black slaves on the North American continent, but its universalism should not be lost in that singular application.

I have emphasized the ideas of personal “identities and agency” in bold text above because this is really what applies to all people regardless of race or ethnicity. It also struck me that the appropriate term we are looking for is “The Unfree.” Every individual and oppressed group can relate to the idea of being unfree, if not enslaved. When you are unfree, you are deprived of free choice, free will, free agency, and the outward self-dignity imbued in that truest sense of human freedom. Historically and currently this condition is usually the result of a gross imbalance of power and a certain pathology of those who impose their unequal power over others. The history of the unfree applies to the ancient story of Spartacus, as well as any employee today preyed upon by an unreasonable boss.

This status of the unfree also highlights the fundamental condition of human identity, which is freedom. Freedom is what delineates our identities and personal agency in our lives, and it is sufficient in itself. In recent decades this truth has been twisted a bit to suggest that our chosen identities establish and signal our freedom, when actually it is only our freedom that helps guarantee the free and open expression of our identities. For example, one can assert one’s identity as “non-binary,” and the freedom of self-expression under the law defends the right to whatever that might be, but one cannot force others to use the preferred pronoun, that is not within the power of the state or any other entity without violating the basic tenets of freedom.

This is important because politics can intervene with laws and enforcement to guarantee our freedoms, but it cannot define or defend our personal identities or our self-dignity. As The Underground Railroad narrative demonstrates, slavery could not deprive the unfree slaves of their identities and their self-dignity, unless the individual allowed it. The oppressors can take away physical freedom, humiliate, and even impose a death sentence, but they cannot take away the freedom to think freely and the self-dignity of the oppressed. We witness these truths again and again in the stories of Holocaust and Gulag survivors.

It is also interesting to note that ideologically the primacy of freedom as a value tends to delineate today’s liberals and conservatives, as noted by Jonathan Haidt in his studies of political identity. Liberty is the primary moral value to those who identify on the right, while fairness and human caring are the dominant values asserted by many on the left. Leftists might argue that one cannot be free in an unfair society, but that only means we have to focus on freedom as a precondition to fairness. The issue of slavery the unfree, in universal world history as well as African American history, should enlighten us to the primary ordering of moral values: one cannot have fairness without the precondition of freedom, and without the precondition of freedom, fairness has no meaningful relation to our concepts of justice. (Unfortunately, this only hints at another discussion on the differences between fairness and justice, and the unnecessary qualifiers applied to the universal singular idea of moral justice.)

Lastly, this rich portrayal of the unfree escaping the bonds that defined them by preserving and expressing their self-dignity and personal agency provides the correct lesson on the true legacy of the American experiment – not that one group of our fore-bearers oppressed another, but that they both evolved under a constitutional system of laws to continue to progress toward a society of true liberty and justice for all. We have not arrived, but we are on the right track.

The Deconstruction of the West

What concerns me most from the following article is the misguided notion that pan-nationalism and global citizenship has displaced the sovereign nation-state international system. The sovereign nation-state is all we have to manage global affairs in a representative democratic, people-centered global society. Without it we are all vulnerable to constellations of power among political elites and authoritarians of all stripes.

Reprinted from The American Interest:

The Deconstruction of the West

ANDREW A. MICHTA

April 12, 2017

The greatest threat to the liberal international order comes not from Russia, China, or jihadist terror but from the self-induced deconstruction of Western culture.

To say that the world has been getting progressively less stable and more dangerous is to state the obvious. But amidst the volumes written on the causes of this ongoing systemic change, one key driver barely gets mentioned: the fracturing of the collective West. And yet the unraveling of the idea of the West has degraded our ability to respond with a clear strategy to protect our regional and global interests. It has weakened the NATO alliance and changed not just the global security calculus but now also the power equilibrium in Europe. If anyone doubts the scope and severity of the problem, he or she should ask why it has been so difficult of late to develop a consensus between the United States and Europe on such key issues as defense, trade, migration, and how to deal with Russia, China, and Islamic jihadists.

The problem confronting the West today stems not from a shortage of power, but rather from the inability to build consensus on the shared goals and interests in whose name that power ought to be applied. The growing instability in the international system is not, as some argue, due to the rise of China as an aspiring global power, the resurgence of Russia as a systemic spoiler, the aspirations of Iran for regional hegemony, or the rogue despotism of a nuclear-armed North Korea; the rise and relative decline of states is nothing new, and it doesn’t necessarily entail instability. The West’s problem today is also not mainly the result of the economic decline of the United States or the European Union, for while both have had to deal with serious economic issues since the 2008 meltdown, they remain the two largest economies in the world, whose combined wealth and technological prowess are unmatched. Nor is the increasing global instability due to a surge in Islamic jihadism across the globe, for despite the horrors the jihadists have wrought upon the peoples of the Middle East and North Africa, and the attendant anxiety now pervading Europe and America, they have nowhere near the capabilities needed to confront great powers.

The problem, rather, is the West’s growing inability to agree on how it should be defined as a civilization. At the core of the deepening dysfunction in the West is the self-induced deconstruction of Western culture and, with it, the glue that for two centuries kept Europe and the United States at the center of the international system. The nation-state has been arguably the most enduring and successful idea that Western culture has produced. It offers a recipe to achieve security, economic growth, and individual freedom at levels unmatched in human history. This concept of a historically anchored and territorially defined national homeland, having absorbed the principles of liberal democracy, the right to private property and liberty bound by the rule of law, has been the core building block of the West’s global success and of whatever “order” has ever existed in the so-called international order. Since 1945 it has been the most successful Western “export” across the globe, with the surge of decolonization driven by the quintessentially American precept of the right to self-determination of peoples, a testimony to its enduring appeal. Though challenged by fascism, Nazism, and communism, the West emerged victorious, for when confronted with existential danger, it defaulted to shared, deeply held values and the fervent belief that what its culture and heritage represented were worth fighting, and if necessary even dying, to preserve. The West prevailed then because it was confident that on balance it offered the best set of ideas, values, and principles for others to emulate.

Today, in the wake of decades of group identity politics and the attendant deconstruction of our heritage through academia, the media, and popular culture, this conviction in the uniqueness of the West is only a pale shadow of what it was a mere half century ago. It has been replaced by elite narratives substituting shame for pride and indifference to one’s own heritage for patriotism. After decades of Gramsci’s proverbial “long march” through the educational and cultural institutions, Western societies have been changed in ways that make social mobilization around the shared idea of a nation increasingly problematic. This ideological hollowing out of the West has been accompanied by a surge in confident and revanchist nationalisms in other parts of the world, as well as religiously inspired totalitarianism.

National communities cannot be built around the idea of collective shame over their past, and yet this is what is increasingly displacing a once confident (perhaps overconfident, at times) Western civilization. The increasing political uncertainty in Europe has been triggered less by the phenomenon of migration than it has by the inability of European governments to set baselines of what they will and will not accept. Over the past two decades Western elites have advocated (or conceded) a so-called “multicultural policy,” whereby immigrants would no longer be asked to become citizens in the true sense of the Western liberal tradition. People who do not speak the national language, do not know the nation’s history, and do not identify with its culture and traditions cannot help but remain visitors. The failure to acculturate immigrants into the liberal Western democracies is arguably at the core of the growing balkanization, and attendant instability, of Western nation-states, in Europe as well as in the United States.

Whether one gives the deconstruction of the Western nation-state the name of postmodernism or globalism, the ideological assault on this very foundation of the Western-led international system has been unrelenting. It is no surprise that a poorly resourced radical Islamic insurgency has been able to make such vast inroads against the West, in the process remaking our societies and redefining our way of life. It is also not surprising that a weak and corrupt Russia has been able to shake the international order by simply applying limited conventional military power. Or that a growing China casts an ever-longer shadow over the West. The greatest threat to the security and survival of the democratic West as the leader and the norm-setter of the international system comes not from the outside but from within. And with each passing year, the deconstruction of Western culture, and with it the nation-state, breeds more internal chaos and makes our international bonds across the West ever more tenuous.

Andrew A. Michta is the dean of the College of International and Security Studies at the George C. Marshall European Center for Security Studies. Views expressed here are his own.

Brexit: Failure of the Central State

This is the best article I’ve seen on Brexit. Basically we’re witnessing the failure of statism, politically and economically…and a desperate reassertion of the principles of democracy, sovereignty and freedom.

Brexit: A Very British Revolution

The vote to leave the EU began as a cry for liberty and ended as a rebuke to the establishment

By FRASER NELSON
The Wall Street Journal, June 24, 2016 4:33 p.m. ET

The world is looking at Britain and asking: What on Earth just happened? Those who run Britain are asking the same question.

Never has there been a greater coalition of the establishment than that assembled by Prime Minister David Cameron for his referendum campaign to keep the U.K. in the European Union. There was almost every Westminster party leader, most of their troops and almost every trade union and employers’ federation. There were retired spy chiefs, historians, football clubs, national treasures like Stephen Hawking and divinities like Keira Knightley. And some global glamour too: President Barack Obama flew to London to do his bit, and Goldman Sachs opened its checkbook.

And none of it worked. The opinion polls barely moved over the course of the campaign, and 52% of Britons voted to leave the EU. That slender majority was probably the biggest slap in the face ever delivered to the British establishment in the history of universal suffrage.

Mr. Cameron announced that he would resign because he felt the country has taken a new direction—one that he disagrees with. If everyone else did the same, the House of Commons would be almost empty. Britain’s exit from the EU, or Brexit, was backed by barely a quarter of his government members and by not even a tenth of Labour politicians. It was a very British revolution.

Donald Trump’s arrival in Scotland on Friday to visit one of his golf courses was precisely the metaphor that the Brexiteers didn’t want. The presumptive Republican presidential nominee cheerily declared that the British had just “taken back their country” in the same way that he’s inviting Americans to do—underscoring one of the biggest misconceptions about the EU referendum campaign. Britain isn’t having a Trump moment, turning in on itself in a fit of protectionist and nativist pique. Rather, the vote for Brexit was about liberty and free trade—and about trying to manage globalization better than the EU has been doing from Brussels.

The Brexit campaign started as a cry for liberty, perhaps articulated most clearly by Michael Gove, the British justice secretary (and, on this issue, the most prominent dissenter in Mr. Cameron’s cabinet). Mr. Gove offered practical examples of the problems of EU membership. As a minister, he said, he deals constantly with edicts and regulations framed at the European level—rules that he doesn’t want and can’t change. These were rules that no one in Britain asked for, rules promulgated by officials whose names Brits don’t know, people whom they never elected and cannot remove from office. Yet they become the law of the land. Much of what we think of as British democracy, Mr. Gove argued, is now no such thing.

Instead of grumbling about the things we can’t change, Mr. Gove said, it was time to follow “the Americans who declared their independence and never looked back” and “become an exemplar of what an inclusive, open and innovative democracy can achieve.” Many of the Brexiteers think that Britain voted this week to follow a template set in 1776 on the other side of the Atlantic.

Mr. Gove was mocked for such analogies. Surely, some in the Remain camp argued, the people who were voting for Leave—the pensioners in the seaside towns, the plumbers and chip-shop owners—weren’t wondering how they could reboot the Anglo-Scottish Enlightenment for the 21st century. Perhaps not, but the sentiment holds: Liberty and democracy matter. As a recent editorial in Der Spiegel put it, Brits “have an inner independence that we Germans lack, in addition to myriad anti-authoritarian, defiant tendencies.”

Mr. Cameron has been trying to explain this to Angela Merkel for some time. He once regaled the German chancellor with a pre-dinner PowerPoint presentation to explain his whole referendum idea. Public support for keeping Britain within the EU was collapsing, he warned, but a renegotiation of its terms would save Britain’s membership. Ms. Merkel was never quite persuaded, and Mr. Cameron was sent away with a renegotiation barely worthy of the name. It was a fatal mistake—not nearly enough to help Mr. Cameron shift the terms of a debate he was already well on the way to losing.

The EU took a gamble: that the Brits were bluffing and would never vote to leave. A more generous deal—perhaps aimed at allowing the U.K. more control over immigration, the top public concern in Britain—would probably have (just) stopped Brexit. But the absence of a deal sent a clear and crushing message: The EU isn’t interested in reforming, so it is past time to stop pretending otherwise.

With no deal, all Mr. Cameron could do was warn about the risks of leaving the EU. If Brits try to escape, he said, they’d face the razor wire of a recession or the dogs of World War III. He rather overdid it. Instead of fear, he seemed to have stoked a mood of mass defiance.

Mr. Obama also overdid it when he notoriously told the British that, if they opted for Brexit, they would find themselves “in the back of the queue” for a trade deal with the U.S. That overlooked a basic point: The U.K. doesn’t currently have a trade deal with the U.S., despite being its largest foreign investor. Moreover, no deal seems forthcoming: The negotiations between the U.S. and the EU over the trans-Atlantic Trade and Investment Partnership are going slowly, and the Brits involved in the talks are in despair.

Deals negotiated through the EU always move at the pace dictated by the most reluctant country. Italy has threatened to derail a trade deal with Australia over a spat about exports of canned tomatoes; a trade deal with Canada was held up after a row about Romanian visas. Brexit wasn’t a call for a Little England. It was an attempt to escape from a Little Europe.

Many British voters felt a similar frustration on security issues, where the EU’s leaders have for decades now displayed a toxic combination of hunger for power and incompetence at wielding it. When war broke out in the former Yugoslavia in 1991, the then-chair of the European Community’s Council of Ministers declared that this was “the hour of Europe, not the hour of the Americans—if one problem can be solved by the Europeans, it is the Yugoslav problem.” It was not to be.

Nor did the EU acquit itself much better in more recent crises in Ukraine and Libya. Field Marshal Lord Charles Guthrie, a former chief of the British military, put it bluntly last week: “I feel more European than I do American, but it’s absolutely unrealistic to think we are all going to work together. When things get really serious, we need the Americans. That’s where the power is.” Brits feel comfortable with this; the French less so.

Throughout the campaign, the Brexit side was attacked for being inward-looking, nostalgic, dreaming of the days of empire or refusing to acknowledge that modern nations need to work with allies. But it was the Brexiteers who were doing the hardest thinking about this, worrying about the implications of a dysfunctional EU trying to undermine or supplant NATO, which remains the true guarantor of European security.

In the turbulent weeks and months ahead, we can expect a loud message from the Brexiteers in the British government: The question is not whether to work with Europe but how to work with Europe. Alliances work best when they are coalitions of the willing. The EU has become a coalition of the unwilling, the place where the finest multilateral ambitions go to die. Britain’s network of embassies will now go into overdrive, offering olive branches in capital after capital. Britain wants to deal, nation to nation, and is looking for partners.

Even the debate about immigration had an internationalist flavor to it. Any member of any EU state has had the right to live and work in Britain; any American, Indian or Australian needs to apply through a painstaking process. Mr. Cameron’s goal is to bring net immigration to below 100,000 a year (it was a little over three times that at last count). So the more who arrive from the EU, the more we need to crack down on those from outside the EU. The U.K. government now requires any non-European who wants to settle here to earn an annual salary of at least £35,000 (or about $52,000)—so we would deport, say, a young American flutist but couldn’t exclude a Bulgarian convict who could claim (under EU human-rights rules) that he has family ties in the U.K.

To most Brits, this makes no sense. In a television debate last week, Mr. Cameron was asked if there was “anything fair about an immigration system that prioritizes unskilled workers from within the EU over skilled workers who are coming from outside the EU?” He had no convincing answer.

The sense of a lack of control over immigration to Britain has been vividly reinforced by the scenes on the continent. In theory, the EU is supposed to protect its external borders by insisting that refugees claim asylum in the first country they enter. In practice, this agreement—the so-called Dublin Convention—was torn up by Ms. Merkel when she recklessly offered to settle any fleeing Syrians who managed to make it over the German border. The blame here lies not with the tens of thousands of desperate people who subsequently set out; the blame lies with an EU system that has proven itself hopelessly unequal to such a complex and intensifying challenge. The EU’s failure has been a boon for the people-trafficking industry, a global evil that has led to almost 3,000 deaths in the Mediterranean so far this year.

Britain has been shielded from the worst of this. Being an island helps, as does our rejection of the ill-advised Schengen border-free travel agreement that connects 26 European countries. But the scenes on the continent of thousands of young men on the march (one of which made it onto a particularly tasteless pro-Brexit poster unveiled by Nigel Farage, the leader of the anti-immigration UK Independence Party) give the sense of complete political dysfunction. To many voters in Britain, this referendum was about whether they want to be linked to such tragic incompetence.

The economists who warned about the perils of Brexit also assure voters that immigration is a net benefit, its advantages outweighing its losses. Perhaps so, but this overlooks the human factor. Who loses, and who gains? Immigration is great if you’re in the market for a nanny, a plumber or a table at a new restaurant. But to those competing with immigrants for jobs, houses or seats at schools, it looks rather different. And this, perhaps, explains the stark social divide exposed in the Brexit campaign.

Seldom has the United Kingdom looked less united: London and Scotland voted to stay in the EU, Wales and the English shires voted to get out. (Scottish First Minister Nicola Sturgeon has already called a fresh vote on secession “highly likely.”) Some 70% of university graduates were in favor of the EU; an equally disproportionate 68% of those who hadn’t finished high school were against it. Londoners and those under age 30 were strongly for Remain; the northern English and those over 60 were strongly for Leave. An astonishing 70% of the skilled working class supported Brexit.

Here, the Brexit battle lines ought to be familiar: They are similar to the socioeconomic battles being fought throughout so many Western democracies. It is the jet-set graduates versus the working class, the metropolitans versus the bumpkins—and, above all, the winners of globalization against its losers. Politicians, ever obsessed about the future, can tend to regard those left unprotected in our increasingly interconnected age as artifacts of the past. In fact, the losers of globalization are, by definition, as new as globalization itself.

To see such worries as resurgent nationalism is to oversimplify. The nation-state is a social construct: Done properly, it is the glue that binds society together. In Europe, the losers of globalization are seeking the protection of their nation-states, not a remote and unresponsive European superstate. They see the economy developing in ways that aren’t to their advantage and look to their governments to lend a helping hand—or at least attempt to control immigration. No EU country can honestly claim to control European immigration, and there is no prospect of this changing: These are the facts that led to Brexit.

The pound took a pounding on the currency markets Friday, but it wasn’t alone. The Swedish krona and the Polish zloty were down by about 5% against the dollar; the euro was down 3%. The markets are wondering who might be next. In April, the polling firm Ipsos MORI asked voters in nine EU countries if they would like a referendum on their countries’ memberships: 45% said yes, and 33% said they’d vote to get out. A Pew poll recently found that the Greeks and the French are the most hostile to the EU in the continent—and that the British were no more annoyed with the EU than the Swedes, the Dutch and the Germans.

The Brexit campaign was led by Europhiles. Boris Johnson, the former London mayor turned pro-Brexit firebrand who now seems likely to succeed Mr. Cameron, used to live in Brussels and can give interviews in French. Mr. Gove’s idea of perfect happiness is sitting on a wooden bench listening to Wagner in an airless concert hall in Bavaria. Both stressed that they love Europe but also love democracy—and want to keep the two compatible. The Brexit revolution is intended to make that point.

Mr. Gove has taken to borrowing the 18th-century politician William Pitt’s dictum about how England can “save herself by her exertions and Europe by her example.” After Mr. Cameron departs and new British leadership arrives, it will be keen to strike new alliances based on the principles of democracy, sovereignty and freedom. You never know: That might just catch on.

Unicorns, Tooth Fairies, and Free Markets

The most frequent criticism of free markets lies in their comparison to unicorns, fairies, and leprechauns. In other words, they exist only in our imaginations and thus are unworthy of serious discussion. This is sheer nonsense. It is like denying the value of Plato’s ideal forms as a means of comparison and judgment. Democracy also does not exist in its pure, idealistic form, so, is it a useless figment of our imaginations? I don’t believe so.

Free markets should be thought of as markets for free people, much like democracy is a political market for free people. In terms of exchange, people are free when buyer and seller can either mutually agree on an exchange or walk away. This freedom obtains best when there are lots of competing and comparable alternatives to any particular good or service. Also, voluntary action is enhanced when the terms of the transaction are transparent to both parties. Some markets offer better and more options than others, while some are more transparent than others, so markets are defined along a continuum from “free” to “unfree.” The whole thrust of free market theory is to point us in the right direction.

Oddly enough, the failure of unfree, or controlled, markets is often cited as proof that markets don’t work. This is like pulling the wheels off my car and then stating that since it can’t go anywhere, cars are a poor form of transportation. Such arguments should be the butt of jokes, not serious debate. Some may qualify this argument by saying some markets are easier to manipulate or control by narrow interests or are less transparent, and thus need to be regulated by a disinterested third party such as a government bureaucracy. But that just begs the more important question over what means will insure that any particular market becomes freer?

Regulation vs. Competition

We can’t really answer this question without a careful analysis of behavioral incentives, of both the economic and political variety. It is widely accepted that economic and political actors pursue their narrow self-interests, with economic behavior determined by loss aversion and profit/utility maximization and political behavior more influenced by power, status, and control. These behaviors dovetail in real people as we all seek to survive by pursuing wealth, power, and control over our own destinies. When we scrape beneath the surface we find that survival is more about not losing (loss aversion), than winning (big rewards).

We would assume from these incentives that most actors would like to manipulate markets to their personal advantage, so how do we constrain or redirect this?

Most people would look to contract law as the explanation for what keeps us honest, but that offers only a narrow understanding of how markets work. We make dozens, if not hundreds, of market transactions every day and very few ever reach that threshold where we feel the need to consult legal counsel or call our Congressperson in Washington. Instead, we rely on more efficient means, such as trust, reciprocity, the implicit value of repeat business, and competing alternatives to guide our choices.

This point about competing alternatives is crucial because while trust, reciprocity, and relationships help ameliorate the need for transparency, competition gives us freedom of choice. Anti-competitive monopolies are considered economic evils because they control the market for their product or service, so consumers must pay their price or go without. (Likewise why we despise political tyranny.) Critics often deride market capitalism as a competitive conflictual system, but that too is a myopic point of view. Markets foster cooperation as much as competition, and many of the transactions in economic markets are win-win positive sum games rather than win-lose zero-sum games.

Think about it: Sellers compete among themselves in order to develop a long-term cooperative relationship with their buyers and suppliers. Ever wonder why a department store takes back that dress or pair of shoes you bought last week because you changed your mind? That doesn’t seem in their immediate profit interest, but it does when you consider how the retailer values repeat business against the freedom you have to take your business elsewhere.

It is not laws or regulatory watchdogs but open competition under accepted market rules that constrains most of our selfish economic behavior. In addition, market competitors have the biggest incentive to insure all play by the established rules, thus they are the watchdogs. This implies the need for transparency. Third party regulatory agencies are inadequate to the task of monitoring the multitude of transactions in markets, especially financial markets. For example, the banking industry is the most regulated industry in the US, and yet the financial crisis of “too big to fail” revealed how ineffective that regulation was. So the test should not be regulation OR competition, but regulation FOR competition. Financial markets in particular need to be open, transparent, and competitive to constrain behavior that risks the integrity of the financial system. In financial markets, failure is a big inducement for prudence.

For an illustrative case, consider the policy response to the financial crisis of 2008, the 2010 Dodd-Frank law. Under that legislation, “too big to fail” banks have gotten even bigger, while 1,500 community banks—the source of half of all loans to local businesses—reportedly have been destroyed. The remaining community banks have had to hire 50% more compliance staff just to keep up with the regulations. That means far less competition among lenders to serve borrowers and more concentrated finance that does not respond to the bankruptcy constraint. It means a far less efficient and just credit market and far more control centralized in a financial oligopoly seeking to influence the policymaking process in its favor. According to the practice of regulatory capture – where lobbyists “buy” politicians with campaign contributions to formulate policy to constrain their competitors – we’ve discovered too often that big government mostly works for big business, the powerful, the wealthy and the well-connected to the detriment of open and honest competition among free people.

One could make the same argument against health care reform under the Affordable Care Act. Has policy made the market more open, transparent, and competitive, or less? Health care provision is really about competition and abundance in the supply of health care rather than the price and distribution of access. With an abundance of competitive health care providers, price and access take care of themselves.

The most important argument in favor of markets is the crucial role they play in providing information feedback signals. Free markets provide the most accurate and essential signals to consumers and producers needed to make efficient economic decisions, like comparing alternatives to maximize preferences, or where to invest and how much to produce, and how to adapt to changing market conditions. These signals are embodied in prices and inventory quantities and without these, producers are operating in the dark about what people want. Hayek was the first to point out the lack of private exchange markets would make central planning under socialism untenable over time. He was right.

Market failures do exist and we don’t live in a world of idealized free markets. But in addressing those failures we should strive not to make the perfect the enemy of the good, because free markets support free people and that’s the bottom line.

Besides, it would be heartbreaking to admit to our children that this is best we can do when it comes to unicorns and tooth fairies:

adult-unicorn-costume toothfairy1

Time is Money?

191090-strip

Yes, but no. The actual truism should be stated as: “Money is Time.” The difference, of course, is that time, not money, is the ultimate value. (The truism is probably most often stated in reverse because most people are confused as to the ultimate value of life, and thus respond better to the admonition that they are wasting money, not just time.)

Time is egalitarian. It is the great equalizer because in the course of a lifetime, an hour of time is equivalent to a rich or poor person alike, or a powerful or powerless person. Not equivalent as measured in terms of the currency of money, but equivalent as measured in time value.

“Money is Time” is also probably the most profound statement one can make in economics, because, in theory, economics uses money as the true measure of the value of time.

Think about this a little more deeply. What explains the differences in value between a horse, a car and an airplane? The difference in monetary value is explained by the efficiencies gained by a car over a horse, and an airplane over both. A horse can get one rider from Los Angeles to New York in probably about 2-3 months. A car can get maybe five or six people from LA to NY in about 3 days. An airplane can get 300+ people across the continent in about five and a half hours. If we compute and compare the three options in terms of man-hours expended, we can see why airplanes are valued that much more than a horse.

One could see this just as simply by comparing the productivity (in terms of time) of a tractor vs. a plow horse, or a computer vs. a typewriter, or a smart phone vs. a telegram. Technologies that allow us to make the most of our time are valued accordingly and displace less efficient technologies. And the time we gain is measured in monetary wealth.

This truism, that Money is Time, also has profound implications for how we control money as a measure of time. Money has been defined by its three functions: a unit of account, a store of value, and a medium of exchange. What money really does is tell us how much time value we have produced, saved, and stored up for future consumption. As such, money is merely an information signal that tells us if we are on the right track or not. If we are on the wrong track, being unproductive and wasteful, ultimately we have squandered time, not money.

I recently read a monograph by George Gilder, The New Information Theory of Money, that explores this relationship between time and money in depth. He observes that Neanderthal Man had the same natural resources that we have today, since all matter is conserved. Homo sapiens today is much wealthier because  we live longer, we spend less time working for food and shelter, and have much more opportunity for leisure and cultural pursuits. Our wealth is really a measure of how productive we have become with our time.

Gilder’s monograph analyzes what this means for our concepts of money. When we think of money as wealth, we come up with all sorts of schemes to increase the supply of money in order to increase wealth. When we consider the actions of the central banks for the past hundred years, we can see that this fallacy defines our misguided policies. This should be clear from the actions of the US Federal Reserve since the 2008 financial crisis, both leading up to that crisis and in reaction. Fed policy, referred to as Zero Interest Rate Policy and Quantitative Easing, has merely goosed the nominal prices of assets such as houses, collectibles, land, stocks, and bonds with the idea that more nominal wealth as measured in US$ will lead to greater productivity and real wealth as measured by the value of time.

It hasn’t quite worked that way. Why? Because the Fed is focused on managing inaccurate statistical measures of real wealth as denoted by GDP, money incomes, CPI price changes, etc. Policymakers focus on the monetary economy rather than the real economy because that’s what is measured by their statistical information. Perhaps it is the best proxy we have, but it is still a proxy.

You must ask yourself – are you richer in terms of time? More time for you and your family to spend as you see fit? Those few beneficiaries (the 1%) who have benefited directly from this misguided monetary policy can certainly answer yes, but for the aggregate body politic, the answer is no.

Money supply today is controlled by governments with their ability to expand and contract credit through the banking system. Thus, our monetary economies really operate according to the calculus of political power and influence. It is no accident that ZIRP taxes small savers in order to recapitalize large banks that made the bad loans that crashed the financial system. No wonder the majority of voters are disgruntled with the results.

Gilder explains the true value of money (as opposed to wealth), is as an information signal that helps us be efficient and productive with our time. When we distort this information source (which is exactly what the Federal Reserve does when it manipulates interest rates), we can only become less efficient and productive. He notes that gold was a more accurate basis for money information because its value was a direct function of the time and effort it took to get it out of the ground. Governments or private actors could not easily manipulate its value.

He applies this reasoning to an even better foundation for money, Bitcoin. Bitcoin is a digital currency that is “mined” by the application of mathematical algorithms that get more and more difficult to solve as time goes by. This means that in order for the supply of bitcoins to increase, we must become more and more efficient in terms of computing power. In other words, becoming more productive with time. You see, the more productive we are with time, the greater the wealth the monetary information signal should represent.

Currently, governments have little constraint over how much money they can create, meaning there is little hard discipline being imposed on political power. This can only be a dangerous state of affairs, as we know that power corrupts and absolute power corrupts absolutely. As I mentioned in a previous post, floating fiat currencies were intended (ala Milton Friedman’s monetarism) to discipline politics, but have failed miserably to do so. In fact, they have achieved the opposite, creating more volatility and chaos in the global economy.

But, with digital currencies, power, influence, and wealth have little or no sway over the supply of money, which means they cannot manipulate the value to suit their narrow interests. Of course, those who hold political or economic power are loathe to surrender it, so we can expect powerful forces to be opposed to taking control over the money supply away from them. But a money that is directly connected to the value of time must be the most efficient and productive information signal that will increase the value of wealth measured in time, while insuring both liberty and justice for all. Remember, time is the great equalizer.

Digital currencies today are not yet developed to the point of replacing fiat currencies, but if this discussion captures your interest I would recommend reading up on technologies such as bitcoin. Much of the literature focuses on the efficiency of a digital payment system, but the real payoff in throwing off the yoke of fiat currencies will be in terms of liberty,  justice, and true egalitarian democracy.

Land of the Free?

Sebastian

Good to take note of these things…from the WSJ:

America’s Dwindling Economic Freedom

Regulation, taxes and debt knock the U.S. out of the world’s top 10.

By Terry Miller
World economic freedom has reached record levels, according to the 2014 Index of Economic Freedom, released Tuesday by the Heritage Foundation and The Wall Street Journal. But after seven straight years of decline, the U.S. has dropped out of the top 10 most economically free countries.For 20 years, the index has measured a nation’s commitment to free enterprise on a scale of 0 to 100 by evaluating 10 categories, including fiscal soundness, government size and property rights. These commitments have powerful effects: Countries achieving higher levels of economic freedom consistently and measurably outperform others in economic growth, long-term prosperity and social progress. Botswana, for example, has made gains through low tax rates and political stability.Those losing freedom, on the other hand, risk economic stagnation, high unemployment and deteriorating social conditions. For instance, heavy-handed government intervention in Brazil’s economy continues to limit mobility and fuel a sense of injustice.

It’s not hard to see why the U.S. is losing ground. Even marginal tax rates exceeding 43% cannot finance runaway government spending, which has caused the national debt to skyrocket. The Obama administration continues to shackle entire sectors of the economy with regulation, including health care, finance and energy. The intervention impedes both personal freedom and national prosperity.

But as the U.S. economy languishes, many countries are leaping ahead, thanks to policies that enhance economic freedom—the same ones that made the U.S. economy the most powerful in the world. Governments in 114 countries have taken steps in the past year to increase the economic freedom of their citizens. Forty-three countries, from every part of the world, have now reached their highest economic freedom ranking in the index’s history.

Hong Kong continues to dominate the list, followed by Singapore, Australia, Switzerland, New Zealand and Canada. These are the only countries to earn the index’s “economically free” designation. Mauritius earned top honors among African countries and Chile excelled in Latin America. Despite the turmoil in the Middle East, several Gulf states, led by Bahrain, earned designation as “mostly free.”

A realignment is under way in Europe, according to the index’s findings. Eighteen European nations, including Germany, Sweden, Georgia and Poland, have reached new highs in economic freedom. By contrast, five others—Greece, Italy, France, Cyprus and the United Kingdom—registered scores lower than they received when the index started two decades ago.

The most improved players are in Eastern Europe, including Estonia, Lithuania and the Czech Republic. These countries have gained the most economic freedom over the past two decades. And it’s no surprise: Those who have lived under communism have no trouble recognizing the benefits of a free-market system. But countries that have experimented with milder forms of socialism, such as Sweden, Denmark and Canada, also have made impressive moves toward greater economic freedom, with gains near 10 points or higher on the index scale. Sweden, for instance, is now ranked 20th out of 178 countries, up from 34th out of 140 countries in 1996.

The U.S. and the U.K, historically champions of free enterprise, have suffered the most pronounced declines. Both countries now fall in the “mostly free” category. Some of the worst performers are in Latin America, particularly Venezuela, Argentina, Ecuador and Bolivia. All are governed by crony-populist regimes pushing policies that have made property rights less secure, spending unsustainable and inflation evermore threatening.

Despite financial crises and recessions, the global economy has expanded by nearly 70% in 20 years, to $54 trillion in 2012 from $32 trillion in 1993. Hundreds of millions of people have left grinding poverty behind as their economies have become freer. But it is an appalling, avoidable human tragedy how many of the world’s peoples remain unfree—and poor.

The record of increasing economic freedom elsewhere makes it inexcusable that a country like the U.S. continues to pursue policies antithetical to its own growth, while wielding its influence to encourage other countries to chart the same disastrous course. The 2014 Index of Economic Freedom documents a world-wide race to enhance economic opportunity through greater freedom—and this year’s index demonstrates that the U.S. needs a drastic change in direction.

Mr. Miller is the director of the Center for International Trade and Economics at the Heritage Foundation.

Friedman on Freedom

 

Good quote by Milton Friedman:

Freedom is a rare and delicate plant. Our minds tell us, and history confirms, that the great threat to freedom is the concentration of power. Government is necessary to preserve our freedom, it is an instrument through which we can exercise our freedom; yet by concentrating power in political hands, it is also a threat to freedom. Even though the men who wield this power initially be of good will and even though they be not corrupted by the power they exercise, the power will both attract and form men of a different stamp.