Pondering National Governance

This is a recent article published in the NY Times. To make any sense of our answers to this question requires some ideological and historical clarity. [Blog comments]

Is the United States Too Big to Govern?

By Neil Gross

May 11, 2018

Last month the Pew Research Center released a poll showing that Americans are losing faith in their system of government. Only one-fifth of adults surveyed believe democracy is working “very well” in the United States, while two-thirds say “significant changes” are needed to governmental “design and structure.” [Because nobody really knows what these words mean, or they don’t agree among the many meanings, polling results are questionable indicators.]

The 2016 election is one explanation for these findings. Something is not right in a country where Donald Trump is able to win the presidency. [Well, that’s a selective value judgment – one could easily substitute in the names Hillary Clinton or Bernie Sanders. The point of a democratic society is that the people get to make those decisions and the people agree to abide by them or revolt. Are the people revolting against themselves or against their political representatives?]  

But here’s another possibility: What if trust in American democracy is eroding because the nation has become too big to be effectively governed through traditional means? With a population of more than 325 million and an enormously complex society, perhaps this country has passed a point where — no matter whom we elect — it risks becoming permanently dissatisfied with legislative and governmental performance. [There’s an implicit assumption here that the original intent of the founders is that some central authority should “govern” the affairs of the population and manage the national interest (“traditional means”?). This is probably half true in that a national interest must be represented as the sum of its many parts. We have a Federal government. What was not intended was an all-powerful Federal government.]

Political thinkers, worried about the problem of size, have long advocated small republics. Plato and Aristotle admired the city-state because they thought reason and virtue could prevail only when a polis was small enough that citizens could be acquaintances. Montesquieu, the 18th-century French political philosopher, picked up where the ancient Greeks left off, arguing for the benefits of small territories. “In a large republic,” he wrote, “the common good is sacrificed to a thousand considerations,” whereas in a smaller one the common good “is more strongly felt, better known, and closer to each citizen.” [I suspect Dunbar’s number is at work here.]

The framers of the United States Constitution were keenly aware of these arguments. As the political scientists Robert Dahl and Edward Tufte noted in their 1973 book, “Size and Democracy,” the framers embraced federalism partly because they thought that states were closer in scale to the classical ideal. Ultimately, however, a counterargument advanced by James Madison won the day: Larger republics better protected democracy, he claimed, because their natural political diversity made it difficult for any supersized faction to form and dominate. [With Federalism and the separation of powers and overlapping jurisdictions, I think the founders split the difference here.]

Two and a half centuries later, the accumulated social science suggests that Madison’s optimism was misplaced. Smaller, it seems, is better. [This is a false and impossible choice. When complex networks grow too large, they break-up into smaller, more manageable pieces, but these smaller entities are vulnerable to competitive pressures. This is true in industrial organization, economic and financial markets, and digital and social networks. It also applies to social choice and governance. The founders’ idea was to create a coordinated network of states, counties, and municipalities to manage affairs at the appropriate jurisdictional level. National issues are the sole responsibility of a Federal government balanced by parochial interests. This would secure the strongest union to guarantee citizens’ rights and freedoms. As that task grows in complexity, the need for decentralization and coordination reasserts itself.] 

There are clear economic and military advantages to being a large country. But when it comes to democracy, the benefits of largeness — defined by population or geographic area — are hard to find. Examining data on the world’s nations from the 19th century until today, the political scientists John Gerring and Wouter Veenendaal recently discovered that although size is correlated with electoral competition (in line with the Madisonian argument), there is no association between size and many other standard measures of democratic functioning, such as limits on executive power or the provision of human rights. [Another question raised here is what exactly we mean by democracy. Strictly democracy means government by the people, but popular democracy is a narrow offshoot of that definition. IT also begs the question of what a government by the people is trying to accomplish. Our founders made it clear they thought it was life, liberty, and the pursuit of happiness.  Note: the pursuit of happiness, not its guarantee.]

In fact, large nations turn out to have what the political scientist Pippa Norris has called “democratic deficits”: They don’t fully satisfy their citizens’ demands for democracy. [Again, what is that demand? Is it coherent?] For one thing, citizens in large nations are generally less involved in politics and feel they have less of a voice. [Are they unable to secure life, liberty and pursue happiness or do they just not like the results?] Voter turnout is lower. [Low voter turnout could mean that voters are happy with the status quo, or don’t believe voting matters to their individual fates.] According to the political scientist Karen Remmer, smaller-scale political entities encourage voting in ways large ones can’t by “creating a sense of community” and “enforcing norms of citizenship responsibility.” [Perhaps because they enjoy more intrinsic rewards to participation. This would suggest more localized control over politics.] In addition, small countries promote political involvement by leaning heavily on forms of direct democracy, like referendums or citizen assemblies. [This is a feature of scale. Direct democracy on a large scale can empower the tyranny of the popular majority because the effects are so far removed from that majority.]

A second problem is political responsiveness: The policies of large nations can be slow to change, even if change is needed and desired. In a book published last year, the sociologists John Campbell and John Hall compared the reactions to the 2007-2008 financial crisis in Denmark, Ireland, and Switzerland. These three small countries didn’t cause the crisis; a homegrown Irish housing bubble notwithstanding, the shock wave they dealt with came from America. But though the countries were economically vulnerable, Mr. Campbell and Mr. Hall observed, this vulnerability fostered unexpected resilience and creativity, generating in each nation “a sense of solidarity or ‘we-ness’” that brought together politicians, regulators, and bankers eager to do whatever was necessary to calm markets. [Again, a sense of “we-ness” is one of scale. Cultural homogeneity helps.] 

With the United States lacking the same sense of shared fate and vulnerability, American policymakers could organize only a tepid response, which helps explain why the recovery here was so slow. This theory sheds light as well on developments in environmental and social welfare policy, where it is increasingly common to find a complacent America lagging behind its smaller, more innovative peers. [Complexity plus centralization leads to sclerosis, which is why centralizing authority in a large, diverse, pluralist society make be unworkable.] 

Finally, largeness can take a toll on citizen trust. The presence of a wide variety of social groups and cultures is the primary reason for this. Nearly all scholars who study country size recognize, as Madison did, that large nations are more socially heterogeneous, whether because they represent an amalgamation of different regions, each with its own ethnolinguistic, religious or cultural heritage; or because their economic vitality encourages immigration; or because population size and geographic spread promote the growth of distinctive subcultures; or because they have more differentiated class structures. [Agreed, which is why encouraging a large diverse population of the virtues of multiculturalism may actually be a detriment. I believe the original idea, or at least the one that prevailed in past influxes of cultural groups, was the melting pot of gradual, voluntary assimilation.]

It isn’t inevitable that a large amount of social variation would undermine trust. Well-governed societies like Canada address the issue by stitching diversity and multiculturalism into their national identities. Yet in the absence of cultural and institutional supports, heterogeneity and trust are frequently in tension, as different ways of life give rise to suspicion and animosity. Without at least a veneer of trust among diverse social groups, politics spirals downward. [This characterization of Canada seems counter-intuitive. Stitching ethnic diversity and multiculturalism into a national identity means that national identity must be based not on ethnicity, race, or diverse cultures but in a national identity based on universal principles and social contracts. In other words, on something called patriotism and fealty to the larger community, subsuming ethnic, racial, and cultural differences.]

The challenges of American largeness are here to stay. The task now is for individuals, civic organizations and institutions to commit themselves to building stronger communities and a renewed sense of shared responsibility and trust among different groups. Within the constraints of our nation’s size, we can create conditions for as much democracy as possible. [So, we converge on the idea that it is inevitable we decentralize power and assume the responsibility of self-governance? What then is the real political conflict of interest?]

Neil Gross is a professor of sociology at Colby College.

Advertisements

How the Enlightenment Ends

The Death of Text?

 

The following short essay was published in the NY Times feature called The Fate of the Internet. Frankly, it’s difficult to take these arguments too seriously, despite the transformative effects of technology.

Welcome to the Post-Text Future

by Farhad Manjoo, NY Times

I’ll make this short: The thing you’re doing now, reading prose on a screen, is going out of fashion. [Which means what? It’s popularity is fading as a communication channel?]

We’re taking stock of the internet right now, with writers [Hmm, what’s a writer without a reader?] who cover the digital world cataloging some of the most consequential currents shaping it. If you probe those currents and look ahead to the coming year online, one truth becomes clear. The defining narrative of our online moment concerns the decline of text, and the exploding reach and power of audio and video. [Yes, but where does real “power” really reside? In cat videos and selfies? Those behind the curtain are really smiling.]

This multimedia internet has been gaining on the text-based internet for years. But last year, the story accelerated sharply, and now audio and video are unstoppable. The most influential communicators online once worked on web pages and blogs. They’re now making podcasts, Netflix shows, propaganda memes, Instagram and YouTube channels, and apps like HQ Trivia.

Consider the most compelling digital innovations now emerging: the talking assistants that were the hit of the holidays, Apple’s face-reading phone, artificial intelligence to search photos or translate spoken language, and augmented reality — which inserts any digital image into a live view of your surroundings.

These advances are all about cameras, microphones, your voice, your ears and your eyes.

Together, they’re all sending us the same message: Welcome to the post-text future. [No, they are welcoming us to the distractions of circuses. That’s what entertainment is.]

It’s not that text is going away altogether. Nothing online ever really dies, and text still has its hits — from Susan Fowler’s whistle-blowing blog post last year about harassment at Uber to #MeToo, text was at the center of the most significant recent American social movement.

Still, we have only just begun to glimpse the deeper, more kinetic possibilities of an online culture in which text recedes to the background, and sounds and images become the universal language.

The internet was born in text because text was once the only format computers understood. Then we started giving machines eyes and ears — that is, smartphones were invented — and now we’ve provided them brains to decipher and manipulate multimedia. [Yes, but civilization was not born with the ASCII computer language. Computers are becoming clever tvs, but they still deliver a lot of trivia as content and video formats probably amplify that. Perhaps we are seeing the trivialization of popular culture? Has it ever not been trivial?]

My reading of this trend toward video as a substitute for text applies to certain types of media and content. Certain commentators have adapted readily to YouTube channels to transmit knowledge and ideas and the educational potential is just being tapped. But true power in the world of ideas is controlled by those who know how to manipulate text to understand abstract intellectual ideas that govern our world.

The question is, is technology turning us into sheep or shepherds? Because for sure, there are wolves out there.

As John Maynard Keynes wrote,

The ideas of economists and political philosophers, both when they are right and when they are wrong are more powerful than is commonly understood. Indeed, the world is ruled by little else. Practical men, who believe themselves to be quite exempt from any intellectual influences, are usually slaves of some defunct economist. Madmen in authority, who hear voices in the air, are distilling their frenzy from some academic scribbler of a few years back…

Dennis_The_Menace-11-6-09-240x300

Order vs. Chaos: How We Choose

(The Towers of San Gimignano)

Below is a thought-provoking essay by historian Niall Ferguson examining the fluid structure of societies that swing from hierarchies to decentralized networks.

Anyway, this is a subject dear to my heart, as it is the overriding theme of several of my fiction books. See interjections below…

In Praise of Hierarchy – The Wall Street Journal
https://apple.news/A3UEyEvI-SnuHNdt8fLLjzg (paywall)

The Saturday Essay
Established, traditional order is under assault from freewheeling, networked disrupters as never before. But society craves centralized leadership, too.

It is a truth universally acknowledged that we now live in a networked world, where everyone and everything are connected. The corollary is that traditional hierarchical structures—not only states, but also churches, parties, and corporations—are in various states of crisis and decline. Disruption, disintermediation, and decentralization are the orders of the day. Hierarchy is at a discount, if not despised.

Networks rule not only in the realm of business. In politics, too, party establishments and their machines have been displaced by crowdfunded campaigns and viral messaging. Money, once a monopoly of the state, is being challenged by Bitcoin and other cryptocurrencies, which require no central banks to manage them, only consensus algorithms.

But is all this wise? In all the excitement of the age of hyper-connection, have we perhaps forgotten why hierarchies came into existence in the first place? Do we perhaps overestimate what can be achieved by ungoverned networks—and underestimate the perils of a world without any legitimate hierarchical structure?

True, few dare shed tears for yesterday’s hierarchies. Some Anglophile viewers of “The Crown” may thrill at the quaint stratification of Elizabeth II’s England, but the nearest approximations to royalty in America have lately been shorn of their gilt and glamour. Political dynasties of the recent past have been effaced, if not humiliated, by the upstart Donald Trump, while Hollywood’s elite of exploitative men is in disarray. The spirit of the age is revolutionary; the networked crowd yearns to “smack down” or “shame” each and every authority figure.

Nevertheless, recent events have called into question the notion that all will be for the best in the most networked of all possible worlds. “I thought once everybody could speak freely and exchange information and ideas, the world is automatically going to be a better place,” Evan Williams, a co-founder of Twitter, told the New York Times last May. “I was wrong about that.”

Far from being a utopia in which we all become equally empowered “netizens,” free to tweet truth to power, cyberspace has mutated into a nightmare realm of ideological polarization, extreme views and fake news. The year 2016 was the annus horribilis of the liberal internet, the year when the network platforms built in Silicon Valley were used not only by Donald Trump’s election campaign but also by the proponents of “Brexit” in the United Kingdom to ends that appalled their creators. In 2017, research (including some by Facebook itself) revealed the psychological harm inflicted by social media on young people, who become addicted to the network platforms’ incessant, targeted stimuli.

Most alarming was the morphing of cyberspace into Cyberia, not to mention the Cyber-caliphate: a dark and lawless realm where malevolent actors ranging from Russian trolls to pro-ISIS Twitter users could work with impunity to subvert the institutional foundations of democracy. As Henry Kissinger has rightly observed, the internet has re-created the human state of nature depicted by 17th-century English philosopher Thomas Hobbes, where there rages a war “of every man against every man” and life (like so many political tweets) is “nasty, brutish, and short.”

We should not be surprised. Neither history nor science predicted that everything would be awesome in a world of giant, online networks—quite the contrary. And now that it becomes clear that a networked world may be an anarchic world, we begin to see—as previous generations saw—the benefits of hierarchy.

The word hierarchy derives from ancient Greek (hierarchia, literally the “rule of a high priest”) and was first used to describe the heavenly orders of angels and, more generally, to characterize a stratified order of spiritual or temporal governance. Up until the 16th century, by contrast, the word “network” signified nothing more than a woven mesh made of interlaced thread.

For most of history, hierarchies dominated social networks, a relationship exemplified by the looming Gothic tower that overshadows the Tuscan town of Siena’s central piazza.

DSC_1483_2

Siena’s torre

This is roughly how most people think about hierarchies: as vertically structured organizations characterized by centralized and top-down command, control and communication. Historically, they began with family-based clans and tribes, out of which more complicated and stratified institutions evolved: states, churches, corporations, empires.

The crucial incentive that favored hierarchical order was that it made the exercise of power more efficient. Centralizing control in the hands of the “big man” eliminated or at least reduced time-consuming arguments about what to do, which might at any time escalate into internecine conflict. The obvious defect of hierarchy—in the mid-19th century words of Lord Acton, “power corrupts, and absolute power corrupts absolutely”—was not by itself sufficient to turn humanity away from the rule of “big men.”

There have been only two eras of enhanced connectedness, when new technology helped social networks gain the upper hand. The second is our own age. The first began almost exactly half a millennium ago, in 1517, and lasted for the better part of three centuries.

COM2014-tiny FB cover

The epic story of chaos vs. order during the Savonarola-Machiavelli era, foreshadowing Martin Luther.

When the printing press empowered Martin Luther’s heresy, a network was born. Luther’s dream was of a “priesthood of all believers.” The actual result of the Reformation he inspired was not harmony, but 130 years of polarization and conflict. But it proved impossible to kill Protestant networks, even with mass executions. Hierarchy had to be restored in the form of the princely states whose power the Peace of Westphalia affirmed, but this restoration was fleeting.

Like the Reformation, the 18th-century Enlightenment was a network-driven phenomenon that challenged established authority. The amazing thing was how much further the tendrils of the Enlightenment extended: as far afield as Voltaire’s global network of correspondents, and into the depths of Bavaria, where the secret network known as the Illuminati was founded in 1776.

In Britain’s American colonies, Freemasonry was a key network that connected many of the Founding Fathers, including George Washington and the crucial “node” in the New England revolutionary network, Paul Revere.

IGWT Cover12 6x9 large 2017

Freemasons in today’s Washington, D.C.?

At the same time, the American revolutionaries—Franklin, Jefferson, Lafayette—had all kinds of connections to France, land of the philosophes. The problem in France was that the ideas that went viral were not just “liberty, equality and fraternity,” but also the principle that terror was justifiable against enemies of the people. The result was a descent into bloody anarchy.

 

Those who lived through the wars of the 1790s and early 1800s learned an important lesson that we would do well to relearn: unless one wishes to reap one revolutionary whirlwind after another, it is better to impose some kind of hierarchical order on the world and to give it some legitimacy. At the Congress of Vienna, the five great powers who defeated Napoleon agreed to establish such an order, and the “pentarchy” they formed provided a remarkable stability over the century that followed.

Just over 200 years later, we confront a similar dilemma. Those who favor a revolutionary world run by networks will end up not with the interconnected utopia of their dreams but with Hobbes’s state of nature, in which malign actors exploit opportunities to spread virus-like memes and mendacities. Worse, they may end up entrenching a new but unaccountable hierarchy. For here is a truth that is too often glossed over by the proponents of networked governance: Many networks are hierarchically structured.

Nothing illustrates this better than the way the internet has evolved from being an authentically distributed, decentralized network into one dominated by a few giant technology companies: Facebook, Amazon, Netflix and Alphabet’s Google—the so-called FANGs. This new hierarchy is motivated primarily by the desire to sell—above all, to sell the data that their users provide. Dominance of online advertising by Alphabet and Facebook, coupled with immunity from civil liability under legislation dating back to the 1990s, have create an extraordinary state of affairs. The biggest content publishers in history are regulated as if they are mere technology startups; they are a new hierarchy extracting rent from the network.

The effects are pernicious. According to the Pew Research Center, close to half of Americans now get their news from Facebook, whose incentive is to promote news that holds the attention of users, regardless of whether it is true or false, researched by professional journalists or cooked up by Russian trolls. Established publishers—and parties—were too powerful for too long, but is it really a better world if there are no authorities to separate real news from fake, or decent political candidates from rogues? The old public sphere had its defects, but the new one has no effective gatekeepers, so the advantage now lies not with leaders but with misleaders.

The alternative is that another pentarchy of great powers recognizes their common interest in resisting the threat posed by Cyberia, where jihadism and criminality flourish alongside cyberwarfare, to say nothing of nuclear proliferation. Conveniently, the architects of the post-1945 order created the institutional basis for such a new pentarchy in the form of the permanent members of the United Nations Security Council, an institution that retains the all-important ingredient of legitimacy, despite its gridlocked condition throughout the Cold War.

It is easy to be dismissive of the UNSC. Nevertheless, whether or not these five great powers can make common cause once again, as their predecessors did in the 19th century, is a great geopolitical question of our time. The hierarchical Chinese leader Xi Jinping likes to talk about a “new model of great power relations,” and it may be that the North Korean missile crisis will bring forth this new model. But the crucial point is that the North Korean threat cannot be removed by the action of networks. A Facebook group can no more solve it than a tweet storm or a hashtag.

Our age may venerate online networks, to the extent of making a company such as Facebook one of the most valuable in the world. Yet there is a reason why armies have commanding officers. There is a reason why orchestras have conductors. There is a reason why, at great universities, the lecturers are not howled down by social justice warriors. And there is a reason why the last great experiment in networked organization—the one that began with the Reformation—ended, eventually, with a restoration of hierarchy.

There is hope for hierarchies yet. “The Crown” is not mere fiction; the hierarchy of the monarchy has continued to elevate the head of the British state above party politics. In a similar way, the papacy remains an object of authority and veneration, despite the tribulations of the Roman Catholic Church. Revolutions repeatedly sweep the countries of the Middle East, yet the monarchies of the region have been the most stable regimes.

Even in the U.S., ground zero for disruptive networks, there still is respect for hierarchical institutions. True, just 32% of Americans still have “a great deal” or “quite a lot” of confidence in the presidency and 12% feel that way about Congress. But for the military the equivalent percentage is 72% (up from 50% in 1981), for the police it is 57%, for churches 41%, and for the Supreme Court 40%. By comparison, just 16% of Americans have confidence in news on the internet.

We humans have been designed by evolution to network—man is a social animal, of course—but history has taught us to revere hierarchy as preferable to anarchy, and to prefer time-honored hierarchs to upstart usurpers.

Mr. Ferguson’s new book, “The Square and the Tower: Networks and Power, from the Freemasons to Facebook,” will be published by Penguin Press on Jan. 16.

 

Finite and Infinite Games: the Internet and Politics

About two decades ago James Carse, a religious scholar and historian, wrote a philosophical text titled Finite and Infinite Games. As he explained, there are two kinds of games. One could be called finite, the other infinite. A finite game is played for the purpose of winning, an infinite game for the purpose of continuing the play.

This simple distinction invites some profound thought. War is a finite game, as is the Superbowl. Peace is an infinite game, as is the game of love. Finite games end with a winner(s) and loser(s), while infinite games seek perpetual play. Politics is a finite game; democracy, liberty, and justice are infinite games.

Life itself, then, could be considered a finite or infinite game depending on which perspective one takes. If ‘he who dies with the most toys wins,’ one is living in a finite game that ends with death. If one chooses to create an entity that lives beyond the grave, a legacy that perpetuates through time, then one is playing an infinite game.

One can imagine that we often play a number of finite games within an infinite game. This supports the idea of waging war in order to attain peace (though I wouldn’t go so far as saying it validates destroying the village in order to save it). The taxonomy also relates to the time horizon of one’s perspective in engaging in the game. In other words, are we playing for the short term gain or the long term payoff?

I find Carse’s arguments compelling when I relate them to the new digital economy and how the digital world is transforming how we play certain games, especially those of social interaction and the monetization of value. That sounds a bit hard to follow, but what I’m referring to is the value of the information network (the Internet) as an infinite game.

I would value the internet according to its power to help people connect and share ideas. (I recently wrote a short book on this power called The Ultimate Killer App: The Power to Create and Connect.) The more an idea is shared, the more powerful and valuable it can be. In this sense, the internet is far more valuable than the sum of its various parts, and for it to end as the victim of a finite game would be a tragedy for all. So, I see playing on the information network as an infinite game.

The paradox is that most of the big players on the internet – the Googles, Facebooks, Amazons, etc – are playing finite games on and with the network. In fact, they are using the natural monopoly of network dynamics to win finite games for themselves, reaping enormous value in the process. But while they are winning, many others are losing. Yes, we do gain in certain ways, but the redistribution of information data power is leading to the redistribution of monetary gains and losses across the population of users. In many cases those gains and losses are redistributed quite arbitrarily.

For instance, let us take the disruption of the music industry, or the travel industry, or the publishing industry. One need not lament the fate of obsolete business models to recognize that for play to continue, players must have the possibility of adapting to change in order to keep the infinite game on course. Most musicians and authors believe their professions are DOA. What does that say for the future of culture?

Unfortunately, this disruption across the global economy wrought by digitization is being reflected in the chaotic politics of our times, mostly across previously stable developed democracies.

These economic and political developments don’t seem particularly farsighted and one can only speculate how the game plays out. But to relate it to current events, many of us are playing electoral politics in a finite game that has profound implications for the more important infinite game we should be playing.

 

Bonfire of the Humanities

 

bonfirebThis is how it starts…

This is sad, no matter how you see it. We start by burning the ideas (books) and end with burning those who adopt or even challenge those ideas. From the WSJ:

Bonfire of the Humanities

Christine Lagarde is the latest ritualistic burning of a college-commencement heretic.

It’s been a long time coming, but America’s colleges and universities have finally descended into lunacy.

Last month, Brandeis University banned Somali-born feminist Ayaan Hirsi Ali as its commencement speaker, purporting that “Ms. Hirsi Ali’s record of anti-Islam statements” violates Brandeis’s “core values.”

This week higher education’s ritualistic burning of college-commencement heretics spread to Smith College and Haverford College.

bonfire2

On Monday, Smith announced the withdrawal of Christine Lagarde, the French head of the International Monetary Fund. And what might the problem be with Madame Lagarde, considered one of the world’s most accomplished women? An online petition signed by some 480 offended Smithies said the IMF is associated with “imperialistic and patriarchal systems that oppress and abuse women worldwide.” With unmistakable French irony, Ms. Lagarde withdrew “to preserve the celebratory spirit” of Smith’s commencement.

On Tuesday, Haverford College’s graduating intellectuals forced commencement speaker Robert J. Birgeneau to withdraw. Get this: Mr. Birgeneau is the former chancellor of UC Berkeley, the big bang of political correctness. It gets better.

Berkeley’s Mr. Birgeneau is famous as an ardent defender of minority students, the LGBT community and undocumented illegal immigrants. What could possibly be wrong with this guy speaking at Haverford??? Haverfordians were upset that in 2011 the Berkeley police used “force” against Occupy protesters in Sproul Plaza. They said Mr. Birgeneau could speak at Haverford if he agreed to nine conditions, including his support for reparations for the victims of Berkeley’s violence.

In a letter, Mr. Birgeneau replied, “As a longtime civil rights activist and firm supporter of nonviolence, I do not respond to untruthful, violent verbal attacks.”

Smith president Kathleen McCartney felt obliged to assert that she is “committed to leading a college where differing views can be heard and debated with respect.” And Haverford’s president, Daniel Weiss, wrote to the students that their demands “read more like a jury issuing a verdict than as an invitation to a discussion or a request for shared learning.”

Mr. Birgeneau, Ms. McCartney, Mr. Weiss and indeed many others in American academe must wonder what is happening to their world this chilled spring.

Here’s the short explanation: You’re all conservatives now.

Years ago, when the academic left began to ostracize professors identified as “conservative,” university administrators stood aside or were complicit. The academic left adopted a notion espoused back then by a “New Left” German philosopher—who taught at Brandeis, not coincidentally—that many conservative ideas were immoral and deserved to be suppressed. And so they were.

This shunning and isolation of “conservative” teachers by their left-wing colleagues (with many liberals silent in acquiescence) weakened the foundational ideas of American universities—freedom of inquiry and the speech rights in the First Amendment.

No matter. University presidents, deans, department heads and boards of trustees watched or approved the erosion of their original intellectual framework. The ability of aggrieved professors and their students to concoct behavior, ideas and words that violated political correctness got so loopy that the phrase itself became satirical—though not so funny to profs denied tenure on suspicion of incorrectness. Offensive books were banned and history texts rewritten to conform.

No one could possibly count the compromises of intellectual honesty made on American campuses to reach this point. It is fantastic that the liberal former head of Berkeley should have to sign a Maoist self-criticism to be able to speak at Haverford. Meet America’s Red Guards.

Nazi bookburning

These students at Brandeis, Smith, Haverford and hundreds of other U.S. colleges didn’t discover illiberal intolerance on their own. It is fed to them three times a week by professors of mental conformity. After Brandeis banned Ms. Hirsi Ali, the Harvard Crimson’s editors wrote a rationalizing editorial, “A Rightful Revocation.” The legendary liberal Louis Brandeis (Harvard Law, First Amendment icon) must be spinning in his grave.

Years ago, today’s middle-aged liberals embraced in good faith ideas such as that the Western canon in literature or history should be expanded to include Africa, Asia, Native Americans and such. Fair enough. The activist academic left then grabbed the liberals’ good faith and wrecked it, allowing the nuttiest professors to dumb down courses and even whole disciplines into tendentious gibberish.

The slow disintegration of the humanities into what is virtually agitprop on many campuses is no secret. Professors of economics and the hard sciences roll their eyes in embarrassment at what has happened to once respectable liberal-arts departments at their institutions. Like some Gresham’s Law for Ph.D.s, the bad professors drove out many good, untenured professors, and that includes smart young liberals. Most conservatives were wiped out long ago.

One might conclude: Who cares? Parents are beginning to see that this is a $65,000-a-year scam that won’t get their kids a job in an economy that wants quantification skills. Parents and students increasingly will flee the politicized nut-houses for apolitical MOOCs—massive open online courses.

Still, it’s a tragedy. The loonies are becoming the public face of some once-revered repositories of the humanities. Sic transit whatever.

giordano_bruno(Giordano Bruno burned in Rome as a heretic.)

This is how it ends…

%d bloggers like this: