Networks and Hierarchies

This is a review of British historian Niall Ferguson’s new book titled The Square and the Tower: Networks, Hierarchies and the Struggle for Global Power. It’s interesting to take the long arc of history into account in this day and age of global communication networks, which might seem to herald the permanent dominance of networks over hierarchies. That history cautions us otherwise.

Ferguson notes two predominant ages of networks: the advent of the printing press in 1452 that led to an explosion of networks across the world until around 1800. This was the Enlightenment period that helped transform economics, politics, and social relations.

Today, the second age of networks consumes us, starting at about 1970 with microchip technology and continuing forward to the present. It is the age of telecommunications, digital technology, and global networks. Ours is an age where it seems “everything is connected.”

Ferguson notes that, beginning with the invention of written language,  all that has happened is that new technologies have facilitated our innate, ancient urge to network – in other words, to connect. This seems to affirm Aristotle’s observation that “man is a social animal,” as well as a large library of psychological behavioral studies over the past century. He also notes that most networks may reflect a power law distribution and be scale-free. In other words, large networks grow larger and become more valuable as they do so. This means the rich get richer and most social networks are profoundly inegalitarian. This implies that the GoogleAmazonFacebookApple (GAFA) oligarchy may be taking over the world, leaving the rest of us as powerless as feudal serfs.

But there is a fatal weakness inherent to this futuristic scenario, in that complex networks create interdependent relationships that can lead to catastrophic cascades, such as the global financial crisis of 2008. Or an explosion of “fake news” and misinformation spewed out by global gossip networks.

We are also seeing a gradual deconstruction of networks that compete with the power of nation-state sovereignty. This is reflected in the rise of nationalistic politics in democracies and authoritarian monopoly control over information in autocracies.

However, from the angle of hierarchical control, Ferguson notes that failures of democratic governance through the administrative state “represents the last iteration of political hierarchy: a system that spews out rules, generates complexity, and undermines both prosperity and stability.”

These historical paths imply that the conflict between distributed networks and concentrated hierarchies is likely a natural tension in search of an uneasy equilibrium.

Ferguson notes “if Facebook initially satisfied the human need to gossip, it was Twitter – founded in March 2006 – that satisfied the more specific need to exchange news, often (though not always) political.” But when I read Twitter feeds I’m thinking Twitter may be more of a tool for disruption rather than constructive dialogue. In other words, we can use these networking technologies to tear things down, but not so much to build them back up again.

As a Twitter co-founder confesses:

‘I thought once everybody could speak freely and exchange information and ideas, the world is automatically going to be a better place,’ said Evan Williams, one of the co-founders of Twitter in May 2017. ‘I was wrong about that.’

Rather, as Ferguson asserts, “The lesson of history is that trusting in networks to run the world is a recipe for anarchy: at best, power ends up in the hands of the Illuminati, but more likely it ends up in the hands of the Jacobins.”

Ferguson is quite pessimistic about today’s dominance of networks, with one slim ray of hope. As he writes,

“…how can an urbanized, technologically advanced society avoid disaster when its social consequences are profoundly inegalitarian?

“To put the question more simply: can a networked world have order? As we have seen, some say that it can. In the light of historical experience, I very much doubt it.”

That slim ray of hope? Blockchain technology!

A thought-provoking book.

 

 

 

 

 

 

 

 

 

 

 

 

Advertisements

How the Enlightenment Ends

Surviving the Digital Economy

 

This will be a crucial issue for a free society going forward…giving away your information data is like giving away your labor.

Want Our Personal Data? Pay for It

The posting, tagging and uploading that we do online may be fun, but it’s labor too, and we should be compensated for it

By Eric A. Posner and Glen Weyl

WSJ, April 20, 2018 11:19 a.m. ET

Congress has stepped up talk of new privacy regulations in the wake of the scandal involving Cambridge Analytica, which improperly gained access to the data of as many as 87 million Facebook users. Even Facebook chief executive Mark Zuckerberg testified that he thought new federal rules were “inevitable.” But to understand what regulation is appropriate, we need to understand the source of the problem: the absence of a real market in data, with true property rights for data creators. Once that market is in place, implementing privacy protections will be easy.

We often think of ourselves as consumers of Facebook, Google, Instagram and other internet services. In reality, we are also their suppliers—or more accurately, their workers. When we post and label photos on Facebook or Instagram, use Google maps while driving, chat in multiple languages on Skype or upload videos to YouTube, we are generating data about human behavior that the companies then feed into machine-learning programs.

These programs use our personal data to learn patterns that allow them to imitate human behavior and understanding. With that information, computers can recognize images, translate languages, help viewers choose among shows and offer the speediest route to the mall. Companies such as Facebook, Google, and Microsoft (where one of us works) sell these tools to other companies. They also use our data to match advertisers with consumers.

Defenders of the current system often say that we don’t give away our personal data for free. Rather, we’re paid in the form of the services that we receive. But this exchange is bad for users, bad for society and probably not ideal even for the tech companies. In a real market, consumers would have far more power over the exchange: Here’s my data. What are you willing to pay for it?

An internet user today probably would earn only a few hundred dollars a year if companies paid for data. But that amount could grow substantially in the coming years. If the economic reach of AI systems continues to expand—into drafting legal contracts, diagnosing diseases, performing surgery, making investments, driving trucks, managing businesses—they will need vast amounts of data to function.

And if these systems displace human jobs, people will have plenty of time to supply that data. Tech executives fearful that AI will cause mass unemployment have advocated a universal basic income funded by increased taxes. But the pressure for such policies would abate if users were simply compensated for their data.

The data currently compiled by Facebook and other companies is of pretty low quality. That’s why Facebook has an additional army of paid workers who are given dedicated tasks, such as labeling photos, to fill in the gaps left by users. If Facebook paid users for their work, it could offer pay tied to the value of the user’s contribution—offering more, for example, for useful translations of the latest Chinese slang into English than for yet another video labeled “cat.”

So why doesn’t Facebook already offer wages to users? For one, obviously, it would cost a lot to pay users for the data that the company currently gets for free. And then Google and others might start paying as well. Competition for users would improve the quality of data but eat away at the tech companies’ bottom line.

It’s also true that users simply aren’t thinking this way. But that can change. The basic idea is straightforward enough: When we supply our personal data to Facebook, Google or other companies, it is a form of labor, and we should be compensated for it. It may be enjoyable work, but it’s work just the same.

If companies reject this model of “data as labor,” market pressure could be used to persuade them. Rather than sign up directly with, say, Facebook, people would sign up with a data agent. (Such services, sometimes referred to as personal data exchanges or vaults, are already in development, with more than a dozen startups vying to fill this role.) The data agent would then offer Facebook access to its members and negotiate wages and terms of use on their behalf. Users would get to Facebook through the agent’s platform. If at any time Facebook refused reasonable wages, the data agent could coordinate a strike or a boycott. Unlike individual users, the data agent could employ lawyers to review terms and conditions and ensure that those terms are being upheld.

With multiple data agents competing for users’ business, no one could become an abusive monopolist. The agent’s sole purpose would be managing workers’ data in their interests—and if there were a problem, users could move their data to another service without having to give up on their social network.

Companies such as Apple and Amazon also could get into the act. Currently, their business models are very different from those of Facebook and Google. For the most part, their focus is on selling products and services, rather than offering them without charge. If Facebook and Google refuse to pay users for their data, these other companies are big and sophisticated enough to pay for data instead.

Would the “data as labor” model put the tech giants out of business? Hardly. Their vast profits already reflect their monopoly power. Their margins would certainly be tighter under this new regime, but the wider economy would likely grow through greater productivity and a fairer distribution of income. The big companies would take a smaller share of a larger pie, but their business model would be far more sustainable, politically and socially. More important, they would have to focus on the value that their core services bring to consumers, rather than on exploiting their monopoly in user data.

As for Congress, it could help by making it simpler for individuals to have clear property rights in their own data, rights that can’t be permanently signed away by accepting a company’s confusing terms and conditions. The European Union has already taken steps in this direction, and its new regulations—which require data to be easily portable—are a leading stimulus for the rise of data agent startups. Government can also help by updating labor law to be more consistent with modern data work while protecting data workers from exploitation.

Most of us already take great satisfaction in using social media to connect with our friends and family. Imagine how much happier and prouder we would be if we received fair pay for the valuable work we perform in doing that?

Prof. Posner teaches at the University of Chicago Law School. Dr. Weyl teaches at Yale University and is a principal researcher at Microsoft (whose views he in no way represents here). Their new book is “Radical Markets: Uprooting Capitalism and Democracy for a Just Society,” which will be published on May 8 by Princeton University Press.

https://www.wsj.com/articles/want-our-personal-data-pay-for-it-1524237577 (Paywall)

The Death of Text?

 

The following short essay was published in the NY Times feature called The Fate of the Internet. Frankly, it’s difficult to take these arguments too seriously, despite the transformative effects of technology.

Welcome to the Post-Text Future

by Farhad Manjoo, NY Times

I’ll make this short: The thing you’re doing now, reading prose on a screen, is going out of fashion. [Which means what? It’s popularity is fading as a communication channel?]

We’re taking stock of the internet right now, with writers [Hmm, what’s a writer without a reader?] who cover the digital world cataloging some of the most consequential currents shaping it. If you probe those currents and look ahead to the coming year online, one truth becomes clear. The defining narrative of our online moment concerns the decline of text, and the exploding reach and power of audio and video. [Yes, but where does real “power” really reside? In cat videos and selfies? Those behind the curtain are really smiling.]

This multimedia internet has been gaining on the text-based internet for years. But last year, the story accelerated sharply, and now audio and video are unstoppable. The most influential communicators online once worked on web pages and blogs. They’re now making podcasts, Netflix shows, propaganda memes, Instagram and YouTube channels, and apps like HQ Trivia.

Consider the most compelling digital innovations now emerging: the talking assistants that were the hit of the holidays, Apple’s face-reading phone, artificial intelligence to search photos or translate spoken language, and augmented reality — which inserts any digital image into a live view of your surroundings.

These advances are all about cameras, microphones, your voice, your ears and your eyes.

Together, they’re all sending us the same message: Welcome to the post-text future. [No, they are welcoming us to the distractions of circuses. That’s what entertainment is.]

It’s not that text is going away altogether. Nothing online ever really dies, and text still has its hits — from Susan Fowler’s whistle-blowing blog post last year about harassment at Uber to #MeToo, text was at the center of the most significant recent American social movement.

Still, we have only just begun to glimpse the deeper, more kinetic possibilities of an online culture in which text recedes to the background, and sounds and images become the universal language.

The internet was born in text because text was once the only format computers understood. Then we started giving machines eyes and ears — that is, smartphones were invented — and now we’ve provided them brains to decipher and manipulate multimedia. [Yes, but civilization was not born with the ASCII computer language. Computers are becoming clever tvs, but they still deliver a lot of trivia as content and video formats probably amplify that. Perhaps we are seeing the trivialization of popular culture? Has it ever not been trivial?]

My reading of this trend toward video as a substitute for text applies to certain types of media and content. Certain commentators have adapted readily to YouTube channels to transmit knowledge and ideas and the educational potential is just being tapped. But true power in the world of ideas is controlled by those who know how to manipulate text to understand abstract intellectual ideas that govern our world.

The question is, is technology turning us into sheep or shepherds? Because for sure, there are wolves out there.

As John Maynard Keynes wrote,

The ideas of economists and political philosophers, both when they are right and when they are wrong are more powerful than is commonly understood. Indeed, the world is ruled by little else. Practical men, who believe themselves to be quite exempt from any intellectual influences, are usually slaves of some defunct economist. Madmen in authority, who hear voices in the air, are distilling their frenzy from some academic scribbler of a few years back…

Dennis_The_Menace-11-6-09-240x300

Order vs. Chaos: How We Choose

(The Towers of San Gimignano)

Below is a thought-provoking essay by historian Niall Ferguson examining the fluid structure of societies that swing from hierarchies to decentralized networks.

Anyway, this is a subject dear to my heart, as it is the overriding theme of several of my fiction books. See interjections below…

In Praise of Hierarchy – The Wall Street Journal
https://apple.news/A3UEyEvI-SnuHNdt8fLLjzg (paywall)

The Saturday Essay
Established, traditional order is under assault from freewheeling, networked disrupters as never before. But society craves centralized leadership, too.

It is a truth universally acknowledged that we now live in a networked world, where everyone and everything are connected. The corollary is that traditional hierarchical structures—not only states, but also churches, parties, and corporations—are in various states of crisis and decline. Disruption, disintermediation, and decentralization are the orders of the day. Hierarchy is at a discount, if not despised.

Networks rule not only in the realm of business. In politics, too, party establishments and their machines have been displaced by crowdfunded campaigns and viral messaging. Money, once a monopoly of the state, is being challenged by Bitcoin and other cryptocurrencies, which require no central banks to manage them, only consensus algorithms.

But is all this wise? In all the excitement of the age of hyper-connection, have we perhaps forgotten why hierarchies came into existence in the first place? Do we perhaps overestimate what can be achieved by ungoverned networks—and underestimate the perils of a world without any legitimate hierarchical structure?

True, few dare shed tears for yesterday’s hierarchies. Some Anglophile viewers of “The Crown” may thrill at the quaint stratification of Elizabeth II’s England, but the nearest approximations to royalty in America have lately been shorn of their gilt and glamour. Political dynasties of the recent past have been effaced, if not humiliated, by the upstart Donald Trump, while Hollywood’s elite of exploitative men is in disarray. The spirit of the age is revolutionary; the networked crowd yearns to “smack down” or “shame” each and every authority figure.

Nevertheless, recent events have called into question the notion that all will be for the best in the most networked of all possible worlds. “I thought once everybody could speak freely and exchange information and ideas, the world is automatically going to be a better place,” Evan Williams, a co-founder of Twitter, told the New York Times last May. “I was wrong about that.”

Far from being a utopia in which we all become equally empowered “netizens,” free to tweet truth to power, cyberspace has mutated into a nightmare realm of ideological polarization, extreme views and fake news. The year 2016 was the annus horribilis of the liberal internet, the year when the network platforms built in Silicon Valley were used not only by Donald Trump’s election campaign but also by the proponents of “Brexit” in the United Kingdom to ends that appalled their creators. In 2017, research (including some by Facebook itself) revealed the psychological harm inflicted by social media on young people, who become addicted to the network platforms’ incessant, targeted stimuli.

Most alarming was the morphing of cyberspace into Cyberia, not to mention the Cyber-caliphate: a dark and lawless realm where malevolent actors ranging from Russian trolls to pro-ISIS Twitter users could work with impunity to subvert the institutional foundations of democracy. As Henry Kissinger has rightly observed, the internet has re-created the human state of nature depicted by 17th-century English philosopher Thomas Hobbes, where there rages a war “of every man against every man” and life (like so many political tweets) is “nasty, brutish, and short.”

We should not be surprised. Neither history nor science predicted that everything would be awesome in a world of giant, online networks—quite the contrary. And now that it becomes clear that a networked world may be an anarchic world, we begin to see—as previous generations saw—the benefits of hierarchy.

The word hierarchy derives from ancient Greek (hierarchia, literally the “rule of a high priest”) and was first used to describe the heavenly orders of angels and, more generally, to characterize a stratified order of spiritual or temporal governance. Up until the 16th century, by contrast, the word “network” signified nothing more than a woven mesh made of interlaced thread.

For most of history, hierarchies dominated social networks, a relationship exemplified by the looming Gothic tower that overshadows the Tuscan town of Siena’s central piazza.

DSC_1483_2

Siena’s torre

This is roughly how most people think about hierarchies: as vertically structured organizations characterized by centralized and top-down command, control and communication. Historically, they began with family-based clans and tribes, out of which more complicated and stratified institutions evolved: states, churches, corporations, empires.

The crucial incentive that favored hierarchical order was that it made the exercise of power more efficient. Centralizing control in the hands of the “big man” eliminated or at least reduced time-consuming arguments about what to do, which might at any time escalate into internecine conflict. The obvious defect of hierarchy—in the mid-19th century words of Lord Acton, “power corrupts, and absolute power corrupts absolutely”—was not by itself sufficient to turn humanity away from the rule of “big men.”

There have been only two eras of enhanced connectedness, when new technology helped social networks gain the upper hand. The second is our own age. The first began almost exactly half a millennium ago, in 1517, and lasted for the better part of three centuries.

COM2014-tiny FB cover

The epic story of chaos vs. order during the Savonarola-Machiavelli era, foreshadowing Martin Luther.

When the printing press empowered Martin Luther’s heresy, a network was born. Luther’s dream was of a “priesthood of all believers.” The actual result of the Reformation he inspired was not harmony, but 130 years of polarization and conflict. But it proved impossible to kill Protestant networks, even with mass executions. Hierarchy had to be restored in the form of the princely states whose power the Peace of Westphalia affirmed, but this restoration was fleeting.

Like the Reformation, the 18th-century Enlightenment was a network-driven phenomenon that challenged established authority. The amazing thing was how much further the tendrils of the Enlightenment extended: as far afield as Voltaire’s global network of correspondents, and into the depths of Bavaria, where the secret network known as the Illuminati was founded in 1776.

In Britain’s American colonies, Freemasonry was a key network that connected many of the Founding Fathers, including George Washington and the crucial “node” in the New England revolutionary network, Paul Revere.

IGWT Cover12 6x9 large 2017

Freemasons in today’s Washington, D.C.?

At the same time, the American revolutionaries—Franklin, Jefferson, Lafayette—had all kinds of connections to France, land of the philosophes. The problem in France was that the ideas that went viral were not just “liberty, equality and fraternity,” but also the principle that terror was justifiable against enemies of the people. The result was a descent into bloody anarchy.

 

Those who lived through the wars of the 1790s and early 1800s learned an important lesson that we would do well to relearn: unless one wishes to reap one revolutionary whirlwind after another, it is better to impose some kind of hierarchical order on the world and to give it some legitimacy. At the Congress of Vienna, the five great powers who defeated Napoleon agreed to establish such an order, and the “pentarchy” they formed provided a remarkable stability over the century that followed.

Just over 200 years later, we confront a similar dilemma. Those who favor a revolutionary world run by networks will end up not with the interconnected utopia of their dreams but with Hobbes’s state of nature, in which malign actors exploit opportunities to spread virus-like memes and mendacities. Worse, they may end up entrenching a new but unaccountable hierarchy. For here is a truth that is too often glossed over by the proponents of networked governance: Many networks are hierarchically structured.

Nothing illustrates this better than the way the internet has evolved from being an authentically distributed, decentralized network into one dominated by a few giant technology companies: Facebook, Amazon, Netflix and Alphabet’s Google—the so-called FANGs. This new hierarchy is motivated primarily by the desire to sell—above all, to sell the data that their users provide. Dominance of online advertising by Alphabet and Facebook, coupled with immunity from civil liability under legislation dating back to the 1990s, have create an extraordinary state of affairs. The biggest content publishers in history are regulated as if they are mere technology startups; they are a new hierarchy extracting rent from the network.

The effects are pernicious. According to the Pew Research Center, close to half of Americans now get their news from Facebook, whose incentive is to promote news that holds the attention of users, regardless of whether it is true or false, researched by professional journalists or cooked up by Russian trolls. Established publishers—and parties—were too powerful for too long, but is it really a better world if there are no authorities to separate real news from fake, or decent political candidates from rogues? The old public sphere had its defects, but the new one has no effective gatekeepers, so the advantage now lies not with leaders but with misleaders.

The alternative is that another pentarchy of great powers recognizes their common interest in resisting the threat posed by Cyberia, where jihadism and criminality flourish alongside cyberwarfare, to say nothing of nuclear proliferation. Conveniently, the architects of the post-1945 order created the institutional basis for such a new pentarchy in the form of the permanent members of the United Nations Security Council, an institution that retains the all-important ingredient of legitimacy, despite its gridlocked condition throughout the Cold War.

It is easy to be dismissive of the UNSC. Nevertheless, whether or not these five great powers can make common cause once again, as their predecessors did in the 19th century, is a great geopolitical question of our time. The hierarchical Chinese leader Xi Jinping likes to talk about a “new model of great power relations,” and it may be that the North Korean missile crisis will bring forth this new model. But the crucial point is that the North Korean threat cannot be removed by the action of networks. A Facebook group can no more solve it than a tweet storm or a hashtag.

Our age may venerate online networks, to the extent of making a company such as Facebook one of the most valuable in the world. Yet there is a reason why armies have commanding officers. There is a reason why orchestras have conductors. There is a reason why, at great universities, the lecturers are not howled down by social justice warriors. And there is a reason why the last great experiment in networked organization—the one that began with the Reformation—ended, eventually, with a restoration of hierarchy.

There is hope for hierarchies yet. “The Crown” is not mere fiction; the hierarchy of the monarchy has continued to elevate the head of the British state above party politics. In a similar way, the papacy remains an object of authority and veneration, despite the tribulations of the Roman Catholic Church. Revolutions repeatedly sweep the countries of the Middle East, yet the monarchies of the region have been the most stable regimes.

Even in the U.S., ground zero for disruptive networks, there still is respect for hierarchical institutions. True, just 32% of Americans still have “a great deal” or “quite a lot” of confidence in the presidency and 12% feel that way about Congress. But for the military the equivalent percentage is 72% (up from 50% in 1981), for the police it is 57%, for churches 41%, and for the Supreme Court 40%. By comparison, just 16% of Americans have confidence in news on the internet.

We humans have been designed by evolution to network—man is a social animal, of course—but history has taught us to revere hierarchy as preferable to anarchy, and to prefer time-honored hierarchs to upstart usurpers.

Mr. Ferguson’s new book, “The Square and the Tower: Networks and Power, from the Freemasons to Facebook,” will be published by Penguin Press on Jan. 16.

 

Finite and Infinite Games: the Internet and Politics

About two decades ago James Carse, a religious scholar and historian, wrote a philosophical text titled Finite and Infinite Games. As he explained, there are two kinds of games. One could be called finite, the other infinite. A finite game is played for the purpose of winning, an infinite game for the purpose of continuing the play.

This simple distinction invites some profound thought. War is a finite game, as is the Superbowl. Peace is an infinite game, as is the game of love. Finite games end with a winner(s) and loser(s), while infinite games seek perpetual play. Politics is a finite game; democracy, liberty, and justice are infinite games.

Life itself, then, could be considered a finite or infinite game depending on which perspective one takes. If ‘he who dies with the most toys wins,’ one is living in a finite game that ends with death. If one chooses to create an entity that lives beyond the grave, a legacy that perpetuates through time, then one is playing an infinite game.

One can imagine that we often play a number of finite games within an infinite game. This supports the idea of waging war in order to attain peace (though I wouldn’t go so far as saying it validates destroying the village in order to save it). The taxonomy also relates to the time horizon of one’s perspective in engaging in the game. In other words, are we playing for the short term gain or the long term payoff?

I find Carse’s arguments compelling when I relate them to the new digital economy and how the digital world is transforming how we play certain games, especially those of social interaction and the monetization of value. That sounds a bit hard to follow, but what I’m referring to is the value of the information network (the Internet) as an infinite game.

I would value the internet according to its power to help people connect and share ideas. (I recently wrote a short book on this power called The Ultimate Killer App: The Power to Create and Connect.) The more an idea is shared, the more powerful and valuable it can be. In this sense, the internet is far more valuable than the sum of its various parts, and for it to end as the victim of a finite game would be a tragedy for all. So, I see playing on the information network as an infinite game.

The paradox is that most of the big players on the internet – the Googles, Facebooks, Amazons, etc – are playing finite games on and with the network. In fact, they are using the natural monopoly of network dynamics to win finite games for themselves, reaping enormous value in the process. But while they are winning, many others are losing. Yes, we do gain in certain ways, but the redistribution of information data power is leading to the redistribution of monetary gains and losses across the population of users. In many cases those gains and losses are redistributed quite arbitrarily.

For instance, let us take the disruption of the music industry, or the travel industry, or the publishing industry. One need not lament the fate of obsolete business models to recognize that for play to continue, players must have the possibility of adapting to change in order to keep the infinite game on course. Most musicians and authors believe their professions are DOA. What does that say for the future of culture?

Unfortunately, this disruption across the global economy wrought by digitization is being reflected in the chaotic politics of our times, mostly across previously stable developed democracies.

These economic and political developments don’t seem particularly farsighted and one can only speculate how the game plays out. But to relate it to current events, many of us are playing electoral politics in a finite game that has profound implications for the more important infinite game we should be playing.

 

Unforgettable Economics Lessons in Tombstone

The lessons of history are there for us to learn…

Economics One

Last night Yang Jisheng was awarded the 2012 Hayek Prize for his book Tombstone about the Chinese famine of 1958-1962.  It’s an amazing book. It starts with Yang Jisheng returning home as a teenager to find a ghost town, trees stripped of bark, roots pulled up, ponds drained, and his father dying of starvation. He thought at the time that his father’s death was an isolated incident, only later learning that tens of millions died of starvation and that government policy was the cause.

Then you read about the Xinyang Incident: people tortured for simply suggesting that the crop yields were lower than exaggerated projections. Those projections led government to take the grain from the farmers who grew it and let many starve; and there are the horrific stories of cannibalism.

You also find out what life was like as a member of a communal kitchen. With free meals people…

View original post 198 more words

Obamacare and “Information”

Good quote applying Hayek and market theory to the illusion of centralized health care:

Perhaps ObamaCare will be remembered as the breaking point for top-down planning. There is not enough information available for the government to micromanage a system as complex as health care, which represents more than 15% of the economy. Austrian economist Friedrich Hayek wrote some 50 years ago about the “pretence of knowledge,” meaning the conceit that planners could know enough about complex markets to dictate how they operate. He warned against “the belief that we possess the knowledge and the power which enable us to shape the processes of society entirely to our liking, knowledge which in fact we do not possess.”

True enough, ObamaCare was built on an unworkable foundation. The original sin in health care goes back to the wage and price controls in effect during World War II. The federal government let employers avoid wage controls by adding health insurance as an untaxed benefit for employees. Employer-provided insurance has since insulated most Americans from the cost of care. The predictable result is endless demand for increasingly inefficient services.

When was the last time you saw prices posted in a doctor’s office or hospital? Yet price is the key means through which information is transmitted, at least in functioning markets. There are many ways to make sure that the poor and seriously ill get medical care, including direct subsidies that don’t undermine the price mechanism. But the complexity of accomplishing this goal in a hyperregulated health-care industry overwhelmed the system.

If the justices do send ObamaCare back to be rethought, politicians should address the problem with more humility. We’ll know health care is on the road to recovery when basic information such as clear rules and transparent prices are again part of the system.

Full article here.

%d bloggers like this: