Highlights from David Bollier: Think Like a Commoner: A Short Introduction to the Life of the Commons

Think Like a Commoner: A Short Introduction to the Life of the Commons by David Bollier

Serious wealth can also be a community asset and the rich set of social relationships that make community possible. The Linux story is a stunning proof that the commons can be highly generative and contemporary as well as being entirely practical and effective. 283

Commoners are intent on addressing difficult practical questions such as, What’s the best way to irrigate these forty acres when water is scarce? and What’s a fair way to allocate access to a dwindling fishery in this coastal bay? Commoners are also not afraid to tackle the problem of shirkers, vandals and free riders: individuals who want benefits without corresponding responsibilities. The point is that the commons is a practical paradigm for self-governance, resource management and “living well.” Commoners can often negotiate satisfactory resolutions to meet their common purposes without getting markets or government bureaucracies involved. They struggle to figure out the best structures for managing a collective resource, the procedures for making rules and operational norms that work. They understand the need to establish effective practices to prevent over-exploitation of their forest or lake or farmland. They can negotiate fair allocations of duties and entitlements. They like to ritualize and internalize their collective habits and stewardship ethic, which over time ripen into a beautiful culture. 288

Commons certainly include physical and intangible resources of all sorts, but they are more accurately defined as paradigms that combine a distinct community with a set of social practices, values and norms that are used to manage a resource. Put another way, a commons is a resource + a community + a set of social protocols. The three are an integrated, interdependent whole. 304

the lesson from the Wolfpak and the parking commons is that local commons can provide types of management and order that government bureaucracies and formal law cannot. Boston snowplows may not reliably clear the streets of snow, and the city government’s enforcement of parking rules may be unreliable or expensive. Hawaiian authorities may not wish to hire a police officer or lifeguard to patrol Banzai Pipeline Beach (leaving a void of governance?), or such tasks may be seen as too impractical or “small” for a large bureaucracy to address. But the commoners? They often have their own deep stores of knowledge, imagination, resourcefulness and commitment. Their informal governance may in fact outperform official forms of government. In fact, as explicit negotiations among commoners become so engrained that they settle into habit, custom becomes a kind of invisible “vernacular law.” Vernacular law originates in the informal social zones of society — coffeehouses, schools, the beach, “the street” — and becomes a source of effective order and moral legitimacy in its own right. Social norms such as queuing up in a line (and punishing those who cut in line) and meal etiquette (never take the last helping) are a kind of passive commoning that most of us have internalized as “the way things are done.” They constitute an implicit mode of commoning for managing access to limited resources. 340

EACH OF THE COMMONS described above arose spontaneously, without the direction or oversight of centralized institutions or government. Each is committed to a larger collective purpose while also providing personal benefits for individuals. None is driven by a quest for money or personal fortune, at least not directly. In most commons, in fact, the market is a rather peripheral presence. Yet even without the direct involvement of markets or the state, serious production and governance occur. The beauty of the commons as a “rediscovered” paradigm is both its generality and its particularity. It embodies certain broad principles — such as democratic participation, transparency, fairness and access for personal use — but it also manifests itself in highly idiosyncratic ways. For these reasons, I like to compare the commons to DNA. Scientists will tell you that DNA is ingeniously under-specified precisely so that the code of life can adapt to local circumstances. DNA is not fixed and deterministic. It is partial and adaptable. It grows and changes. A commons is like a living organism in that it co-evolves with its environment and context. It adapts to local contingencies. A forest commons in Vermont is likely to be quite different from one in Nepal or Germany, because the local ecosystems, tree types, economies, cultural histories and much else vary. And yet commons in each of these places are nonetheless commons: stable regimes for managing shared resources in fair ways for the benefit of participating commoners. The “diversity within unity” principle that commons embody is what makes the commons paradigm so versatile and powerful — and so confusing to conventional economists and policymakers. What’s critical in creating any commons, as mentioned earlier, is that a community decides that it wants to engage in the social practices of managing a resource for everyone’s benefit. This is sometimes known as commoning. The great historian of the commons Peter Linebaugh has noted that “there is no commons without commoning.” It’s an important point to remember because it underscores that the commons is not only about shared resources; it’s mostly about the social practices and values that we devise to manage them. Commoning acts as a kind of moral, social and political gyroscope. It provides stability and focus. When people come together, share the same experiences and practices and accumulate a body of practical knowledge and traditions, a set of productive social circuits emerges. 351

escorted through a ritual shudder, the professor whisks them along to the main attraction, the virtues of private property and free markets. Here, finally, economists reveal, we may surmount the dismal tragedy of a commons. The catechism is hammered home: individual freedom to own and trade private property in open markets is the only way to produce enduring personal satisfaction and social prosperity. Hardin explains the logic this way: we can overcome the tragedy of the commons through a system of “mutual coercion, mutually agreed upon by the majority of the people affected.” For him, the best approach is “the institution of private property coupled with legal inheritance.” He concedes that this is not a perfectly just alternative, but he asserts that Darwinian natural selection is ultimately the best available option, saying, “those who are biologically more fit to be the custodians of property and power should legally inherit more.” We put up with this imperfect legal order, he adds, “because we are not convinced, at the moment, that anyone has invented a better system. The alternative of the commons is too horrifying to contemplate. Injustice is preferable to total ruin.” 393

There is just one significant flaw in the tragedy parable. It does not accurately describe a commons. Hardin’s fictional scenario sets forth a system that has no boundaries around the pasture, no rules for managing it, no punishments for over-use and no distinct community of users. But that is not a commons. It is an open-access regime, or a free-for-all. A commons has boundaries, rules, social norms and sanctions against free riders. A commons requires that there be a community willing to act as a conscientious steward of a resource. Hardin was confusing a commons with “no-man’s-land” — and in the process, he smeared the commons as a failed paradigm for managing resources. To be fair, Hardin was following a long line of polemicists who projected their unexamined commitments to market individualism onto the world. As we will see later, the theories of philosopher John Locke have been widely used to justify treating the New World as terra nullius — open, unowned land — even though it was populated by millions of Native Americans who managed their natural resources as beloved commons with unwritten but highly sophisticated rules. 413

are. Commons scholar Lewis Hyde dryly notes, “Just as Hardin proposes a herdsman whose reason is unable to encompass the common good, so Lloyd supposes persons who have no way to speak with each other or make joint decisions. Both writers inject laissez-faire individualism into an old agrarian village and then gravely announce that the commons is dead. 427

basis for a large literature of “prisoner’s dilemma” experiments that purport to show how “rational individuals” behave when confronted with “social dilemmas,” such as how to allocate a limited resource. Should the “prisoner” cooperate with other potential claimants and share the limited rewards? Or should he or she defect by grabbing as much for himself as possible? Needless to say, the complications are endless. But the basic premise of such social science experiments is rigged at the outset. Certain assumptions about the selfishness, rational calculation of individuals and lack of context (test subjects have no shared social history or culture) are embedded into the very design of the “game.” Test subjects are not allowed to communicate with each other, or develop bonds of trust and shared knowledge. They are given only limited time and opportunity to learn to cooperate. They are isolated in a lab setting for a single experiment, and have no shared history or future together. Aghast at the pretzel logic of economic researchers, Lewis Hyde puckishly suggested that the “tragedy” thesis be called, instead, “The Tragedy of Unmanaged, Laissez-Faire, Common-Pool Resources with Easy Access for Noncommunicating, Self-Interested Individuals.” The dirty little secret of many prisoner’s dilemma experiments is that they subtly presuppose a market culture of “rational” individuals. Most give little consideration to the real-life ways in which people come to cooperate and share in managing resources. That is changing now that more game theory experiments are incorporating the ideas of behavioral economics, complexity theory and evolutionary sciences into their design. 432

Paradoxically enough, the heedless quest for selfish gain — “rationally” pursued, of course, yet indifferent toward the collective good — is a better description of the conventional market economy than a commons. In the run-up to the 2008 financial crisis, such a mindset propelled the wizards of Wall Street to maximize private gains without regard for the systemic risks or local impacts. The real tragedy precipitated by “rational” individualism is not the tragedy of the commons, but the tragedy of the market. Happily, contemporary scholarship has done much to rescue the commons from the memory hole to which it has been consigned by mainstream economics. The late American political scientist Elinor (“Lin”) Ostrom of Indiana University deserves special credit for her role in expanding the frame of analysis of economic activity. In the 1970s, the economics profession plunged into a kind of religious fundamentalism. It celebrated highly abstract, quantitative models of the economy based on rational individualism, private property rights and free markets. A child of the Depression, Ostrom had always been interested in cooperative institutions working outside of markets. As a young political scientist in the 1960s, she began to question some of the core assumptions of economics, especially the idea that people are unable to cooperate in stable, sustainable ways. Sometimes working with political scientist Vincent Ostrom, her husband, she initiated a new kind of cross-disciplinary study of institutional systems that manage “common-pool resources,” or CPRs. CPRs are collective resources over which no one has private property rights or exclusive control, such as fisheries, grazing lands and groundwater. All of these resources are highly vulnerable to over-exploitation because it is difficult to stop people from using them. We might call it the “tragedy of open access.” (Hardin himself later acknowledged that he should have entitled his essay “The Tragedy of an Unmanaged Commons” — an oxymoron, but never mind.) 451

What distinguished Ostrom’s scholarship from that of so many academic economists was her painstaking empirical field-work. She visited communal landholders in Ethiopia, rubber tappers in the Amazon and fishers in the Philippines. She investigated how they negotiated cooperative schemes, and how they blended their social systems with local ecosystems. As economist Nancy Folbre of the University of Massachusetts, Amherst, explained, “She would go and actually talk to Indonesian fishermen or Maine lobstermen, and ask, ‘How did you come to establish this limit on the fish catch? How did you deal with the fact that people might try to get around it?’” From such empirical findings, Ostrom tried to figure out what makes for a successful commons. How does a community overcome its collective-action problem? The recurring challenge facing a group of principals in an interdependent situation, she wrote, is figuring out how to “organize and govern themselves to obtain continuing joint benefits when all face temptations to free-ride, shirk, or otherwise act opportunistically. Parallel questions have to do with the combinations of variables that will (1) increase the initial likelihood of self-organization, (2) enhance the capabilities of individuals to continue self-organized efforts over time, or (3) exceed the capacity of self-organization to solve CPR [common-pool resource] problems without eternal assistance of some form.” Ostrom’s answer was Governing the Commons, a landmark 1990 book that set forth some of the basic “design principles” of effective, durable commons. These principles have been adapted and elaborated by later scholars, but her analysis remains the default framework for evaluating natural resource commons. The focus of Ostrom’s work, and of the legions of academics who now study commons, has been how communities of resource users develop social norms — and sometimes formal legal rules — that enable them to use finite resources sustainably over the long term. Standard economics, after all, declares that we are selfish individuals whose wants are unlimited. The idea that we can depend on people’s altruism and cooperation, economists object, is naive and unrealistic. The idea that commons can set and enforce limits on usage also seems improbable because it rejects the idea of humans having unbounded appetites. Ostrom nonetheless showed how, in hundreds of instances, commoners do in fact meet their needs and interests in collective, cooperative ways. The villagers of Törbel, Switzerland, have managed their high alpine forests, meadows and irrigation waters since 1224. Spaniards have shared irrigation waters through huerta social institutions for centuries while, more recently, diverse water authorities in Los Angeles learned how to coordinate their management of scarce groundwater supplies. Many commons have flourished for hundreds of years, even in periods of drought or crisis. Their success can be traced to a community’s ability to develop its own flexible, evolving rules for stewardship, oversight of access and usage, and effective punishments for rule-breakers. Ostrom found that commons must have clearly defined boundaries so that commoners can know who has authorized rights to use a resource. Outsiders who do not contribute to the commons obviously have no rights to access or use the common-pool resource. She discovered that the rules for appropriating a resource must take account of local conditions and must include limits on what can be taken and how. 466

What is fascinating is the parallel development, outside of academia, of an eclectic, transnational corps of activists and project leaders who have embraced the commons as an organizing principle for their campaigns for social change. This, arguably, is what is making the commons a significant force in politics, economics and culture today. New movements of people worldwide are beginning to see how the commons paradigm describes their lives and their relationships to other people and resources. Software programmers, urban gardeners, indigenous peoples, academic researchers, permaculturists, Indian textile makers, Istanbul residents defending Gezi Park, the users of public libraries and parks, Slow Food activists: the affinity of these groups for the commons is not necessarily intellectual or scientific; it’s personal and passionate. For many of these commoners, the commons is not a “management system” or “governance regime”; it’s a cultural identity, a personal livelihood and a way of life. It’s a way to revive democratic practice. It’s a way to live a more satisfied life. 556

I like to think of this as a vernacular movement more than a political movement or ideological perspective. The term “vernacular” was given a special meaning by iconoclastic social critic Ivan Illich in his 1981 book Shadow Work. As a critic of the dehumanizing tendencies of institutions, Illich saw vernacular spaces as informal cultural zones where people naturally come to their own moral judgments and act out of their own sovereign humanity. The vernacular flourishes in the realm of householding and subsistence, and of family life and child rearing. It lives in the shared spaces of a community in which people assert their collective moral values and political interests, over and above those of the state, the corporation and other institutional powers. As one of Illich’s students, Trent Schroyer, put it, the vernacular realm evokes a “sensibility and rootedness . . . in which local life has been conducted throughout most of history and even today in a significant proportion of subsistence- and communitarian-oriented communities.” The vernacular consists of “places and spaces where people are struggling to achieve regeneration and social restoration against the forces of economic globalization.” There is a certain timelessness and mystery associated with the vernacular, and as you have probably guessed, it has a lot to do with the commons. The commons is a fragile social institution and sensibility that naturally arises from vernacular culture, as if driven by a life force. It invariably tries to assert and maintain itself in the face of powerful institutions that have other priorities and interests. Sometimes commoners succeed in negotiating a rapprochement with those institutions, and carve out a protected zone for commoning. Urban gardens in New York City had to struggle to maintain themselves in the face of development pressures, for example. Coastal fishery commons must often struggle against large-scale industrial trawlers who swoop through their waters extracting fish for global markets rather than local consumption. Digital commoners must contend with copyright laws and corporate demagoguery that equate sharing with criminal activity (“piracy”). History has shown that the forces of market enclosure are cruel and relentless in deconstructing and destroying commons; they don’t like the competition. A successful commons is a “bad example” because it bears witness to better practical alternatives. Sharing is also objectionable because it is an affront to the ideology of private property rights (with the exception of tech companies like Google and Facebook, whose business model relies upon monetizing social sharing). For their part, governments and bureaucracies are often wary of the commons as an independent, potentially threatening power base, preferring the certainties and rewards of market-based allies. Governments generally prefer to manage resources through strict standardized systems of control. To them, commoning appears to be altogether too informal, irregular and unreliable — even if the actual successes of commons refute that prejudice. Any basic 575

This process is often called the enclosure of the commons. It’s a process by which corporations pluck valuable resources from their natural contexts, often with government support and sanction, and declare that they be valued through market prices. The point is to convert resources that are shared and used by many to ones that are privately owned and controlled, and treat them as tradeable commodities. To talk about enclosure is to open up a conversation that standard economics rarely entertains — the dispossession of commoners as market forces seize control of common resources, often with the active collusion of government. The familiar debate of “privatization versus government ownership” does not really do justice to this process because government ownership, the supposed antidote to privatization, is not really a solution. In many instances, the state is only too eager to conspire with industries to seize control of common resources for “private” (i.e., corporate) exploitation. Regulation is too often a charade that does more to legalize than eradicate market abuses. To talk about enclosure, then, is a way to point to the commons and reframe the discussion. The language of enclosure makes visible the antisocial, anti-environmental effects of “free markets” and validates commoning as an appropriate, often-effective alternative. A few years ago, I learned of a contemporary enclosure that eerily replicated the medieval pattern of land enclosure. For more than a century, the village of Camberwell, in the fertile Hunter Valley region of New South Wales, Australia, had used part of an open flood plain around Glennies Creek as a commons. It was a place for residents to keep their horses and dairy cows, and to let their children fish, swim and ride horses. In April 2005, according to the Sydney Morning Herald, “a pair of officers from the Department of Lands arrived, called together members of the [Camberwell] Common Trust, and told them the Crown land would be immediately resumed and turned over to the Ashton mine that looms over the Upper Hunter village in the form of a hollowed-out hill on the other side of the creek.” The action was just another instance of government using its authority to seize common lands for corporate purposes. The secretary of the Camberwell Common Trust told a reporter, “When we go to community meetings with the mines they are always talking about what they will do ‘when’ they get approval. They never say ‘if’ they get approval.” Both mining companies and government make out well from enclosures. The mining companies get cheap access to minerals and lax environmental oversight. The Australian government earned about $1.5 billion in royalties and fees at the time of the Camberwell enclosure. Commoners are generally not so lucky. In Camberwell, blasts from the mining hollowed out the hills around the village. Parts of the commons cracked, according to the Morning Herald. Nearly two-thirds of the village population gave up fighting the mining companies and moved elsewhere. The Camberwell experience is a classic example of state-assisted market enclosure. In the US, the government allows mining interests to extract mineral wealth on public lands under the Mining Act of 1872. Unchanged for more than 140 years, this law lets mining companies extract gold, silver and iron ore for five dollars an acre, period. It’s been estimated that Americans have lost more than $245 billion worth of revenues over the years from this law — while ruining beautiful mountains and rivers with mine tailings and other wastes. Similar stories from around the world can be told about timber companies raping public forests, oil companies drilling in pristine wilderness areas, industrial trawlers decimating coastal fisheries and transnational water bottlers sucking groundwater dry. In Latin America, transnational corporations are working with neoliberal governments to impose aggressive “neo-extractivist” policies. As Argentinian professor Maristella Svampa explains, the idea is to build… 606

A Brief History of the English Enclosure Movement The term “enclosure” is generally associated with the English enclosure movement, which occurred at various times in medieval history and through the nineteenth century. To put it plainly, the king, aristocracy and/or landed gentry stole the pastures, forests, wild game and water used by commoners, and declared them private property. Sometimes the enclosers seized lands with the formal sanction of Parliament, and sometimes they just took them by force. To keep commoners out, it was customary to evict them from the land and erect fences or hedges. Sheriffs and gangs of thugs made sure that no commoner would poach game from the king’s land. Enclosure was irresistible to the 1 percent of medieval England because it was an easy way to grab more wealth and power with the full sanction of the law. It could help struggling barons and upwardly mobile gentry consolidate their political power and increase their holdings of land, water and game. An anonymous protest poem from the eighteenth century put it well: The law locks up the man or woman Who steals the goose from off the common But leaves the greater villain loose Who steals the common from off the goose. The law demands that we atone When we take things we do not own But leaves the lords and ladies fine Who take things that are yours and mine. The poor and wretched don’t escape If they conspire the law to break; This must be so but they endure Those who conspire to make the law. The law locks up the man or woman Who steals the goose from off the common And geese will still a common lack Till they go and steal it back. As enclosures swept the villages of England, commoners suffered serious hardships. They depended upon the forest for their firewood and roof thatches, and on acorns to feed their pigs. They relied on shared fields to grow vegetables, and on open meadows for wild fruits and berries. An entire rural economy was based upon access to the commons. Barred from using their commons, villagers migrated to cities, where the emerging industrial revolution turned them into wage slaves, if they were lucky, and beggars and paupers if they weren’t. Charles Dickens drew upon the social disruptions and injustices of enclosures in writing Oliver Twist, Great Expectations and his other novels about London’s troubled underclass. One important goal of the English enclosures was to transform commoners with collective interests into individual consumers and employees. Which is to say: creatures of the marketplace. The satanic mills of the Industrial Revolution needed obedient and desperate wage slaves. One of the lesser-noticed aspects of enclosures was the separation of production and governance. In a commons, both were part of the same process, and all commoners could participate in both. After enclosures, markets took charge of production and the state took charge of governance. The modern liberal state was born. And while the new order brought about vast improvements in material production, those gains came at a terrible cost: dissolution of communities, deep economic inequality, an erosion of self-governance and a loss of social solidarity and identity. Governance became a matter of government, the province of professional politicians, lawyers, bureaucrats and monied special interest lobbies. Democratic participation became mostly a matter of voting, a right limited to men (and at first, property owners). Enclosure also isolated people from direct encounters with the natural world and marginalized social and spiritual life. During the course of a hundred and fifty years, from the late 1600s to the mid-1800s, about one-seventh of all English common land was carved up and privatized. As a result, deep inequalities took root in society and urban poverty soared. The foundations of the modern market order were being laid, and the masters of this new world had no need for the commons. The hallmarks of the new order would be individualism, private property and “free markets.” Karl Polanyi was an economic historian who… 667

David Johnson’s claim that law amounts to a “self-referential, organizational identity” that belongs to the people who make it. “If law has a life of its own,” he writes, “and in some sense causes its own form of order and persistence, we should be studying its biography rather than pretending that we can design and repair its mechanisms from the outside.” In other words, we must understand the subjective, socially internal dynamics of commons and recognize that this is where law originates. When law is seen in this perspective — not just as a series of formal constitutions and statutes but as a self-organized system that a community creates to manage itself and its resources in orderly fair ways — it is easy to see that the commons itself is a living embodiment of law. It amounts to an evolving social contract. Individuals come together to negotiate the rules and norms that will govern their community. They specify how members may access and use shared resources. They set about making rules for managing land, water, fish and wild game, and for monitoring usage and punishing vandals and free riders. In this broader sense, the law of the commons extends into the mists of time and precedes formal written law by many millennia. 1298

Some of the most astute commentators on these problems are autonomous Marxists such as Massimo De Angelis, editor of The Commoner website; George Caffentzis, founder of the Midnight Notes Collective; Silvia Federici, an historian who concentrates on the feminist implications of the commons; Peter Linebaugh, author of The Magna Carta Manifesto and other histories of English commons; and Michael Hardt and Antonio Negri, the political theorists and authors of Multitude, Empire and Commonwealth. Each in different ways has noted that the core problem of unfettered capitalist markets is their tendency to erode the authentic social connections among people (cooperation, custom, tradition) and to liquidate the organic coherence of society and individual commons. Capital breaks commons into their constituent parts — labor, land, capital, money — and treats them as commodities whose value is identical with their price. This has caused a persistent moral and political crisis because market capitalism cannot answer the questions, What can bind people together beyond the minimal social and civic ties needed to participate in market exchanges? Can a market-based society survive without the commons? 1418

How we define property rights matters because they influence the sorts of personal and social entitlements we may enjoy, affect the kind of social relations we will have and have enormous effects on our sense of well-being (or alienation). In a much-quoted definition, the eighteenth-century jurist William Blackstone described property rights as “the sole and despotic dominion which one man claims and exercises over the external things of the world, in total exclusion of the right of any other individual in the universe.” He implied that property rights belong solely to individuals. But of course property need not be defined this way. As the cruise ship passengers showed, they could choose to exercise temporary individual “use rights” to the same resource instead of exclusive possession. (To be technical, the cruise ship owner is arguably the “owner” of the deck chairs, but the passengers possess them for limited periods of time and in this case are free to set their own rules.) Different property rights schemes have very different implications for how people’s needs are met (or not met). Such choices influence the nature of the social order and the general attitudes among people. This may be the real point of the allegory of the deck chairs — that property rights are more malleable than most people suspect; that their design can be altered; and that such choices have far-reaching effects on how we relate to each other and how we use resources. 1444

People like to think of property as a fairly self-evident category. By default they tend to see it as a private right to exercise exclusive control over physical objects such as land, cars and smartphones. A landowner typically sees his plot of land as a fixed, individual parcel of inert soil over which he may do whatever he wants. But the conceit that “property” has no social or ecological implications is a fantasy of modern life. In reality, a piece of land is a living part of a living ecosystem. Even as a commodity, its value is dependent upon the character of adjacent pieces of land and the larger ecosystem. A country home with sweeping views of the surrounding countryside alive with chirping birds and friendly neighbors is more valuable than an identical house located next to a factory and a belching smokestack. In this sense, land is really a fictional commodity, as we have seen. It may be treated as private property, and we maintain the illusion that it is truly self-contained and fungible. But it is not really a bounded unit whose fullest value can be expressed by a price, in isolation from its context. Property is a kind of social fiction — an agreed-upon system for allocating people’s rights to use a resource or exclude access to it. Individual property rights are by no means the only or best way to manage a resource. Land can be well managed as a trust on behalf of the public and future generations. It can be managed through cultural practices and traditions that treat it as a sacred gift of nature, as indigenous peoples often do. Specific and limited rights can be allocated to people in various ways, as farming collectives and conservation easements often do. 1455

The Inalienable Rights of Commoners Property rights do not arise naturally, as the great Digger leader Gerrard Winstanley noted in 1659. They are the result of conquest: “For the power of enclosing land and owning property was brought into the creation by your ancestors by the sword.” 1475

the modern tendency to assert absolute individual property rights is a libertarian fantasy. One person’s property rights invariably end up affecting another person’s property rights; everyone’s freedom cannot be limitless. Indigenous peoples help us see that Western conceptions of property reflect some deep-seated cultural attitudes toward nature and social relationships. We moderns presume that humans can commoditize water, land, genes and other elements of nature as if they are inert objects that can be isolated from their natural context and owned as chattel. 1498

The problem is that dominant market-based forms of law usually privilege individual rights and ignore collective rights and needs. Law does not usually recognize the commons as an institutional form, so it can be difficult to achieve a collective purpose while working within the straitjacket of individual property rights. That’s why protecting commons from enclosures has generally required legal ingenuity, at least within the context of the modern liberal state: the commons exists within a lexical void, rendering it unnamed and inscrutable. It’s important to see 1506

the commons is not simply another variant of property. Its character is quite different. First, the commons is less about ownership as we usually understand it than about stewardship. Ask indigenous peoples if they “own” the land and they will reply that the land owns them. To talk about ownership brings to mind the “sole and despotic dominion” over a resource that Blackstone described. A commons implies a more personal engagement with a resource and a longer-term perspective. It also implies a richer ongoing set of ethical and cultural relationships than private property normally entails. A commons is about the shared management of a resource by many — something that may or may not require formal property law to achieve. 1513

The commons asks us to consider a different paradigm of social and moral order. It asks us to embrace social rules that are compatible with a more cooperative, civic-minded and inclusive set of values, norms and practices. The commons bids us to reject Homo economicus as the default ideal of human behavior. It asks us to entertain the idea that certain rights should be inalienable — that is, not for sale — and to elevate certain social values over private property rights. This is the challenge faced by so much of the human rights movement — to recognize human dignity, respect, social reciprocity and social justice as elemental human needs that law must protect. Traditionally, human rights have been seen as an abstract, universal norm selectively enforced by the nation-state (depending upon political circumstances). The commons proposes a more local, “on the ground” reconceptualization of human rights: a way for communities to meet basic needs more directly and, quite possibly, more reliably. 1533

Locke’s theory of property merits our attention because it still sets the framework for how we see and justify property rights. If the labor that we expend in discovering or improving a piece of land entitles us to own it, then land that is “undeveloped” belongs to no one and is therefore free for the taking. This was a convenient idea for eighteenth-century European explorers eager to seize the riches of the New World. By the logic of Locke’s philosophy, such lands should be considered terra nullius, or empty land (sometimes referred to as res nullius, or a nullity), because land becomes valuable only as individuals apply their labor and ingenuity to it (by improving it, making it marketable, etc.). It is Locke’s conceit that nature is an inert object that can be privately owned without regard for its connections to its existing inhabitants or larger natural ecosystems. Thus even though indigenous peoples and peasants have managed land, water, fisheries, forests and other natural resources as commons from time immemorial — without formal legal titles — Western imperialists have taken comfort in the legal fiction that the land doesn’t belong to anyone — so we can march right in and take it! In this way, Locke’s theory of private property deliberately ignores the prior use rights and customs of indigenous peoples, the rights of future generations and the inherent needs of nature itself. Using Lockean logic, it has become customary to talk about oceans, outer space, biodiversity and the Internet as if they too are resources that belong to no one. The logic of res nullius justifies unchecked private plunder. Tellingly, Locke added a brief qualification to his theory stating that any private appropriation is limited to “at least where there is enough, and as good, left in common for others.” He raises an awkward issue that is too obvious to ignore: the exercise of private property rights may encroach on and even destroy resources that belong to everyone. In other words, there is an unresolved tension between private property and the commons. This “Lockean proviso,” as it is often called, is mostly treated as a symbolic, throwaway gesture, however. Philosophers and legal scholars may invoke it to show their intellectual rigor, but in practice politicians and the investor class don’t care a whit about honoring it. Transnational bottling companies are still sucking groundwater supplies dry without leaving enough, and as good, in common. Agriculture–biotechnology (ag-biotech) companies are still marketing proprietary genetically modified crops that destroy sustainable seed-sharing. Industrial trawlers are still overexploiting ocean fisheries to the point of exhaustion, dispossessing small coastal fishing communities. Whatever one makes of his proviso, Locke’s singular intent was to justify private property, not assure the longevity of the commons. In this tradition, private property laws today continue to ignore or criminalize commoners who use resources in a collective fashion. Nonmarket subsistence commoning is not seen as “adding value” in a Lockean sense; by this logic no one is entitled to property rights protection. This is how the “freedom” of private property is used to dispossess and violate commoners, as seen in the international land grab of customary lands in Africa. It is important to understand the Lockean analysis because it has become the central moral justification of modern capitalism and its enclosures. As a number of commons scholars, such as Wolfgang Hoeschele and Roberto Verzola, have noted, capitalism is about the engineering of scarcity. To maximize profits and market share, businesses deliberately create scarcity by finding novel ways to limit supplies or access to resources. Copyright and patent law, for example, take resources that are cheap and easy to reproduce — information and knowledge — and deliberately give limited-term monopolies to authors and inventors whose creativity is presumed to be wholly novel and original. The ag–biotech industry likes… 1550

The price system typically fails to take account of all sorts of value that are external to the marketplace. For example, price cannot easily represent types of value that are subtle, qualitative, long-term and complicated — precisely the attributes of nature. What’s the market value of the atmosphere? Of a clean river? Of babies born without pollution-induced birth defects? Markets have trouble answering such questions because there is no meaningful market price for such things. Price only measures exchange value, after all; it doesn’t really measure use value. And so the grand narrative of conventional economics celebrates Gross Domestic Product as the height of human progress by totaling the value of all market activity. It doesn’t really care if that activity is beneficial to society or not — in fact, it doesn’t even ask that question! Instead it just measures if money has changed hands, which is its moronic definition of wealth creation. By this reckoning, the Gulf of Mexico oil spill and the Fukushima nuclear disaster should be considered good, because they ended up stimulating economic activity. Ida Kubiszewski, Robert Costanza and a team of other economists vividly demonstrated the shortcomings of GDP in a 2013 study of the net social benefits of economic activity in 17 countries, representing 53 percent of the world’s population. Using a new index, the Genuine Progress Indicator or GPI, they explicitly took into account dozens of factors that GDP ignores, such as negative activities like crime, pollution and social problems as well as positive nonmarket activities such as volunteering and household work. Their conclusion? The economists found that the costs of economic growth globally have outweighed the benefits since 1978! This year was also the point at which the global ecological footprint of human activity exceeded global biocapacity. And despite a three-fold global increase in GDP since 1950, life satisfaction in nearly all 17 countries surveyed had not improved significantly since 1975. John Ruskin called the unmeasured, unintended harms caused by markets “illth.” The problem with the price system, as yoked to private property, is that it generates as much illth as wealth — but hardly any of this illth gets counted. It’s off the books. A company’s bottom line and a nation’s GDP reflect only the monetized wealth generated by markets; they deliberately omit the nonmarket illth. This damage is borne mostly by commons as markets take what they can from nature, for free, without acknowledging its actual value (because nature is seen as res nullius). Once profits have been taken and privatized, the market then dumps its wastes and disruptions back onto the commons, leaving commoners and governments to mop up the mess. As mentioned earlier, this might be called the “tragedy of the market” — the unmetered, hidden subsidies and costly “externalities” that markets, in the service of private property, impose upon the commons. This should not be surprising in a society that looks to price as the highest, most reliable metric of value. If a resource does not have a price or property rights, it naturally will be regarded as “not valuable” or “free for the taking.” No wonder normal market activity frequently rides roughshod over ecological values; nature’s wealth does not come with price tags. 1597

Market externalities are easy to ignore, too, because they tend to be diffused among many people and across large geographic areas. No individual or locality can take effective action against air pollution, say, or against pesticide residues in food. Externalities also tend to lurk on the frontiers of scientific knowledge (Does this vaccine cause autism? Do cell phones cause brain cancer?), which means that identifying and confirming negative externalities can be scientifically difficult. And industries actively resist the scientific verification of harmful externalities, lest unwelcome news triggers angry political responses and costly reparations. For all these reasons, a system of stewardship, not ownership, is more likely to take conscientious precautions to prevent harms. In a commons, the structural pressures to earn money are reduced and the incentives to take into account subtle, long-term factors are greater. As a social institution, a commons is also more likely to care about the long-term sustainability of a resource than a market, because the very identities and cultures of commoners are wrapped up in the management of the resource. Markets tend to care primarily about financial returns, and see everything else (working conditions, product safety, ecological concerns, etc.) as secondary. The basic problem is that the signals communicated by prices are too crude and impersonal to alter management practices. 1623

adequate systems to protect the commons from market encroachments. What steps can commoners take to protect the things they love? This is an urgent issue so long as private property and price are the default definitions of value in public policy, because as we have seen, the price system, however valuable in certain contexts, overrides most ecological, social and moral values. How can commoners assure protection for human dignity and respect, over and above that enabled by private property rights? How can they secure the right to engage in nonmarket social exchange — gift economies, informal collaborations, new forms of collective action — whose value is barely recognized by the modern liberal polity? How can commoners uphold social justice and human rights as inalienable values that may have to trump corporate property rights? 1635

the open educational resources (OER) movement has pioneered the cooperative development of open textbooks, curricula and course materials. 1666

law professor Yochai Benkler in his landmark 2006 book The Wealth of Networks, “is the emergence of more effective collective action practices that are decentralized but do not rely on either the price system or a managerial structure for coordination.” Benkler’s preferred term is “commons-based peer production,” by which he means systems that are collaborative, nonproprietary and based on “sharing resources and outputs among widely distributed, loosely connected individuals who cooperate with each other.” 1683

in 2009 and after, a wide array of open educational resources, or OER, emerged as the next turn of the viral spiral. All levels of education and learning communities — not just scholarly publishing — got wise to the fact that proprietary control of knowledge is antithetical to their core values: to learn and grow through participation and sharing. Academia is a commons. Community colleges were dismayed to learn that many students were dropping out or delaying their educations because they could not afford their textbooks. It is not unusual for textbook publishers to bring out new editions every two or three years simply to make the existing used books “obsolete” and promote new textbook purchases. Some farsighted OA educators have responded by forming the Community College Consortium for Open Educational Resources, which helps identify and publicize open textbooks. Such books are CC-licensed and available for the cost of a print-on-demand copy. This has reduced students’ expenses by hundreds of dollars apiece. The Massachusetts Institute of Technology (MIT) pioneered open educational resources in 2001 when it produced the first major body of curricular materials — syllabi, readings, videos, datasets — for free online use. MIT’s innovation has profoundly influenced the teaching of physics and other scientific fields in China as well as many small countries with isolated rural populations. It has also spawned the OpenCourseWare Consortium, which now has more than 120 member universities and educational institutions worldwide. The viral spiral that started with free software and the CC licenses continues to expand. The very term “open source” has become a widely used cultural meme to celebrate production that is open, participatory, transparent and accountable. Open source principles now animate a robust “open design” movement that invites anyone to help design clothing, furniture, computer components, even automobiles. A group called Arduino now designs and produces scores of printed-circuit boards and computer components, which enable cheap and easy customization by techies. An Open Prosthetics Project invites anyone to contribute to the design of prosthetic limbs — or to the specifications for limbs that ought to be designed even if the designer doesn’t know how to do it herself. Among the designs: prosthetic limbs for rock climbers and a prosthetic arm for fishing. One of the more fascinating open-network projects is Wikispeed, a Seattle-based automotive prototyping and manufacturing start-up project that has collaborators in fifteen countries. Its goal is to use open source principles to design and build a modular, lightweight race car that can travel a hundred miles on a gallon of gasoline. Community networks like Open Source Ecology are now building shareable, low-cost equipment for off-the-grid “resilient communities.” One of its prime projects is the LifeTrac, a low-cost, multipurpose open source tractor whose components are modular, inexpensive and easy to build and maintain. In other words, not complex, expensive or proprietary. Open source design and manufacturing of physical things has reached a large enough scale that the community of innovators have formed their own association, the Open Hardware and Design Alliance. Digital commons now pop up in the most unlikely places. A self-organized group called Crisis Commons is a network of tech volunteers who provide humanitarian aid in response to natural disasters. Following the Haiti earthquake of 2009, thousands of volunteers associated with Crisis Commons swiftly built Web-based translation tools, people finders and maps showing routes to empty hospital beds. There is also a range of what I call “eco-digital commons,” in which Internet technologies are being used to help monitor and manage the environment. Some websites now invite individuals to use mobile phones, motion sensors, GPS tracking and other electronic systems to monitor local sightings of birds, butterflies and invasive species, or to monitor pollution levels in… 1785

corporations only support “sharing” if they can make money from it. That’s not commoning. 1832

subsistence commons, operating outside of market system without private property rights or money, are vitally important to an estimated two billion people worldwide, according to the International Association for the Study of the Commons. 1887

It’s worth emphasizing that subsistence commons vary a great deal and are not without their problems. Many need better management; others are poorly managed and could be improved; still others struggle in unsupportive political environments. Yet they remain an important means of everyday sustenance and dignity that strive to respect ecological limits. That’s an impressive accomplishment that markets and states have trouble emulating. 1889

A South African lawyers’ group called Natural Justice has developed a legal instrument known as “biocultural community protocols” (BCP) that is a novel attempt to protect cultural traditions and practices from appropriation by outsiders. BCPs set forth the specific values and customary procedures that a community has chosen to manage its natural resources. The protocols also spell out the procedural and substantive rights of commoners to participate in decisionmaking, and to demand free, prior and informed consent to specific public policies that might be imposed on them. The BCPs also ensure that people can monitor and evaluate the impact of projects in their community. It is difficult to overgeneralize about indigenous peoples’ commons because they embody so many different types of landscapes, tribal cosmologies and cultural practices. Still, legal scholar Rebecca Tsosie has noted striking similarities among indigenous systems of knowing and interacting with the natural world. Indigenous peoples’ commons tend to reflect “a perception of the earth as an animate being; a belief that humans are in a kinship system with other living things; a perception of the land as essential to the identity of the people; and a concept of reciprocity and balance that extends to relationships among humans, including future generations, and between humans and the natural world.” Indigenous peoples have developed remarkably stable socio-ecological models precisely because they focus on long-term social relationships, not irregular market transactions. Westerners often dismiss indigenous peoples’ commons out of hand because they are not based on strict individualism, private property rights and market notions of “value” (i.e., a price for everything). As N. Bruce Duthu, a leading scholar of Native American law, has written, “The idea of ‘property’ in the Western tradition. . .implies an orientation toward the Market use of resources without special regard for the long-term ecological consequences or the social meanings of nature to people; the price system presumes a basic equivalence among like-priced elements of nature. Societies that have a more direct, subsistence relationship to nature may therefore find property- and market-based sensibilities alien and even offensive.” Not surprisingly, the industrialized nations of the world scoff at Bolivia’s proposal that the United Nations recognize “nature’s rights,” an idea that lies at the heart of so many indigenous peoples’ commons. Honoring “Mother Earth” — as the Pachamama movement in Latin America advocates — is seen by the industrialized world as ridiculous, impractical nonsense, but this prejudice simply illustrates the West’s alarming cultural myopia. It 1913

In gift economies, however, as Lewis Hyde noted in his classic book The Gift, social boundaries are blurred or even eradicated through gift exchange. There is no self-serving calculation of whether the value given and received is strictly equal; the point is to establish ongoing social relationships and sympathies. The subtitle of Hyde’s book — Imagination and the Erotic Life of Property — captures this idea nicely: gifts bring people closer together, especially when the exchange is indirect and staggered over time. So long as gifts continue to circulate among people, without a clear reckoning of what one is “owed,” the social commons thrives. 1956

It is often cheaper, easier and more reliable to coordinate an activity through a trusted community. This is surely one reason that “collaborative consumption” is growing as a new hybrid sector of the market economy; artfully designed Web systems let people coordinate the (cash-based) “sharing” of cars, commuting rides, bikes and tools. One of the more remarkable Internet-based gift economies is CouchSurfing, a free, informal system of overnight hospitality used by travelers (and the people who host them) in more than ninety-seven thousand cities and towns around the world. Cash exchange between host and visitor is explicitly prohibited. CouchSurfing is a vast Wed-mediated gift economy that helps more than five million strangers a year give and receive hospitality in each other’s homes, often forging new friendships in the process. 1963

Cities are an especially fertile environment for social commons because of the great diversity and density of people there. San Francisco has been something of a leader. After a local organization, Shareable magazine, issued a policy paper, “Policies for a Shareable City,” Mayor Ed Lee appointed a Sharing Economy Working Group to explore ways to encourage a “shareable city.” Among the ideas: resource-sharing among citizens (e.g., ride sharing), coproduction assisted by the city government (urban agriculture) and mutual aid among citizens (eldercare). In Naples, Italy, Mayor Luigi de Magistris has appointed an Assessor of the Commons to take account of local commons systems and has rallied municipal officials throughout Italy to improve city government support for local commons. 1971

In Rome, Italy, the former employees of a grand public theater and former opera house, Teatro Valle, took over the premises in 2011 after the city government had failed to support it, and managed it as a self-organized commons. The protest was part of a larger complaint about the government’s failure to maintain civic and recreational spaces even as it privatizes cherished public properties, leading to higher rents and evictions. The occupation of Teatro Valle, still underway, has inspired other citizen groups to mount direct action protests that have reclaimed other buildings and spaces. Instead of simply fighting privatization, aggrieved Romans have come to realize that they need active, ongoing self-governance beyond representative government. 1977

There are other more ambitious initiatives to try to promote social commons in urban areas. Urban designers Nikos A. Salingaros, Federico Mena-Quintero and others are seeking to apply the principles of peer-to-peer production to urban environments. “P2P Urbanism,” as it is called, seeks to make city design and daily life more hospitable to ordinary people. Instead of the dehumanizing monumentalism that “starchitects” have inflicted on many cities, P2P Urbanism proposes collaborative design and user participation in urban planning, drawing upon the wisdom of pattern theory guru and architect Christopher Alexander. The initiative also seeks to make urban design more adaptable to local conditions and individual needs in the style of open source software and peer production. 1983

and accountable to them. However, in contemporary life, commerce is so often integrated with vast national or global markets and driven by the “divine right of capital,” as Marjorie Kelly puts it. Capital-driven markets tend to produce enormous structural disparities of power that disenfranchise consumers, workers and communities. They plunder nature with little concern for the long-term consequences. The good news is that it is becoming easier for many communities to assert greater control over the structure and behavior of markets. For example, community-supported agriculture (CSAs) and local farmers’ markets have a deep stake in their communities. These social relationships and the local accountability of markets mean that a community can meet many needs while avoiding the rapacious ethic of global capitalism. Markets need not be predatory and socially corrosive; they can become socially integrated into a community and made locally responsive. Other examples include cooperatives, the Slow Food movement and mutual businesses (owned by their member-consumers), all of which try, in different ways, to incorporate larger social values with market activity. One of the most successful commons-based business enterprises I have encountered is Cecosesola, the Central Cooperative for Social Services of Lara, in Venezuela. For more than forty years, this self-organized, self-financed project has run over eighty cooperatives — banks, farms, factories — as well as civic associations and organizations. Cecosesola deliberately avoids hierarchical relationships and bosses by moving tasks and production among its 1,200 associate workers. Deliberations take place in assemblies that strive for consensus — a process that requires a great deal of mutual education, communication and dialogue. Prices at Cecosesola’s five local food markets are not based on demand but on “fairness.” All vegetables are sold at the same price per kilo, for example. Cultivating trust, commitment to the common good and the courage to take risks — all within a flexible, evolving organizational structure — lie at the heart of Cecosesola’s improbable success. 1998

The trick in melding commons and markets, to my mind, consists in nourishing a distinct culture of commoning while devising “defensible boundaries” around the commons so that it can maintain its basic autonomy. In medieval times, commoners would often “beat the bounds” — walk the perimeter of their forest or piece of communal land — as part of an annual community celebration that doubled as an occasion to patrol the boundaries of their commons. If they came upon a private fence or hedge that had enclosed the commons, the commoners would knock it down, re-establishing the integrity of their land. Community enforcement of the “perimeters” of commons is essential. Our task today is to devise modern-day equivalents of beating the bounds. Two successful examples in cyberspace are the General Public License for software and the Creative Commons licenses. Both ensure that commoners can retain control over the fruits of their shared labors by prohibiting private appropriations of code and digital content, respectively. The biocultural protocols developed by the South African advocacy group Natural Justice have a similar purpose — to prevent transnational corporations from appropriating the specialized ethnobotanical knowledge and agro-ecological practices of indigenous peoples. Commoners today “beat the bounds” when they devise formal rules and ethical norms as ways to preserve their commons. The elaborate governance rules of Wikipedia editing, the customs that Maine lobster fishers have negotiated among themselves, the rules for New Mexican acequias — all have the goal of preserving the resource and the community while excluding outsiders who have not invested their energies in cultivating the commons or who may act as vandals or free riders. Equipped with self-devised rules and governance systems, commoners achieve something else as well: they can pressure markets to be more responsive to the consumers who must rely upon them. One might call these “commons-based markets” — coherent communities with enough power to influence and tame markets. Such markets are more prevalent on the Internet, where social communities (or loose networks) can self-organize as passionate affinity groups before turning to markets to meet certain needs. 2014

commons from capitalist exploitation. How can the commons be structured so that its logic is decoupled from that of capitalist markets — and yet still be able to interact with markets as needed? For my colleague Silke Helfrich, the key is to ensure that a commons has the capacity to protect and reproduce itself. The commons must have within its very structure the capacity to assure its own longevity and self-protection. It must be able to protect its resources and community norms. This could be achieved through legal rules that prevent outsider appropriation or interventions. It could be achieved through social practices and norms that constitute commons governance. It could be achieved through geographic isolation from markets or through technological barriers (fences around a resource; digital “gates” for authorized commoners). Without such protections, commons are vulnerable to capitalist appropriations, a problem that can be seen in the Google Books Library Project, Facebook and other open platforms. In such situations, commoning becomes another type of “market input” that can be alienated from commoners and privatized. It is therefore important that commons develop the means to protect the fruits of their labor and reproduce themselves and other commons. What we need, says Helfrich, is “a shift from commons-based peer production to commons-creating peer production.” Ultimately, she insists, “the commons is not about organizational form or property rights. It’s about the purpose. If commoning ends with a sale on the market, then what happens to all the other people who have a stake in the process of commons-based production?” “Open” systems give no guarantee that the long-term social or ecological interests of contributors will be respected or protected. State 2038

the U.S. Patent and Trademark Office’s Peer to Patent project, which invites people to submit instances of “prior art” for inventions. This is a way to improve the quality of patents by helping to identify prior innovations that might call into question a patent application claiming ownership of a novel invention. The wiki-style crowdsourcing helps prevent the government from giving out unwarranted patent monopolies that could inhibit future innovation. Given the proper support, citizen-commoners with expertise and interests in given fields could evolve into active constituencies that act as agency watchdogs. They could come up with their own innovations and pressure government agencies to fulfill their missions better. 2084

In our book, Green Governance: Ecological Survival, Human Rights and the Law of the Commons, my colleague Burns H. Weston and I tried to imagine new sorts of minimalist, flexible policy structures that could encourage the work of commons at all levels — local, regional, national, transnational and global. This takes us beyond the state trustee commons to entirely novel modes of state support for commons. The goal is to unleash the great self-reinforcing energies of commons as a valuable form of governance without stifling them through top-down micromanagement or political interference. The design challenge is to find a way to govern CPRs at the lowest levels feasible — a principle often known as “subsidiarity” — and with multiple centers of authority. Levels of commons would be diversified and “nested within” higher levels of governance — the concept of “polycentricity,” an idea that Elinor Ostrom explored in her work. 2094

While skeptics may scoff at such ideas as too speculative and far-fetched for dealing with global environmental problems, it is surely more utopian to think that centralized state institutions of limited competence and declining social trust will be able to force people to adopt changes that the Market/State itself does not really wish to implement in the first place. By contrast, commons have shown their capacity to energize people to take direct responsibility; set limits on market activity; model a new vision of human development; and nurture an ethic of sufficiency. However the new global commons are structured (and this is a longer discussion than we can deal with here), the new state-mediated systems will have to open up new spaces that let commons-based governance flourish. That, at least, is the vision that Burns Weston and I propose. 2101

Government agencies — long accustomed to doling out subsidized assets and infrastructure to voracious corporations — must be structured to act as conscientious and transparent administrative and fiduciary trustees of common assets. Purists may object that government-managed systems for shared resources cannot truly be considered commons. But we should remember that even commons such as open source software or academic research depend upon government and markets in all sorts of indirect ways. Government funding supported the development of the Internet and still funds a vast amount of academic research; most personal computers are still acquired through commercial vendors; and so on. The question is not so much whether markets or governments have some role in commons but rather to what degree and under what terms. The preeminent challenge is to assure the greatest integrity of commons, so that the fruits of commoning are not siphoned away by clever, covetous businesses and governments. For now, the idea of a state-authorized Commons Sector may seem politically quixotic. After all, the state is generally indifferent or hostile to most collective enterprises except corporations. Thus a serious ongoing challenge for commoners is to self-organize themselves into quasi-sovereign collectives — a wiki, a seed-sharing collective, a water commons — committed to building and protecting their various resources and to insisting that the State recognize and respect them. We need new federations within the Commons Sector that can mobilize politically. We must devise legal innovations that can give the commons real standing in law. Until such things are achieved, the empire of capital will continue to impose its suffocating logic as widely as possible. 2119

TO ANDREAS WEBER, a theoretical biologist in Germany, the commons is not simply a matter of public policy or economics. It is an existential condition of life in all its forms, from cellular matter to human beings. “The idea of the commons provides a unifying principle that dissolves the supposed opposition between nature and society/culture,” he writes. “It cancels the separation of the ecological and the social.” According to Weber, the commons provides us with the means to reimagine the universe and our role in it. If we are to truly transform our economic and political systems, Weber argues, then we must also address some unquestioned, deeply embedded premises of those systems. In effect we must reassess the nature of reality itself. As creatures immersed in the liberal political paradigm and the principles of Darwinian evolution, most of us implicitly see life as a fierce, competitive struggle and the economy as a kind of machine in which countless individuals strive to maximize their personal wealth and advantage. Competitive triumph is all. We also see, implicitly, a Newtonian universe in which large abstract forces buffet the inanimate particles of nature. In this view, human consciousness and meaning are insignificant if not moot in the cosmic scheme of things. Our tacit metaphysical commitments, argues Weber, are the very basis for our “free market” economic and political structures. What’s so intriguing is that many scientists are starting to see the natural world and evolution through a different metaphysical prism, one that sees life as a system of cooperative agents constantly striving to build meaningful relationships and exchange “gifts.” Competition still exists, of course, but it is interwoven with deep, stabilizing forms of cooperation. In this new theoretical scheme, the subjective experiences of an organism matter. That’s because, in the emerging scheme of biological thought, all organisms are “meaning-making” living systems. Life is seen as an evolutionary process in which embodied subjects interact with their environment and other living organisms to create meaningful relationships. Subjectivity is not an illusion or an inconsequential side-story, as our existing metaphysics claims; it is not a mere bubble of ephemeral, trivial feelings in an empty universe. Rather, subjectivity is the centerpiece of a new “existential ecology” whose primary concern is subjects, not objects alone. Human beings are not isolated atoms adrift in a vast indifferent universe. Our human subjectivity is not separate from a nature that exists as an alien, unfathomable “other.” The subjective and the objective, the individual and the collective, blur into each other — just as in a commons! Weber, speaking as a scientist, calls his new evidence-based theory “biopoetics.” It is both a metaphysics and a biological theory that can explain “the deep relationship between felt experience and biological principles.” Weber argues that the “science of life” as traditionally studied is no longer an adequate methodology for understanding living things. Conventional science fails to address the realities of consciousness and subjectivity in living organisms; indeed, these topics have been more or less banished from the field of study. But, as Weber writes, “only if we understand organisms as feeling, emotional, sentient systems that interpret their environments and do not slavishly obey stimuli, can we ever expect answers to the great enigmas of life.” For him, biopoetics has the potential to provide “a new holistic account of biology as the interaction of subjects producing and providing meaning and hence laying the ground for understanding the meaningful cosmos of human imagination.” 2139

The commons is central to this vision. Only through commoning do we start to reintegrate ourselves with nature and with each other. Our challenge, Weber contends, is to bring about a new “Enlivenment” — a new type of rebirth to succeed the three-hundred-year-old Enlightenment. Our calling is to enact a vision of the universe that honors our subjective identities and need for meaning as biological necessities. We can do this by engaging in “the rituals and idiosyncrasies of mediating, cooperating, sanctioning, negotiating and agreeing, to the burdens and the joy of experienced reality,” says Weber. “It is here where the practice of the commons reveals itself as nothing less than the practice of life.” While Weber’s biological theories, like the commons, remain outside of the mainstream, to me they help explain the deep visceral appeal of the commons paradigm. They confirm that the commons is no PR gambit or “messaging” strategy, but rather a prism for seeing the world anew, and more profoundly: in its totality. In all its diversity. With a realistic understanding of humanity as it works on the ground. Weber’s analysis situates the individual as a conscious, subjective agent in the world. It recognizes the role of actual history, local circumstances, culture and individuals in shaping human evolution and in creating commons. To see the commons — to really see the commons — we need to escape the highly reductionist mindset of market-based economics and culture. We have to learn to see that a cooperative logic can animate human institutions, and that, with the right social structures and norms, this humanistic ethic actually works. Market culture has insidiously narrowed our imaginations. By privileging the interests of private property, capital and markets as governing priorities, our very language marginalizes the idea of working together toward common goals. 2167

Taking the commons seriously, however, means changing some of the ways that we see the world. Our choices are not confined to being employees, consumers, entrepreneurs or investors seeking to maximize our personal economic well-being. We can begin to imagine ourselves as commoners. We can begin to become protagonists in our lives, applying our own considerable talents, aspirations and responsibilities to real-life problems. We can begin to act as if we have inalienable stakes in the world into which we were born. We can assert the human right and capacity to participate in managing resources critical to our lives. 2188

The commons challenges some of the myths that lie at the heart of liberalism, market economics and modernity. It rejects the idea that technological innovation, economic growth and consumerism will inexorably improve our lives if only we try harder and give ourselves more time. As noted earlier, normal economic activity arguably generates as much illth as it does wealth. In this sense, the commons dares to challenge the commodity logic that enshrines price as the supreme arbiter of value and material progress as the linchpin of all progress. Commons scholar James Quilligan helps us understand this when he writes: “The notion of ‘goods and services’ in traditional economics is a reduction of the social relations among individuals — and of the individuals themselves — into commodifiable and fungible things. But a commons-based economics raises the possibility of experiencing value through the practical relationships that arise among individuals, the resources of the world, and that which exists between people and the world” (emphasis in original). 2201

Norms cannot be easily generalized or made universal. This is precisely why it is so difficult to commodify the fruits of the commons without destroying the commons; its value is socially embedded and not readily converted into cash. Monetizing resources in a commons threatens to corrode the social relationships that hold a commons together. As we saw in Chapter 9, indigenous peoples tend to have very different attitudes toward property. When a transnational corporation attempts to patent traditional knowledge or genetic material, they consider such propertization both fatuous and outrageous. No individual can claim to be the sole “author” of collective resources (as copyrights and patents imply) because these resources required generations of stewardship, inherited innovation and culture to develop and refine! No one can appropriate and sell for private gain something entrusted to a commons as a sacred trust. Hence the term “biopiracy.” 2216

Indigenous peoples generally see individuals as nested within a larger network of people; the very idea of the “self-made” person is somewhat ridiculous or even delusional. Not surprisingly, the idea of private property tends to be nonsensical for them because property is not so much a description of a thing as it is a description of social relationships with others. The idea of “sole and despotic dominion” over a resource, as Western law has come to think of property, denies our inescapable dependence on nature and our interdependence on each other. Indigenous people tend to see their resources and knowledge as embedded in a community of reciprocal care and group stewardship. Modern industrial societies presume (incorrectly) that such arrangements are archaic and unnecessary, and that markets can provide what we need. “Monetize the resource and split the income. What could be fairer?” 2229

the “monoculture of knowledge of the 20th Century,” as anthropologist Marianne Maeckelburgh has put it. The knowledge generated by large centralized institutions and disciplines is too brittle, monochromatic and remote from the diverse lived realities of real people. The dominant systems of thought in our time, especially those of bureaucracies, conventional economics and scientific inquiry, have delegitimized vernacular culture — the practice-based ways of knowing and being. We need to understand ourselves as corporeal, situated human beings if we are to surmount our many ecological and social challenges. The loss of diverse languages around the world represents a major setback in humanity’s quest to come to terms with the more-than-human world. Most of Australia’s two hundred and fifty aboriginal languages have disappeared, as have one hundred native languages in the area now known as California. As Daniel Nettle and Suzanne Romaine point out, “the extinction of languages is part of the larger picture of near-total collapse of the worldwide ecosystem.” Native languages represent invaluable storehouses of particularized knowledge, especially about specific ecological systems. “Every language is an old-growth forest of the mind,” as ethnobotanist Wade Davis memorably puts it. 2238

Maeckelburgh has studied a range of activist and networked communities to identify the “alternative ways of knowing” that self-organized communities are developing. This “knowledge is collectively constructed,” she notes. It is “context-specific, partial and provisional.” And it makes a distinction “between knowing something and knowing better. At the heart of the struggle for self-determination, then, is what anthropologist Arturo Escobar calls “a micro-politics for the production of local knowledge. . . . This micro-politics consists of practices of mixing, re-using, and re-combining of knowledge and information.” Commoners rarely presume that there is a fixed body of canonical knowledge whose authority must be respected. They create their own (situational) types of knowledge through engagement with each other and their common resources. Why should some abstract, self-serving bureaucratic or economic framework automatically prevail when local expertise and experience-rich traditions may be more trustworthy, responsive and practical? 2250

humanity and ecological responsibility. Wendell Berry, the poet and ecologist, has put it this way: “Only the purpose of a coherent community, fully alive both in the world and in the minds of its members, can carry us beyond fragmentation, contradiction, and negativity, teaching us to preserve, not in opposition but in affirmation and affection, all things needful to make us glad to live.” Or as Berry said on another occasion, quoting Alexander Pope, “Consult the genius of the place in all.” This approach resonates so deeply with commoners because global commerce has diminished so much that was once distinctive and fecund about individual places. 2262

Wendell Berry said it well: “The great enemy of freedom is the alignment of political power with wealth. This alignment destroys the commonwealth — that is, the natural wealth of localities and the local economies of household, neighborhood, and community — and so destroys democracy, of which the commonwealth is the foundation and practical means.” We should not romanticize the local as an easy or automatic solution to the problems caused by global markets, however. The need for responsive “top-down” structures remains. Some collective-action problems can only be solved with appropriate high-level policies or infrastructures. Centralized bodies are often needed to assure a rough equality of opportunity and resources, or to oversee redistributions of wealth. It doesn’t make sense for every community to replicate functions that might be performed effectively (and without harmful externalities) at a state or national level, or even by larger markets. On the other hand, a certain redundancy and inefficiency are essential to a system’s long-term resilience. For the time being, however, we don’t really have a rich typology of larger-scale commons infrastructures. We don’t really know how to design or build them. Such functions are usually considered the province of government. But I think it is time for commoners themselves to imagine how infrastructures and large governing protocols should be engineered. This could be politically difficult. Governments are jealous of their sovereignty and are not generally predisposed to understand and support commons. The idea of letting bottom-up, network-driven decisions emerge and prevail is threatening to traditional institutions of control. Yet that may be the only way that the energy, imagination and social legitimacy of commoners will be available to solve our myriad problems. We’ve already seen in countless ecological and social crises that the state and market, as constituted, are not up to the job. Let’s begin to acknowledge this simple fact. 2277

Commons-based models are not just “policy mechanisms” that are inserted into a situation to “solve” a problem; they generally embody a very different vision of life than that of Western industrialization and consumerism. 2307

In Ecuador and Bolivia, buen vivir — “good living” — is a discourse that attempts to name a different development vision and way of being in the world. Buen vivir honors the ideas of community autonomy, social reciprocity, respect for natural ecosystems and a cosmic morality. In various ways, indigenous peoples, traditional cultures and commoners caught up in market systems are trying to express a worldview beyond the rational instrumentalism and economic mentality of market capitalism. In this sense, the commons is not just about managing resources; it’s an ethic and inner sensibility. This inner conviction ultimately empowers people to take responsibility for the Earth’s resources and to nourish their own sense of stewardship. People discover that it is not only personally enlivening and culturally wholesome to participate in a commons; it is a way to encourage people to set and enforce sustainable limits on markets. Commoning provides a credible alternative to the growth- and consumer-based visions of development peddled by the World Bank. It provides a path for reducing inequality and insecurity in marginalized nations while highlighting the vital role of local ecosystems and commons-based governance. 2308

The basic problem is that the state has strong incentives to ally itself with market forces in order to advance the privatization and commodification of public resources. Enclosures + economic growth = power and tax revenues. To disrupt this logic, we must reconceptualize the role of the State so that it acts to authorize and support commons-based provisioning. As Professor Burns Weston and I explain in our book Green Governance, political pressure must be brought to bear on states to recognize a number of “macro-principles and policies” to support the commons. These include recognition of:        commons- and rights-based ecological governance as a practical alternative to the state and market;        the principle that the Earth belongs to all;        a state duty to prevent enclosures of commons resources;        state trustee commons as a way to protect large-scale common-pool resources;        state chartering of commons;        legal limitations on private property as needed to ensure the long-term viability of ecological systems; and        the human right to establish and maintain ecological commons. 2322

The commons, to the extent it is considered at all, is often equated with “the citizenry” or “the public,” and not with distinct communities of commoners. It may take some cultural imagination, therefore, to entertain the idea of the commons as an independent sector separate from the State, with its own moral compass and political identity. There are legitimate policy questions about how national and provincial governments can formally recognize the commons in law. It is not self-evident how the State could assure that local commons, absent intervention, would not abuse their authority or the environment, or discriminate unfairly against some people. These are serious questions, but I do not consider them insuperable. After all, the State has delegated considerable authority to corporations to perform certain functions while retaining ongoing oversight. If the State can charter corporations as a vehicle for serving the public good, in principle it ought to be able to delegate similar authority to commons. Diverse sorts of commons demonstrably serve the public good every bit as much as State-chartered corporations do (and at far less cost to the environment and public resources). And properly structured commons are generally more responsive than legislatures and State bureaucracies, which tend to be geographically remote, inaccessible to the layperson and heavily influenced by monied special interests. 2346

law. The State could and should do more to recognize the authority of commons as vehicles for serving the public interest. But calibrating the level of State involvement is tricky. It is important that the State not become too involved in overseeing the commons lest it overwhelm the will of commoners to manage things themselves, which is the very point. Yet the State should not simply use the existence of commons to shirk its own responsibilities by withdrawing legal, administrative or financial support for them. This is a criticism made of UK Prime Minister David Cameron’s “Big Society” policy gambit, which has celebrated community control while cutting public funding to assist it. 2366

As I see it, the proper model for State support of commons should be “State policies in the service of commons formation and stewardship.” The State should openly recognize that self-organized commons can perform certain functions more effectively than the State or Market, and with greater perceived legitimacy, fairness and participation. 2371

it is abundantly clear that commoners using digital networks can now amass, organize and deploy knowledge more rapidly and reliably than large centralized bureaucracies (examples abound in the use of wikis, crisis-relief coordination, reporting via social networks and crowdsourcing of research). The real challenge may be how to find new ways for bureaucratic institutions and digital commons to collaborate. Ecosystem resources, too, are often more effectively and responsively managed by local commoners with the direct authority and responsibility to supervise their own forests, fisheries or water systems without outside interference. 2374

Makers and Takers: The Rise of Finance and the Fall of American Business by Rana Foroohar

You have 87 highlighted passages

You have 0 notes

Last annotated on July 13, 2016

Finance holds a disproportionate amount of power in sheer economic terms. (It represents about 7 percent of our economy but takes around 25 percent of all corporate profits, while creating only 4 percent of all jobs.) But its power to shape the thinking and the mind-set of government officials, regulators, CEOs, and even many consumers (who are, of course, brought into the status quo market system via their 401(k) plans) is even more important. This “cognitive capture,” as academics call it, was a crucial reason that the policy decisions taken by the administration post-2008 resulted in large gains for the financial industry but losses for homeowners, small businesses, workers, and consumers. It’s also the reason that the rules of our capitalist system haven’t yet been rewritten in a way that would force the financial markets to do what they were set up to do: support Main Street. As the conversation detailed above shows, when all the people in charge of deciding how market capitalism should operate are themselves beholden to the financial industry, it’s impossible to craft a system that will be fair for everyone. 59

Wage growth is flat. Six out of the top ten fastest-growing job categories pay $15 an hour and workforce participation is as low as it’s been since the late 1970s.4 It used to be that as the fortunes of American companies improved, the fortunes of the average American rose, too. But now something has broken that relationship. That something is Wall Street. 118

the business of America isn’t business anymore. It’s finance. From “activist investors” to investment banks, from management consultants to asset managers, from high-frequency traders to insurance companies, today, financiers dictate terms to American business, rather than the other way around. 127

Wealth creation within the financial markets has become an end in itself, rather than a means to the end of shared economic prosperity. The tail is wagging the dog. Worse, financial thinking has become so ingrained in American business that even our biggest and brightest companies have started to act like banks. 130

They are, in essence, acting like banks, but they aren’t regulated like banks. 136

only the tip of the iceberg. In fact, American firms today make more money than ever before by simply moving money around, getting about five times the revenue from purely financial activities, such as trading, hedging, tax optimizing, and selling financial services, than they did in the immediate post–World War II period.8 147

Eight years on from the financial crisis of 2008, we are finally in a recovery, but it has been the longest and weakest recovery of the postwar era. The reason? Our financial system has stopped serving the real economy and now serves mainly itself, 151

capitalism is sick, and the big-picture symptoms—slower-than-average growth, higher income inequality, stagnant wages, greater market fragility, the inability of many people to afford middle-class basics like a home, retirement, and education—are being felt throughout our entire economy and, indeed, our society. 154

The financialization of America includes everything from the growth in size and scope of finance and financial activity in our economy to the rise of debt-fueled speculation over productive lending, to the ascendancy of shareholder value as a model for corporate governance, to the proliferation of risky, selfish thinking in both our private and public sectors, to the increasing political power of financiers and the CEOs they enrich, to the way in which a “markets know best” ideology remains the status quo, even after it caused the worst financial crisis in seventy-five years. 160

It’s a shift that has even affected our language, our civic life, and our way of relating to one another. We speak about human or social “capital” and securitize everything from education to critical infrastructure to prison terms, a mark of our burgeoning “portfolio society.”9 164

Takers, those that use our dysfunctional market system mainly to enrich themselves rather than society at large. These takers include many (though certainly not all) financiers and financial institutions, as well as misguided leaders in both the private and the public sector, including numerous CEOs, politicians, and regulators who don’t seem to understand how financialization is undermining our economic growth, our social stability, and even our democracy. 170

Today finance engages mostly in alchemy, issuing massive amounts of debt and funneling money to different parts of the financial system itself, rather than investing in Main Street.10 “The trend varies slightly country by country, but the broad direction is clear: across all advanced economies, and the United States and the UK in particular, the role of the capital markets and the banking sector in funding new investment is decreasing. Most of the money in the system is being used for lending against existing assets,” says Adair Turner, former British banking regulator, financial stability expert, and now chairman of the Institute for New Economic Thinking, whose recent book, Between Debt and the Devil, explains the phenomenon in detail.11 In simple terms, what Turner is saying is that rather than funding the new ideas and projects that create jobs and raise wages, finance has shifted its attention to securitizing existing assets (like homes, stocks, bonds, and such), turning them into tradable products that can be spliced and diced and sold as many times as possible—that is, until things blow up, as they did in 2008. Turner estimates that a mere 15 percent of all financial flows now go into projects in the real economy. The rest simply stays inside the financial system, enriching financiers, corporate titans, and the wealthiest fraction of the population, which hold the vast majority of financial assets in the United States and, indeed, the world. 176

the fact that we are in the longest and weakest economic recovery of the post–World War II period, despite the trillions of dollars of monetary and fiscal stimulus that our government has shelled out since 2008, shows that our model is broken. Our ability to offer up the appearance of growth—via low interest rates, more and more consumer credit, tax-deferred debt financing for businesses, and asset bubbles that make us all feel richer than we really are, until they burst—is at an end. What we need isn’t virtual growth fueled by finance, but real, sustainable growth for Main Street. 189

How did this sector, which was once meant to merely facilitate business, manage to get such a stranglehold over it? 197

rapid-fire computerized trading that now makes up about half of all US stock market activity.13 The entire value of the New York Stock Exchange now turns over about once every nineteen months, a rate that has tripled since the 1970s.14 No wonder the size of the securities industry grew fivefold as a share of gross domestic product (GDP) between 1980 and mid-2000s while ordinary bank deposits shrunk from 70 to 50 percent of GDP.15 217

With the rise of the securities and trading portion of the industry came a rise in debt of all kinds, public and private. Debt is the lifeblood of finance; it is where the financial industry makes its money. At the same time, a broad range of academic research shows that rising debt and credit levels stoke financial instability.16 And yet, as finance has captured a greater and greater piece of the national pie—its share of the US economy has tripled in the postwar era17—it has, perversely, all but ensured that debt is indispensable to maintaining any growth at all in an advanced economy like the United States, where 70 percent of output is consumer spending. Stagnating wages and historically low economic growth can’t do the trick, so debt-fueled finance becomes a saccharine substitute for the real thing, an addiction that just gets worse and worse.18 As the economist Raghuram Rajan, one of the most prescient seers of the 2008 financial crisis, argued in his book Fault Lines, credit has become a palliative to address the deeper anxieties of downward mobility in the middle class. As he puts it sharply, “let them eat credit” could well summarize the mantra of the go-go years before the economic meltdown.19 222

financial fees are rising, even as financial efficiency falls.23 So much for efficient markets.24 STEALING THE SEED CORN OF THE FUTURE But as credit and fees have risen inexorably, lending to business—and in particular small business—has come down over time. Back in the early 1980s, when financialization began to gain steam, commercial banks in the United States provided almost as much in loans to industrial and commercial enterprises as they did in real estate and consumer loans; that ratio stood at 80 percent. By the end of the 1990s, the ratio fell to 52 percent, and by 2005, it was only 28 percent.25 Lending to small business has fallen particularly sharply,26 as has the number of start-up firms themselves. In the early 1980s, new companies made up half of all US businesses. By 2011, they were just a third,27 a trend that numerous academics and even many investors and businesspeople have linked to the financial industry’s change in focus from lending to speculation.28 The wane in entrepreneurship means less economic vibrancy, given that new businesses are the nation’s foremost source of job creation and GDP growth. As Warren Buffett once summed it up to me in his folksy way, “You’ve now got a body of people who’ve decided they’d rather go to the casino than the restaurant” of capitalism. In lobbying for short-term share-boosting management, finance is also largely responsible for the drastic cutback in research and development outlays in corporate America, investments that are the seed corn for the future. Indeed, if you chart the rise in money spent on share buybacks and the fall in corporate spending on productive investments like R&D, the two lines make a perfect X.29 The former has been going up since the 1980s, with S&P 500 firms now spending $1 trillion a year on buybacks and dividends—equal to more than 95 percent of their net earnings—rather than investing that money back in research, product development, or anything that could contribute to long-term company growth. Indeed, long-term investment has fallen precipitously over the past half century. In the 1950s, companies routinely set aside 5–6 percent of profits for research. Only a handful of firms do so today. Analysis funded by the Roosevelt Institute, for example, shows that the relationship between cash flow and corporate investment began to fall apart in the 1980s, as the financial markets really took off.30 240

We have made a Faustian bargain, in which we depend on the markets for wealth and thus don’t look too closely at how the sausage gets made. 281

The number of new initial public offerings (IPOs) is about a third of what it was twenty years ago. Part of this is about the end of the unsustainable, Wall Street–driven tech stock boom of the 1990s. But another reason is that firms simply don’t want to go public. That’s because an IPO today is likely to mark not the beginning of a new company’s greatness, but the end of it. According to a Stanford University study, innovation tails off by 40 percent at tech companies after they go public, often because of Wall Street pressure to keep jacking up the stock price, even if it means curbing the entrepreneurial verve that made the company hot in the first place.35 284

In the first half of 2015, the United States boasted $81.7 trillion worth of financial assets—more than the combined total of the next three countries (China, Japan, and the United Kingdom).38 We are at the forefront of financialization; our financiers and politicians like to brag that America has the world’s broadest and deepest capital markets. But contrary to the conventional wisdom of the last several decades, that isn’t a good thing.39 All this finance has not made us more prosperous. Instead, it has deepened inequality and ushered in more financial crises, which destroy massive amounts of economic value each time they happen. Far from being a help to our economy, finance has become a hindrance. More finance isn’t increasing our economic growth—it is slowing it.40 Indeed, studies show that countries with large and quickly growing financial systems tend to exhibit weaker productivity growth.41 That’s a huge problem, given that productivity and demographics together are basically the recipe for economic progress. One influential paper published by the Bank for International Settlements (BIS) put the issue in quite visceral terms, asking whether a “bloated financial system” was like “a person who eats too much,” slowing down the rest of the economy. The answer is yes—and in fact, finance starts having these kinds of adverse effects when it’s only half of its current size in the United States.42 Other reports by groups like the Organisation for Economic Co-operation and Development (OECD)43 and the International Monetary Fund (IMF)44 have come to a similar conclusion: the industry that was supposed to grease the wheels of growth has instead become a headwind to it. Part of this adverse impact stems from the decrease in entrepreneurship and economic vibrancy that has gone hand in hand with the growth of finance. Another part is about the mounting monopoly power of large banks, whose share of all banking assets has more than tripled since the early 1970s. (America’s five largest banks now make up half its commercial banking industry.)45 300

That growing dominance means that financial institutions can increasingly funnel money where they like, which tends to be toward debt and speculation, rather than productive investment on which it takes longer to reap a profit. Power—in terms of both size and influence—is also the reason the financial sector’s lobby is so effective. Finance regularly outspends every other industry on lobbying efforts in Washington, D.C.,46 which has enabled it to turn back key areas of regulation (remember the trading loopholes pushed into the federal spending bill by the banking industry in 2014?) and change our tax and legal codes at will. 321

changes, our economy is gradually becoming “a zero-sum game between financial wealth-holders and the rest of America,” says former Goldman Sachs banker Wallace Turbeville, who runs a multiyear project on financialization at the nonprofit think tank Demos.48 328

Indeed, one of the most pernicious effects of the rise of finance has been the growth of massive inequality, the likes of which haven’t been seen since the Gilded Age. The two trends have in fact moved in sync. Financial sector wages—an easy way to track the two variables’ relationship—were high relative to everyone else’s in the run-up to the market crash of 1929, then fell precipitously after banking was reregulated in the 1930s, and then grew wildly from the 1980s onward as finance was once again unleashed.49 The share of financiers within the top 1 percent of the income distribution nearly doubled between 1979 and 2005.50 331

gap. Financiers and the corporate supermanagers whom they enrich represent a growing percentage of the nation’s elite precisely because they control the most financial resources. These assets (stocks, bonds, and such) are the dominant form of wealth for the most privileged,51 which actually creates a snowball effect of inequality. 338

tome, Capital in the Twenty-First Century, the returns on financial assets greatly outweigh those from income earned the old-fashioned way: by working for wages.52 Even when you consider the salaries of the modern economy’s supermanagers—the CEOs, bankers, accountants, agents, consultants, and lawyers that groups like Occupy Wall Street rail against—it’s important to remember that somewhere between 30 and 80 percent of their income is awarded not in cash but in incentive stock options and stock shares. This type of income is taxed at a much lower rate than what most of us pay on our regular paychecks, thanks to finance-friendly shifts in tax policy in the past thirty-plus years. That means the composition of supermanager pay has the effect of dramatically reducing the public sector take of the national wealth pie (and thus the government’s ability to shore up the poor and middle classes) while widening the income gap in the economy as a whole. The top twenty-five hedge fund managers in America make more than all the country’s kindergarten teachers combined, a statistic that, as much as any, reflects the skewed resource allocation that is part and parcel of financialization.53 342

wealth built on financial markets is “more abstracted from the real world” and thus more volatile, contributing to a cycle of booms and busts (which of course hurt the poor more than any other group).56 As Piketty’s work so clearly shows, in the absence of some change-making event, like a war or a severe depression that destroys financial asset value, financialization ensures that the rich really do get richer—a lot richer—while the rest become worse off. That’s bad not only for those at the bottom, but for all of us. Research proves that more inequality leads to poorer health outcomes, lower levels of trust, more violent crime, and less social mobility—all of the things that can make a society unstable.57 As Piketty told me during an interview in 2014, there’s “no algorithm” to predict when revolutions happen, but if current trends continue, the consequences for society in terms of social unrest and economic upheaval could be “terrifying.”58 359

the depth and breadth of correlations between the rise of finance and the growth of inequality, the fall in new businesses, wage stagnation, and political dysfunction strongly suggest that finance is not just pulling ahead, but is also actively depressing the real economy. On top of this, it’s quantitatively increasing market volatility and risk of the sort that wiped out $16 trillion in household wealth during the Great Recession.59 Evidence shows that the number of wealth-destroying financial crises has risen in tandem with financial sector growth over the last several decades. In their book This Time Is Different: Eight Centuries of Financial Folly, academics Carmen Reinhart and Kenneth Rogoff describe how the proportion of the world affected by banking crises (weighed by countries’ share of global GDP) rose from some 7.5 percent in 1971 to 11 percent in 1980 and to 32 percent in 2007. 371

In the period following the Great Depression, banking was a cornerstone of American prosperity. Back then, banks built the companies that created the products that kept the economy going. If you had some initiative and a great idea, you went to a bank, and the bank checked out your business plan, tracked your credit record, and, with any luck, helped you build your dream. Banks funded America—that’s what we grew up to believe. And that’s what we were told in 2008, when our government pledged some $700 billion of taxpayer money (enough to rebuild the entire Interstate Highway System from scratch and then some) to bail out the American financial system. The resultant Troubled Asset Relief Program, or TARP, was meant to quell the subprime mortgage crisis, brought on, of course, by colossal malfeasance by some of the very banks being saved. But, no surprise, that hasn’t fixed the problem. Wall Street is not only back, but bigger than it was before. The ten largest banks in the country now make up a greater percentage of the financial industry and hold more assets than they did in 2007, nearly two-thirds as much as the entire $18-trillion US economy itself. Main Street, meanwhile, continues to struggle. 385

Financialization is behind the shifts in our retirement system and tax code that have given banks ever more money to play with, and the rise of high-speed trading that has allowed more and more risk and leverage in the system to serve up huge profits to a privileged few. It is behind the destructive deregulation of the 1980s and 1990s, and the failure to reregulate the banking sector properly after the financial crisis of 2008. Individuals from J.P. Morgan and Goldman Sachs may (or, more often, may not) go to jail for reckless trading, but the system that permitted their malfeasance remains in place. The problems are so blatant, in fact, that even a number of Too Big to Fail bankers themselves, including former Citigroup chairman Sandy Weill, have admitted that the system is unsafe, that finance needs much stricter reregulation, and that big banks should be broken up. 422

Not only are many regulators disinclined to police the industry, but they are also woefully underpaid, understaffed, and underfunded. Consider the Commodity Futures Trading Commission (CFTC), which has about the same staff size today as it did in the 1990s, despite the fact that the swaps market it oversees has ballooned to more than $400 trillion.68 It’s not easy for regulators on five-figure salaries, with modest research budgets and enforcement assets, to stay ahead of the algorithmic misdeeds of traders making seven figures. And that’s a shame, because a 2015 survey of hundreds of high-level financial professionals found that more than a third had witnessed instances of malfeasance at their own firms and 38 percent disagreed that the industry puts a client’s best interests first.69 441

(The neuroscience of traders’ brains, which respond to deal making similarly to how addicts’ brains respond to cocaine, is in itself a fascinating area of scholarly inquiry.)70 Other academics, like University of Michigan scholar Gerald Davis, focus on the importance of new management theories such as our notion of shareholder value that puts the investor before everyone and everything else in society, including customers, employees, and the public good. 452

anthropological research that explores the way in which Wall Street culture has come to dominate society and the economy, providing yet another theater for financialization. The anthropologist Karen Ho’s book Liquidated: An Ethnography of Wall Street, for example, looks at how Wall Street’s own labor practices, characterized by volatility and insecurity, have become status quo for the rest of the country.73 “In many ways investment bankers and how they approach work became a model for how work should be conducted. Wall Street shapes not just the stock market but also the very nature of employment and what kinds of workers are valued,” says Ho, who worked in banking before becoming an academic. “What [Wall Street values] is not worker stability but constant market simultaneity. If mortgages aren’t the best thing, it’s, ‘Let’s get rid of the mortgage desk and we’ll hire them back in a year.’ People [in finance are] working a hundred hours a week, but constantly talking about job insecurity. Wall Street bankers understand that they are liquid people.”74 Now, as a consequence, so do we all. Moreover, financialization has bred a business culture built around MBAs rather than engineers and entrepreneurs. Because Wall Street salaries are 70 percent higher on average than in any other industry, many of the best minds are drawn into its ranks and away from anything more useful to society.75 468

reforming business education, which is still filled with academics who resist challenges to the false gospel of efficient markets in the same way that medieval clergy dismissed scientific evidence that challenged the existence of God. It’s about changing a tax system that treats one-year investment gains the same as longer-term ones and induces financial institutions to push overconsumption and speculation, rather than healthy lending to small businesses and job creators. It’s about rethinking retirement, crafting smarter housing policy, and restraining a money culture filled with lobbyists who violate the essential principles of democracy. This book is about connecting those 514

the vast majority of bankers, businesspeople, and economic policy makers aren’t venal—far from it. They are simply part of a very large, complex, and (for them) lucrative system, one that has unfortunately become so dysfunctional that it actively prevents us from making the best and fairest use of our nation’s resources. 528

as the banks got bailed out and swiftly recovered, things in the real economy grew worse. Bank profits reached record heights, yet loans to businesses and consumers shrank. Corporate earnings were high, yet few companies wanted to invest their cash in Main Street. Instead, managers beholden to the markets disgorged it mainly to rich investors and Wall Street.5 Meanwhile, America’s largest financial institutions remained as focused as ever on securities trading, the “casino” part of the banking business, since there was no reason not to be. Regulators had yet—and still have yet—to prohibit bankers from eschewing this more profitable type of business in favor of boring, old-fashioned lending. The very riskiest portion of the markets, derivatives trading, actually grew following the crisis. Globally, it was 20 percent bigger in late 2013 than in late 2007 (and US regulators are trying to police it with budgets that haven’t increased much since then).6 And that’s just what we can see. 599

The great liberal economist John Maynard Keynes, for one, worried that market capitalism might be able to function quite well without actually employing many people, particularly if money went to speculation rather than productive investment. (He called on the government to boost long-term investment through special incentives.) Other thinkers, like Hyman Minsky, Harry Magdoff, and Paul Sweezy, took that idea further, arguing that finance itself creates bubbles and draws money away from the real economy as a matter of course. As Minsky put it, “capitalism is a flawed system in that, if its development is not constrained, it will lead to periodic deep depressions and the perpetuation of poverty.”9 He also believed that the government would be forced to act as a lender of last resort during such periods, a position that would become untenable as public debt levels rose, leading to more public pressure to allow more speculation, which would unleash renewed instability, and so on. This story of a “symbiotic embrace” between finance and underlying economic malaise, one that the markets can’t stave off forever, finds resonance in the fact that every recovery of the post–World War II period has been longer and weaker than the one before.10 628

2015 paper by BIS senior economist Enisse Kharroubi and Brandeis University professor Stephen Cecchetti, who examined how finance affected growth in fifteen countries. They found that productivity—the value that each worker creates in the economy, which, along with demographics, is basically the driver of economic progress—declines in markets with rapidly expanding financial sectors. What’s more, the industries most likely to suffer are those, like advanced manufacturing, that are most critical for long-term growth and jobs. That’s because finance would rather invest in areas like real estate and construction, which are far less productive but offer quicker, more reliable short-term gains (as well as collateral that can be sold in crisis or securitized in boom times).11 No wonder twin booms in credit and real estate were a defining characteristic of many economies worst hit by the 2008 financial crisis.12 Government has a huge role to play in all this. Deregulation from the 1970s onward encouraged banks to move away from their traditional role of enabling investment, and toward embracing speculation. It also paved the way to the so-called shareholder revolution, which enriched investors but pushed corporations into debt and toward short-term decision making. Both trends have redirected capital to less socially useful areas of the economy and created a vicious cycle that’s increasingly difficult to break via the usual methods like monetary policy. Witness the fact that despite the $4.5 trillion the Fed injected into the economy and six years of historically low interest rates, corporations are reinvesting just 1–2 percent of their assets into Main Street.13 Much of the rest is going straight into the pockets of the richest 10 percent of the population—mostly in the form of rising asset prices—and those people are unlikely to spend as much of it as the middle and working classes would. 640

The financial industry is the world’s ultimate power and information hub, the tiny middle portion of an hourglass that represents the larger global economy. All the money in the world, and all the information about who’s making and taking it, passes through that tiny middle. Financiers sit in what is the most privileged position, extracting whatever rent they like for passage. It’s telling that technology, which usually decreases industries’ operating costs, has failed to deflate the costs of financial intermediation. Indeed, finance has become more costly and less efficient as an industry as it deployed new and more advanced tools over time.16 It’s also telling that during the last few decades financiers have earned three times as much as their peers in other industries with similar education and skills.17 As Thomas Piketty put it in Capital in the Twenty-First Century, financiers are, in some ways, like the landowners of old. Instead of controlling labor, they regulate access to things even more important in the modern economy: capital and information. As a result, they represent the largest single group of the richest and most powerful people on earth. Even more so than Silicon Valley titans or petro-czars, financiers are truly masters of our capitalist universe. 679

For several decades now, “the main function of the financial system with respect to corporate America has not been raising funds for investment, but compelling corporations to ‘disgorge the cash’ in the form of payments to shareholders,” says economist J. W. Mason, who studies financialization at the Roosevelt Institute.20 This shift spells bad news for the average worker. Over the last forty years, as finance has grown, the traditional relationship between productivity and wages has gone out the window. Conventional economic wisdom holds that as productivity grows, so too should wages. But during the time that finance has been ascendant, since the late 1970s on, even as productivity per worker doubled, real wages have stalled.21 719

If you want to understand how an industry that creates only 4 percent of the jobs in this country came to represent 7 percent of the economy and take almost 25 percent of all corporate profits, there’s no better place to start than with the history of Citigroup. 737

Dodge v. Ford not only established a legal justification for shareholders’ rights above anyone else’s; it also set a terrible precedent for labor relations that would haunt American business. It was a precedent that chimed with another major business idea of the era: Taylorism. THE PRINCIPLES OF SCIENTIFIC (MIS)MANAGEMENT Even before Henry Ford was battling the Dodge brothers, Frederick Winslow Taylor, a mechanical engineer from Philadelphia, was gaining fame and fortune for his ideas about how to improve American industry. Those ideas, which came to be known as “efficiency theory” or, as critics put it, “Taylorism,” were laid out in his seminal work, The Principles of Scientific Management, published in 1911. Like the Dodge brothers, Taylor didn’t think much of labor. His theories were built around the notion that workers were a lazy and rather stupid bunch who needed to be managed closely if the American economy was to become more efficient. His book laid out his disdain for labor in ways that are hard to imagine any business leader openly articulating today. “One of the very first requirements for a man who is fit to handle pig iron as a regular occupation is that he shall be so stupid and so phlegmatic that he more nearly resembles in his mental make-up the ox than any other type,” wrote Taylor. “The man who is mentally alert and intelligent is for this very reason entirely unsuited to what would, for him, be the grinding monotony of work of this character. Therefore the workman who is best suited to handling pig iron is unable to understand the real science of doing this class of work.”18 It’s easy to see, in reading this, how Taylor’s ideas were eventually used to justify racist philosophies like eugenics. But around the turn of the century and for the three decades that followed, Taylorism was considered the state of the art in business theory (Taylor himself had a stint teaching the ideas at Harvard Business School). His teachings spread like wildfire in firms eager to become faster, more efficient, and more profitable. Doing so, according to Taylor, involved putting workers in much more rigidly defined boxes, and keeping a tight lid on them. His “time and motion” studies became the basis for new job categorizations in which workers would do one very specific task in a very specific way. Using a stopwatch, Taylor famously stood over factory 1314

large body of research shows, public companies are almost always more conservative in this way than private ones, because the former are under pressure from Wall Street to make quarterly earnings and keep shareholders happy, whether or not that involves decisions that are good for long-term growth. Private companies, for example, invest about twice as much as equivalent public firms do in things like factories, worker training, R&D, and other long-term investments39 (something that belies a claim made by many large public firms today—namely, that a lack of investment into the US economy is the result of high tax rates). 1488

This shift from private to public also engendered shifts in the company’s labor and compensation policies. Ford wanted at all costs to avoid margin-killing strikes, so the firm tended to cut deals with unions that increased pay, while refusing to adopt the more collaborative methods of production that were already being employed in postwar Europe. There, companies like Daimler had adopted a “codetermination” style of management in which labor actually sat on the corporate board and helped make decisions about how the firm was run and how cars were made—a model that ultimately proved more productive and globally competitive. But in the United States, the traditionally Taylorist approach meant management was inclined not to collaborate with labor but to pacify it. Workers got raises but little control, in a Faustian bargain that ultimately backfired decades later as jobs and skills were sent abroad to Asia, where things could be made more cheaply. Adversarial is perhaps the best word to describe relations between management and labor in America, not only at GM but also in most of the auto industry and, indeed, in US corporations as a whole. One of the many reasons that American auto manufacturers are still struggling to implement the sort of collaborative approaches to production that have made some Asian and European companies so successful is that doing so requires a profound mental reset. For example, back in the 1980s and ’90s—when GM and other firms tried to put into place a Japanese-style “andon cord” system that would allow any worker to stop the line if something went wrong—American workers would regularly be yelled at by bosses for actually pulling the cord. They were, some managers thought, just trying to get themselves a free work break.40 The idea that they might take pride in their products and want them to be top-notch seemed an imaginative leap too far. Even when bosses were actually willing to create more collaboration, their efforts were subject to Wall Street approval and the fluctuations of market conditions. Back in 1991, for example, when Robert Stempel, then CEO of GM and a respected “car guy,” tried to roll out lean production methods throughout the firm, he was unable to persuade his board or the analysts on the Street that it was worth the effort. They wanted the company to beat their quarterly numbers now, and so Stempel eventually had to revert to the usual way of doing so. He announced plans to close twenty-one plants and cut 74,000 workers—moves that boosted the company’s stock price but cost it trust with labor, which was of course subsequently less interested in negotiating compromises in compensation in exchange for control over the production process. While there has been some incremental improvement, the basic lack of trust in these relationships, which has been largely broken by financialization, persists to this day. All of it resulted in an increasingly dysfunctional business model, one that discouraged all manner of innovation and long-term thinking. Managers used the fact that they were paying higher salaries to justify cuts in R&D spending. Profits were increasingly bolstered not with truly new products and technologies, but by nipping and tucking costs. “The Ford Motor Company was becoming a stagnant place at which to work,” writes Halberstam in The Reckoning.41 “The impulse of product, to make the best and most modern cars possible, was giving way to the impulse of profit, to maximize the margins and drive both the profit and the stock up. It did not happen overnight. It had begun with McNamara and his systems.” Keeping stock prices up meant keeping costs down, and no one was better at that than the Whiz Kids. Yet even before McNamara was named president of Ford in 1960, his strategy of putting finance ahead of design and engineering was having a major effect on quality. It’s no surprise, then, that the Edsel, the most notorious flop in all of automotive history, happened under the watch of his team. 1495

Hayes and William J. Abernathy, published in 1980, looked at the problem not only in the auto business, but throughout American industry.47 It found that US firms’ research and development spending had been falling since the mid-1960s, even as the percentage of company leaders coming out of finance, relative to any other area, had been increasing. Money spent on mergers amounted to nearly two-thirds of the entire amount of R&D spending by American industry. Companies were hoarding cash rather than investing, and executives spent the majority of their time on “sophisticated and exotic techniques used for managing their cash hoard,” treating “technological matters simply as if they were adjuncts to finance or marketing decisions.” The article, which was entitled “Managing Our Way to Economic Decline,” could have been written today; the only difference would be that the numbers supporting its thesis would be more striking. No wonder it was re-released in 2007 to popular acclaim. The legacy of the Whiz Kids ensured that the top-down, financialized approach to management became the de facto approach at most firms, a misguided practice that still plagues many American companies. This is in large part because the Whiz Kids themselves took over so many top firms in the years following their revamping of Ford. By 1573

It’s about building strong teams, rather than manufacturing balance sheets that look good to investors. Indeed, there’s a large and growing body of research showing that great teams, not all-powerful leaders, are the connective tissue of companies that perform better over the long haul, and that financialization and team building are antithetical.51 1613

hourly employee, who was wearing a World War II–era “We Can Do It” T-shirt, 1619

It’s just one of many examples of how business education doesn’t prepare our future business leaders for the reality of what happens in actual firms and in the capital markets. Instead of turning the finest business minds into innovators and job creators, it’s turning the people who will run the next generation of American businesses into glorified number crunchers. Business education, it turns out, is failing business. 1688

As Nitin Nohria, dean of the Harvard Business School, admits, “anyone can teach you how to read a P&L [profit-and-loss statement] or value a derivative; those kinds of things have become commoditized.”12 The bigger challenge is to teach America’s future business leaders how to be curious, humane, and moral; how to think outside the box about problems like funding the research for a new blockbuster drug. And how to be strong enough to stand up to Wall Street when it demands the opposite. 1713

“The premise of financial theory [taught in MBA programs] is bogus,” says Robert Johnson, an economist and former quantitative trader for George Soros’s Quantum fund who now heads the Institute for New Economic Thinking, an influential group that, among other things, is trying to broaden the nature of economics and business education. “That’s why we end up living with very thin margins of safety—because of the pretense of knowledge and precision about the future which does not exist.”15 1729

I knew any better, kind of like sailors get tattoos,” jokes former GM vice chairman Bob Lutz, whose book Car Guys vs. Bean Counters decries the rise of the MBAs. The problem with business education, according to him, is that students are taught not what happens in real business—which tends to be unpredictable and messy—but a series of techniques and questions that should take them to the right answers, no matter what the problem is. “The techniques, if you read the Harvard Business School cases, they are all about finding efficiencies, cost optimization, reducing your [product] assortment, buying out competitors, improving logistics, getting rid of too many warehouses, or putting in more warehouses. It’s all words, and then there’s a sea of numbers, and you read it all and analyze your way through this batch of charts and numbers, and then you figure out the silver bullet: the problem is X. And you’re then considered brilliant.” The real problem, says Lutz, is that the case studies are static—they don’t reflect the messy, emotional, dynamic world of business as it is. “In these studies, annual sales are never in question. I’ve never seen a Harvard Business School case study that says, ‘Hey, our sales are going down and we don’t know why. Now what?’ ”20 As we read in chapter 2, Lutz believes this kind of approach was one of the things that tanked the American automobile industry and manufacturing in general from the 1970s onward. He’s not alone. Many of America’s iconic business leaders believe an MBA degree makes you less equipped to run a business well for the long term, particularly in high-growth, innovation-driven industries like pharmaceuticals or technology, which depend on leaders who are willing to invest in the future. MBAs are everywhere, yet the industries where you find fewer of them tend to be the most successful. America’s shining technology and innovation hub—Silicon Valley—is relatively light on MBAs and heavy on engineers. MBAs had almost nothing to do with the two major developments in the American business landscape over the last forty years: the Japanese-style quality revolution in manufacturing and the digital revolution.21 1755

Why has business education failed business? Why has it fallen so much in love with finance and the ideas it espouses? It’s a problem with deep roots, which have been spreading for decades. It encompasses issues like the rise of neoliberal economic views as a challenge to the postwar threat of socialism. It’s about an academic inferiority complex that propelled business educators to try to emulate hard sciences like physics rather than take lessons from biology or the humanities. It dovetails with the growth of computing power that enabled complex financial modeling. The bottom line, though, is that far from empowering business, MBA education has fostered the sort of short-term, balance-sheet-oriented thinking that is threatening the economic competitiveness of the country as a whole. If you wonder why most businesses still think of shareholders as their main priority or treat skilled labor as a cost rather than an asset—or why 80 percent of CEOs surveyed in one study said they’d pass up making an investment that would fuel a decade’s worth of innovation if it meant they’d miss a quarter of earnings results22—it’s because that’s exactly what they are being educated to do. 1779

in places like Germany or France today, there was a focus on industry-specific expertise and the practical problems of real firms in the real world. Students had to understand each sector from the ground up, and there was substantial focus on areas like labor relations, government relations, and engineering. Classes on ethics flourished. Joseph Wharton, the Philadelphia industrialist and devout Quaker who founded his namesake school, felt that commerce had a crucial role to play in solving the social problems of the day, namely growing inequality, job disruption, and urbanization. “No country,” he argued, “can afford to have this inherited wealth and capacity wasted for want of that fundamental knowledge which would enable the possessors to employ them with advantage to themselves and to the community.”25 1796

As recently as 1990, the Business Roundtable, a group of CEOs from America’s largest and most powerful companies, said in its mission statement that it was “the directors’ responsibility to carefully weigh the interests of all stakeholders as part of their responsibility to the corporation or to the long-term interests of its shareholders.” Seven years later, though, the group had finally caved, rewriting the statement to say that “the paramount duty of management and of boards of directors is to the corporation’s stockholders; the interests of other stakeholders are relevant as a derivative of the duty to stockholders.”43 Today, whether they believe it or not, it’s rare to find a CEO of a public company who doesn’t publicly buy into the idea of shareholder value. Indeed, the only leaders who can openly question this notion and get away with it tend to be high-profile founder-owners who have a certain cult of personality (Alibaba’s Jack Ma and Starbucks’s Howard Schultz are two who regularly accomplish that feat). 1942

academics, the Center for Higher Ambition Leadership, which aims to find ways to humanize business curriculums and build case studies of how to run more economically inclusive and sustainable businesses. That’s a big deal, because when Bertolini first posed the idea of his wage hike to a group of Harvard Business School professors whom he regularly consulted with, they responded negatively. The CEO, who grew up working class in Detroit and worked a welding line for years before going to college on scholarship, has continued to push forward his agenda within Aetna; in 2015 he gave all his top executives something that’s not yet on the typical MBA reading list—a copy of Thomas Piketty’s Capital in the Twenty-First Century. Companies, says Bertolini, shouldn’t just be moneymaking machines. They also have to invest in people, the real economy, and society as a whole if they want to succeed in the long term. “Capital is the resource that we often manage well, but in my opinion, the scarce resource is a talented and engaged workforce.” Creating that requires thinking bigger, looking at people as assets, not just costs on a balance sheet, and knowing how to think beyond the quarter. “One of my goals as CEO is to help reestablish the credibility of corporate America,” says Bertolini. “That means leaning into the recovering economy and working to bring everyone along, not just a few.” 2118

“The Icahn/Apple situation is a great example of how financial markets are no longer about raising money for investment, but for arbitrage,” the Nobel Prize–winning economist Joseph Stiglitz told me.9 Money is stuck in all the wrong places and flowing to all the wrong people and things. Corporate winners like Apple accumulate vast amounts of cash, yet rather than paying their taxes or giving workers a pay hike (which would also bolster our consumption-oriented economy), they simply turn over cash to investors, who are unlikely to spend it in a way that creates real growth. 2240

Apple and the rest of America’s export-oriented corporate giants may make plenty of cash, but as a group they have created almost no net new jobs since at least 1990, according to an influential study done for the Council on Foreign Relations.12 That is largely due to the fact that despite ebbs and flows in economic growth, corporate earnings, and the credit environment in the United States over the last dozen years, America’s largest firms taken as a whole haven’t invested more than 1–2 percent of their total assets per year into the real economy—real jobs, real goods, real services—over that time.13 This lack of productive investment has nothing to do with bank lending, GDP figures, the rise of China, the failure of Europe, increased government regulation, or partisan politics at home. The biggest economic conundrum of our age—why American companies aren’t investing the $2 trillion in cash they have sitting on their balance sheets (most of which is held overseas) in factories, workers, and wages—turns out to have an easy answer: they are using it to bolster markets and enrich the 1 percent instead. This isn’t just a matter of social justice. You don’t have to view inequality as a moral issue to appreciate that this consolidation of wealth isn’t good for growth—countless data shows that affluent people, like companies, tend to hoard cash (in bank accounts, stocks, bonds, etc.) rather than spend it. When money goes mainly to the 1 percent, it stays in that closed circuit of financial markets that was described earlier. By and large, it doesn’t (despite claims to the contrary) trickle down into the sort of new investments—in businesses, factories, and jobs—that create real economic growth. That’s not the way financial markets were supposed to work. They were supposed to funnel money to new assets and ventures. The great irony of financialization is that it produces bad finance. How did we get to a place in which financial markets have become an insulated system that enriches mainly the wealthy? It’s all part of a shift from a system in which corporations retain their earnings and reinvest to one in which firms distribute profits almost entirely to shareholders and downsize everything else—people, pay, growth-enhancing capital investments, and tax contributions. It’s a shift that is partly enabled by job-displacing technology and globalization but is fundamentally about the pervasiveness of short-term and balance-sheet-oriented thinking throughout the economy. “Financialization has polluted the entire physical investment process, the labor markets, and the innovation cycle of firms,” says Andrew Haldane, the chief economist of the Bank of England and one of the deepest thinkers on the topic of financialization today. “The damage it inflicts on investments in physical and human capital [meaning factories and workers] is hugely important, because that’s what slows down growth.”14 THE RISE OF CREATIVE ACCOUNTING Shareholder activism by people like Carl Icahn and the sort of buybacks being done by Apple and other large public firms are currently one of the best windows into the rise of finance. Back in the 1960s and ’70s, companies invested about 40 percent of each additional earned or borrowed dollar into the real economy.15 All that changed in the Reagan era. “Since the mid-1980s, in aggregate, corporations have funded the stock market rather than vice versa,” says William Lazonick, a University of Massachusetts Lowell professor who has done extensive research on buybacks.16 The legislative change that allowed this destructive shift happened in 1982, which was a crucial year for all kinds of market deregulation. The Supreme Court struck down a key antitakeover law in Illinois—and, by implication, similar laws in all other states. The Justice Department relaxed limits on concentration within industries, making it possible for large, more monopoly-oriented firms to emerge. The floodgates were opened for corporate raiders, 2263

Not only has growth in our economy been higher during times with more regulation, namely the 1950s to the 1970s, but government itself has funded the underlying resources that have allowed private firms like Apple and others to become as profitable as they are. As academics William Lazonick, Mariana Mazzucato, and Oner Tulum have argued in a paper outlining why more of Apple’s cash should go to taxpayers rather than investors, government innovations have been the very fuel of capitalism. “Apple did not have to invent the integrated circuit,” they write. “It did not have to invent the graphical user interface. It did not have to invent the Internet. Moreover, Apple did not have to build universities to educate engineers or roads to allow those graduates engineers to commute to work or the airplanes to carry goods and people around the world. Nor did Apple have to negotiate trade deals with the governments in Japan during the 1980s and in China during the 1990s to ensure access to growth markets for their products.”21 The government did all those things. But it was executives and investors who profited from price hikes in Apple stock. These people are the biggest beneficiaries of the wealth of such corporations, wealth that has been built up by many stakeholders over decades. 2341

the bulk of buybacks since 2001 were done during market peaks, belying the notion that such purchases represent firms’ own belief in a rising share price.25 Why would executives buy back their firms’ stock at such inopportune moments? Many experts believe it’s because buybacks are done at the end of a true growth cycle or, in the case of the most recent boom, at the end of a cycle of easy monetary policy—when the good times are about to end, and buybacks are a way of keeping the party going just a little bit longer. But the buybacks do nothing to help make companies more competitive; in fact, they waste corporate cash, given that they are done when the market was high rather than low. Nonetheless, they do enrich executives, who took from 66 percent to 82 percent of their compensation in stock between 2006 and 2012.26 “It is surely difficult to praise buybacks as being good for shareholders when they are made at such disadvantageous times,” says Andrew Smithers, a British economist and financial consultant. (His book The Road to Recovery makes a convincing case that buybacks and the bonus culture are responsible for slow growth not just in the United States but in many rich countries, because they encourage executives to pay themselves, rather than investing in things that will actually make their companies more profitable.) “Buying overpriced shares is a way of destroying value and spending more money when the market is most overpriced is particularly egregious.”27 What this means on a practical level is 2368

corporations invested at exactly the same rate in the five years after the meltdown as they did in the few years preceding it: around 10 cents of each borrowed dollar. The other 90 cents (which varied in dollar amounts depending on credit and growth conditions) primarily went to shareholder payouts.28 That means that far from funding the economy that you and I live and work in, stock markets now basically fund payouts to the wealthy. This “shareholder revolution,” based on the Chicago School notion that maximizing shareholder value is the purpose of corporate America (as covered in chapter 3), is the single most important reason why high corporate profits and unprecedented cash hoards have failed to translate into jobs, wage growth, and innovation. All of this raises a profound question: What is a company for? We thought that companies, like banks themselves, were supposed to be entities that generated wealth broadly, by allocating capital to productive purposes—investing in people, factories, new ideas, and businesses. But something in that cycle is broken. Today’s companies, like banks, are keeping their wealth in a closed feedback loop where it enriches only a few at the expense of the many. This is a serious dysfunction in our market system, one with multiple growth-destroying effects that we are only just beginning to understand. PRIVATE IS THE NEW PUBLIC One pernicious effect of all this pressure from activist investors is that public markets now are a much less attractive place to raise innovation capital than they used to be, given that they push for short-term gains over long-term objectives. Far from being a place where firms go to fund their best new ideas, markets today have become a place where entrepreneurs and their backers go to cash out via an IPO, or where large public firms manipulate their own stock price via buybacks to please investors. Either way, innovation suffers. For example, Stanford University research shows that tech firms scale back innovation by 40 percent after an IPO, since once a firm goes public, it must focus on pleasing shareholders rather than on investing in the future.29 When it comes to the financial sector’s dampening effect on growth, private and family-owned firms are in many ways the exception that proves the rule. Research shows that privately owned firms invest more than twice as much in the real economy as public firms of similar size operating in the same sector.30 They have also weathered the 2008 crisis much better than many public firms and are now on the upswing rather than the downswing; even as profit margins have flattened or contracted at most big public firms, a 2015 survey of major privately held companies found that 31 percent have raised their margins, and the majority expected to achieve growth rates double the average for their industries in the coming year. 2385

The rich keep their money in banks or in the secondary markets, buying stocks and bonds that already exist, rather than, say, starting new businesses or purchasing new things. The money stays in the financial sector, in other words, instead of being invested in the real economy that we live in. 2568

Yet even as buybacks are likely to slow, mergers and acquisitions are at a record high. Global M&A transactions in early 2015 were worth more than $900 billion, the highest level since 2007.50 Activists love M&A because it always generates quick-hit growth (and bankers love it, too, because it allows them to pocket large fees, both from putting companies together and, later, from breaking them apart if the mergers don’t work). But the game has changed since the infamous wheeling and dealing of the barbarians in the 1980s, evolving to encompass more subtle pressures, too. 2614

Ford, and many other large firms focused on retail and consumer products have created stand-alone lending units. These units “were originally intended to support consumer purchases of their products by offering installment financing but [they] eventually became financial behemoths that overshadowed the manufacturing or retailing activities of the parent firm,” says University of Michigan academic Greta Krippner, whose book Capitalizing on Crisis provides the best quantitative analysis of the shift.6 Krippner tallies the extent to which nonfinancial firms derive revenues from financial investments as opposed to more traditional productive activities (read: making stuff) and finds an alarming trend. While the share of revenues from financial activities vis-à-vis everything else was relatively stable in the 1950s and 1960s, it began to climb in 1970s and then increase sharply over the course of the 1980s. By the late 1980s, the ratio peaked at about five times the levels of the previous postwar decades. After some brief retreats during periods of market boom and bust, the ratio continued rising by the second half of the 1990s and has never stopped.7 Today’s firms represent the apex of this trend. Corporate borrowing is at an all-time high, as are share buybacks, dividend payments, outsourcing, and tax optimizing—all factors that increase the share of financial activities in companies’ revenues. And of course, firms’ investment into jobs, factories, and innovation is near record lows, a decline that has run concurrently to the rise of financial activity within all American businesses.8 Proof of the shift is everywhere. Automakers often generate large chunks of their profits by selling consumers loans to purchase cars, rather than by simply selling the vehicles themselves.9 Energy companies regularly try to boost their profits by speculating in oil futures, a shift that actually undermines their own core business models by creating more volatility in the oil markets. And airlines commonly make more money from hedging on oil prices than on selling airfare, although this strategy can also unexpectedly backfire.10 Indeed, with the 50 percent plunge in oil prices from 2014 to 2015, the airline industry has lost more than $1 billion to bad bets on fuel prices.11 “If I were on the board of a [large airline] I would not hedge oil,” says Warren Buffett, who indeed limited fuel hedging at Burlington Northern Santa Fe railway when he took over the company in 2009. “It doesn’t make any sense to me. If you really think you have an edge in predicting the price of oil futures,” just get into that business, he says. Otherwise, buy your fuel on the spot market, know what your costs will be, and be done with it.12 2777

“The old view is that if you’re in the bolt business, you take risks in the bolt business,” one investment banker proclaimed in BusinessWeek in 1986. “You don’t take risks with the cash.”20 The new view was that cash was there to be poured back into the markets, where it would always earn more than you would make by selling more bolts, or so the thinking went. Pretty soon corporations were nothing more than portfolios, bundles of assets to be managed like stocks. Finally free to turn money into more money—to spin straw, as it were, into gold—CFOs would frequently nix investments in real, tangible products in favor of new financial products like money-market mutual funds, “stripped” Treasuries, offshore dollar accounts, foreign currency hedges, and the most high-risk, high-return asset of all: futures contracts and other sorts of derivatives. The trading culture infected all operations; productive assets became merely commodities to be traded, and quarterly profits were all. One of the most poignant examples of how this kind of thinking tanked once-competitive firms is Kodak, which famously decided to forgo investment into digital film technology. 2897

Sara Lee, another firm that shed jobs and ultimately lost market value after switching from manufacturing to brand management, is another case in point. As its CEO once summed it up, “Wall Street can wipe you out. They are the rule-setters. They do have their fads…and they have decided to give premiums to companies that harbor the most profits for the least assets. I can’t argue with that.”22 GE’s Jack Welch was, of course, the master of figuring out how to make the most money with the fewest possible assets. “My gut told me that compared to the industrial operations I did know, this business [meaning financial operations like lending and credit] seemed an easy way to make money,” he wrote in his autobiography. “You didn’t have to invest heavily in R&D, build factories, and bend metal day after day.”23 While previous GE leaders had poured excess profits into their firm and shared them with employees (in the form of raises) or customers (in the form of price cuts), Welch bought wholesale into the Chicago School thinking around creating shareholder value. To him stock owners were king, and soon after taking over the company, he promised to deliver them an unheard-of 15 percent earnings increase per year. Over the course of his tenure, he fulfilled that promise by cutting tens of thousands of jobs, slashing R&D spending as a percentage of sales by half,24 selling off GE’s famous small appliance division (yes, the one that made televisions and toasters), and imposing a single rule for top managers: Be first or second in your products’ market, and raise your profits every quarter. At GE, it was up or out. In another vicious cycle, expectations of those kinds of returns, combined with now common multimillion-dollar executive pay packages (which resembled Wall Street bankers’ pay more than the comfortable corporate salaries of the time), all but necessitated a move to financial activities. That, after all, was the only way to generate such profits in such a short amount of time. But as GE moved into finance, it began to resemble Wall Street in not just its business model but also its culture, starting with a spate of scandals that began under Welch: improper time card charges on a defense contract; accusations of tax evasion; defrauding the US government on an Israeli Aid Force deal; dumping tons of toxic waste; and a revelation that the firm’s investment banking unit booked $350 million in phantom profits from fake trades.25 The list goes on, including charges of insider trading, fraud, and racketeering. Then there were the everyday accounting high jinks that were a disconcertingly common way of doing business in the pre-Enron era. To keep up his promise of 15 percent growth in earnings every year, Welch frequently resorted to moving money (not to mention jobs) offshore. He also booked income and expenses in ways that were technically legal but were designed to obscure what was really happening within various divisions, allowing him to “pull profits seemingly out of thin air,” as the business-friendly Economist magazine once put it.26 One telling example: Over the last five years of Welch’s reign, between 1996 and 2001, GE earnings per share grew at 90.2 percent, an unprecedented figure for a large conglomerate. But without massive under-reserving at its reinsurance unit (meaning, it didn’t put aside enough for the possibility that many claims would be called in at once), the company would have shown a gain of only 5.6 percent.27 Meanwhile, GE Capital regularly allowed GE to manipulate its quarterly statements by engaging in trades right before reporting day, which would artificially push up the company’s earnings. Such number games were technically forbidden by corporate governance watchdogs, but it was an open secret that GE played them. The company didn’t try particularly hard to hide such maneuvers—after all, Wall Street just wanted its take; it didn’t much care how it was generated.28 2914

Sadly, “by doing exactly what Wall Street wanted, they actually increased risk,” says Harvard Business School professor Gautam Mukunda (who’s trying to help his students understand the downside of financialization).32 So much for shareholder value.33 CULTURE SHOCK Financialization doesn’t just shift how and where companies do business; it changes their very culture, making organizations value risky gambles and quick, easy wins over steady product quality and a consumer-oriented approach. In finance, the trader is at the top of the food chain—he’s the alpha man (they are almost entirely men) who brings home the bear he’s slaughtered and decides who gets to eat what. This culture encourages the glamorization of the individual trader over the firm as a whole, rewarding leaders for large deals and killer trades, rather than for slow and steady growth. Yet these hunters are not penalized for the risk that such actions bring to the firm. Consider the rise of the winner-take-all system of corporate compensation that has prevailed on Wall Street and is now becoming more pervasive throughout corporate America. The last several decades have seen the rise of a ridiculously unfair compensation paradigm in American business, in which CEOs are paid more than three hundred times the salary of their average workers, and an up-or-out system of talent management that rewards type-A stars rather than encouraging team play. Never mind the mounting evidence, offered by academic research, that it’s almost always successful teams, rather than single individuals, that drive a corporation’s success.34 2998

slower growth and lower investment in the real economy as firms jockeyed to make higher margins from speculating with their cash. Even Ronald Reagan, one of the biggest advocates of unfettered capitalism around, knew it. Changes in the regulations governing mergers and acquisitions during the early part of his presidency, in 1982, had opened the way for the proliferation of giant conglomerates prone to financial wheeling and dealing. Between 1980 and 1990, a full 28 percent of the Fortune 500’s largest manufacturing firms received tender bids, most of them hostile, from people like T. Boone Pickens, Carl Icahn, and other corporate raiders. Firms would be spliced, diced, and spliced again, their component parts sold off to the highest bidder, who would often promptly bleed the companies dry. By the end of this period, around a third of the largest firms in the United States had ceased to exist as independent companies.38 Yet even as their laws were helping to create a climate ripe for financialization, Reagan and his advisers had begun to worry about the results, which included the eroding market share of US manufacturing firms vis-à-vis their Asian and European competitors. One report released by Reagan’s Commission on Industrial Competitiveness in 1985 sounds, amazingly enough, like something that could have been written by President Obama’s Council on Jobs and Competitiveness today: “In the 1960s, the real rates of return earned by manufacturing assets were substantially above those available on financial assets. Today, the situation is reversed. Passive investment in financial assets has pretax returns higher than the rates of return on manufacturing assets….As a result, the relative attractiveness of investing in our vital manufacturing core has been compromised.” 39 A little-known and truly stupefying fact is that Reagan was so worried about the financialization of the US economy and its impact on competitiveness that he actually launched a secret project to develop a US industrial policy—a term that is still so strongly associated in the public consciousness with Soviet Russia that it has become (wrongly) a third-rail topic even among most American liberals, not to mention conservatives. The idea behind that Reagan administration effort, known as Project Socrates, was to figure out why America’s foreign competitors were succeeding in establishing highly efficient, thriving corporations while their US counterparts were withering. 3043

The official figure for US manufacturing employment, 9 percent, belies the true importance of the sector. Manufacturing represents a whopping 69 percent of private-sector R&D spending as well as 30 percent of the country’s productivity growth.49 And, according to US government figures, every $1 of manufacturing activity returns $1.37 to the economy. “The ability to make things is fundamental to the ability to innovate things over the long term,” says Willy Shih, a Harvard Business School professor and coauthor of Producing Prosperity: Why America Needs a Manufacturing Renaissance. “When you give up making products, you lose a lot of the added value.” In other words, what you make makes you, economically anyway.50 Certainly, all this was on GE executives’ minds when they made the decision to move manufacturing back to upstate New York. “In the old days, we spent a lot of time looking for sources [globally] for everything we did, and then we picked the cheapest source,” says GE’s Little. “Now we’ve realized that if we can control things and vertically integrate our activities in local markets, then we don’t have to give up that margin that we might have once given to a vendor,” not to mention the value of any intellectual property created as a result.51 It’s a smarter way to think about business, and it’s certainly better for local economies. The question is just how many good new jobs such operations will actually create in America in the coming years. The Boston Consulting Group’s 2014 annual survey of senior manufacturing executives found that the number of respondents bringing production back from China to the United States had risen 20 percent from the previous year. But that won’t come close to replacing the 2.3 million manufacturing jobs lost in the recession that followed the 2008 financial crisis. 3127

Arab Spring, a change that has totally transformed the Middle East. It began, as so many revolutions do, with food riots. “Food is a radically different threat [than other kinds of financial crises], because it affects so many of the world’s poor so profoundly,” Erwann Michel-Kerjan, managing director of the Risk Management and Decision Processes Center at the Wharton School, told me at the time. Food is also an amplifier of many other kinds of risk, particularly political risk. And today its effects are traveling much more rapidly because of the increasing interconnectedness of the world, as well as the increasing power that Wall Street has over the price of a loaf of bread.9 3234

“I’m sure that Goldman used the information they had about aluminum to influence the market between 2010 and 2013,” says Cornell law professor Saule Omarova. (Her paper on the problems inherent in banks both owning and trading commodities, “The Merchants of Wall Street: Banking, Commerce, and Commodities,” first sparked serious media interest in the topic.) “But can I prove it? No. Can the CFTC? I doubt it. And if that’s the case, should Goldman be doing any of this? Absolutely not.”22 3352

Are such rules coming anytime soon? And even if they do, will they definitively separate banking from commerce in risky areas like commodities? Certainly those proposed so far don’t. Many reform advocates are skeptical that they ever will. “I’m not expecting any giant news on this front,” says Lisa Donner, executive director of Americans for Financial Reform, a coalition advocating for tighter regulation of Wall Street.66 Part of the problem, she says, is that no regulators, including the Fed, seem to be asking the profound questions: Why do we have a system that allows finance to be a hindrance to commerce rather than a lubricant to it? How is it that banks could create a bottleneck in raw materials and then profit from it at the expense of their customers? How might we reshape things in a systemic way so that can’t happen? Instead, regulators have so far focused on tweaking the administrative aspects of existing laws while maintaining the silos that make the system so hard to police. They might have good intentions, but they are failing at fixing the problem. “In my view, the Fed has both the legal mandate and the regulatory capacity to address these issues in a more comprehensive way, which is necessary in order to bring the financial markets back in service to the real economy,” says Omarova, 3664

Unfortunately, says Omarova, “regulators are taking on these relatively low-stakes technical questions about additional capital and insurance, but they aren’t asking the big questions—is it a good policy to allow big banks to accumulate so much power, not only over finances, but also over our food, fuel, and other raw materials? What kind of a society will we have if a handful of banking giants end up controlling the country’s energy, metals, and agricultural supply chains?” It’s quite telling that the law gives the Federal Reserve the power to determine what a “complementary” activity is, and whether conducting it would pose “a substantial risk to the safety or soundness of depository institutions of the financial system generally.” But it doesn’t say anything about what impact such activities might have on businesses, consumers, and American families. It’s a legal issue that goes far beyond even the crucial commodities markets. Remember, the Bank Holding Company Act of 1956 (the legislation that was so unfortunately tweaked by Gramm-Leach-Bliley) was really about power—and specifically, about ensuring that banks don’t have too much of it relative to the rest of the economy and to society. If there’s anything that the aluminum fiasco showed, it was that surely the balance isn’t yet right. One telling detail is how many companies affected by the aluminum scandal have been reluctant to speak out about the issue. 3679

Before moving on from her post at the World Food Programme, Josette Sheeran gave a moving TED Talk on the problem of global hunger. “If we look at the economic imperative here, this isn’t just about compassion,” she said. “The fact is studies show that the cost of malnutrition and hunger—the cost to society, the burden it has to bear—is on average six percent, and in some countries up to 11 percent, of GDP a year. And if you look at the 36 countries with the highest burden of malnutrition, that’s 260 billion lost from a productive economy every year. Well, the World Bank estimates it would take about 10.3 billion dollars to address malnutrition in those countries. You look at the cost-benefit analysis, and my dream is to take this issue, not just from the compassion argument, but to the finance ministers of the world, and say we cannot afford to not invest in the access to adequate, affordable nutrition for all of humanity.”69 It’s a laudable goal. However, tackling it will first require not only compassion from world leaders, but real change in Wall Street’s business model. 3714

result. According to Harvard’s Joint Center for Housing Studies, the share of moderately to severely cost-burdened renters (meaning those who pay at least 30 percent of their income in rent) grew to represent half of all American renters in 2013, up from 38 percent in 2000.13 “We get lots of people coming to us saying, we wanted to own, but all the affordable properties have been bought up, so now, we’re renting from Blackstone for more than the price of a mortgage,” says Atlanta-based Tony Romano, the organizing director for the nonprofit Right to the City alliance, which has produced a number of studies on the consequences of private equity moving into the housing market.14 3809

Since their birth four decades ago, private equity firms have perfected a business model that is designed to extract as much wealth from every target company with as little capital or risk to themselves as possible. The current business model “emerged out of the shareholder-value revolution and the leveraged buyout (LBO) movement of the 1970s and 1980s,” say Eileen Appelbaum, a senior economist at the Center for Economic and Policy Research (CEPR) in Washington, and Cornell University professor Rosemary Batt in their influential book, Private Equity at Work.21 This mirrors what we’ve already learned in chapters 3 to 5; as Appelbaum and Batt put it, the rise of private equity represents “a fundamental shift in the concept of the American corporation—from a view of it as a productive enterprise and stable institution serving the needs of a broad spectrum of stakeholders to a view of it as a bundle of assets to be bought and sold with an exclusive goal of maximizing shareholder value.” If the markets are an ocean, private equity firms like Blackstone are the great white sharks that have perfected the use of debt, leverage, asset stripping, tax avoidance, and legal machinations to maximize profits for themselves at the expense of almost everyone else—their investors, their limited partners, their portfolio companies and the workers in them, and certainly society at large.22 During the 2012 presidential race, Mitt Romney’s candidacy spurred a vigorous debate over whether private equity firms create or destroy jobs on a net basis. The research can be spun in many ways, but the upshot is that employment generally declines in companies that spend too much time in private equity’s hands. Job destruction is particularly bad in the retail sector, although the other end of the spectrum has some firms in which private equity’s overall effect on jobs is modest at best.23 But what’s clear is that the private equity model, even more so than most Wall Street practices, enriches a few investors at great cost to others. Let’s not forget that while private equity firms may operate as owners (though they often aren’t regulated or taxed as such), they are essentially financial intermediaries; they make money not necessarily by growing the pie, but by taking an ever-larger slice of 3855

Charles Calomiris and Stephen Haber meticulously outline in their book, Fragile by Design: The Political Origins of Banking Crises & Scarce Credit, the subprime crisis itself was “the outcome of a series of spectacular political deals that distorted the incentives of both bankers and debtors.”44 Unfortunately, nothing about this paradigm has changed—indeed, things have arguably gotten worse. Given that some of the largest players in the game now are outside the formal banking sector, you have a particularly dangerous tethering of the country’s most important industry, housing, with the fastest-growing and least transparent part of the financial sector, shadow banking. This time around, though, financial institutions aren’t just trading risky securities. They actually own and operate real properties—meaning, they have an even more direct impact on the lives of average Americans. Private equity firms already control many people’s jobs; now they control the roofs over people’s heads, too. 4045

Worse yet, for such short-term gains, we all get charged massive fees. In one study on the topic, Stanford professor emeritus and Nobel laureate William Sharpe calculated that investing in low- or no-cost funds rather than actively managed funds could result in a 20 percent higher standard of living in retirement, in part due to the fees commanded by the managed funds.9 Bogle has run his own numbers, including not only the lower returns for active funds but additional hidden fees from portfolio turnover costs, charges for investment advice, and other such expenses. As he put it in 2014 testimony to the Senate Finance Committee, “the high costs of ownership of mutual fund shares, over the long-term, are likely to confiscate as much as 65 percent or more of the wealth that retirement plan investors could otherwise easily earn, simply by diverting market returns from fund investors to fund managers.” He added, “many of the infirmities of our retirement system are the result of the heavy costs incurred by investors because of our bloated financial system.” Popping that finance bubble, the one that takes nearly 25 percent of corporate profits and creates only 4 percent of the jobs, is crucial to getting both retirement and the economy back on track.10 It’s an enormous task, one as big as the fund business itself. 4293

Then, as now, there was little data to show that any actively managed fund could regularly outperform the market, particularly when you factored in fees. (Morningstar, the respected purveyor of mutual fund analysis services, basically conceded this point in 2010.)13 A particularly telling recent piece of research done by law academics at Yale and the University of Virginia found that after considering costs, not only did index funds outperform actively managed portfolios by a significant amount, but 16 percent of the time the impact of high fees would actually offset the entire tax benefit of investing in a 401(k) plan for young workers over the course of their careers.14 Investors then and now were better off simply linking their investments to the market via an index fund, an industry that Bogle and others had begun to develop. But active fund management was much more profitable, and the industry worked hard to convince average Joe investors that they needed to pay for professional guidance through this wild world of investing. “You wouldn’t settle for an ‘average’ brain surgeon,” said one index fund critic. “So why would you settle for an ‘average’ mutual fund?”15 Another fund management firm papered Wall Street with posters showing an angry Uncle Sam putting a rubber stamp across index funds. “Index funds are un-American!” the ad screamed. “Help stamp out index funds.” 4326

By the 1970s, the die was basically cast. Fund management firms themselves began to move away from the unlimited liability partnership structure and to go public, creating further conflict between the needs of clients and the incentives for the firm to maximize profits for itself, whatever the risks to investors. This has created a dizzying amount of wheeling and dealing, which may be good for the fund companies but is of questionable interest to investors. 4351

“The staggering economies of scale that characterize money management have been largely arrogated by fund managers to themselves, rather than shared with their fund shareholders,” concludes Bogle. Or, as the great economist Paul Samuelson put it presciently in 1967, “I decided that there was only one place to make money in the mutual fund business—as there is only one place for a temperate man to be in a saloon, behind the bar and not in front of the bar. And I invested in…[a] management company.”23 In lieu of pouring all your money into Fidelity or BlackRock, there are any number of studies that tell us how much better off we are investing the Vanguard way, in low- or no-fee index funds. The world’s smartest investors buy it—Warren Buffett recently told me that upon his death, his wife’s inheritance would be invested 90 percent in Vanguard’s S&P 500 equity index funds.24 So why don’t we all follow that wisdom? The answer has its roots in behavioral economics, which tells us that the “rational man” is more than capable of making irrational decisions. As The Economist put it recently, “everyone knows that if you go to a casino, the odds are rigged in favour of the house. But people still dream of making a killing. The same psychology seems to apply to fund management, where investors flock to high-cost mutual funds even though the odds are against them.”25 Most of us simply don’t trust ourselves about investing, and it’s in the interests of the industry to make it seem much more confusing than it is. That gives them more business; as we have seen, one of the great tricks of finance is to advance the cult of the expert. By cloaking what is essentially a pretty simple decision (put your money in index funds and forget about it until you are sixty-five) in all sorts of complicated jargon, the industry convinces people that they need someone to explain it to them. No wonder a recent survey found that 7 out of 10 wealthy individuals say their financial adviser, the person who sells them on all those high-fee funds, is as important to them as their doctor.26 4378

This perplexing phenomenon—wherein individuals who are unable to make the best market decisions nonetheless get forced into taking more personal responsibility for their retirement future—became more common through the 1980s. That was when the private 401(k) retirement investing system, on which about half of us depend today, really took off. Amazingly, the creation of this entire system was an accident. Like our bloated healthcare system, our modern 401(k) retirement savings arrangement was a fluke that some clever people came up with in an effort to exploit parts of the tax code. In 1980, a benefits consultant working on a cash bonus plan for bankers had the idea to take advantage of an obscure provision in the tax code passed two years earlier, which allowed employees to set aside money to be matched by employers, which would increase tax deductions that could be taken by the corporation. That moment was the beginning of the 401(k), a 4397

When the bankruptcy of Detroit began in late 2013, the terms of the settlement quickly took center stage and became “a discussion between an emergency manager, from a law firm dedicated to the financial sector, and the financial sector,” explains Wallace Turbeville, a former Goldman Sachs banker who is now a senior fellow at the nonprofit think tank Demos. “The people [meaning pensioners] tried to get a seat at the table, but the emergency manager had a monopoly on the information [on city finances] and for the first four months of the process his was the only story available.”34 That, says Turbeville, along with what he believes were dubiously calculated numbers (crunched by emergency manager Kevin Orr’s team) that overestimated pension liabilities, resulted in a widespread belief that oversize pensions had caused Detroit’s demise. In fact, he says, it was the financiers who cut the dubious bond deals with the city in the first place that put Detroit into bankruptcy. That Wall Street debt was “the biggest contributing factor to the increase in Detroit’s legacy expenses,” explains Turbeville, who wrote an influential report in 2013 outlining the role that finance had played in Detroit’s demise.35 The long and short of it was that the people negotiating the debt settlement on behalf of the city were completely outsmarted and outflanked by financiers, who cut deals for millions of dollars of extremely long-term interest rate swaps that were subject to immediate termination if the city’s credit deteriorated, which of course it quickly did. The termination of the contracts required immediate payment of all projected profits that would be earned by the banks had the contract not been terminated. That meant that Detroit was suddenly on the hook for a huge lump-sum payment that made its cash flow position completely untenable. If this contract sounds Kafkaesque, you’re right, it is. But this sort of “heads I win, tails you lose” wording is woefully common in municipal finance deals, which pit Wall Street against Main Street in a completely unfair way and tend to include all kinds of tax loopholes that further add to the complexity. The Detroit story has something of a happy ending. Thankfully, activists and local politicians decided to fight back, and federal judge Steven Rhodes eventually approved the city’s bankruptcy plan, threw out the initial settlement of the $800 million derivatives deal, and made financial institutions settle for a fraction of that amount, as part of a larger settlement to rid the city of $7 billion in debt. After sixteen months of legal wrangling, fighting, and soul-searching, a group of private donors, including family foundations with landmark names like Ford and Kellogg, banded together with community development agencies, big businesses, and the state itself. They decided that it was inconceivable that the onetime heart of American economic power—which had already lost much of its tax base, more than half its population, and a devastating portion of its labor pool—should fall further. They came up with the $800 million to offset some of the pension pain and save Detroit’s art—a “grand bargain,” as it has become known, that gave the city a future. Suddenly the government, its workers, and Detroit’s creditors were more willing to come to terms. Residents got creative, and financial institutions took payment in assets that represented a bet on Motown’s future, rather than grabbing what cash they could before fleeing. Union reps accepted decreases of 5–10 percent in pension payments, a painful and contentious decision, but much less draconian than what Orr had originally proposed. In the end, no major stakeholders refused to be part of the almost universally praised settlement, which turned the page on the largest municipal bankruptcy in history. As Michigan’s Republican governor, Rick Snyder, told me, “none of this would have been possible without the grand bargain. If people were going to accept this kind of pain, they had to feel that the private… 4474

Given the growing public awareness of the issue, now would be an ideal moment for the next American president to announce major root-and-branch reform of the retirement system and take the country to higher ground before the tsunami hits. How to do this? We should start by committing to maintaining Social Security in its present form, given that it’s the only part of the system that’s unfailingly effective at keeping most elderly people secure. To protect the solvency of the system, we could increase the maximum income level subject to the payroll tax, change the formula for benefit levels while limiting payouts to wealthy people who don’t need them, and raise the retirement age for some Americans. (This last issue needs to be examined carefully to avoid creating hardships for people like manual laborers and others who do physical work that can’t be sustained over a certain age.) Public pension plans, for their part, will need to come clean about overoptimistic return projections—they should start basing their forecasts on 5 percent a year, not 8 percent, and be obligated to tell both pensioners and the public whether they can meet them. Getting out in front of shortfalls early will help cities avoid the fate of Detroit. As for Americans who currently save via 401(k)s or IRAs, there is a much more straightforward solution. They should simply move their savings into programs that are dominated by low- or no-fee index funds (and be auto-enrolled in such programs unless they choose otherwise). For their part, the purveyors of these funds should ensure that fee loads and returns are clearly stated and easily accessible so individuals don’t have to hunt for and guess about them. Regulators should also strictly limit early withdrawals or loans against such private savings, which should do what the 401(k) was intended to do in the first place—protect us in our golden years. There’s already a model that provides many of these things and could be easily copied: the Thrift Savings Plan used by federal workers, which is large, cheap, and effective. If Congress has a problem replicating its own successful savings model for the broader public, then voters should ask who, exactly, their elected officials are serving—Wall Street or Main Street? At the state level, the shift has already begun. The California Secure Choice (CSC) Retirement Savings plan, for example, aims to guarantee every Californian working in the private sector a living wage in retirement. CSC was signed into law in 2012 by Governor Jerry Brown. It combines the best of old-style defined-benefit plans (traditional pensions that guarantee workers a certain level of yearly income in retirement) with the flexibility and mobility of a 401(k). This plan will cover workers in California who don’t currently have access to formal retirement savings via their work, which is a particularly big number, since California has more immigrants, freelancers, and young people working without benefits than many other states. “I’m a big fan,” says the Economic Policy Institute’s Monique Morrissey. “It’s probably the farthest along of all the retirement reform ideas in terms of practical implementation.” Details of the plan, which might be unveiled as early as 2016, are now being hashed out in consultation with a variety of industry and academic experts. It’s likely that CSC will use behavioral nudges to get as many eligible people as possible to participate, for instance by making enrollment automatic unless a worker opts out, rather than requiring a sign-up to opt in.45 Participants in CSC would sock away at least 3 percent of their income, most likely in a conservative index fund like an S&P 500 fund, where the pooled money is invested in all 500 stocks in that index. Index funds are considered a simple way to ensure that investors see the same return as the overall stock market—and they’re cheaper, too, since index funds don’t employ stock-picking wizards and charge the related fees. 4555

…the powerful financial services industry exploits vulnerable individual investors.”49 Information asymmetry will always work in favor of Wall Street. But asset managers could be forced to take a fiduciary pledge, a kind of Socratic oath for bankers, just like the one medical professionals must take, which would bind them to serving their customers rather than primarily themselves. Those found to be in violation of such an oath could be subject to strict penalties and big fines. 4631

the kind of market system Adam Smith had hoped for, one in which “the interest of the consumer [must be] the ultimate end and object of all industry and commerce.”52 4662

As a recent Harvard Business School alumni survey summed up the problem, we’re stuck in an economy that’s “doing only half its job.”9 Says Michael E. Porter, a coauthor of the study: “The United States is competitive to the extent that firms operating here do two things—win in global markets and lift the living standards of the average American.” At the moment, we’re still succeeding at the first but failing at the second. But “business leaders and policymakers need a strategy to get our country on a path toward broadly shared prosperity,” Porter adds. And one of the biggest challenges in that respect will be creating a tax code that stops rewarding taking over making.10 PERVERSE INCENTIVES The US tax code is nearly seventy-five thousand pages long. That fact right there tells you a lot about what’s wrong with it, but its perversity can also be summed up in a very simple way: The American tax code rewards debt over equity, at both the corporate and consumer level, a structure that has contributed mightily to the rise of finance and the fall of American business. The US tax code has made it much more advantageous for both companies and consumers to borrow than to save. That has in turn added fuel to the fire of financialization, since banks and other financial institutions are basically in the business of issuing debt—the main way they make their money. It has also contributed to slower growth, as capital is misallocated and mispriced, flowing to all the wrong places for all the wrong reasons.11 Consider the way the tax code works today. Corporations can deduct interest payments on debt from their taxes, yet their dividends and retained profits can’t be written off and are taxed at the full corporate load. Jason Furman, the head of the National Economic Council, has estimated that these and other similar tax breaks make corporate debt as much as 42 percent cheaper than corporate equity. 4757

The world is more awash in debt now than ever before in history. And who benefits from all this? The financial industry, of course. Today four-fifths of all the stock of global financial assets is in debt or deposits.19 That’s due in large part to the fact that we have a tax code that rewards debt so disproportionately. Unfortunately, all this has made our economy extremely prone to bubbles, crises, and stagnation. As we have learned throughout this book, the number of financial crises in the modern era has increased in lockstep with rising debt.20 Indeed, the famous economist John Kenneth Galbraith believed that all financial crises stemmed from too much debt and credit, and there is a wealth of research to back up that notion.21 One of the biggest myths put forward during the financial meltdown of 2008 was that we had to save banks to save the economy, but in fact, a growing body of academic research has found that just the opposite is true. In their seminal book House of Debt, economists Atif Mian of Princeton and Amir Sufi of the University of Chicago make a very strong case that what we need isn’t more debt-fueled finance and credit to fix the economy, but much, much less of it.22 Marshaling a large body of research and data stretching over one hundred years, they present a compelling case that the financial crisis started not with the failure of the banks, but with the collapse of consumer spending, which began well before the fall of Lehman Brothers—as early as two years before in some areas of the country. The fall in consumer spending was most pronounced and happened earliest in communities most heavily saddled with debt. (It was by no means limited to those areas, however. As the authors explain, when consumption in one city or region collapses, it has a knock-on effect of job losses and slower growth throughout the nation.) Basically, House of Debt makes a very convincing argument that recessions occur when finance arrives on the scene and offers middle- and lower-income people more and more debt. That brews up asset bubbles, which eventually burst, hitting the biggest debt holders—which also tend to be the poorest people—hardest. Everyone hunkers down and stops spending, and unemployment grows. The perverse cycle continues, as out-of-work people with even less spending power are buried under mounds of debt (no federally subsidized bailouts for them). Indeed, House of Debt paints a fascinating picture of how similar the periods leading up to the Great Depression and the Great Recession were in this regard. 4806

Why does it reward takers over makers so thoroughly? There’s no clear or complete answer. Certainly a large part of it is politics. Our entire financial system is based on debt, and, as we know, the financial lobby wields tremendous political power. Financiers have used that power over these many decades to push for a system that rewards the creation of debt, which is the core of their business model. But the truth is that like many large and complex systems (take healthcare, or the retirement system, which I covered in chapter 8), the American tax code’s favoring of debt over equity isn’t the result of some grand design, but rather “the unintended consequence of an extended series of discrete, reactive, short-term political decisions,” in one economic historian’s words.26 Interest groups lobby for this or that loophole, and little by little, rules and regulations that might seem to make sense in a vacuum combine to create a system of perverse incentives that reward exactly the kind of behavior the economy doesn’t need. 4847