Magstadt, T. M. (2017). Understanding politics: Ideas, institutions, and issues. Australia: Cengage Learning.
Chapter 1. Introduction: The Study of Politics
· 1Discuss the value of studying politics.
· 2Identify the three basic elements of politics, as well as the dynamics of each.
· 3Analyze the methods, models, and approaches for studying politics.
· 4Evaluate whether politics brings out the best or the worst in human nature—or both.
Politics is not for the faint-hearted. There is virtually never a day without a crisis at home or abroad. Whenever we catch the news on our radio, TV, or computer, we are reminded that we live in a dangerous world.
In 2008, the spectacle of the world’s only superpower paralyzed by extreme partisanship and teetering on the brink of a “fiscal cliff” loomed like a gathering storm. No sooner had that danger receded than a new threat arose in the Middle East in the form of the so-called Islamic State of Iraq and Syria (ISIS). There were even rumors of a coming end-of-the-world apocalypse—December 21, 2012, to be exact, the final day of the old Mayan calendar.
The politically charged atmosphere and the pervasive sense of an impending crisis was nothing new, but two events dominated the news in 2008. First, a financial meltdown and plummeting stock market wiped out fortunes and rocked the global economy to its very foundations. Second, Barack Obama became the first African American elected to the nation’s highest office.
Political culture plays a big role in shaping public policy, and optimism is part of America’s political DNA. Despite a deepening recession, there was a new sense of hope—perhaps it was the beginning of the end of two costly wars and the dawn of a new era in America. But by 2012 hope had given way to anger and disappointment.
What happened? In 2009, President Obama had moved to revive the U.S. economy, which had fallen into the deepest recession since the Great Depression of the 1930s. But the economic stimulus package he pushed through Congress, where the Democrats enjoyed a solid majority in both the House and Senate, was widely viewed as a Wall Street “bailout”—a massive multibillion dollar gift to the very financial institutions that had caused the problem. It was also criticized as a “jobless recovery”; unemployment rose to nearly 10% and youth unemployment (16- to 19-year-olds) rose about 25% in 2010. Nearly half of young people aged 16 to 24 did not have jobs, the highest number since World War II.
The conservative media (most notably FOX News) and the amorphous Tea Party movement eagerly exploited growing public discontent, handing the Democrats a crushing defeat in the 2010 midterm elections. Republicans regained control of the House and cut deeply into the Democrats’ majority in the Senate (see especially Chapters 11 and 13).
Obama also spearheaded a controversial health care reform that satisfied few, confused everyone, and angered many voters on both sides of the acrimonious debate. His decision to order a “surge” in Afghanistan, committing 30,000 more U.S. troops to an unpopular and unwinnable war, did not placate Congress or greatly improve his standing in the opinion polls, nor did his decision to withdraw the last U.S. combat troops from Iraq in December 2011.
Despite a constant chorus of criticism and a vicious media campaign of attack ads from the right, Obama was elected to a second term in 2012. He defeated Republican Mitt Romney by a margin of 5 million votes (51% to 47% of the popular vote) while taking 61% of the electoral votes. The embattled president’s troubles in dealing with a recalcitrant Republican majority in Congress, however, continued unabated. His decision in the fall of 2014 to launch a major bombing campaign against ISIS in Iraq and Syria—in effect, resuming a war that had officially ended three years earlier—did not appease the opposition or boost his popularity, which fell to new lows in 2014.
The president’s popularity—or lack thereof—was a major factor in setting the stage for the Republican victory in the 2014 midterm elections when voters gave the GOP a majority in the Senate. Republicans also gained seats in the House (where they had won back control in 2010). But President Obama acted decisively in the days following the election, confounding his critics and commentators who had branded him a “lame duck.”
We know politics is something that happens in Washington, D.C., or in Austin, Texas, and other state capitals, but some of us forget that politics is a pervasive fact of life— others never forget it. That very fact often gives those “others” a big advantage, which can be the difference between success and failure.
For any democracy to succeed in the long run, it is vital that citizens pay attention, learn to think for themselves, and vote intelligently. Political literacy is vital to a viable and sustainable representative government—what we commonly call “democracy.”
The alternative is revolution, a drastic measure and a last resort—one American colonists chose in 1776 and the Confederate South chose in 1860. As we will see in Chapter 14, revolutions are convulsive and quixotic. They often result in less freedom for the people, not more.
A popular slogan (and bumper sticker) reminds us that “Freedom Isn’t Free.” It’s true. At a minimum, being a good citizen requires us to have a basic understanding of the ideas, institutions, and issues that constitute the stuff of politics. This book is an attempt to foster just such an understanding.
Why Study Politics?
The belief that anybody with a college education will have a basic understanding of political ideas, institutions, and issues is wishful thinking. There is a mountain of evidence showing it’s simply not true; moreover, there is a mountain of empirical evidence to prove it. To begin to understand the power of politics—and the politics of power—we have to make a careful study and, above all, keep an open mind.
Because personal happiness depends in no small degree on what government does or does not do, we all have a considerable stake in understanding how government works (or why it is not working). Federal work-study programs, state subsidies to public education, low-interest loans, federal grants, and court decisions that protect students’ rights are but a few examples of politics and public policy that directly affect college students. For farmers, crop subsidies, price supports, and water rights are crucial policy issues. Environmental regulations are often the target of intense lobbying on the part of power companies, the oil and gas industry, and mining interests.
Taxes are a hot button for nearly everybody. Most people think they pay too much and others pay too little. Do you know anybody who wants to pay more in taxes? Can you think of one wealthy individual who argues that people in his income bracket ought to pay more? (Hint: His initials are W.B.)
Through the study of politics, we become more aware of our dependence on the political system and better equipped to determine when to favor and when to oppose change. At the same time, such study helps to reveal the limits of politics and the obstacles to bringing about any major change in a society. It is sobering to consider that each of us is only one person in a nation of millions (and a world of billions), most of whom have opinions and prejudices no less firmly held than our own.
The Public Interest
What could be more vital to the public interest in any society than the moral character and conduct of its citizens? Civil society is defined by and reflected in the kinds of everyday decisions and choices made by ordinary people leading ordinary lives. At the same time, people are greatly influenced by civil society and the prevailing culture and climate of politics. We are all products of our circumstances to a greater extent than most of us realize (or care to admit). Politics plays a vital role in shaping these circumstances, and it is fair to say the public interest hangs in the balance.
Basic Concepts of Politics
Politics has been defined as “the art of the possible,” as the study of “who gets what, when, and how,” as the “authoritative allocation of values,” and in various other ways. Many people think politics is inherently corrupt and corrupting—hence the term “dirty politics.” Is this true? Can you think of any exceptions?
We may not agree on how to define politics, but we know what it is when we see it—and we don’t like what we see. We are quick to blame “politics” as the main cause of problems not only in society but also in families, schools, and the workplace. Likewise, college students are typically unaware of the anger and tumult that often animate campus politics.
Like other disciplines, political science has a lexicon and language all its own. We start our language lesson with three words that carry a great deal of political freight: power, order, and justice.
Power is the currency of all politics. Without power, no government can make and enforce laws, provide security, regulate the economy, conduct foreign policy, or wage war. There are many kinds of power. In this book, we are interested in political power. Coercion plays an important role in politics, but political power cannot be equated with force. Indeed, the sources of power are many and varied. A large population, a booming economy, a cohesive society, and wise leadership—all are examples of quite different power sources.
We often define power in terms of national wealth or military spending. We once called the most formidable states Great Powers; now we call them “superpowers.” Power defined in this way is tangible and measurable. Critics of this classical view make a useful distinction between “hard power” and “soft power.” Hard power refers to the means and instruments of brute force or coercion, primarily military and economic clout. Soft power is “attractive” rather than coercive: the essence of soft power is “the important ability to get others to want what you want.”*
Power is never equally distributed. Yet the need to concentrate power in the hands of a few inevitably raises three big questions: Who wields power? In whose interests? And to what ends?
The most basic question of all is “Who rules?” Sometimes we have only to look at a nation’s constitution and observe the workings of its government to find the answer. But it may be difficult to determine who really rules when the government is cloaked in secrecy or when, as is often the case, informal patterns of power are very different from the textbook diagrams.
The terms power and authority are often confused and even used interchangeably. In reality, they denote two distinct dimensions of politics. According to Mao Zedong, the late Chinese Communist leader, “Political power flows from the barrel of a gun.” Political power is clearly associated with the means of coercion (the regular police, secret police, and the army), but power can also flow from wealth, personal charisma, ideology, religion, and many other sources, including the moral standing of a particular individual or group in society.
Authority, by definition, flows not only (or even mainly) from the barrel of a gun but also from the norms society accepts and even cherishes. These norms are moral, spiritual, and legal codes of behavior, or good conduct. Thus, authority implies legitimacy—a condition in which power is exercised by common consensus through established institutions. Note this definition does not mean, nor is it meant to imply, that democracy is the only legitimate form of government possible. Any government that enjoys the consent of the governed is legitimate—including a monarchy, military dictatorship, or theocracy.
The acid test of legitimate authority is not whether people have the right to vote or to strike or dissent openly, but how much value people attach to these rights. If a majority of the people are content with the existing political order just as it is (with or without voting rights), the legitimacy of the ruler(s) is simply not in question. But, as history amply demonstrates, it is possible to seize power and to rule without a popular mandate or public approval, without moral, spiritual, or legal justification—in other words, without true (legitimate) authority.
A military power seizure—also known as a coup d’etat—typically involves a plot by senior army officers to overthrow a corrupt, incompetent, or unpopular civilian ruler. One well-known recent example happened in Egypt in July 2013, following many months of turmoil and the outcome of a presidential election that became unacceptable to the military.
Power seizures also occurred in Mauritania and Guinea in 2008 and in Thailand as recently as 2014; many contemporary rulers, especially in Africa, have come to power in this manner. Adolf Hitler’s failed “Beer Hall Putsch” in 1923 is a famous example of an attempted power seizure. Such attempts often fail, but they are usually evidence of political instability—as the case of Weimar Germany illustrates.
Claiming authority is useless without the means to enforce it. The right to rule—a condition that minimizes the need for repression—hinges in large part on legitimacy or popularity.
Legitimacy and popularity go hand in hand. Illegitimate rulers are unpopular rulers. Such rulers are faced with a choice: relinquish power or repress opposition. Whether repression works depends, in turn, on the answer to three questions. First, how widespread and determined is the opposition? Second, does the government have adequate financial resources and coercive capabilities to defeat its opponents and deter future challenges? Third, does the government have the will to use all means necessary to defeat the rebellion?
If the opposition is broadly based and the government waivers for whatever reason, repression is likely to fail. Regimes changed in Russia in 1917 and 1992 following failed attempts to crush the opposition. Two other examples include Cuba in 1958, where Fidel Castro led a successful revolution, and Iran in 1978, where a mass uprising led to the overthrow of the Shah. A similar pattern was evident in many East European states in 1989, when repressive communist regimes collapsed like so many falling dominoes.
If people respect the ruler(s) and play by the rules without being forced to do so (or threatened with the consequences), the task of maintaining order and stability in society is going to be much easier. It stands to reason that people who feel exploited and oppressed make poorly motivated workers. The perverse work ethic of Soviet-style dictatorships, where it was frequently said, “We pretend to work and they pretend to pay us,” helps explain the decline and fall of Communism in the Soviet Union and Eastern Europe, dramatized by the spontaneous tearing down of the Berlin Wall in 1989.
Order exists on several levels. First, it denotes structures, rules, rituals, procedures, and practices that make up the political system embedded in every society. What exactly is society? In essence, society is an aggregation of individuals who share a common identity. Usually that identity is at least partially defined by geography, because people who live in close proximity often know each other, enjoy shared experiences, speak the same language, and have similar values and interests. The process of instilling a sense of common purpose or creating a single political allegiance among diverse groups of people is complex and works better from the bottom up than from the top down. The breakup of the Soviet Union and Yugoslavia in the early 1990s, after more than seven decades as multinational states, suggests new communities are often fragile and tend to fall apart quickly if there are not strong cultural and psychological bonds under the political structures.
The Russian-backed secessionist movement that threatened to break up Ukraine in 2014-15 also illustrates the obstacles to maintaining order in a newly independent country where a national minority group is geographically concentrated. Russian-speakers in parts of eastern Ukraine bordering on Russia constitute a solid majority and remain fiercely loyal to Moscow. The same is true in Crimea (previously part of Ukraine), where most people welcomed Russia’s armed intervention. Russia annexed this strategically important region (the whole of the Crimean Peninsula) in March of 2014.
The idea that individuals become a cohesive community through an unwritten social contract has been fundamental to Western political thought since the seventeenth century. Basic to social contract theory is the notion that the right to rule is based on the consent of the governed. Civil liberties in this type of community are a matter of natural law and natural rights—that is, they do not depend on written laws but rather are inherent in Nature. Nature with a capital N is a set of self-evident truths that, in the eyes of social contract theorists, can be known through a combination of reason and observation. A corollary of this theory is that whenever government turns oppressive, when it arbitrarily takes away such natural rights as life, liberty, and (perhaps) property, the people have a right to revolt (see Chapter 14).
Government is a human invention by which societies are ruled and binding rules are made. Given the rich variety of governments in the world, how might we categorize them all? Traditionally we’ve distinguished between republics, in which sovereignty (see below) ultimately resides in the people, and governments such as monarchies or tyrannies, in which sovereignty rests with the rulers. Today, almost all republics are democratic (or representative) republics, meaning political systems wherein elected representatives responsible to the people exercise sovereign power.*
Some political scientists draw a simple distinction between democracies, which hold free elections, and dictatorships, which do not. Others emphasize political economy, distinguishing between governments enmeshed in capitalist or market-based systems and governments based on socialist or state-regulated systems. Finally, governments in developing countries face different kinds of challenges than do governments in developed countries. Not surprisingly, more economically developed countries often have markedly more well-established political institutions—including political parties, regular elections, civil and criminal courts—than most less developed countries, and more stable political systems.
In the modern world, the state is the sole repository of sovereignty. A sovereign state is a community with well-defined territorial boundaries administered by a single government capable of making and enforcing laws. In addition, it typically claims a monopoly on the legitimate use of force; raises armies for the defense of its territory and population; levies and collects taxes; regulates trade and commerce; establishes courts, judges, and magistrates to settle disputes and punish lawbreakers; and sends envoys (ambassadors) to represent its interests abroad, negotiate treaties, and gather useful information. Entities that share some but not all of the characteristics of states include fiefdoms and chiefdoms, bands and tribes, universal international organizations (such as the United Nations), and regional supranational organizations (such as the European Union).
In the language of politics, state usually means country. France, for instance, may be called either a state or a country. (In certain federal systems of government, a state is an administrative subdivision, such as New York, Florida, Texas, or California in the United States; however, such states within a state are not sovereign.)
The term nation is also a synonym for state or country. Thus, the only way to know for certain whether state means part of a country (for example, the United States) or a whole country (say, France or China) is to consider the context. By the same token, context is the key to understanding what we mean by the word nation.
A nation is made up of a distinct group of people who share a common background, including geographic location, history, racial or ethnic characteristics, religion, language, culture, or belief in common political ideas. Geography heads this list because members of a nation typically exhibit a strong collective sense of belonging associated with a particular territory for which they are willing to fight and die if necessary.
Countries with relatively homogeneous populations (with great similarity among members) were most common in old Europe, but this once-defining characteristic of European nation-states is no longer true. The recent influx of newcomers from former colonial areas, in particular the Muslim majority countries of North Africa, the Arab world, and South Asia, and post–Cold War east-west population movements in Europe have brought the issue of immigration to the forefront of politics in France, Germany, the United Kingdom, Spain, Italy, the Netherlands, and even the Scandinavian countries. Belgium, on the other hand, provides a rare example of a European state divided culturally and linguistically (French-speaking Walloons and Dutch-speaking Flemish) from the start.
India, Russia, and Nigeria are three highly diverse states. India’s constitution officially recognizes no fewer than eighteen native tongues! The actual number spoken is far larger. As a nation of immigrants, the United States is also very diverse, but the process of assimilation eventually brings the children of newcomers, if not the newcomers themselves, into the mainstream.*
The nation-state is a state encompassing a single nation in which the overwhelming majority of the people form a dominant in-group who share common cultural, ethnic, and linguistic characteristics; all others are part of a distinct out-group or minority. This concept is rooted in a specific time and place—that is, in modern Western Europe. (See “Landmarks in History” for the story of the first nation-state.) The concept of the nation-state fits less comfortably in other regions of the world, where the political boundaries of sovereign states—many of which were European colonies before World War II—often do not coincide with ethnic or cultural geography. In some instances, ethnic, religious, or tribal groups that were bitter traditional enemies were thrown together in new “states,” resulting in societies prone to great instability or even civil war.
Decolonization after World War II gave rise to many polyglot states in which various ethnic or tribal groups were not assimilated into the new social order. Many decades later, the all-important task of nation-building in these new states is still far from finished. Thus, in 1967, Nigeria plunged into a vicious civil war when one large ethnic group, the Igbo, tried unsuccessfully to secede and form an independent state called Biafra. In 1994, Rwanda witnessed one of the bloodiest massacres in modern times when the numerically superior Hutus slaughtered hundreds of thousands of Tutsis, including women and children. In early 2008, tribal violence in Kenya’s Rift Valley and beyond claimed the lives of hundreds of innocent people following the outcome of a presidential election that many believed was rigged.
In India, where Hindus and Muslims frequently clash and sporadic violence breaks out among militant Sikhs in Punjab and where hundreds of languages and dialects are spoken, characterizing the country as a nation-state misses the point altogether. In Sri Lanka (formerly Ceylon), Hindu Tamils have long waged a terrorist guerrilla war against the majority Singhalese, who are Buddhist.
Even in the Slavic-speaking parts of Europe, age-old ethnic rivalries have caused the breakup of preexisting states. The Soviet Union, Yugoslavia, and Czechoslovakia are multinational states that self-destructed in the 1990s—in 2014-15 centrifugal tendencies threatened to split Ukraine in half.
Landmarks in History The Peace of Westphalia (1648): The Origins of the Modern Nation-State System
Most historians believe the Peace of Westphalia marks the beginning of the modern European state system. The main actors in forging the peace, which ended the Thirty Years War in 1648, were Sweden and France as the challengers, Spain and the dying Holy Roman Empire as the defenders of the status quo, and the newly independent Netherlands.
At first glance, the map of Europe in the mid-seventeenth century does not look much like it does today. However, on closer inspection, we see the outlines of modern Europe emerge (see Figure 1.1)—visual proof that the treaty laid the foundations of the nation-state as we see it in Europe today.
The emergence of the nation-state system transformed Europe from a continent of territorial empires to one based on relatively compact geographic units that share a single dominant language and culture. This pattern was unprecedented and it would shape both European and world history in the centuries to come.
France under Napoleon attempted to establish a new continental empire at the beginning of the nineteenth century but ultimately failed. Two other empires—Austria-Hungary and Russia—remained, but they were eclipsed by a rising new nation-state at the end of the nineteenth century and perished in World War I. After World War I, only the newly constituted Soviet empire existed in Europe. After World War II, what remained of Europe’s overseas colonial empires also disintegrated. Today, the entire world, with few exceptions, is carved up into nation-states—the legacy of a treaty that, for better or worse, set the stage for a new world order.
Finally, stateless nations such as the Palestinians, Kurds, and Native Americans (known as First Nations in Canada) share a sense of common identity but no longer control the homelands or territories they once inhabited. The tragic reality of nations without states has created highly volatile situations, most notably in the Middle East.
We willingly accept the rule of the few over the many only if the public interest—or common good—is significantly advanced in the process. The concept of justice is no less fundamental than power in politics, and it is essential to a stable order. Is power exercised fairly, in the interest of the ruled, or merely for the sake of the rulers? For more than two thousand years, political observers have maintained the distinction between the public-spirited exercise of political power on one hand and self-interested rule on the other. This distinction attests to the importance of justice in political life.
Not all states and regimes allow questions of justice to be raised; in fact, throughout history, most have not. Even today, some governments brutally and systematically repress political dissent because they fear the consequences.
Often, criticism of how a government rules implicitly or explicitly raises questions about its moral or legal right to rule. One of the most important measures of liberty is the right to question whether the government is acting justly.
Questions about whether a particular ruler is legitimate or a given policy is desirable stem from human nature itself. The Greek philosopher Aristotle (384–322 BCE) observed that human beings alone use reason and language “to declare what is advantageous and what is just and unjust.” Therefore, “it is the peculiarity of man, in comparison with the rest of the animal world, that he alone possess a perception of good and evil, of the just and unjust.”*
The same human faculties that make moral judgment possible also make political literacy—the ability to think and speak intelligently about politics—necessary. In other words, moral judgment and political literacy are two sides of the same coin.
The Problem of Dirty Hands
Based on everyday observation, it’s easy to get the impression that politics and morality operate in separate realms of human experience, that power always corrupts, and that anyone who thinks differently is hopelessly naïve. Political theorists have long recognized and debated whether it is possible to exercise power and still remain true to one’s principles. It’s called the problem of “dirty hands.”
In politics, anything is possible, including the unthinkable. When morality is set aside, justice is placed entirely at the mercy of raw power.
The rise and fall of Nazi Germany (1933–1945) under Adolf Hitler illustrates the tremendous impact a regime can have on the moral character of its citizens. At the core of Nazi ideology was a doctrine of racial supremacy. Hitler ranted about the superiority of the so-called Aryan race. The purity of the German nation was supposedly threatened with adulteration by inferior races, or untermenschen. Policies based on this maniacal worldview resulted in the systematic murder of millions of innocent men, women, and children. Approximately six million Jews and millions of others, including Poles, Gypsies, homosexuals, and people with disabilities, were killed in cold blood.
During the Nazi era, the German nation appears, at first glance, to have become little more than an extension of Hitler’s will—in other words, the awesome moral responsibility for the Holocaust somehow rested on the shoulders of one man, Adolf Hitler. But some dispute this interpretation. For example, according to Irving Kristol,
When one studies the case of The Nazi there comes a sickening emptiness of the stomach and a sense of bafflement. Can this be all? The disparity between the crime and the criminal is too monstrous.
We expect to find evil men, paragons of wickedness, slobbering, maniacal brutes; we are prepared to trace the lineaments of The Nazi on the face of every individual Nazi in order to define triumphantly the essential features of his character. But the Nazi leaders were not diabolists, they did not worship evil. For—greatest of ironies—the Nazis, like Adam and Eve before the fall, knew not of good and evil, and it is this cast of moral indifference that makes them appear so petty and colorless and superficial.*
One such person, according to the late German-born political theorist Hannah Arendt, was Otto Adolf Eichmann, the Nazi officer in charge of Jewish affairs in the Third Reich, who engineered and directed the genocide or extermination program known in history as the Holocaust. In Arendt’s view, Eichmann was not a particularly unusual man.*
Following Eichmann’s capture in 1960 and his subsequent trial for war crimes, Arendt wrote a famous series of articles for The New Yorker later published in a book entitled Eichmann in Jerusalem: The Banality of Evil. The subtitle of the book underscored Arendt’s central argument: namely, that far from being one of the masterminds of the Holocaust, Eichmann was an ordinary man with no original ideas, great ambitions, or deep convictions. Rather, he had a strong desire to get ahead, to be a success in life. He took special pride in his ability to do a job efficiently.
Politics and Pop Culture Schindler’s List
Not all Germans, or Europeans, were as indifferent or self-serving in the face of evil as Adolf Eichmann. One notable example was Oskar Schindler, who is now widely renowned thanks largely to the movie Schindler’s List.
Schindler was a German businessman who belonged to the Nazi Party. Schindler was no saint, but he used his business and political connections to save the lives of the Jewish workers he had first exploited.*
No doubt most of us would identify more with Schindler and other Christians who rescued Jews than with Eichmann, but the disturbing fact remains that far more Germans (including tens of thousands of Hitler Youth), mesmerized by Hitler’s message of hate, behaved more like Eichmann than like Schindler.
At his trial for war crimes, Eichmann claimed to have no obsessive hatred toward Jews. In fact, we know now that Eichmann’s “little man” self-portrait was a clever act designed to save him from the gallows.
Although Eichmann was not the mere functionary or “cog” he claimed to be, many Germans who participated directly in the Holocaust do fit this description—they were following orders, full stop. The fact that so many Germans blindly obeyed Eichmann and Hitler’s other top lieutenants illustrates the fine line between indifference and immorality—and how easily the former can lead to the latter.
· Eichmann exemplifies the worst in human nature; Schindler exemplifies the best. Both men were caught up in the same set of circumstances. Except for a depraved but ingenious demagogue name Hitler, Eichmann would not have become a war criminal and Schindler would not have become a paragon. If Hitler does not deserve the credit for producing an exemplar like Schindler, does he deserve the blame for producing a monster like Eichmann? Think about it.
Hint: If we are all products of the circumstances we are born (or thrust) into, we are thereby absolved of individual moral responsibility. On the other hand, if there is such a thing as free will, then we cannot blame society for our misdeeds.
Although not particularly thoughtful or reflective in Arendt’s view, he was intelligent in practical ways, attentive to details, a competent administrator capable of managing a major operation like the systematic mass murder of millions of Jews and other “enemies” and “degenerates.” Arendt also describes Eichmann as somewhat insecure, but not noticeably more so than many “normal” people (see “Politics and Pop Culture”).
More recently, scholars have unearthed a treasure trove of research materials that challenge Arendt’s thesis. In a well-documented 579-page tome entitled Eichmann Before Jerusalem: The Unexamined Life of a Mass Murderer (New York: Alfred A. Knopf, 2014), German philosopher Bettina Stangneth shows clearly that Eichmann was a thinking man, a fanatical believer in German racial superiority who believed himself to have been involved in “creative” work and who—as a fugitive hiding out in Argentina after the war—was determined to secure his rightful place as a hero in German history. The notion that in Kristol’s words “he knew not of good and evil” is no longer credible. Eichmann did not lose any sleep over dirty hands; instead, he gloried in having bloody hands.
How to Study Politics
Aristotle is the father of political science.* He not only wrote about politics and ethics, but he also described different political systems and suggested a scheme for classifying and evaluating them. For Aristotle, political science simply meant political investigation; thus, a political scientist was one who sought, through systematic inquiry, to understand the truth about politics. In this sense, Aristotle’s approach to studying politics more than two thousand years ago has much in common with what political scientists do today. Yet the discipline has changed a great deal since Aristotle’s time.
There is no consensus on how best to study politics. Political scientists can and do choose among different approaches, ask different kinds of questions, and address different audiences. This fact is often a source of some dismay within the discipline, but it is hardly surprising and probably unavoidable given the vast universe of human activity the study of politics encompasses. Let us explore why and how contemporary political scientists study politics.
For What Purposes?
Some of the most important questions in politics are “should” and “ought” questions, the kind that scientists seeking objective truth tend to avoid. These are the great normative political questions that resonate throughout human history: When is war justified? Do people have a right to revolt? Is the right to life absolute? Is state repression always wrong? Does government have a right to keep secrets from the people? To invade the privacy of its citizens? What about censorship? Is government ever justified in placing limits on freedom of expression and freedom of the press? Should every citizen pay taxes at the same rate? If not, why not? Who should pay more and who less?
Such questions may seem too abstract or theoretical to have any practical value, but in fact they are behind the most controversial political issues of the day—abortion, gun control, gay rights, legalization of marijuana, capital punishment, and the list goes on. (See if you can think of more issues to add to this list and connect each issue to some fundamental question of justice or fairness.)
Some issues lend themselves to empirical analysis more than others. Studying elections, for example, can reveal flaws in the voting process—such as skewed voting districts or impediments to voter registration—and lead to appropriate changes or reforms, such as redistricting or switching from written ballots to voting machines. Opinion polls help leaders gauge the mood of the public and better understand the effect of government policies (see Chapter 11).
However, answers to many of the most basic questions in politics can only be discovered via a thorough knowledge of the facts and a rigorous process of analysis involving reason, logic, and dialogue. There are no shortcuts, and given that we are talking about the health and well-being of society, the stakes are too high to settle for anything less than our best efforts.
By What Methods?
Should political science strive to predict or forecast events? Is the study of politics a science akin to physics or chemistry? Answers to such questions lie in the realm of methodology. There are many ways to classify political scientists. We will focus on one basic distinction—the difference between positivism and normativism.
Positivism emphasizes empirical research (which relies on observation) and couches problems in terms of variables we can measure. Behaviorism is an offshoot of positivism that focuses mainly on the study of political behavior. Behaviorists use quantitative analysis to challenge the conventional wisdom—for example, what motivates voters or why a given election turned out the way it did. Following the facts—statistical data—wherever they may lead is the hallmark of the so-called hard sciences. The results of empirical research can cast long-standing “truths” into serious doubt or expose “facts” as fallacies.
Normativism is based on the idea closely associated with the German political philosopher Immanuel Kant. He stated that the “ought” and the “is” are inseparable from one another and that the “ought” cannot be derived from the “is.” Sticking strictly to the facts, a trademark of positivism, thus raises a serious problem for the adherents of normative theory, who are interested not only in describing actions and consequences but also in prescribing policies and remedies. Seen in this light, values are at the core of political analysis. In studying Congress, for example, value-based political science might ask: Did special interests unduly influence health care reform in 2009-2010? Or with respect to U.S. foreign policy: Was the invasion of Iraq in 2003 necessary?
Scholars and policy analysts seeking answers to such questions often resort to philosophy, history, constitutional law, court cases, treaties, declassified documents, and expert opinion. For example, in explaining why the Constitution adopted in 1787 did not abolish slavery, scholars often skip over the question of whether or why slavery is wrong. Instead, they examine the writings and speeches of the founding fathers, the economic interests they represented, the social class to which they all belonged, and the like. The reason they (we) don’t dwell on the moral question is that today every sane and sensible person knows slavery is wrong. Slavery is an extreme case, but many political issues are at least as much about values as about facts.
In truth, it is not always easy to distinguish between a fact and a value. Moreover, in politics, values are facts. We all bring certain values to everything we do. At the same time, however, we can never get at the truth if we don’t place a high value on facts.
For example, the belief that abortion is a sin, which is held by an influential segment of the population, is a value based on a religious belief or moral conviction. We can argue all day long whether abortion is an American’s right or always wrong, but there is no escaping the fact that it is controversial and that politicians, government officials, and judges have no choice but to deal with it. No matter what legislation or jurisprudence is brought to bear on this question, it will have far-reaching consequences for society. This is but one simple example among many, illustrating the reality in which facts and values are entangled in the political life of every society, always have been, and always will be.
The Study of Human Behavior
Political scientists tend to be wary of “subjective” value judgments that often fly in the face of objective facts. In the social sciences, so-called behaviorists use the type of quantitative methods common in the natural sciences such as biology, physics, and chemistry, asking questions that can only be answered empirically. Constructing a research design, collecting data, using the objective tools of statistical analysis to test hypotheses—these are the essential elements of the scientific method. In this manner, behavioral scientists develop mathematical models to try and explain voting behavior, coalition-building, decision making, even the causes of war.
In a study done nearly two decades ago but still relevant, researchers asked: Is it really true, as is widely believed, that high voter turnout favors Democrats?* The prevailing belief that Democrats benefit from high voter turnout assumes several things:
people with lower socioeconomic status (SES) vote less often than people with higher SES;
as voter turnout rises, more people on the lower end of the SES ladder vote; and
lower-end voters are likely to vote for the party they trust to advance working-class interests—namely, the Democratic Party.
This belief is reinforced whenever low voter turnout coincides with Republican victories. It also explains why most Democrats favored (and Republicans opposed) the 1993 National Voter Registration Act—popularly known as the Motor Voter Bill—which eased voter registration procedures.
Researchers examined 1,842 state elections going all the way back to 1928: 983 for senator and 859 for governor. Applying a mathematical test, they concluded that from 1928 to 1964 high voter turnout did aid the Democrats, as generally believed, but after 1964 there was no such correlation either in senatorial or gubernatorial races.
Why? Although this question was beyond the scope of the study, its findings were consistent with another complex theory of voting behavior. The rise in the number of independents since 1964 (and the resulting decline in party identification and partisan voting) made it difficult to calculate which party would benefit from a large voter turnout in any given race. In 2011, a Gallup poll found that 40% of all voters identified themselves as independents, and ticket splitting and swing voting have become common (see Chapter 11). In the 2010 midterm elections, for example, Republicans were the beneficiaries of a huge swing vote, as they were once again in 2014.
Behaviorists, like other research scientists, are typically content to take small steps on the road to knowledge. Each step points the way to future studies.
Studying human behavior can be as frustrating as it is fascinating. There are almost always multiple explanations for human behavior, and it is often difficult to isolate a single cause or distinguish it from a mere statistical correlation. For instance, several studies indicate that criminals tend to be less intelligent than law-abiding citizens. But is low intelligence a cause of crime? What about social factors such as poverty, drug or alcohol addiction, or a history of being abused as a child? What about free will? Many reject the idea that society—rather than the criminal—is somehow responsible for the crime.
Political scientists often disagree not only about how to study politics but also about which questions to ask. Behaviorists typically prefer to examine specific and narrowly defined questions, answering them by applying quantitative techniques—sophisticated statistical methods such as regression analysis and analysis of variance.
Many broader questions of politics, especially those raising issues of justice, lie beyond the scope of this sort of investigation. Questions such as “What is justice?” or “What is the role of the state in society?” require us to make moral choices and value judgments. Even if we cannot resolve such questions scientifically, they are worth asking. Confining the study of politics only to the kinds of questions we can subject to quantitative analysis risks turning political science into an academic game of Trivial Pursuit.
Given the complexity of human behavior, it is not surprising that experts argue over methodology, or how to do science. Although the lively debate sparked by the behavioral revolution has cooled, it divided the discipline for several decades and is likely to continue to do so for years to come.
The Political (Science) Puzzle
Political science, like politics, means different things to different people. The subject matter of politics is wide-ranging and thus difficult to study without breaking it down into more manageable parts and pieces. Subfields include political theory, U.S. government and politics, public administration, public policy, political economy, comparative politics, and international relations.
The origins of what we now call political science are to be found in Greek philosophy and date back to Socrates and Plato (circa 400 BCE). The Socratic method of teaching and seeking Truth was to ask a series of pithy questions—What is the good life? Is there a natural right to liberty?—while questioning every answer in order to expose logical fallacies.
Political theory seeks answers to such questions through reason, logic, and experience. Famous names in the history of political thought include Aristotle, Thomas Hobbes, John Locke, Jean-Jacques Rousseau, and John Stuart Mill, among others. These thinkers ranged far and wide but met at the intersection of politics and ethics.
Because people on opposite sides of the political fence believe that they are right and the other people are wrong, understanding politics requires us, at minimum, to be open-minded and familiarize ourselves with pro and con arguments.* Knowledge of costs and the moral consequences in politics are essential to a clear sense of purpose and coherent policy.
Are we humans rational by nature or are we driven by passions such as love, hate, anger, and prejudice? Advocates of rational choice theory emphasize the role of reason over emotion in human behavior. Political behavior, arguably, follows logical and even predictable patterns. The key to understanding politics is self-interest. This approach, which forms the basis for a theory of international relations known as political realism (see Chapter 17), holds that individuals and states alike act according to the iron logic of self-interest.
Other political scientists argue that rational choice theory is an oversimplification because states and groups are composed of human beings with disparate interests, perceptions, and beliefs. The key is not self-interest pure and simple but culture and shared values. In this view, we cannot explain political behavior by reference to logic and rationality alone. Instead, the behavior of individuals and of groups is a product of specific influences that vary from place to place—in other words, political behavior is a product of political culture.
Of course, it is not necessary to adhere dogmatically to one theory or the other. Both contain important insights and we can perhaps best see them as complementary rather than conflicting.
U.S. Government and Politics
Understanding our own political institutions is vitally important. Because the United States is a federal system, our frame of reference changes depending on whether we mean national, state, or local politics. Similarly, when we study political behavior in the United States, it makes a big difference whether we are focusing on individual behavior or the behavior of groups such as interest groups, ethnic groups, age cohorts, and the like. Teaching and learning about one’s own government is, in effect, an exercise in civic education.
Citizens in a democracy need to know how the government works, what rights they are guaranteed by the Constitution, and how to decide what to believe. We need to remember that the United States is home to the oldest written constitution, a behemoth economy, and the most potent military capability of all time. Prestige, power, and wealth have political and moral consequences: namely, an obligation to act responsibly as citizens of both a powerful country and an interdependent world.
Public administration is all about how governments organize and operate, about how bureaucracies work and interact with citizens and each other. In federal systems, intergovernmental relations is a major focus of study. Students of public administration examine budgets, procedures, and processes in an attempt to improve efficiency and reduce waste and duplication. One perennial question deals with bureaucratic behavior: How and why do bureaucracies develop vested interests and special relationships (such as between the Pentagon and defense contractors, or the Department of Commerce and trade associations) quite apart from the laws and policies they are established to implement?
Political scientists who study public administration frequently concentrate on case studies, paying attention to whether governmental power is exercised in a manner consistent with the public interest. Public administration shares this focus with policy studies and political science as a whole.
Policy Studies and Analysis
Public policy places a heavy emphasis on the outputs of government. However, the politics of public policy involves inputs as well. Before any policy can be formulated and finalized, much less implemented, all sorts of ideas and interests must be brought forward, congressional hearings held, consultants hired, and studies undertaken, published, digested, and debated. Not only special interests but also institutional interests and bureaucratic politics are further complicating factors. Once a policy is put into effect, policy analysts study the effects and look for signs—evidence—that it’s working or not working. The whole process is highly political both because public policy carries a price tag denominated in taxpayer dollars and, not least, because it often carries a lot ideological freight.
The study of political economy is a particularly well-developed discipline in the United Kingdom, but it has migrated across the Atlantic and now occupies a prominent place in the curriculum at many colleges and universities in the United States. As the name implies, this subfield resides at the intersection of politics and economics. The genius of this marriage of two disciplines arises from the fact that so much of what governments do involves monetary and fiscal policy (taxes and spending), which have a major impact on the distribution of wealth in society, inflation and interest rates, employment levels, the business cycle, the investment climate, bank regulations, and the like.
Comparative politics seeks to contrast and evaluate governments and political systems. Comparing forms of government, stages of economic development, domestic and foreign policies, and political traditions enables political scientists to formulate meaningful generalizations. Some comparativists specialize in a particular region of the world or a particular nation. Others focus on a particular issue or political phenomenon, such as terrorism, political instability, or voting behavior.
All political systems share certain characteristics. Figure 1.2 depicts one famous model, developed by political scientist David Easton in 1965. This model suggests that all political systems function within the context of political cultures, which consist of traditions, values, and common knowledge. It assumes citizens have expectations of and place demands on the political system, but they also support the system in various ways: They may participate in government, vote, or simply obey the laws of the state. The demands they make and supports they provide in turn influence the government’s decisions, edicts, laws, and orders.
Simplified Model of the Political System.
Countries and cultures differ in countless ways. Focusing on these differences makes it possible to classify or categorize political systems in ways that can aid our understanding of the advantages and disadvantages of each type. This book distinguishes among democratic, authoritarian, and totalitarian states.
Typologies change over time, reflecting new trends and seismic shifts in world politics or the global economy. For example, after the fall of Communism, the distinction between established liberal democracies and “transitional states” gained currency (see Chapter 8). It also became fashionable to distinguish between viable states and so-called failed states (see Chapter 9). The main types of totalitarian systems—the Nazi or Fascist model on the right and the Communist model on the left—are either defunct (most notably Hitler’s Germany and Stalin’s Russia) or depend on foreign investment and access to global markets (China and Vietnam). As a result, there is a tendency to gloss over or ignore the totalitarian model today even though some unreconstructed examples of this extremely repressive system still exist (North Korea and Cuba). And perhaps because many countries (including the United States and our NATO allies in Europe) are now locked in an interdependent relationship with China, there is also a tendency to sweep gross human rights violations under the rug.
Specialists in international relations analyze how nations interact. Why do nations sometimes live in peace and harmony but go to war at other times? The advent of the nuclear age, of course, brought new urgency to the study of international relations, but the threat of an all-out nuclear war now appears far less menacing than other threats, including international terrorism, global warming, energy security, and, most recently, the economic meltdown.
Although war and peace are ever-present problems in international relations, they are by no means the only ones. The role of morality in foreign policy continues to be a matter of lively debate. Political realists argue that considerations of national interest have always been paramount in international politics and always will be.* Others argue that enlightened self-interest can lead to world peace and an end to the cycle of war. Realists often dismiss such ideas as too idealistic in a dog-eat-dog world. Idealists counter that realists are too fatalistic and that war is not inevitable but rather a self-fulfilling prophecy. Still others say the distinction between the national interest and international morality is exaggerated; that democracies, for example, derive mutual benefit from protecting each other and that in so doing they also promote world peace.*
The Power of Ideas
In politics, money talks—or so people say. Listening to the news, it’s easy to get the impression that Congress is up for sale. As a summer intern in the United States Senate many years ago, one of the first things I was told is, “Son, in Washington it isn’t what you know but who you know.”
Often we start out life being idealistic and then quickly run up against reality. For young students of politics, it is easy to fall into a trap, to lurch from one extreme to the other. If money is all that matters, justice is an illusion, ideas are irrelevant, and things can never change. But is it true? Are the cynics the smart ones?
One view, recently showcased in The Economist, holds that intelligence, not money, is what really matters: smart people are the inventors, innovators, and entrepreneurs who make things happen: “The strongest force shaping politics is not blood or money but ideas.”* Big movements in world history are propelled by big ideas, and “the people who influence government the most are often those who generate compelling ideas.” If true, ideas do matter and justice is possible.
According to this argument, intelligence is the great equalizer in a globalized and competitive world operating on market principles. The children of the poor can—and often do—have greater native intelligence than rich kids. Thus, a college dropout (Mark Zuckerberg) can have a bright idea, launch a social network called Facebook on the Internet, and become a billionaire in his mid-20s. Years earlier, another college dropout with an idea (Bill Gates III) started a computer software company called Microsoft and soon reached the top of the Fortune 400 list of the world’s richest individuals. Gates remained at the top of that list in 2013.
But entrepreneurs who control billions of dollars in assets (Rupert Murdoch and the Koch brothers are a few other well-known examples) do not operate only in the business world and economy—they also invest heavily in politics and government. Do ideas still have a chance in today’s political marketplace? Do smart people get elected to high office in the same way as they climb the corporate ladder to become CEOs and join the ranks of the super-rich? This book will challenge you to think about such questions.
And one word of caution: Don’t expect to find easy answers. And don’t expect the answers to be revealed suddenly in a burst of divine light. The role of education is to ask the right questions. The key to a life well lived is to search for the right answers—wherever that might take you.
Understanding politics is a matter of self-interest. By exploring politics, we gain a better appreciation of what is—and what is not—in the public interest.
This chapter focuses on three fundamental concepts: power, order, and justice. It also explores the interrelationships between power and order, order and justice, and justice and power.
Political power can be defined as the capacity to maintain order in society. Whenever governments promulgate new laws or sign treaties or go to war, they are exercising political power. Whenever we pay our taxes, put money in a parking meter, or remove our shoes prior to boarding an airplane, we, in effect, bow to the power of government.
When governments exercise power, they often do it in the name of order. Power and authority are closely related: authority is the official exercise of power. If we accept the rules and the rulers who make and enforce them, then government also enjoys legitimacy.
Questions of justice are often embedded in political disputes. If the public interest is not advanced by a given policy or if society no longer accepts the authority of the government as legitimate, the resulting discontent can lead to political instability and even rebellion or revolution.
Political science seeks to discover the basic principles and processes at work in political life. Classical political theory points to moral and philosophical truths, political realism stresses the role of self-interest and rational action, and behaviorism attempts to find scientific answers through empirical research and data analysis. Most political scientists specialize in one or more subfields such as political theory, U.S. government and politics, comparative politics, international relations, political economy, public administration, or public policy.
Politics matters. This simple truth was tragically illustrated by the rise of Nazism in Germany. The bad news is that sometimes war is necessary to defeat a monstrous threat to world order and humanity. The good news is that there are often political or diplomatic solutions to conflict and injustice in human affairs. It is this fact that makes the study of politics forever obligatory and essential.
Chapter 2. The Idea of the Public Good: Ideologies and Isms
· 1Define the public good.
· 2Identify the three kinds of political ideologies.
· 3Identify the five core values.
· 4Describe the difference between a liberal and a conservative, as well as how these terms have changed over time.
· 5Determine whether or not one ideology or political persuasion better guarantees freedom, justice, and democracy.
In Lewis Carroll’s classic tale Alice in Wonderland, Alice loses her way in a dense forest and encounters the Cheshire Cat who is sitting on a tree branch. “Would you tell me, please, which way I ought to go from here?” asks Alice. “That depends a good deal on where you want to get to,” replies the Cat. “I don’t much care where,” says Alice. “Then it doesn’t matter which way you go,” muses the Cat.
Like Alice lost in the forest, we too occasionally find ourselves adrift when trying to make sense of complex issues, controversies, and crises. Governments and societies are no different. Political leadership can be woefully deficient or hopelessly divided over the economy or the environment or health care or a new threat to national security. Intelligent decisions, as Alice’s encounter with the Cheshire Cat illustrates, can take place only after we have set clear aims and goals. Before politics can effectively convert mass energy (society) into collective effort (government), which is the essence of public policy, we need a consensus on where we want to go or what we want to be as a society a year from now or perhaps ten years down the road. Otherwise, our leaders, like the rest of us, cannot possibly know how to get there. There are plenty of people eager to tell us what to think. Our purpose is to learn how to think about politics.
Political Ends and Means
In politics, ends and means are inextricably intertwined. Implicit in debates over public policy is a belief in the idea of the public good, that it is the government’s role to identify and pursue aims of benefit to society as a whole rather than to favored individuals. But the focus of policy debates is often explicitly about means rather than ends. For example, politicians may disagree over whether a tax cut at a particular time will help promote the common good (prosperity) by encouraging saving and investment, balancing the national budget, reducing the rate of inflation, and so on. Although they disagree about the best monetary and fiscal strategies, both sides would agree that economic growth and stability are proper aims of government.
In political systems with no curbs on executive authority, where the leader has unlimited power, government may have little to do with the public interest. * In constitutional democracies, by contrast, the public good is associated with core values such as security, prosperity, equality, liberty, and justice (see Chapter 13). These goals are the navigational guides for keeping the ship of state on course. Arguments about whether to tack this way or that, given the prevailing political currents and crosswinds, are the essence of public policy debates.
Ideologies and the Public Good
The concept of Left and Right originated in the European parliamentary practice of seating parties that favor social and political change to the left of the presiding officer; those opposing change (or favoring a return to a previous form of government), to the right. “You are where you sit,” in other words.
Today, people may have only vague ideas about government or how it works or what it is actually doing at any given time. * Even so, many lean one way or another, toward conservative or liberal views. When people go beyond merely leaning and adopt a rigid, closed system of political ideas, however, they cross a line and enter the realm of ideology. Ideologies act as filters that true believers (or adherents) use to interpret events, explain human behavior, and justify political action.
The use of labels—or “isms” as they are often called—is a kind of shorthand that, ideally, facilitates political thought and debate rather than becoming a way to discredit one’s political opponents. One note of caution: these labels do not have precisely the same meaning everywhere. Thus, what is considered “liberal” in the United Kingdom might be considered “conservative” in the United States (see Figure 2.1).
|U.S. Conservatives *||British Liberals *|
Focus Conservative or Liberal?
Conservatives in the United States traditionally favor a strong national defense, deregulation of business and industry, and tax cuts on capital gains (income from stocks, real estate, and other investments) and inheritances. They often staunchly oppose social spending (“welfare”) on the grounds that giveaway programs reward sloth and indolence.
By contrast, liberals tend to favor public assistance programs, cuts in military spending, a progressive tax system (one that levies higher taxes on higher incomes), and governmental regulation in such areas as the food and drug industry, occupational safety and health, housing, transportation, and energy.
Prior to the Reagan Revolution of the 1980s, Republicans championed balanced budgets and limited government. In 2012–2014, Tea Party Republicans in Congress led the fight for deficit reduction, but insisted that it be done without raising taxes on the wealthy. Democrats countered by demanding that budget cuts be offset with targeted tax increases, in particular on individual earnings in excess of $400,000 and capital gains.
Compared to Europe’s parliamentary democracies, the political spectrum in the United States is shifted to the right. Ideas and policies widely viewed as “socialist” in the United States—national health care, for example—are mainstream in European countries. Any attempts to tamper with social programs in so-called welfare states are likely to provoke a public outcry, as the anti-austerity protests that have swept across Europe (notably, Greece, Spain, Ireland, Italy, and France) in recent years attest.
The word “liberal” is frequently associated with leftist views in the United States, where talking heads on FOX News routinely call President Obama’s policy initiatives “socialist.” But “liberal” has always meant something quite different in the United Kingdom, where it originated. There the term still denotes a desire to maximize individual liberty as the first principle of good government.
Leftists in Europe often belong to socialist parties, but there is no viable Socialist Party in the United States and never has been. In the May 2012 presidential election, for example, voters in France handed Socialist Party leader Francois Hollande a sweeping victory, giving the Socialists an absolute majority of 300 seats in the National Assembly. (What if that were to happen in the U.S.? You can’t imagine it? That’s because it can’t happen here, which begs the question: Why?)
We’ll learn more about European politics in general, and France in particular, in Chapter 7; for now, think of Socialists winning elections as a measure of the vast differences in the way Americans and Europeans define the public good.
In this chapter, we group ideologies under three headings: antigovernment ideologies, right-wing ideologies, and left-wing ideologies. Left and Right are very broad categories, however, and there are many shades of gray. Only when the political system becomes severely polarized, as it did, for example, in Germany between the two world wars, are people forced to choose between black and white.
In the two-party system of the United States, the choice is limited to red (Republican) and blue (Democrat). After September 11, 2001, the political climate became more polarized and partisan, as reflected in the charged rhetoric of media figures like Rush Limbaugh, Glenn Beck, and Sean Hannity on the right and of Keith Olbermann and Rachel Maddow on the left. Between 2010 and 2014, the most extreme partisanship since World War II gave rise to governmental paralysis and gridlock in Congress, which in turn exacerbated political polarization and gave birth to the Tea Party movement. History provides sobering examples of what can happen when a government is dysfunctional, wealth is extremely concentrated, and people become disillusioned.
Opposition to government in principle is anarchism. The Russian revolutionary Mikhail Bakunin (1814–1876), who reveled in the “joy of destruction” and called for violent uprisings by society’s beggars and criminals, is often considered the father of modern anarchism. A close relative of anarchism is nihilism, which glorifies destruction as an end in itself rather than as a means to overthrow the existing system or rebuilding society. In Russia during the last half of the nineteenth century, anarchists helped to precipitate the discontent that led to the 1905 Revolution, sometimes called a dress rehearsal for the 1917 October Revolution.
Ideologies of the Right
Monarchism is at the opposite end of the political spectrum. Until the twentieth century, monarchy was the prevalent form of government throughout the world. Whether they were called kings or emperors, czars or sultans, or sheiks or shahs, monarchs once ruled the world. Aristotle regarded monarchy—rule by a wise king—as the best form of government (although he recognized that wise kings, as opposed to tyrants, were very rare).
However archaic it may look to modern eyes, monarchism is not dead. Jordan, Kuwait, Morocco, Saudi Arabia, and the oil-rich Persian Gulf ministates, as well as Bhutan, Brunei, and Swaziland, are still monarchies. Jordan and Morocco are limited monarchies; in both countries, the chief executive rules for life by virtue of royal birth rather than by merit, mandate, or popular election. Most other countries that still pay lip service to monarchism are, in fact, constitutional monarchies in which the king or queen is a figurehead. The United Kingdom is the example we know best in the United States, but Belgium, the Netherlands, Spain, Denmark, Norway, and Sweden all have monarchs as titular rulers.
After World War I, fascism supplanted monarchism as the principal ideology of the extreme Right. In Germany, National Socialism—more commonly known as Nazism—was a particularly virulent form of this ideology (see Chapter 6). Predicated on the “superiority” of one race or nation and demanding abject obedience to authority, fascism exerted a powerful influence in Europe and South America from the 1920s to the 1940s. The prime examples in history are the Axis powers (Germany, Italy, and Japan) in World War II, but other instances of authoritarian regimes bearing a close resemblance to fascism—including Spain (Francisco Franco), Portugal (Oliveira Salazar), and Hungary (Miklos Horthy)—existed in this period as well.
Argentina under Juan Perón (1946–1955) closely resembled the fascist model after World War II, as did military dictatorships in Brazil, Paraguay, and several other Latin American countries. However, Perón never engaged in the kind of violence and mass repression associated with General Augusto Pinochet in Chile (1974–1990) or General Rafael Jorge Videla in Argentina (1976–1981). More recent examples include Kim Jong-Il of North Korea (who died in December 2011), Hosni Mubarak of Egypt (overthrown in the 2011 Egyptian Revolution), Muammar Qaddafi of Libya (also overthrown in 2011 as part of the wider “Arab Spring”), Omar al-Bashir of Sudan, and Bashar al-Assad of Syria (see Chapter 5).
Fascism enjoyed mass support in many countries largely because of its appeal to nationalism, ethnicity, and (in the case of Nazi Germany) race. Other ideological roots of fascism can be found in romanticism, xenophobia, populism, and even a form of hierarchical socialism (discussed below).
One of the distinguishing features of many extreme right-wing ideologies is a blatant appeal to popular prejudices and hatred.* Such an appeal often strikes a responsive chord when large numbers of people, who are part of the racial or ethnic majority, have either not shared fully in the benefits of society or have individually and collectively suffered severe financial reversals. In turbulent times, people are prone to follow a demagogue, to believe in conspiracy theories, and to seek scapegoats, such as a racial, ethnic, or religious minority group; an opposing political party; a foreign country; and the like. Xenophobia and antipathy to foreigners, immigrants, and even tourists have been on the rise in many European countries (including France, Germany, the Netherlands, and the United Kingdom) and in the United States since the 1990s. Remnants of the American Nazi Party and the Ku Klux Klan (KKK) have lingered as well.
Belief in racial superiority supplies an underlying rationale for a whole range of radical policies dealing with immigration (foreigners must be kept out), civil rights (African Americans, Jews, and other minorities are genetically inferior and do not deserve the same constitutional protections as whites), and foreign policy (threats to white America must be met with deadly force). At the far-right extreme, these groups are organized along paramilitary lines, engage in various survivalist practices, and preach violence. Although the KKK has largely faded from view, it still has die-hard followers, including some in law enforcement. In February 2009, the Nebraska Supreme Court upheld the firing of State Highway Patrol trooper Robert Henderson for his ties to the KKK. The KKK’s long history of violence toward African Americans—symbolized by the white sheets worn by its members and the crosses set ablaze at rallies—has made it synonymous with bigotry and racial intolerance.
The Religious Right
The religious right in the United States emerged as an important nationwide political force in 1980. The election as president of a conservative Republican, Ronald Reagan, both coincided with and accelerated efforts to create a new right-wing political coalition in the United States. The coalition that emerged combined the modern political techniques of mass mailings, extensive political fundraising, and the repeated use of the mass media (especially television) with a call for the restoration of traditional values, including an end to abortion, the reinstatement of prayer in public schools, a campaign against pornography, the recognition of the family as the basis of U.S. life, and a drive to oppose communism relentlessly on every front.
This movement contained a core of fundamentalist or evangelical Christians, called the New Right, who saw politics as an outgrowth of their core religious values. Beginning in the 1980s, television evangelists such as the late Jerry Falwell (who spearheaded a movement called the Moral Majority) and Pat Robertson (who ran unsuccessfully for president in 1988) gained a mass following. The far right suffered a setback in 1992 when Pat Buchanan’s presidential bid also fizzled.
The election of George W. Bush, who openly courted the fundamentalist Christian vote, was widely viewed as victory for the religious right. Roman Catholics and Southern Baptists, along with other evangelical groups, joined forces in a new kind of coalition against what many regular churchgoers saw as an alarming upsurge in immorality and sinful behavior, including abortion, gay marriage, and the teaching of evolution in public schools. The last issue, along with stem cell research, pitted religion against science.
The Christian Coalition, another conservative group, has roots in the Pentecostal church. Boasting as many as one million members, the Christian Coalition produces and distributes a kind of morality scorecard, evaluating political candidates’ positions on key issues from the perspective of religious dogma. Its members focus on getting elected to local school boards in order to advocate for patriotism (as opposed to multiculturalism), religion, and a return to the basics in education.
The Christian Coalition’s success raised two serious questions. First, was the Christian Coalition best understood as a well-meaning effort by decent citizens to participate in the political arena or as a dangerously divisive blurring by religious bigots of the separation between church and state? Second, was it an interest group or a political party?
The 2008 and 2012 elections both pointed to the possibility that the political potency of the religious right in U.S. politics was declining. Some critics have suggested revoking the tax-exempt status of religious establishments that cross the line and transform themselves into political movements.* But the strength of religious fundamentalists, particularly in the South and Midwest, combined with an antiquated scheme of representation that gives small, sparsely populated states disproportionate voting power in Congress, makes it likely that religion will continue to play a huge role in American politics at all levels in the years to come.
The dominant ideology in the United States, Europe, and Asia today is capitalism. Even in Communist China, where Maoism remains the official ideology, capitalism is the engine driving the amazing revitalization of the economy since the death of Mao Zedong in 1976. The collapse of communism and its explicit rejection of private property, the profit motive, and social inequality was a triumphant moment for proponents of free enterprise and the free-market economy. Indeed, the Cold War was in no small measure an ideological contest between the United States and the Soviet Union over this very issue.
In the contemporary world, capitalism is the ideology of mainline conservatives; at the same time, however, it is a basic feature of classical liberalism. In the United States, it is the Republican Party that most enthusiastically embraces capitalism, although few Democrats in Congress ever dare to denounce big business. However disappointing or frustrating this fact may be to some rank-and-file voters, it is not difficult to discern the reasons for it. Capitalism is the ideology of big business, as well as of powerful Washington lobbies, including the U.S. Chamber of Commerce and the National Association of Manufacturers. It also provides the moral and philosophical justification for the often ruthless practices of multinational corporations (MNCs) such as Walmart, Microsoft, McDonald’s, and Wall Street financiers—practices that would otherwise appear to be based on nothing more high-minded than the idea that “greed is good.” (The U.S. actor Michael Douglas won an Academy Award for his performance in Wall Street, a 1987 film in which his “Master of the Universe” character spoke those very words.)
What is capitalism? It means different things to different people. It can refer to an economic theory based on the principles found in Adam Smith’s Wealth of Nations (discussed later in this chapter). Or it can mean an ideology that elevates the virtues of freedom and independence, individualism and initiative, invention and innovation, risk-taking and reward for success. We can also view it as an elaborate myth system used to justify the class privileges of a wealthy elite and the exploitation of the workers who produce society’s wealth. The latter interpretation, of course, derives most notably from the writings of Karl Marx (see “Ideologies of the Left” section in this chapter).
As an economic theory, capitalism stresses the role of market forces—mainly supply and demand—in regulating economic activity; determining prices, values, and costs; and allocating scarce resources. In theory it opposes government interference, and in practice it opposes government regulation. It applauds the notion that, in the words of President Calvin Coolidge, “the business of America is business.” Capitalism’s proponents, however, often assume we have a free market operating solely on the principles of supply and demand; they seldom consider whether it really exists. In fact, the free market is a myth, useful for public relations or propaganda but not for understanding how modern economies actually work. No modern economy can function without all sorts of rules and regulations. The question is not whether rules are necessary but rather who makes the rules and in whose interests. The key to the success of a market economy is competition, not deregulation.
As an ideology, capitalism opposes high taxes (especially on business), social welfare, and government giveaways. Conservatives tend to believe wealth is a sign of success and a reward for virtue. Rich people deserve to be rich. Poverty is the fault of poor people themselves, who are lazy, indolent, and irresponsible. Relieving poverty is the job of charity and the church, not government. Capitalists also tend (or pretend) to believe in the trickle-down theory: if the most enterprising members of society are permitted to succeed and to reinvest wealth, rather than handing it all over to the tax collector, the economy will grow, prosperity will trickle down to the lower levels, and everybody will be better off.
Critics of capitalism argue that the free market is a fiction and that big business only pretends to support deregulation and the increased competition it fosters. Meanwhile, giant corporations routinely seek tax favors, subsidies, and regulatory concessions and fight antitrust legislation at every turn. Revelations of large-scale fraud and corruption in recent years, symbolized by such fallen corporate outlaws as Enron and WorldCom, had badly stained the image of U.S. business even before the financial meltdown in the fall of 2008. But then came the failure of major investment firms like Lehman Brothers, Wachovia, and Merrill Lynch in 2008, followed by bailouts of banks, automakers, and insurance giant AIG teetering on the brink of bankruptcy.
One measure of how far the corporate sector had fallen in the public esteem: the insurance giant AIG (American International Group) saw its stock plunge from a 52-week high of $52.25 to a low of $0.38 in February 2009. Sensational front page stories of fraud, misfeasance, and self-aggrandizement associated with such prominent financiers such as Bernie Madoff, Robert Allen Stanford, and Jamie Dimon—CEO of JPMorgan Chase & Company blamed for a $2 billion trading loss in 2012—further undermined public trust in business, banks, and Wall Street. Prominent pundits with national audiences—writers like Bill Moyers, Matt Taibbi, Paul Krugman, Joseph Stiglitz, and William Black—kept these stories from fading and furnished a steady stream of evidence about collusion between Washington and Wall Street, corruption in Congress, and crime in the suites.
Proponents of libertarianism generally agree with the axiom, “That government is best, which governs least.” Like classical liberals, libertarians stress the value of individual liberty; but an obsession with fighting all forms of government regulation—even measures aimed at public safety or income security for the elderly—often leads them to embrace policies at odds with logic or common sense. Thus, for example, to quote libertarian Senator Rand Paul (son of 2012 presidential candidate Ron Paul) in a 2002 letter to the Bowling Green Daily News, “a free society will abide unofficial, private discrimination, even when that means allowing hate-filled groups to exclude people based on the color of their skin.” Paul also opposed gun control, called for the United States to withdraw from the United Nations Human Rights Commission, and advocated abolishing the Federal Reserve System.
Ideologies of the Left
Left-wing ideologies propose a view of human beings living together harmoniously without great disparities in wealth or social classes. In contrast to capitalism, public goods take priority over private possessions. If equality is the end, state control is the means—control of everything from banking, transportation, and heavy industry to the mass media, education, and health care.
Socialism is fundamentally opposed to capitalism, which contends that private ownership and enterprise in the context of a competitive free-market economy is the best and only way to bring about prosperity. Socialism is “an ideology that rejects individualism, private ownership, and private profits in favor of a system based on economic collectivism, governmental, societal, or industrial-group ownership of the means of production and distribution of goods, and social responsibility.”*
Politics and Pop Culture The Russians Are Coming! “007” to the Rescue
During the Cold War (1945–1991), Hollywood produced dozens of films aimed at addressing and sometimes exploiting movie-goers’ fear of Communism. Among the most famous films of this genre are Conspirator (1949) starring Elizabeth Taylor; Trial (1955); Rio Bravo (1956), starring John Wayne; The Manchurian Candidate (1962); Dr. Strangelove (1964); Seven Days in May (1964); The Spy Who Came in from the Cold (1965), based on John le Carré’s eponymous best-seller; The Russians Are Coming, the Russians Are Coming (1966); and Three Days of the Condor (1975).
Perhaps the most famous hero of the genre is “007.” Ian Fleming created the fictional character of James Bond (aka 007) in 1953 as the Cold War was building and a hot war (in Korea) was raging. The character has since been adapted for all manner of popular culture uses, especially film. Starting with Dr. No in 1962, James Bond movies (23 so far!) are now the longest running and the second highest grossing film series in history. Skyfall (2012) and Spectre (2015) are the most recent installments.
The University of Washington Library has compiled a selective list of Cold War films (“The Red Scare: A Filmography”), which can be viewed online. Here is their brief introduction to this list:
The films produced in Hollywood before, during and after the Cold War Red Scare make for an interesting study in the response of a popular medium caught in a political firestorm…. [Some] motion pictures played a role in fueling the Red Scare, in propagandizing the threat of Communism and in a few rare and rather veiled cases, in standing up to the charges of the House Committee on Un-American Activities (HUAC)….
HUAC interrogated many film industry people. In the end, countless careers were destroyed but only ten individuals actually went to jail. This group came to be known as “The Hollywood Ten.”… An exhaustive analysis … indicated that none of the 159 films credited … to The Hollywood Ten contained Communist propaganda.
Similarly, since 9/11, countless films and TV series pander to the American public’s fascination with terrorism—and seek to profit from it (see Chapter 16).
· Spy-thrillers in book form and in films are extremely popular and often hugely profitable. They have the power to entertain us and also to shape our views of the world. Do films like the ones about Communism and the Cold War serve a useful purpose in society beyond entertainment? Is there a sharp distinction between art and propaganda? Think about it.
(Hint: It has been said that “propaganda is direct while art is reflective.” In this view, art doesn’t change us but rather makes us more aware of what we already know or think we know, and it can either intensify or challenge our preconceptions.)
Communism is sometimes used interchangeably with Marxism, named after its founder Karl Marx (1818–1883). Marx and his associate Friedrich Engels (1820–1895) envisioned a radical transformation of society attainable only by open class conflict aimed at the overthrow of “monopoly capitalism.”
Marx and Engels opened the famous Communist Manifesto (1848) with the bold assertion, “All history is the history of class struggle.” All societies, Marx contended, evolve through the same historical stages, each of which represents a dominant economic pattern (the thesis) that contains the seeds of a new and conflicting pattern (the antithesis). Out of the inexorable clash between thesis and antithesis—a process Marx called dialectical materialism—comes a synthesis, or a new stage in socioeconomic development. Thus, the Industrial Revolution was the capitalist stage of history, which succeeded the feudal stage when the bourgeoisie (urban artisans and merchants) wrested political and economic power from the feudal landlords. The laws of history—or dialectic—which made the rise of capitalism inevitable, also make “class struggle” between capitalists (the owning class) and the proletariat (the working class) inevitable—and guarantee the outcome.
Marxist theory holds that the main feature of the modern industrial era is the emergence of two antagonistic classes—wealthy capitalists, who own the means of production, and impoverished workers, the proletariat, who are paid subsistence wages. The difference between those wages and the value of the products created through the workers’ labor is surplus value, or excessive profits, which the capitalists pocket. In this way, owners systematically exploit the workers and unwittingly lay the groundwork for a proletarian revolution.
How? According to Marx’s law of capitalist accumulation, the rule is get big or get out. Bigger is always better. Small companies lose out or are gobbled up by big ones. In today’s world of mergers and hostile takeovers, Marx appears nothing less than prescient here. Eventually, the most successful competitors in this dog-eat-dog contest force all the others out, thus ushering in the era of monopoly capitalism, the last stage before the downfall of the whole capitalist system.
The widening gap between rich and poor is the capitalist system’s undoing. As human labor is replaced by more cost-effective machine labor, unemployment grows, purchasing power dwindles, and domestic markets shrink. The result is a built-in tendency toward business recession and depression. It all sounded eerily familiar in the midst of the 2008–2009 global recession.
Countless human beings become surplus labor—jobless, penniless, and hopeless. According to the law of pauperization, this result is inescapable. For orthodox Marxists, the “crisis of capitalism” and the resulting proletarian revolution are equally inevitable. Because capitalists will not relinquish their power, privilege, or property without a struggle, the overthrow of capitalism can occur only through violent revolution.
The belief that violent mass action is necessary to bring about radical change was central to the theories of Marx’s follower Vladimir Lenin (1870–1924), the founder of the Communist Party of the Soviet Union and the foremost leader of the Russian Revolution of 1917. Lenin argued that parliamentary democracy and “bourgeois legality” were mere superstructures designed to mask the underlying reality of capitalist exploitation. As a result, these revolutionaries disdained the kind of representative institutions prevalent in the United States and Western Europe.
With the fall of communism in the Soviet Union and Eastern Europe, Marxism-Leninism has lost a great deal of its luster. Even so, the doctrine retains some appeal among the poor and downtrodden, primarily because of its crusading spirit and its promise of deliverance from the injustices of “monopoly capitalism.”* (See Ideas and Politics, Figure 2.2.) After World War II, communism spearheaded or sponsored “national wars of liberation” aimed at the overthrow of existing governments, especially in the Third World. Since the collapse of Communism in Europe, however, the revolutionary role played by the Soviet state and Marxist ideology on the world stage has given way to Islamism—not Islam, the religion, but Islamism, an anti-Western ideological offshoot that seeks to restore the moral purity of Islamic societies (see Chapter 15).
Ideas and Politics Marx Is Dead; Marxism—Not so MuchIn contrast to what happened in many European democracies, Marxism has never gained a toehold in the United States. Yet, in many other parts of the world, Marxist parties have flourished at one time or another. In Castro’s Cuba, most of Asia, and parts of sub-Saharan Africa, communist or socialist parties long dominated the political scene, and “national wars of liberation” were often spearheaded by self-avowed Marxists.
In many other countries, most notably in Western Europe, nonruling communist parties achieved democratic respectability. The communist parties of France and Italy, to cite two important examples, are legally recognized parties that regularly participate in national elections. Socialist parties are mainstream political parties throughout Europe. In the 1970s, communist party leaders in Italy and Spain led a movement called Eurocommunism. They renounced violent revolution and sought to change society from within by winning elections.
With the collapse of the Soviet Union in 1991, Marxist parties have declined but by no means disappeared. After the “Plural Left” coalition won the French parliamentary elections in May 1997, three communists were appointed to the cabinet of Socialist Prime Minister Lionel Jospin. In recent years, the elected leaders of Venezuela, Bolivia, and Nicaragua have all expressed sympathy with Marxist ideas and have embraced socialist policies. And China, now boasting the world’s second largest economy, is still a communist one-party state. In addition, four other countries continue to be communist-ruled: Cuba, Laos, North Korea, and Vietnam.
· Between 1928 and 1944 Norman Thomas was the Socialist Party’s perennial candidate for president of the United States. In 1932, in the throes of the Great Depression, he garnered 884,885 votes. Thomas famously predicted: “The American people will never knowingly adopt socialism. But under the name of Liberalism, they will adopt every fragment of the socialist program until one day America will be a socialist nation without knowing how it happened.” To what extent—if any—has this prediction come true? Why do conservatives and liberals often answer this question very differently? Think about it.
(Hint: Often the best way to win a televised debate or political argument, unfortunately, is to be highly selective—and inventive—in citing “facts” to underscore whatever strengthens the case you are making while ignoring or discrediting what doesn’t.)
Democratic Socialism, the other main branch of socialist ideology, embraces collectivist ends but is committed to democratic means. Unlike orthodox Marxists, democratic socialists believe in gradualism, or reform, rather than revolution, but they hold to the view that social justice cannot be achieved without substantial economic equality. They also tend to favor a greatly expanded role for government and a tightly regulated economy. Socialist parties typically advocate nationalization of key parts of the economy—transportation, communications, public utilities, banking and finance, insurance, and such basic industries as automobile manufacturing, iron and steel processing, mining, and energy. The modern-day welfare state, wherein government assumes broad responsibility for the health, education, and welfare of its citizens, is the brainchild of European social democracy.
The goal of the welfare state is to alleviate poverty and inequality through large-scale income redistribution. Essentially a cradle-to-grave system, the welfare state model features free or subsidized university education and medical care, generous public assistance (family allowances), pension plans, and a variety of other social services. To finance these programs and services, socialists advocate high taxes on corporations and the wealthy, including steeply progressive income taxes and stiff inheritance taxes designed to close the gap between rich and poor.
Democratic Socialism has had a major impact in Western Europe. The United Kingdom and the Scandinavian countries provide the classic examples. The welfare state became the norm in Europe after World War II, but the aftermath of the 2008 global financial meltdown put its viability to a severe test. Many European Union (EU) governments, including Greece, Ireland, Portugal, and Spain, were running huge budget deficits in 2010. Greece, in particular, teetered on the brink of bankruptcy. Indeed, the EU was forced to bail out the governments of Greece, Ireland, and Portugal with massive infusions of euros to prevent them from defaulting and potentially causing the collapse of the euro zone itself.
The chronic deficits that led to the euro crisis were—and are—in no small measure a result of the generous welfare-state benefits, including health care and pensions, in place in these countries. Attempts to economize through austerity measures (spending cuts and tax increases) met with mass protests in Greece, Spain, Italy, and France, among others. Nor has the public mood in Europe greatly improved: in 2014, thousands of anti-austerity protesters took to the streets in Paris and Rome. In Italy, for example, youth unemployment had risen well above 40%—a figure that represents both a big drain on public spending and a big loss of labor productivity. Unlike France, Spain, and other European countries, the United States does not have a strong Socialist party nor has “socialism” ever lost the negative stigma most Americans attach to it. Even when Socialist Norman Thomas polled nearly 900,000 votes in the 1932 presidential election, that result amounted to only about 2% of the total votes cast.
Nonetheless, many entrenched social programs in the United States resemble measures associated with the welfare state. Examples include Social Security, Medicare, farm subsidies, family assistance, unemployment compensation, and federally subsidized housing. Compared with most Europeans, U.S. citizens pay less in taxes but also get far less in social benefits—except for high-level government employees, the professional military class, and, of course, members of Congress, who enjoy cradle-to-grave benefits that would make even the most ardent socialist blush.
Ideologies and Politics in the United States
U.S. politics is essentially a tug-of-war between liberals and conservatives. These two terms often generate more heat than light, but it is difficult to sort the central issues in U.S. politics without reference to the “liberals” and the “conservatives.”
The Uses and Abuses of Labels
The political ideas virtually everyone in the United States embraces evolved from a 300-year-old liberal tradition in Western political thought that sees the safeguarding of individual rights as the central aim and purpose of government. Liberals and conservatives alike champion freedom and human rights, but they argue about which rights are fundamental. Liberals tend to favor narrowing the gap between rich and poor, whereas conservatives stress the virtues of free enterprise and tend toward a minimalist definition of equality (for example, equal rights = the right to vote; equal opportunity = the right to basic education and the like). In general, liberals typically define equality broadly in social, political, and economic terms; conservatives tend to confine equality to the political realm.
Several factors blur the distinction between liberalism and conservatism in the United States. First, although there are always plenty of impassioned liberals and conservatives eager to sound off on television talk shows, in practice voters tend to be more pragmatic than dogmatic. Voters want results, not rhetoric. Second, although politicians often make bold campaign promises, few are willing to go out on a limb by proposing any change that’s likely to be controversial (for example, campaign finance reform, health care reform, or bank nationalization). Third, liberals and conservatives sometimes come down on the same side, but for different reasons. For example, conservatives are opposed to pornography on religious and moral grounds; many liberals, on the other hand, favor a ban on “dirty” books because pornography, they say, exploits and degrades women.
Liberalism and conservatism are both rooted in principles found in the political philosophy of John Locke* and enshrined in the Declaration of Independence—that all human beings are created equal; that they are endowed with certain unalienable rights, including the rights of life, liberty, and the pursuit of happiness (Jefferson’s expansion of Locke’s “right to property”); that government exists to protect these rights; and that governmental legitimacy derives from consent of the governed rather than from royal birth or divine right.
When government becomes alienated from the society it exists to serve, the people have the right to alter or abolish it. Indeed, the purposes of government are clearly spelled out in the preamble to the U.S. Constitution: to “establish Justice, insure domestic Tranquility, provide for the common defence, promote the general Welfare, and secure the Blessings of Liberty.” As we are about to discover, however, these stirring words are also an invitation to debate.
Conservatives: Economic Rights and Free Enterprise
In stressing economic rights and private property, modern-day conservatives echo and expand on arguments first propounded by political philosophers in the seventeenth and eighteenth centuries. The dawning of the Age of Democracy brought doom to Europe’s monarchies and unleashed the economic potential of a nascent middle class, thereby setting the stage for the Industrial Revolution.
John Locke (1632–1704)
Locke contributed greatly to the idea of the commercial republic, a concept that forms the core of modern conservatism. Locke is famous as an early champion of property rights. For Locke, protecting private property is one of the main purposes of government. Locke thus helped lay the foundations for free enterprise and the modern market economy, including such basic concepts as legal liability and contractual obligation.
Many earlier philosophers, from Aristotle to Thomas Aquinas (1224–1274), cautioned against excessive concern for worldly possessions. Locke, in contrast, imagined a society in which invention and innovation are rewarded, the instinct to acquire goods is encouraged, and money serves as the universal medium of exchange. Where wealth can be accumulated, reinvested, and expanded, Locke reasoned, society will prosper; and a prosperous society is a happy one.
Baron De Montesquieu (1689–1755)
Although Locke developed the general theory of the commercial republic, the French political philosopher Baron de Montesquieu, in his famous The Spirit of the Laws (1748), identified a number of specific advantages of business and commerce. In Montesquieu’s view, nations that trade extensively with other nations are likely to be predisposed toward peace because war disrupts international commerce. Montesquieu asserted that commerce would open new avenues for individual self-advancement; that focusing on wealth creation would combat religious fanaticism; and that a culture of commerce would elevate individual morality. A commercial democracy, Montesquieu believed, would foster certain modest bourgeois virtues, including “frugality, economy, moderation, labor, prudence, tranquility, order, and rule.”*
Adam Smith (1723–1790)
Following in the footsteps of Locke and Montesquieu, Adam Smith set forth the operating principles of the market economy. Known to many as the “worldly philosopher,” Smith is the preeminent theorist of modern capitalism. In his famous treatise, An Inquiry into the Nature and Causes of the Wealth of Nations (1776), Smith explored the dynamics of a commercial society free of regulations or interference from the state. Like Locke, Smith observed that self-interest plays a pivotal role in human relations:
It is not from the benevolence of the butcher, the brewer, or the baker that we expect our dinner, but from their regard to their own self-interest. We address ourselves, not to their humanity but to their self-love, and never talk to them of our own necessities but of their advantages.*
Smith famously theorized about the “invisible hand” of the marketplace, expressed in the law of supply and demand. This law, he argued, determines market value. Where supply is large and demand is small, the market value (or price) of the item in question will be driven down until only the most efficient producers remain. Conversely, where demand is great and supply is low, the market value of a given item will be driven up. Eventually, prices will decline as competition intensifies, again leaving only the most efficient producers in a position to retain or expand their share of the market. In this way, the market automatically seeks supply-and-demand equilibrium.
Smith believed self-interest and market forces would combine to sustain economic competition, which in turn would keep prices close to the actual cost of production. If prices did rise too much, producers would be undercut by eager competitors. In this view, self-interest and market conditions make prices self-adjusting: high prices provide an incentive for increased competition, and low prices lead to increased demand and hence increased production. Finally, Smith’s free-enterprise theory holds that individuals voluntarily enter precisely those professions and occupations that society considers most valuable because the monetary rewards are irresistible, even if the work itself is not particularly glamorous.*
Taken as a whole, these concepts define what has come to be known as laissez faire capitalism, or the idea that the marketplace, unfettered by central state planning, is the best regulator of the economy.* Smith argued for the existence of a natural harmony of interests: what is good for the happiness of the individual is also good for society, and vice versa, because people will unintentionally serve society’s needs as they pursue their own self-interests without government intervention.
Conservatives are generally opposed to big government and heavy taxes, especially on business and wealth. Conservative political parties and politicians typically appeal to commercial interests and corporate industry, as well as to voters for whom traditional family and religious values are paramount. Critics fault conservatives for opposing state regulation even at the cost of consumer safety, environmental protection, or minority rights.
Conservatives argue that the quest for individual affluence brings with it certain collective benefits, including a shared belief in the work ethic, a love of order and stability, and a healthy self-restraint on the part of government. These collective “goods” are most likely to result, they argue, from a political system that ensures the best possible conditions for the pursuit of personal gain.
Two of the most prominent conservative thinkers in the post–World War II period were Friedrich Hayek (1899–1992) and Milton Friedman (1912–2006). Hayek, a leading member of the Austrian School of Economics, won the Nobel Prize in Economics (with Gunnar Myrdal) in 1974. His book, The Road to Serfdom (1944), inspired a generation of Western free-market economists, and Hayek became an iconic figure for libertarians in the United States.
Friedman was the main architect behind the restoration of classical liberalism as the official economic orthodoxy in the United States (under Ronald Reagan), the United Kingdom (under Margaret Thatcher), Germany (under Helmut Kohl), and beyond. According to The Economist, Friedman “was the most influential economist of the second half of the twentieth century … possibly of all of it.”* In his most famous work, Capitalism and Freedom (1962), Friedman argued forcefully that the secret to political and social freedom is to place strict limits on the role of government in the economy. In other words, capitalism is the key to democracy. In this view, it is desirable to minimize government by assigning to the public sector only those few functions that the private sector cannot do on its own—namely, to enforce contracts, spur competition, regulate interest rates and the money supply, and protect “the irresponsible, whether madman or child.”
Liberals: Civil Rights and Social Justice
Liberals tend to hold civil rights most dear. They are often vigorous defenders of individuals or groups they see as victims of past discrimination, including racial minorities, women, and the poor. Rightly or wrongly, liberals are often associated with certain social groups and occupations such as blue-collar workers, minorities, gays and lesbians, feminists, intellectuals, and college professors. In general, liberals favor governmental action to promote greater equality in society. At the same time, however, they oppose curbs on freedom of expression, as well as on efforts to “legislate morality.”
In the classical liberal view, respect for the dignity of the individual is a seminal value. In his treatise On Liberty (1859), John Stuart Mill eloquently stated the case for individualism:
He who lets the world, or his own portion of it, choose his plan of life for him, has no need of any other faculty than the ape-like one of imitation. He who chooses his plan for himself, employs all his faculties. He must use observation to see, reasoning and judgment to foresee, activity to gather materials for decision, discrimination to decide, and when he has decided, firmness and self-control to hold to his deliberate decision…. Human nature is not a machine to be built after a model, and set to do exactly the work prescribed for it, but a tree, which requires to grow and develop on all sides, according to the tendency of the inward forces which make it a living thing.*
Mill was at pains to protect individuality from the stifling conformity of mass opinion. Democracy by its very nature, Mill argued, is ill-equipped to protect individuality, as it is based on the principle of majority rule. Thus, following Mills, liberals point out that defenders of majority rule often confuse quantity (the number of people holding a particular view) with quality (the logic and evidence for or against it) and equate numerical superiority with political truth. In a political culture that idealizes the majority, dissenters are often frowned on or even persecuted.
Liberals value individualism as the wellspring of creativity, dynamism, and invention in society, the source of social progress. Protecting dissent and minority rights allows a broad range of ideas to be disseminated; keeps government honest; and sets up a symbiotic relationship between the individual and society, one that benefits both.
Differences Essential and Exaggerated
Liberals and conservatives often hold contrasting views on human nature. Liberals typically accent the goodness in human beings. Even though they do not deny human vices or the presence of crime in society, they tend to view antisocial behavior as society’s fault. Thus, liberals believe that to reduce crime society must alleviate the conditions of poverty, racism, and despair. Human beings are innocent at birth and “go bad” in response to circumstances over which they have no control. If you are raised in a violent, drug-infested, inner-city neighborhood with inadequate police protection, you are far more likely to turn to a life of crime than if you are raised in a comfortable and safe middle-class neighborhood in the suburbs.
Conservatives take a dimmer view of human nature. They argue that human beings are not naturally virtuous; that coercion, deterrence, and punishment are necessary to keep people in line; that individuals differ in motivation, ability, moral character, and luck; and that it is not the role of government to minimize or moderate these differences. Consequently, conservatives are seldom troubled by great disparities in wealth or privilege. By the same token, they are generally less inclined to attribute antisocial behavior to poverty or social injustice. There will always be some “bad apples” in society, conservatives argue, and the only solution to crime is punishment. Liberals, on the other hand, maintain that alleviating poverty and injustice is the best way to reduce crime and that punishment without rehabilitation is a dead end.
Is change good or bad? Liberals generally take a progressive view of history, believing the average person is better off now than a generation ago or a century or two ago. They adopt a forward-looking optimism about the long-term possibilities for peace and harmony. As they see it, change is often a good thing.
Conservatives, by contrast, look to the past for guidance in meeting the challenges of the present. They are far less inclined than liberals to equate change with progress. They view society as a fragile organism held together by shared beliefs and common values. Custom and convention, established institutions (family, church, and state), and deeply ingrained moral reflexes are the keys to a steady state and stable social order. Like society itself, traditions should never be changed (or exchanged) too rapidly. As Edmund Burke put it, “change in order to conserve.”
The differences between liberals and conservatives, though not insignificant, can be (and often are) exaggerated. Liberals and conservatives share a fundamental belief in the dignity of the individual; freedom of speech, the press, and religion; equality of opportunity; and other important values. In recent years, however, conservativism in U.S. politics has become associated with extreme right-wing groups and voting blocs such as the Tea Party movement, religious fundamentalists, and anti-immigration “nativists.”
In the summer of 2011, political commentator Fareed Zakaria lamented:
From Aristotle to Edmund Burke, the greatest conservative thinkers have said that to change societies, one must understand them, accept them as they are and help them evolve.
Watching this election campaign, one wonders what has happened to that tradition. Conservatives now espouse ideas drawn from abstract principles with little regard to the realities of America’s present or past.*
The “Values Divide” and the War on Terror
The tension between liberals and conservatives escalated into what came to be called a “culture war” or “values divide” in the 1980s.* In the 1990s, then Speaker of the House Newt Gingrich launched the “Contract with America”—a conservative agenda aimed at preventing tax increases and balancing the federal budget, as well as a series of congressional reforms. In 2001, a new divide was opened after the September 11 attacks. The ensuing “war on terror” was framed within a neoconservative worldview and carried out by a president intent upon making homeland security and a crusade against international terrorism the twin pillars of U.S. policy.
Deep divisions over emotionally charged social issues of a moral nature, such as abortion, gay marriage, and stem cell research, also contributed to the polarization of the U.S. body politic in the first decades of the new century. For many conservatives, morality is unambiguous and grounded in religion, whereas liberals tend to believe that morality is personal. Thus, many liberals oppose prayer in schools, favor broad legal and social rights for gays and lesbians, and are pro-choice on abortion. Most conservatives, on the other hand, argue that banning school prayer, allowing gay marriage, and legalizing abortion are morally wrong. Liberals counter that policies denying individual choice violate what is morally right.* For liberals, tolerance of diversity is a moral imperative; for conservatives, giving legal sanction to abortion and LGBT (Lesbian, Gay, Bisexual, and Transgender) marriage is itself immoral.
Lost in the din of angry voices was the inaudible voice of reason and the spirit of mutual tolerance. Lost, too, was a willingness to ask fundamental questions: What does the First Amendment separation of church and state mean? Does it work only in one direction—to protect religion from interference by the state? Or does it also protect the state and its political processes—above all, elections—from interference by tax-exempt (and thus, in effect, subsidized) religious organizations?
Conservatives have traditionally placed little trust in government, believing that less is more. Where tax-funded public programs are necessary, state and local governments are closer to the people and therefore better suited than the federal government to administer them. However, both major political parties have had a hand in the ever-expanding role of the federal government.
A somewhat harsh view of human nature predisposes conservatives to be tougher than liberals in dealing with perceived threats to personal safety, public order, and homeland security. Bush’s military response to the 9/11 attacks, which at first greatly boosted his popularity ratings, was in keeping with this stance.
Liberals insist that the rights of the accused be protected even if it means some criminals will escape punishment. The war on terror gave rise to a new controversy over these rights when the Bush administration refused to classify captured alleged terrorists as criminals or prisoners of war, preferring instead to create a new category of detainee—illegal enemy combatants. As such, the government said, these people were entitled to none of the legal protections provided in the U.S. Constitution or under international law.
Conservatives typically do not share liberals’ concern for protecting provocative speech, especially when they perceive the speakers as “radicals.” Thus, in the war on terror, liberals expressed alarm at provisions of the Patriot Act that allow for increased surveillance powers, warrantless searches and seizures, and, in general, invasions of personal privacy long held to be barred by the Fourth Amendment. Section 215 of the act gives FBI agents pursuing an antiterrorism investigation broad power to demand personal information and private records from citizens. The law also contains a gag rule prohibiting public comment on Section 215 orders. Not surprisingly, libertarians have joined liberals in objecting to what they see as a blatant violation of the Bill of Rights (especially the First and Fourth Amendments).
Although Section 215 was set to expire in December 2009, President Obama backed its reauthorization. The Obama administration caused a furor in November 2010 when it authorized the Transportation Security Administration (TSA) to conduct full body scans of airline passengers—including pat downs.
President Obama vowed to pursue a very different (and more liberal) view in both foreign and domestic policy than his predecessor. But Obama’s critics lament that he has failed to keep his promises—to close the Guantanamo prison (the notorious “Gitmo”), for example, or to renounce the practice of “extraordinary rendition,” whereby suspected terrorists were grabbed anywhere in the world and taken to secret detention centers to be harshly interrogated and even waterboarded (a form of torture).
In foreign affairs, liberals tend to favor reduced defense spending, whereas conservatives are more apt to follow the adage, “Fear God and keep your powder dry.” But, again, this generalization breaks down on close examination. Indeed, the so-called Blue Dog Democrats in Congress (Democrats who identify themselves as either moderate or conservative) are no less opposed to cuts in defense spending than most Republicans.
Choosing Sides versus Making Choices
Politics is often called a game, as in the “game of politics” or the “political game.” In games, we typically choose sides; we cheer for our team and celebrate when we win. Most games have clear winners and losers. Ties are possible in European football (soccer) but not in most other sports.
Politics also has winners and losers—for example, in elections. But outcomes are not so simple or clear-cut. Winning an election means bearing the burdens of government as well as gaining power. When the winners abuse that power or use it for personal gain or make bad decisions with disastrous consequences, the whole country is the loser; and trust in government is damaged. Thus, choosing sides is a necessity, but it is not the same thing as making wise choices. One sides with conservatives and still votes for a Democrat, or vice versa. The character and qualifications of candidates are, arguably, more important than whether the candidate is a Republican or a Democrat.
Many voters in the United States prefer not to be identified as members of either major party. Independents tend to choose sides one election at a time. Whereas 30% of the U.S. electorate classified themselves as independent only a decade or so ago,* today that number has climbed to over 40%. For this reason, getting the “swing vote” has become crucial to winning elections in recent times. Independents tend to choose which candidate or party to support on the basis of issues, like the state of the economy, taxes, or health care reform, rather than on ideology. In politics, unfortunately, common sense is often uncommon—the exception rather than the rule.
We can easily fall into the trap of believing there are two (and only two) sides to every argument—one right and the other wrong. But the more adamant or partisan each side becomes, the more likely it is that the truth will elude both—that it will be found somewhere in the gulf between the two extremes. Why? Because politics, like life itself, is too complicated to be reduced to pat answers, populist slogans, or simple solutions.
Governments seek to attain certain social and economic goals in accordance with some concept of the public good. How vigorously, diligently, or honestly they pursue these goals depends on a number of variables, including the ideology they claim to embrace. An ideology is a logically consistent set of propositions about the public good.
We can classify ideologies as antigovernment (anarchism, libertarianism), right-wing (monarchism, fascism), or left-wing (revolutionary communism, Democratic Socialism, radical egalitarianism). U.S. politics is dominated by two relatively moderate tendencies that are both offshoots of classical liberalism, which stresses individual rights and limited government. It is surprisingly difficult to differentiate clearly between these two viewpoints, principally because so-called liberals and conservatives in the United States often share fundamental values and assumptions. Conservatives stress economic rights; liberals emphasize civil rights. Conservatives are often associated with money and business, on the one hand, and religious fundamentalism, on the other; liberals are often associated with labor, minorities, gays and lesbians, feminists, intellectuals, and college professors. However, these stereotypes can be misleading: not everybody in the business world is conservative, and not all college professors are liberal.
Liberals look to the future, believing progress will ensure a better life for all; conservatives look to the past for guidance in dealing with problems. Liberals believe in the essential decency and potential goodness of human beings; conservatives take a less charitable view. These differences are reflected in the divergent public policy aims of the two ideological groups.