Facing the New Millennium
by Michael Flamm
Facing the New Millennium
- Postwar Politics and the Cold War
- The Fifties
- The Civil Rights Movement
- The Sixties
- The Seventies
- The Age of Reagan
- Facing the New Millennium
by Michael Flamm

In 1941, on the eve of Pearl Harbor, Time magazine publisher Henry Luce predicted that the twentieth century would become known as the "American Century." By many measures he was correct. During the next sixty years, the United States rose to a position of global military, economic, and cultural preeminence. Along the way it waged and won—with the assistance of allies—two great conflicts against opposing ideologies, first fascism and then communism. As the millennium approached, the nation enjoyed relative peace and prosperity even as it experienced considerable social and technological change. The road ahead appeared clear, with democracy and capitalism poised to spread across the world.
In 1991, five decades after Luce offered his prediction, the Soviet Union was dissolved. The formal end to the Cold War removed the old danger of nuclear destruction. But the United States and the international community now faced new threats from the resurgence of ethnic nationalism, the revival of religious fundamentalism, and the rise of global terrorism. In Central Africa and Central Europe, "ethnic cleansing" claimed the lives of millions, while in the Middle East tensions remained high between Jews and Palestinians as well as between the different sects and types of Islam. During the 1990s, the US also bombed Iraqi leader Saddam Hussein and launched missile strikes against Osama bin Laden, a wealthy Saudi and Islamic extremist who had formed a terrorist organization known as al-Qaeda. In short, the world remained a dangerous place, but most Americans paid little attention, in part because they were insulated from the threat, in part because they were distracted by the economic growth and social change that surrounded them.
The future seemed bright at the start of the new century. Then three events made it seem far less promising and far more threatening. In 2000, the presidential contest between Vice President Al Gore and Texas Governor George W. Bush ended in a virtual deadlock, with the Supreme Court determining the winner in a disputed verdict. In 2001, terrorist attacks on the World Trade Center and the Pentagon shocked the nation. Later, while the US was fighting the Taliban and al-Qaeda in Afghanistan, President Bush made the controversial decision to go to war against Iraq in 2003. At first most Americans welcomed the opportunity to remove the brutal dictator Saddam Hussein from power and Bush was narrowly reelected in 2004, but when American troops found themselves embroiled in an ambiguous conflict with no end in sight, public opinion shifted against the war and the President, who was widely and deeply unpopular by the end of his second term.
By 2008 the collapse of the housing market had triggered the worst financial crisis in the United States since the Great Depression. The domestic downturn had also ignited a worldwide recession that served as a reminder of how interconnected nations and peoples had become in the era of globalization. And it had made possible the election of the first black president in American history, Illinois Senator Barack Obama, a liberal Democrat who promised "change we can believe in." Nevertheless, the new president faced a daunting set of economic and strategic challenges as he sought to extend the American Century into the twenty-first century.
The Clinton Presidency
Bill Clinton, a Democrat from Arkansas, entered the White House with the United States in a recession, but the economy quickly recovered and raced through the remainder of the decade. From 1994 to 2000 it grew—sometimes substantially—in every quarter of every year. It was, ultimately, the greatest peacetime economic expansion in American history. Inflation was minimal, with rising wages for individuals and profits for companies. In 1997, the federal government achieved the first budget surplus in thirty years, and in 1998 unemployment fell to the lowest level in twenty years. Meanwhile, the stock market reached new highs and was a powerful engine of economic growth.
The main cause of the economic boom remains a matter of debate. Republicans contend that it was the tax cuts of the Reagan years that laid the foundation for the prosperity. Democrats assert that it was Clinton’s commitment to reduced deficits and free trade, symbolized by the North American Free Trade Agreement (NAFTA). Others point to the role of the Federal Reserve, which under the leadership of Chairman Alan Greenspan kept interest rates low, and the impact of personal computers, which spurred productivity in the workplace and kept inflation in check even as wages rose. And still others note that energy costs remained low and stable, as they had in the 1980s. Finally, with the spread of globalization, many US companies were able to reduce labor costs by lowering wages or shifting plants and production overseas.
What is clear, however, is that the gains of the 1990s, like those of the 1980s, were unevenly distributed. While great fortunes were made by wealthy individuals, great struggles were experienced by the bottom 20 percent of Americans, whose income fell in the 1980s and 1990s. With the loss of manufacturing jobs and the weakness of labor unions, education became a critical factor in how individuals fared in the "new economy"—the income of high school graduates fell while the income of college graduates, especially those with advanced degrees or specialized skills, rose. By 1999 the earnings ratio of the average employer to the average employee was 420-1, whereas in 1980 the ratio was only 42-1.
Social, Cultural, and Technological Trends
On the eve of the millennium, the United States was changing in other profound ways. Science was providing breakthroughs in genetics, which offered immense possibilities for medical treatments and incredible opportunities for the biotechnology industry. But the genetic engineering of plants, animals, and humans also raised troubling moral and ethical implications, which remained largely unresolved. Meanwhile, technology was transforming how information and ideas were transmitted and consumed. Americans now had more entertainment options than ever before, but some worried that it was making politics more divisive and families less cohesive. Perhaps most fundamental of all, the population was growing, aging, and becoming more diverse.
In the last two decades of the twentieth century, the population of the US increased significantly. But it was not the result of another baby boom similar to what happened in the 1950s and 1960s. In fact, the birth rate declined substantially after the 1970s. This trend, combined with better medical treatment and improved life expectancy, led to a large expansion in the number of elderly, whose political influence grew correspondingly. Yet the major source of population growth was immigration, both legal and illegal. Between 1970 and 2000 an estimated 28 million immigrants—21 million legal and 7 million illegal—arrived in the US, doubling the percentage of foreign born from 4.7 percent to 10.4 percent. The wave of immigration after 1970 was the largest in the twentieth century and altered the face of America. Indeed, if current trends continue it is probable that by 2050 whites of European ancestry will constitute less than 50 percent of the total population.
Two groups—Latinos and Asians—were most responsible for this development. Immigrants from Latin America—and Mexico in particular—constituted a disproportionate share of both legal and illegal immigrants. By 2000 Latinos were, by some measures, the largest single ethnic group in the US, although they were also a diverse and artificial "group" since Cubans, Puerto Ricans, Guatemalans, Nicaraguans, Dominicans, and Mexicans often had little in common except for their language. The same was true of Asian immigrants, who in the 1980s and 1990s comprised almost 50 percent of the legal newcomers. Again, Chinese, Japanese, Korean, Thai, Vietnamese, and Filipino immigrants had little in common except for their region of origin. But by 2000 there were twice as many Asians in the US as in 1985. Like Latinos, they were concentrated in states on the coasts and in the Southwest—California’s population, for example, was 27 percent foreign born by 2000. Not surprisingly, immigration was increasingly a hot political issue in those parts of the United States.
While the impact of immigration was concentrated, the impact of technology was diffused. By 2000 the world in which most Americans lived had undergone a dramatic change from earlier decades. The rise of the personal computer in the 1980s transformed the workplace. But what made computers ubiquitous and indispensable in the 1990s was the emergence of the Internet, a network of computers. Originally developed by the Advanced Research Projects Agency (ARPA) in the Department of Defense, the Arpanet had a total of twenty-three users by 1971. Thirty years later, after the federal government had withdrawn from the project for security reasons, the Arpanet had become the Internet and had an estimated 625 million users, including more than 180 million in the US alone. It is difficult to overstate the impact of this development—in less than ten years, the Internet revolutionized communication and commerce as email and online browsing displaced letter writing and window-shopping.
At the same time, the Internet was and is part of a larger shift from a culture of consolidation to a culture of fragmentation. In the world of media, the trend was from broadcasting to narrowcasting, symbolized by the proliferation of websites that catered to almost every interest and ideology. Between 1900 and 1970, mass-circulation magazines like Time and People had dominated the print market, while mass-audience radio and television networks like NBC and CBS had dominated the airwaves. But from 1970 to 2000, the mass audience splintered due to the development of niche publications and cable television, which offered families hundreds of channels so that every member could watch the program of their choice by themselves. At the same time, new inventions like the Walkman and iPod permitted individuals to customize their music and listen to it on their own without having to accommodate to the desires of others.
Nor would individuals or families have to gather at particular times to watch particular programs. Electronic devices like video cassette recorders (VCRs) and digital video recorders (DVRs) enabled audiences to "time shift"—to tape programs now for viewing later, which altered the ratings calculations of television executives and alarmed advertisers, who rightly suspected that fewer viewers were paying close attention to their commercials. Where once certain programs like the final episodes of popular series such as M*A*S*H or Cheers or Friends had attracted huge audiences, it now seemed as though the only truly shared national experience was the Super Bowl, which continued to command a vast viewership. Some predicted that it was only a matter of time before television networks like NBC or CBS and daily newspapers across the country either disappeared or shifted to some form of on-demand system where viewers or "subscribers" paid only for what they watched or read.
Political Developments
These cultural and technological trends had political implications. By the 1990s it was becoming increasingly common for individuals to only listen to radio stations, watch cable channels, read newspapers or visit websites that confirmed their existing beliefs. Known as the "echo chamber effect," this development was blamed by some commentators for the increasing polarization of American politics because studies showed that when like-minded individuals gathered in either real or virtual groups they tended to develop more extreme beliefs and become less tolerant of divergent ideas.
This ideological polarization was matched by growing partisanship between Democrats and Republicans. By the end of the twentieth century, the willingness and ability of the two parties to cooperate and compromise had eroded. One cause was a political shift in the Republican Party, which had moved to the right. A second and related cause was a structural shift in Congress, where the vast majority of House members now occupied safe seats that were ideologically homogeneous (either overwhelmingly liberal or conservative). Consequently, there was little reason or incentive for representatives of either party to practice bipartisanship or take moderate positions on controversial issues like health care—on the contrary, it could lead to a challenge in the primary from the left (for a Democrat) or the right (for a Republican).
Political events were a third cause (and effect) of increasing partisanship. Clinton and the Democrats in Congress enjoyed a fair number of legislative successes in 1993. In addition to reaching a deal to raise taxes and reducing spending, which reduced the budget deficit, they were able to impose a ban on the sale of assault rifles (a significant victory for gun-control advocates) and enact the Family and Medical Leave Act, which entitled some workers to unpaid leave for childbirth, adoption, or family medical emergencies. The White House also eased abortion restrictions and protected wilderness areas by executive order. Together, Clinton and Congress raised the minimum wage, expanded the student-loan program, and created AmeriCorps, a program modeled on the Peace Corps and intended to give students of all ages a chance to earn money for their college or graduate education through community service. Finally, the administration extended tax credits to the working poor, which eventually benefited tens of millions of families. It was, perhaps, the most important anti-poverty measure since the 1960s.
These substantial achievements were, however, overshadowed by the health care fiasco, the great failure of Clinton’s first term. Faced with rising numbers of uninsured Americans and rapidly escalating medical costs, the President asked First Lady Hillary Rodham Clinton to chair a task force that would design a plan to provide affordable health care to all Americans. Meeting in private and without input from Congress, she put together a complex and ambitious proposal that conservative critics and insurance industry advocates charged would lead to higher taxes, rationed care, and less choice of treatment or doctors for consumers. Under a hail of opposition, the measure was dead on arrival in Congress and the dream of universal health care remained unrealized. The Clinton plan also enabled conservative Republicans to paint the President as a liberal Democrat who favored traditional tax-and-spend measures.
In 1994, Republicans gained control of Congress for the first time since 1954 and had a budget showdown with President Clinton that led to a government shutdown. It was a major miscalculation by conservatives in Congress. They soon found that the public blamed them for the stalemate more than the President. The following year, a terrible tragedy proved a political blessing for Clinton. Back in April 1993, federal agents had stormed the compound of an armed religious sect in Waco, Texas. More than eighty cult members had died. On the anniversary two years later, white supremacists detonated a truck bomb next to a federal building in Oklahoma City. The explosion killed more than 140 government employees—as well as nineteen children and babies in a day-care center next to the loading zone where the rental truck was parked. In a powerful and moving speech, Clinton declared that the bombing was "an act of cowardice and it was evil." Americans overwhelmingly agreed—and many began to recoil from the more extreme anti-government rhetoric employed by some conservatives, which helped Clinton regain his political footing.
In the fall of 1996, the President rolled to an easy and substantial victory over Kansas Senator Bob Dole, although the Republicans retained control of Congress. Two years later, independent counsel Kenneth Starr, a former official in the Justice Department during the Reagan years, released a racy report detailing Clinton’s affair with White House intern Monica Lewinsky. House Republicans then voted to approve two articles of impeachment (for perjury and obstruction of justice) related to Clinton’s effort to conceal his sexual relationship with the young woman. The matter next moved to the Senate, where the first trial of a president since Andrew Johnson in 1868 took place. In the end, the trial ended with an acquittal. On neither article were the Republicans able to gain a simple majority, let alone the two-thirds margin needed for a conviction, in large part because the First Lady opted to stand by her husband and his approval ratings remained high due to the strong economy. Nevertheless, the scandal distracted Clinton and consumed the last two years of his term.
The impeachment controversy also had a lingering impact on the 2000 election. When it began, the stakes seemed low. Most Americans were content with their lives and confident about the future. On election night, Democratic Vice President Al Gore of Tennessee won the popular vote over Republican Governor George W. Bush of Texas, but the Electoral College was undecided because Florida was too close to call. After weeks of political charges and legal challenges, the US Supreme Court overruled the Florida Supreme Court, halted the recount, and awarded the election to Bush by a margin of 5-4. For the first time since 1888, a candidate who had lost the popular vote assumed the presidency.
The Bush Presidency and 9/11
Many voters, especially Democrats, were bitter and angry at the outcome, but most Americans accepted it. After all, the nation was at peace and the economy remained buoyant. Taking advantage of the opportunity, Bush immediately proposed an enormous series of tax cuts ($1.35 trillion in all), which would, he asserted, stimulate investment and promote growth. Critics countered that they would reward the wealthy, remove the surplus, and restore the deficits (by 2006 the national debt had grown from $5.7 trillion to $8.3 trillion, in part due to the Iraq War as well). Despite opposition in Congress, the measure passed and Bush signed into law the largest tax cut in history. Yet he remained unpopular with almost half of all Americans and the nation remained divided, until a fateful day in September 2001.
On the bright and clear morning of September 11, nineteen terrorists affiliated with Osama bin Laden’s al-Qaeda network hijacked four commercial airliners. In the most deadly attack ever launched on American soil, two of the planes crashed into New York City’s World Trade Center, whose twin towers were a powerful symbol of the nation’s economic might. The third plane struck the Pentagon and the fourth was apparently headed for the White House, but it crashed in a field in Pennsylvania after the passengers staged a heroic rebellion. In all, more than 3,000 people died in a tragedy that shocked and stunned the nation and the world, even though global terrorism was hardly a new phenomenon (the World Trade Center was, for example, also bombed by Islamic extremists in 1993).
The hijackers in 2001 were part of Osama bin Laden’s al-Qaeda network. Some had received training in Afghanistan, where bin Laden had a base camp and the fundamentalist Taliban had established a Muslim state after the withdrawal of the Soviet Union in 1989. Most of the terrorists were from Saudi Arabia, an ally of America but also the home of bin Laden and a source of funding for his cause. All were Islamic extremists who believed in the need for jihad (holy war) against the infidels, whether Jews, Christians, or more moderate Muslims. Their grievances included the presence of US troops in Saudi Arabia (a vestige of the 1991 Gulf War), US support for Israel, US aid to oppressive Arab regimes in Egypt and Saudi Arabia, and the pervasive influence of American culture and values throughout the Middle East and Islamic world.
The terrorist attacks of 9/11 shattered the sense of insulation and invulnerability that most Americans had felt since the demise of the Soviet Union a decade earlier. At first they rallied around Bush and demonstrated a rare sense of unity and purpose. In October 2001, the United States and Great Britain began to bomb Afghanistan as part of Operation Enduring Freedom. American Special Forces also aided Muslim fighters with the Northern Alliance, a coalition of Afghans that in December drove the Taliban from power. But bin Laden avoided capture and many al-Qaeda supporters remained at large in neighboring Pakistan (where he was killed in 2011 by a team of Navy Seals and CIA operatives). The following year the Afghans elected a new government, but by 2008 the Taliban were again a serious threat and the country as a whole remained a powder keg of poverty, violence, and extremism despite the continued presence of more than 25,000 American troops (who remained there as of 2012).
Domestic Matters
At home, the Bush administration moved swiftly to strengthen executive authority—a long-time goal of Vice President Dick Cheney—and weakened civil liberties in the name of national security. In October 2001, Congress overwhelmingly approved the USA Patriot Act, which permitted the government to monitor suspected terrorists and gather personal information from a range of sources, including public libraries and private communication via email or telephone. In November 2002, Congress also authorized the creation of the Department of Homeland Security, which combined employees from more than twenty federal agencies and represented the greatest reorganization of the executive branch since the early years of the Cold War.
Although the war on terror was the top priority of the Bush administration, it had a domestic agenda beyond tax cuts. At times, the White House acted with partisan disregard for Democratic opposition. For example, Bush used his executive authority to weaken a wide range of environmental regulations in an effort to increase energy production and promote economic growth. He also appointed two staunch conservatives to the Supreme Court in 2005. The first was John Roberts, a member of the Reagan Justice Department who replaced William Rehnquist as chief justice. The second appointee was Samuel Alito, a federal judge and graduate of Yale Law School. Their nominations left the liberal minority on the Court even more embattled.
At other times, Bush was willing to cross the aisle and work with Democrats. In 2002, he signed into law the No Child Left Behind (NCLB) Act, which significantly expanded the federal role in public education. In return for federal aid, which never met the levels promised, states had to require that schools impose annual tests, penalize those schools where students scored poorly, and permit parents to transfer their children from schools identified as "failing." Advocates of the new law hailed the greater accountability it in theory brought to public education. Critics complained that constant testing hindered creative teaching, and called NCLB yet another unfunded mandate, noting that poor schools rarely received the assistance needed, especially when it came to the stated objective of placing a qualified teacher in every classroom. In 2003, the President also received bipartisan support when he added a prescription drug benefit to Medicare. The measure was generally popular, especially with the elderly, but liberals complained that it failed to provide comprehensive coverage while conservatives noted that it expanded an already underfunded entitlement program and would add at least $400 billion to the federal debt over the next ten years.
The Second Gulf War
In January 2002, President Bush delivered his first State of the Union Message and declared—in rhetoric that echoed the language of World War II—that Iraq, Iran, and North Korea were part of an "axis of evil." The White House also implied that Saddam Hussein had close ties to al-Qaeda and was complicit in the attacks of 9/11 (the case was at best circumstantial and is widely disputed). And the administration charged that the Iraqi leader possessed "weapons of mass destruction"—biological, chemical, or nuclear weapons that he might use or place in the hands of terrorists. As of 2012, there is no compelling evidence that those weapons ever existed. Nevertheless, in October 2002 Congress authorized the use of force against Iraq, and in November the United States persuaded the UN Security Council to issue a resolution demanding that Hussein disarm or face "serious consequences." When he refused to permit UN inspections, Bush prepared to invade despite opposition from the Arab world and Russia, China, Germany, and France.
In March 2003, the United States and Great Britain invaded Iraq with thirty or so other nations, few of which were traditional allies or major powers. The coalition forces rapidly routed Hussein’s army and deposed the Iraqi dictator, who was captured nine months later. In May, Bush declared an end to major combat operations from the deck of the carrier USS Abraham Lincoln. Draped behind him was a banner that read "Mission Accomplished." But in fact the war was not over—as of 2011, more than 99 percent of the military and civilian casualties in Iraq had taken place since the banner appeared, while it remained unclear whether the country would ever become the model of stability and democracy in the Middle East that many in the administration had predicted it would.
By 2006 polls showed that a majority of Americans believed the war in Iraq was a mistake. It had cost thousands of American lives and hundreds of billions of American dollars (experts predicted that the war would eventually cost trillions of dollars). It had diverted attention from the war against terror in Afghanistan and damaged relations with allies, which complicated efforts to deal with nuclear threats from Iran and North Korea—the other members of the "axis of evil." And it had drained the reservoir of goodwill that the United States had enjoyed in most of the world since World War II, and especially since 9/11. Anti-Americanism now flourished around the globe, particularly when graphic photos of prisoner abuse at the Abu Ghraib prison in Iraq found their way to the Internet and Arab cable television channels like al-Jazeera.
The Conservative Ascendancy
Fortunately for Bush, his reelection in 2004 took place at a time when the war and occupation in Iraq retained some support. As an incumbent wartime president, he also had an inherent advantage on the critical issue of national security, which he sought to maximize through relentless partisanship. In an effort to shield themselves, the Democrats nominated Massachusetts Senator John Kerry, a highly decorated Vietnam veteran as well as a liberal critic of the President’s interventionist foreign policies. By contrast, Bush had spent the Vietnam War in the Texas Air National Guard. Nevertheless, the Republicans played to their strength and charged that Kerry and the Democrats were un-American "Defeaticrats" who would "cut and run" in the face of the terrorists. In an election with the highest turnout since 1968, when anxiety about the Vietnam War was high, Bush won the popular vote by mobilizing Christian voters in record numbers. And he earned an electoral majority by carrying Ohio, where a constitutional ban on gay marriage attracted strong support from conservative residents. Once again, a single state had doomed Democratic hopes.
After 2004, many Democrats were in despair. Bush, whom few experts had predicted would win in 2000, had again defied the odds by surviving an unpopular war, a weak economy, and soaring deficits. The Republicans had also extended their control of Congress with a well-financed party structure and a well-disciplined political machine. In the aftermath, White House political adviser Karl Rove boasted that the United States was in the midst of a "rolling realignment." Eventually, the Republicans would become the dominant party, he predicted, as they had in 1896 and the Democrats had in 1932. It was merely a matter of time, especially if the terrorist threat continued to make national security a national priority, which most seemed to think it would.
But in August 2005 Hurricane Katrina struck the Gulf Coast, causing severe destruction from Florida to Texas. The deadliest hurricane in history, it also triggered a storm surge that devastated New Orleans, where the levees failed to hold back the water. More than 80 percent of the city was flooded and hundreds of residents died. Tens of thousands suffered extreme hardship due to inadequate and ineffective evacuation plans. The failure of government was bipartisan, profound, and at every level—local, state, and federal. It was also extremely visible, with cable channels like CNN repeatedly airing footage of poor residents, overwhelmingly black, sweltering and suffering in the Superdome because of a lack of assistance. But when Bush briefly and belatedly arrived to tour the city and view the devastation, he praised the head of the Federal Emergency Management Agency (FEMA) for doing "a heckuva job." In an instant, Bush washed away much of what remained of his reputation for competence and compassion.
By 2006 the President’s approval ratings had fallen drastically. To make matters worse for the administration, a series of scandals in Congress made the Republicans appear corrupt to many voters, who opted to punish the party in power. In the mid-term elections, the Democrats surprised themselves and the pundits by retaking control of the House and Senate. Now it was the White House that was on the defensive. And suddenly conservative hopes for a major realignment in American politics, which had appeared so promising two years earlier, seemed wishful at best.
The Obama Ascendancy
In the fall of 2008, as Bush completed his second term, the nation experienced the worst financial crisis since the Great Depression. Across the nation falling home prices triggered mortgage defaults and a credit crunch that brought waves of selling on Wall Street, where stock prices plummeted by almost 40 percent. The lack of capital also drove powerful and prominent investment banks like Lehman Brothers, which had operated for more than a century, into bankruptcy. Many other banks, large and small, were on the brink of failure. Unemployment rose to heights not seen since the early 1980s. Consumer confidence collapsed, reducing consumer spending and further weakening the fragile economy.
In fairness, Bush was not solely or primarily responsible for the financial crisis, which was years in the making and the result of many causes either beyond his control or due in large part to the actions of others, such as Federal Reserve chairman Alan Greenspan. But the crisis erupted near the end of his term, when he had no real power or popularity left, which presented him with few options other than to watch from the sidelines. Like Clinton in 1994, Bush now seemed irrelevant as the American people selected a leader whom they hoped would restore optimism and faith in the future. In a sense, the 2008 election was a rerun of the 1980 election, when the nation faced hard times, came to a crossroads, and had to decide which direction to go.
In the Democratic primaries, the main contenders were Senator Hillary Clinton of New York, who was first elected in 2000, and Senator Barack Obama of Illinois, who was first elected in 2004. Obama stressed that he had opposed the war in Iraq from the outset, unlike Clinton who had voted for the resolution authorizing the use of force in 2002. Both candidates advocated the creation of affordable national health care, although their plans differed in the details. But Obama stressed that he was part of a new generation that would change the style and tenor of politics in Washington. For her part, Clinton stressed that Obama was inexperienced and unprepared for the difficult foreign and domestic challenges that lay ahead. After a bruising and tight battle, Obama claimed the nomination.
In the Republican primaries, a host of candidates jockeyed for the nomination, with none generating great enthusiasm among Republican voters. There was Mitt Romney, a Mormon who had served as governor of Massachusetts and subsequently shifted many of his views from moderate to conservative. There was Mike Huckabee, an evangelical Christian who had served as governor of Arkansas and was a favorite of many religious conservatives. And there was Senator John McCain of Arizona, a Vietnam veteran who had spent years in Hanoi as a prisoner of war and had challenged Bush for the nomination in 2000. Although he had a reputation as a "maverick" who at times was willing to challenge his party on matters of principle (such as campaign finance reform), in 2008 he accepted Republican orthodoxy in a bid to win conservative support. He endorsed more tax cuts and promised to send more troops to Iraq if elected. After a rough start, he eventually cruised to the nomination.
In the general election, McCain had little chance, although he made matters worse by running a poor campaign. He was tied to one of the most unpopular presidents in history. He was a strong supporter of an unpopular war. He had less money to spend than Obama, who raised such enormous sums via the Internet that he opted to decline public funds. And when the financial crisis metastasized in the fall, McCain had no plan to offer save for more tax cuts. Obama’s youth and inexperience—not to mention his multiracial background and middle name (Hussein)—were issues to some, but in the end he won a clear and decisive victory, with almost 53 percent of the popular vote and 365 electoral votes. He even carried Republican states like Virginia and Indiana—not to mention the battleground states of Florida and Ohio.
The political success of Barack Obama was extraordinary. The son of a white woman from Kansas and an African man from Kenya, he was born in Hawaii—the first president to hail from outside the continental United States. A graduate of Columbia University and Harvard Law School (where he was the first African American to serve as president of the prestigious Harvard Law Review), he worked as a community organizer and law professor in Chicago before entering electoral politics and serving as a state senator from 1997 to 2004. At the age of forty-seven, less than fifty years after the Civil Rights Act of 1964 removed the legal foundations of racial segregation, he became the first black president. Whether or not his election signaled the emergence of a post-racial America, as some commentators suggested, Obama’s place in history was assured.
The American Century
But whether the election of 2008 was historic is a matter of debate. At the time, some scholars argued that it was and compared it to the elections of 1860, 1896, and 1932, when the party system was transformed. Obama, they contended, was now in a position to accomplish what Franklin Roosevelt had achieved during the Great Depression—the creation of a durable liberal coalition that would generate popular liberal policies. Other scholars were, however, more cautious. They noted that despite a historically unpopular war, a historically bad economy, and a historically unpopular incumbent, almost 60 million Americans cast their ballots for McCain and his running mate, Alaska Governor Sarah Palin, the first woman to serve on a Republican ticket. They added that Obama faced a number of structural obstacles, such as the partisan climate in Washington and the enormous deficits he inherited from Bush. Finally, these skeptics stressed—correctly in hindsight—that it was simply too soon to know if the 2008 election would realign American politics.
By 2010 it was clear that Obama would have difficulty satisfying the enormous expectations of his supporters. Despite significant if controversial achievements such as national health insurance, the economy remained stagnant, with high unemployment and low growth. The partisan political battles in Washington between Democrats and Republicans continued to rage over the federal budget, tax policy, and other issues. And the emergence of a new conservative phenomenon, the Tea Party Movement, seemed to bode ill for liberal hopes. Meanwhile, as China, India, and Brazil made impressive economic progress and gained increasing political influence, the US seemed to lose power, at least in relative terms. Increasingly, it appeared as though the tragic events of September 2001 had marked an end to the "American Century," six decades after Luce had anointed it. But what the new millennium might bring was uncertain and unpredictable. Only time would tell whether the US would remain the dominant power during the twenty-first century that it had become during the twentieth century.
Michael Flamm is Professor of History at Ohio Wesleyan University. His publications include Law and Order: Street Crime, Civil Unrest, and the Crisis of Liberalism in the 1960s (2005), Debating the 1960s: Liberal, Conservative, and Radical Perspectives (2007), Debating the Reagan Presidency (2009), and The Chicago Handbook for Teachers: A Practical Guide to the College Classroom (2011). He is currently writing a book entitled In the Heat of the Summer: Racial Unrest in New York, 1964.