The Internet Age (1980–Present)

views updated

The Internet Age (1980–Present)

How They Were Governed

The Filibuster

A filibuster is a parliamentary tactic used in the U.S. Senate to delay a vote. It typically consists of a long speech-making session that brings other activity to a halt. Most often used by the minority party to thwart a measure with majority-party support, a filibuster can cause hard feelings between senators. The tactic has become more common in recent decades because of increased political partisanship.

The Filibuster’s History

The filibuster has a colorful history in U.S. politics. Derived from the sixteenth-century Spanish term filibustero (pirate), the word was first associated with U.S. legislators who obstructed the normal workings of the Senate in the 1800s. Filibustering is possible because Senate tradition allows unlimited debate on most matters—a vote cannot be taken until all debate is finished. The House followed the same tradition for a time, but abandoned it as too time-consuming.

By 1917 senators had wearied of limitless debates and adopted a rule—called cloture—that allowed a filibuster to be stopped if a two-thirds majority voted to end it. Because a two-thirds majority was difficult to obtain, the rule was changed in 1975 to allow a three-fifths majority—sixty votes—to stop a filibuster. However, since then the Senate has been fairly evenly split between the two major parties, making even a three-fifths majority hard to achieve.

Filibusters can be effective for a number of reasons. They hold up action on all other Senate matters, which can be very frustrating, particularly in the final days of the session when a backlog of legislation must be considered quickly. Unless cloture is invoked a filibuster can theoretically proceed forever. Because it is so troublesome to overcome, a filibuster often results in the bill under discussion being pulled from consideration.

Filibustering Becomes More Common

Until the 1970s filibusters were infrequent, occurring on average less than once per year. They became much more common and polarizing during the following decades and came to represent the contentious relationship between the parties. In 1991, when George H. W. Bush (1924–) was president, the Democratic-controlled Senate wanted to investigate the role of Republicans—including Bush—in a decade-old controversy relating to Americans held hostage in Iran. Republicans filibustered the proposal, arguing that it was merely an attempt to embarrass Bush as he campaigned for president. The filibuster was successful, and the Democrats’ planned investigation was not funded.

During the 1990s filibusters were employed frequently against presidential nominees and appointees. In 1993 George Mitchell (1933–), a Democratic senator from Maine and majority leader, complained that the filibuster had become a “political party device.” During the administration of Democratic President Bill Clinton (1946–) Republican senators effectively waged a filibuster against Dr. Henry Foster Jr., who was Clinton’s nominee for U.S. surgeon general. Other filibusters were launched against Clinton’s judicial nominees. Democrats responded during the first term of President George W. Bush (1946–) with filibusters of their own against judicial nominees they deemed unsuitable. By that time the simple threat of a filibuster had become an often-used tactic.

Senator Zell Miller (1932–), a Democrat from Georgia, commented on the use of filibusters in a 2003 Wall Street Journal article, noting that “the Senate is the only place I know where 59 votes out of 100 cannot pass anything because 41 votes out of 100 can defeat it.” In 2005 Republican Senator Bill Frist (1952–) of Tennessee claimed that frequent Democratic filibustering against Bush’s judicial nominees amounted to “tyranny by the minority.” While both parties argue that the filibuster allows a minority to thwart the will of the majority, neither party has made a serious effort to change Senate rules to stop the practice.

Famous Filibusters

In 1917 Republican Senator Robert La Follette Sr. (1855–1925) of Wisconsin led a filibuster against a popular bill championed by President Woodrow Wilson (1856–1924) to allow U.S. merchant ships to be armed. Although the United States had not yet entered World War I, its merchant ships crossing the Atlantic Ocean were frequent targets of the German military. La Follette, a pacifist, and several other like-minded senators kept the bill from passing. An angry Wilson complained about “a little group of willful men, representing no opinion but their own,” who were able to thwart the desire of the majority. Wilson got his way in the end, using his executive powers to arm the ships.

Louisiana Senator Huey Long Jr. (1893–1935), a Democrat, became famous for his creative filibustering during the Depression. He once spoke for more than fifteen hours, reading aloud the entire U.S. Constitution; answering questions from reporters; and reciting his favorite recipes. The filibuster finally ended at 4 a.m. when Long had to go to the bathroom.

During the 1920s and 1930s southern senators used the filibuster on many occasions to prevent passage of federal antilynching laws. In the 1950s and 1960s similar tactics were tried against civil-rights legislation. Strom Thurmond (1902–2003), a senator from South Carolina, spoke for twenty-four hours and eighteen minutes against the Civil Rights Act of 1957—the longest filibuster ever by a single senator. The bill passed anyway. Debate on the Civil Rights Act of 1964 was held up by a filibuster that lasted fifty-seven days. Cloture was finally invoked, and the bill passed.

Filibusters have also been used to thwart Senate confirmations of presidential nominees. In 1968 Republicans and conservative southern Democrats cooperated to filibuster against Abe Fortas (1910–1982), a Supreme Court justice who was nominated by President Lyndon Johnson (1908–1973) to be chief justice. Johnson bowed to the pressure and removed Fortas’s name from consideration. It was the first filibuster in history regarding a Supreme Court nominee.

The Fairness Doctrine

The Fairness Doctrine, adopted by the Federal Communications Commission (FCC) in 1949, required broadcasters to present contrasting points of view on issues of concern to the public. Following a court case in 1987, the FCC stopped enforcing the doctrine. Attempts to revive it have become controversial because of the effects it could have on broadcasts that present only one point of view, such as conservative talk-radio shows.

Development and History

In the 1920s the government considered the electromagnetic frequencies over which radio was propagated to be a limited national resource because the number of frequencies is finite. A similar scarcity argument was used when commercial television first became available. The Radio Act of 1927 required broadcasters to provide equal opportunity for airtime to politicians seeking office. This provision was also included in the Communications Act of 1934.

In 1949 the FCC adopted the Fairness Doctrine, which required broadcasters to provide alternative viewpoints on issues of public importance. Although the rule was intended to ensure that broadcasts were balanced, it actually discouraged some stations from addressing controversial issues at all. Journalists began to believe that the rule was an unconstitutional infringement on their right to free speech. However, in Red Lion Broadcasting v. FCC (1969) the U.S. Supreme Court ruled unanimously that the Fairness Doctrine served to “enhance rather than abridge the freedoms of speech and press protected by the First Amendment.”

Political and Technological Change

In 1981 President Ronald Reagan (1911–2004) appointed a new FCC chairman, Mark Fowler, who—like the president—favored industry deregulation and decreased government oversight. By that time cable and satellite television had revolutionized the broadcasting industry and made the government’s argument about the scarcity of broadcast airwaves seem outdated. In 1985 the FCC released the Fairness Report, which declared that the Fairness Doctrine was obsolete and no longer served the public interest. However, the agency did not stop enforcing the rule because it believed the doctrine could only be repealed by Congress. This notion was dismissed by the District of Columbia Circuit Court, which ruled in Meredith Corporation v. FCC (1987) that the agency did not have to continue to enforce the Fairness Doctrine.

Calls for Reinstatement

Even as the Fairness Doctrine was being eliminated, Congress sought to keep it alive through statutory action. A bill was passed by both houses of Congress in 1987; however, it was vetoed by Reagan. Since that time Congress has made several unsuccessful attempts to resurrect the measure.

To many industry analysts the elimination of the Fairness Doctrine in the 1980s led to the subsequent proliferation of conservative talk-radio programs, which became very popular in some markets. Absent the doctrine, the shows do not have to provide airtime to opposing viewpoints. Conservatives have complained that liberals opposed to the shows are behind the efforts to reinstate the Fairness Doctrine.

Computerized Voting Machines

The federal government began encouraging the use of computerized voting machines after the 2000 presidential election. That race revealed the shortcomings of paper ballots, especially during the vote recounts conducted in Florida. Advocates tout electronic voting as reliable and secure; opponents claim the machines are vulnerable to electronic attack and subject to hardware and software problems. Both camps also debate whether the machines should be required to produce paper records so that voting results can be easily verified.

Voting Methods

Voting by paper ballot, marked by hand, was the norm in the first two hundred years of U.S. elections. Then, in the twentieth century, the standard became lever-operated voting machines—voters designate their choices by moving levers next to the names of their preferred candidates—and machines that incorporate punch cards—voters punch out prescored circles in paper cards using a stylus or other pointed instrument. In the 1980s they were replaced in some areas by optical-scanning systems, which incorporate paper ballots on which voters mark their choices by filling in circles or arrows with a marking device (often a simple pencil). The votes are “read” by an optical-scanning device. The technology is similar to that used by school systems to check the answers on standardized tests. In the 2000 presidential election optical systems were used by nearly a third of registered voters. The remainder used other voting systems, primarily punch cards.

The closeness of the 2000 presidential election prompted a recount by hand of votes cast by punch card in four Florida counties. The recount revealed many problems with punch-card ballots; for example, incompletely punched holes made it unclear if the voters had attempted to punch through them and were unable to do so or if the voters had changed their minds. Human counters had to examine each of the ballots rejected by the counting machines to determine the voter’s intent. In one county the layout of the ballot was considered confusing by many voters, causing them to punch the hole next to the name of the wrong candidate. The controversy spurred calls for modernization of the U.S. voting system.

In 2002 Congress passed the Help America Vote Act, which provided nearly $4 billion for states to expand use of computerized voting machines. It also established the Election Assistance Commission (EAC) to oversee disbursement of the funds and development of quality standards for the machines. The EAC, which began operating in January 2004, had little influence over the types of voting systems used in the November 2004 elections.

Direct Recording Electronic Systems

Direct Recording Electronic Systems (DREs) are wholly computerized voting machines—similar to ATM machines—that allow voters to push buttons or use touch-screen controls to cast their votes. The votes are stored on computer chips within the machines and on removable disks that elections officials use to tabulate voting results. Although DREs were introduced in the 1970s, only 12 percent of voters used them in the 2000 election.

The government’s push to expand the use of DREs spurred intense scrutiny of the technology. Reports about problems multiplied, many of them involving DREs manufactured by Diebold Corporation, an Ohio-based company that entered the voting-machine business in 2001. DRE software code was reportedly discovered on unsecured Diebold Web pages. Computer specialists demonstrated how easy it was to manipulate the software to ignore votes cast for a specific candidate or to change votes. The controversy magnified when it was learned that in 2003 Diebold’s CEO had sent a fund-raising letter to fellow Republicans in which he pledged “to help Ohio deliver its electoral votes” to President George W. Bush (1946–) in the upcoming 2004 election. Diebold subsequently changed its ethics policy to prevent top executives from engaging in political activities, but the damage to the company’s reputation was severe. In 2004 the state of California settled a lawsuit against Diebold for $2.6 million. California counties had purchased Diebold voting machines, many of which failed during the 2004 primary election. The state alleged that the company had made false statements about the effectiveness and security of its DREs.

In the 2004 presidential election approximately 29 percent of voters used DREs. Reports of problems with the machines surfaced immediately. In North Carolina a DRE with a full memory card failed to record more than four thousand votes. Pennsylvania officials reported that DRE screens froze. One Florida county had difficulties with touch screens—it took up to an hour to activate them. More than two dozen DREs in Ohio transferred an unknown number of votes from one candidate to another. Although none of the problems reported were believed significant enough to have changed the outcome of the presidential election, activists worried about possible software glitches that went unnoticed.

A 2005 report from the Government Accountability Office (GAO) listed many concerns about the performance of DREs, including “weak security controls, system design flaws, inadequate system version control, inadequate security testing, incorrect system configuration, poor security management, and vague or incomplete voting system standards.” Although the report acknowledged that many of the problems were specific to particular DRE models, the GAO noted that the overall security and reliability of electronic voting systems needed to be addressed by the EAC.

DREs and a Paper Trail

Some DREs produce a paper copy of the completed ballot, which allows voters to verify that their selections were properly recorded. The paper copies are collected at the polls and provide a verifiable source if a vote recount is needed. Many election activists believe that all DREs should be equipped with paper-verification systems. Opponents claim that such systems make DREs more expensive and cause problems when paper jams or poll workers fail to reload the paper.

In 2003 David Dill, a computer science professor at Stanford University, began a movement calling for voter-verified paper records (VVPRs) for all DREs. Dill argued that paperless electronic voting poses too many security risks and possibilities for vote tampering and computer glitches that could go undetected. In 2004 Nevada and California became the first states to require VVPR capability for all DREs. California replaced forty thousand paperless machines in time for its primary elections in June 2006. Dill and his organization, Verified Voting Foundation, have called for federal legislation to mandate VVPRs nationwide. In April 2007 a lawsuit successfully challenged Pennsylvania’s certification of DREs that do not include methods for voter verification or independent audit.

The Department of Homeland Security

The U.S. Department of Homeland Security (DHS), a federal agency established in response to the terrorist attacks of September 11, 2001, has three primary responsibilities: to prevent terrorist attacks inside the United States; to reduce the nation’s vulnerability to terrorist attacks and the damage caused by them; and to coordinate quick and effective recovery if terrorist attacks occur.

DHS coordinates its activities with other federal entities devoted to national security, particularly intelligence agencies, such as the Central Intelligence Agency (CIA); the U.S. military; and the Federal Bureau of Investigation (FBI). In addition DHS works with state and local agencies that would provide the first government assistance following a terrorist attack.

The Components of DHS

DHS, as established by President George W. Bush (1946–), combined dozens of agencies and offices that had previously operated separately. They were devoted to emergency preparedness; risk reduction; scientific and technological responses to terrorist attacks; assessment of threats and vulnerabilities; and intelligence gathering and analysis.

Specific agencies within the DHS include the Federal Emergency Management Agency (FEMA), which manages the federal response to disasters and emergencies; the Transportation Security Administration (TSA), which protects the nation’s transportation systems; U.S. Customs and Border Protection (CBP), which protects the borders and points of entry; U.S. Immigration and Customs Enforcement (ICE), which enforces immigration and customs laws; the U.S. Citizenship and Immigration Services (USCIS), which adjudicates immigration and naturalization applications and petitions and establishes policies and priorities for immigration services; the U.S. Coast Guard (USCG), which patrols U.S. waters and provides rescue assistance; and the U.S. Secret Service, which protects high-ranking government officials and their families and investigates financial crimes.

DHS and National Security

In July 2002 Bush issued The National Strategy for Homeland Security, which laid out the details of DHS. For example, DHS has its own intelligence agents who work with the CIA, FBI, and other organizations to collect and analyze information about terrorist plots and plans. DHS agents conduct detailed analyses of terrorist groups to learn about their motivations, structures, and fund-raising activities. DHS also assesses the vulnerability of likely targets of terrorist attacks and the consequences such attacks would have.

In addition DHS operates a national warning system that uses color codes to indicate the risk of terrorist attacks—green indicates low risk; blue, guarded risk; yellow, elevated risk; orange, high risk; and red, severe risk. For each risk level the DHS recommends actions for the public to take. Since the inception of the warning system, the risk level has been set at yellow most of the time, although orange alerts have been issued on several occasions. Red alerts have been issued for specific sectors, such as air transportation.

Four DHS agencies ensure that the nation’s borders and transportation systems are safeguarded. The CBP assesses and processes travelers coming into the United States and inspects their belongings. It also screens incoming and outgoing cargo. ICE enforces immigration and customs laws and investigates the people who support terrorism and other criminal activities. The TSA is responsible for preflight screening of passengers and baggage at airports; provides air marshals on certain flights; and coordinates some security activities of the nation’s air, rail, and mass transportation systems. The Coast Guard patrols the waters around the United States to prevent illegal activities, such as smuggling and unauthorized entry.

The DHS Infrastructure Protection Program provides funds to secure a variety of facilities and systems that are considered critical to national security, including transit programs, intercity bus systems, and trucking lines. It also provides money to establish buffer zones around chemical facilities, financial institutions, nuclear and electric power plants, dams, stadiums, and other facilities deemed to be at high risk of terrorist attack. DHS grants also go to such preparedness activities as strengthening materials against explosives.

DHS works with federal, state, and local agencies to minimize the potential damage from terrorist attacks and to ensure they have the capabilities to respond when attacks occur. For example, it can provide funds to have professionals draw up building-evacuation plans. DHS also operates a public awareness program for nonterrorist emergencies, such as natural disasters.

FEMA dispatches personnel and provides financial aid to help people displaced by natural disasters and terrorist attacks. It also provides training programs for first responders, such as police officers, firefighters, and emergency medical technicians, and assists local and state emergency management agencies in preparing their disaster and response plans.

Controversy and Criticism

Since its creation DHS has been heavily criticized for its performance—or lack of performance. Some of its difficulties stemmed from the massive reshuffling and blending of existing agencies that took place to form the agency. More than one hundred seventy-five thousand employees from diverse organizations were gathered under the DHS umbrella, causing considerable management challenges.

The FEMA arm of the agency came under intense fire for its emergency response in the aftermath of hurricanes Katrina and Rita, which hit the Gulf Coast in the summer of 2005. An independent audit later that year by the DHS inspector general revealed a host of management and financial problems that hampered the agency’s relief efforts, including poor oversight of grant and contractor programs that resulted in overspending, fraud, and waste of funds intended to clean up and rebuild devastated areas.

Two other reports critical of DHS were released in December 2005. A report from House Democrats accused the agency of failing to deliver dozens of promised improvements for protecting the nation’s borders and critical infrastructure. A “report card” issued by the independent commission that investigated the September 11, 2001, terrorist attacks was as critical. The so-called 9/11 Commission handed out C, D, and F grades to the government for its failure to implement specific recommendations on tasks related to vulnerability assessment, emergency preparedness, and transportation security—all areas under DHS oversight.

The Federal Budget Deficit

A federal budget deficit occurs when the government spends more money than it takes in during a fiscal year. This has been a frequent occurrence throughout U.S. history, but starting in 1970 it became the norm: the United States had a deficit each year for nearly three decades. Budget surpluses were achieved from 1998 through 2002—the federal government took in more money than it spent each year—but the budget deficit returned in 2003.

Budget Background

Each year the president must submit a proposed budget to Congress by the first Monday in February prior to the start of the next fiscal year. (The federal fiscal year begins on October 1 and runs through September 30.) Congress considers the president’s proposed budget and passes a series of appropriation acts, each of which authorizes funding for one or more federal agencies. This funding is known as discretionary spending, because the amounts can be changed at the discretion of the government. Mandatory spending, on the other hand, refers to money delegated to entitlement programs, such as Social Security, Medicare, Medicaid, and veterans’ benefits. Mandatory spending is not covered by annual appropriations acts because entitlement programs are considered “permanently” funded.

Record Deficits Trigger Action

Deficits have occurred most often because of wartime spending or economic disruptions, such as the Great Depression. During the 1970s the economy struggled because of energy problems and rising interest rates. Those challenges, which continued into the early 1980s, were made worse by a crisis in the savings and loan industry, which required a government bailout costing billions of dollars. At the same time rapidly rising expenditures on health-care programs—Medicare serves the elderly and Medicaid the very poor—strained the federal budget.

In fiscal year 1982 the federal deficit exceeded $100 billion—the largest in history up to that time. The following year it soared above $200 billion. In response, a coalition of senators—Phil Gramm (1942–), a Republican from Texas; Warren Rudman (1930–), a Republican from New Hampshire; and Ernest Hollings (1922–), a Democrat from South Carolina—drafted the Balanced Budget and Emergency Deficit Control Act of 1985.

The act, which was signed into law by President Ronald Reagan (1911–2004) in December 1985, set deficit targets for the next five years and mandated automatic spending cuts across a variety of programs if the targets were exceeded. Any new spending programs had to be financed by either reducing existing expenditures or raising new revenues—by raising taxes, for example. In 1986 the Supreme Court, ruling in Bowsher v. Synar, declared that the law was unconstitutional because it delegated major decisions about spending cuts to the comptroller general, who is the director of the Government Accountability Office, an agency of the legislative branch. The act was amended in 1987 to give that responsibility to the Office of Management and Budget, an agency of the executive branch.

Although budget deficits decreased following passage of the law, the targets were not met. In 1990 the Budget Enforcement Act (BEA) put annual limits on appropriation acts and included a pay-as-you-go (PAYGO) restriction to prevent new mandatory spending or new tax laws from increasing the deficit. Violations of the spending caps triggered automatic cuts in discretionary programs. Violations of the PAYGO restriction triggered automatic cuts in certain mandatory programs. The BEA provisions were extended several times, primarily in the Balanced Budget Act (BBA) of 1997. That goal of the law was to balance the federal budget by 2002—the year the law was to expire—primarily through reforming the Medicare program to reduce spending.

During the mid 1990s budget deficits began to decline. In 1998 there was a budget surplus for the first time in nearly thirty years. Annual surpluses occurred through fiscal year 2001. The turnaround has been attributed to a number of factors. For example, the nation spent much less on defense in the early 1990s because the Cold War ended when the Soviet Union dissolved. In addition the economy thrived throughout much of the 1990s, which led to more tax money for government coffers.

The Early Twenty-First Century

Budget deficits returned in fiscal year 2002 and grew quickly during the ensuing years. Spending on national defense and homeland security increased dramatically following the September 11, 2001, terrorist attacks. Additional funds went to the wars and rebuilding in Afghanistan and Iraq. In fiscal year 2004 the deficit reached $413 billion, a new record. The government responded with the Deficit Reduction Act of 2005, which called for major cuts in spending on Medicare, Medicaid, and student loan programs. The act was signed into law only months after hurricanes Katrina and Rita devastated the Gulf Coast region, spurring an unexpected spike in federal spending.

In February 2007 President George W. Bush (1946–) proposed a long-term budget plan to reduce the deficit each successive year and achieve a balanced budget by fiscal year 2012. The plan assumed that Congress would slow spending on both discretionary and mandatory programs and that U.S. expenses in Iraq would decline dramatically by 2010. Critics claimed that neither of those assumptions was likely to become fact.

Why Deficits Matter

Deficits have important economic and political implications. Whenever the federal government has a budget deficit, the Treasury Department must borrow money. The total amount of money that the Treasury has borrowed over the years is called the national debt. On the first day of 2007 the national debt stood at $8.68 trillion. This is money that has to be paid back in the future and represents a burden upon future taxpayers. In addition, like any other debtor, the U.S. government pays interest on the money it borrows; interest payments consume billions of dollars each year. When the government borrows money it competes with the private sector—both businesses and individuals—for investment funds, leaving less money for businesses to spend on new factories, for example, which could provide new jobs and benefit the economy overall.

Politicians who engage in deficit spending risk incurring the wrath of the public. For some voters it is a matter of principle, believing that balanced budgets demonstrate restraint and common-sense money management. Others worry about the economic consequences of the national debt, particularly upon future generations of Americans. However, reducing deficits, either through increased taxes or reductions in government programs, can anger voters as well. As the population ages, payouts will increase dramatically for programs such as Social Security and Medicare. That will put even greater stress on the federal budget and make the politicians’ job even more difficult.

The National Debt

The national debt—the total amount owed by the U.S. government—was relatively low until the early 1940s, when it rose in response to deficit spending for World War II. During the next three decades the debt increased at a slow pace. During the late 1970s the national debt began a steep climb that continued into the 2000s. The budget surpluses from 1998 through 2002 had a slight dampening effect on the growth of the debt, but did not actually decrease it. The budget deficits that have occurred since 2003 have increased the debt rapidly.

The national debt has two components—money that the federal government has borrowed from investors, which is called “debt held by the public,” and money that the federal government has loaned itself, which is called “intragovernmental holdings.”

The public loans money to the federal government by buying bonds and other securities. The government borrows the money with a promise to pay it back with interest after a set term.

When the federal government borrows from itself, the debt is owed by one Treasury Department account to another. Most of the so-called internal debt involves federal trust funds, such the one that supports Social Security. If a trust fund takes in more revenue in a year than it pays out, it loans the extra money to another federal account. In exchange, the loaning trust fund receives an interest-bearing security—basically an IOU—that is redeemable from the Treasury in the future. In other words, the government makes a promise to itself to pay itself back in the future.

As of January 1, 2007, the total national debt of $8.68 trillion was made up of $4.90 trillion in debt held by the public and $3.78 trillion in intragovernmental holdings.

U.S. government securities can be purchased by both domestic and foreign investors. Foreign investors may include individuals and businesses, such as banks, but also national governments. In January 2007 the United States owed foreigners $2.1 trillion, or 43 percent of the total amount of debt held by the public. The largest single foreign holder was Japan with $627 billion, followed by China with $400 billion. Collectively the major oil-producing countries—most of which are in the Middle East—held more than $100 billion in U.S. debt.

Important Figures of the Day

Ronald Reagan

When Ronald Reagan (1911–2004) became the fortieth president of the United States, the nation’s economy was ailing, and its foreign policy was consumed by the Cold War with the Soviet Union. He responded with an economic policy so unique it was dubbed Reaganomics. Its chief components were huge tax cuts, deregulation of industry, and removing barriers to foreign trade. Critics claimed that Reaganomics benefited the wealthy, burdened the poor, and saddled the nation with record-setting federal deficits. Much of the deficit spending was caused by his main foreign-policy initiative: a more aggressive stance toward the Soviet Union, bolstered by a massive buildup of U.S. military forces. In the early 1990s when the Soviet empire splintered into smaller, less powerful republics, Reagan received much of the political credit. He was hugely popular while in office and afterward, despite a major arms scandal that plagued his administration.

Early Life and Public Service

Reagan, who got the nickname Dutch from his father, a shoe salesman, excelled at sports, performed in plays, and worked part time while attending tiny Eureka College, in Eureka, Illinois. He studied economics and sociology and was student body president. After graduating in 1932, in the midst of the Great Depression, he worked as a sports radio announcer. A Hollywood screen test in 1937 earned him a contract with the motion-picture company Warner Brothers. During the following two decades he appeared in more than fifty movies, but never attained the fame of a Hollywood star. Reagan did not go into combat during World War II because of nearsightedness, but did produce military training films.

Following the war Reagan served as president of the Screen Actors Guild and became an outspoken anticommunist and political activist. Although he was a Democrat in his early years, he shifted to the Republican Party in the 1960s and grew increasingly more conservative in his views. By that time he had become a prolific public speaker and made important political connections in California. He ran for governor in 1966 and won easily, partly because of his folksy manner and emphasis on traditional values.

The Presidency

In 1980 Reagan ran for president with his running mate George H. W. Bush (1924–). Reagan faced incumbent President Jimmy Carter (1924–), a Democrat, whose popularity had plunged because of a weak economy and because he had failed to achieve the release of Americans held hostage in Iran for more than a year. The Reagan-Bush team won 51 percent of the popular vote, capturing 489 out of 538 electoral votes. Republican senators swept into office on Reagan’s coattails, giving the party majority control of the Senate for the first time in decades. Democrats maintained majority control of the House in 1980, but by a smaller margin than before.

Just two months after taking office Reagan was wounded in an assassination attempt. His good humor and perseverance during his recovery earned him public praise. That kind of reaction to adversity—a down-to-earth manner and wit—served him well throughout his presidency.

When Reagan and Bush ran for re-election in 1984, their Democratic challengers were former vice president Walter Mondale (1928–) and Geraldine Ferraro (1935–), a representative from New York. Reagan and Bush won more decisively than before, capturing 59 percent of the popular vote and 525 of the 538 electoral votes. The Republican Party held onto majority control of the Senate but, despite significant gains, remained the minority party in the House.

Reaganomics

The most pressing issue when Reagan took office was the economy. The 1970s had seen unemployment and inflation rates at historic highs and energy shortages caused by oil embargoes in the Middle East. Oil producers refused to ship petroleum to countries that supported Israel in its war with Egypt and Syria. In addition they curtailed production, which raised oil prices and affected the economies of all nations. Reagan embraced a remedy called supply-side economics, which targeted the supply, or production, side of the supply-and-demand equation. It focused on easing the tax and regulatory burden on entrepreneurs and businesses—generally the wealthiest sector of the populace—so they could invest and produce more. The benefits of that investment were supposed to “trickle down” to the lower-income sectors. Opponents branded it a Republican ploy to help the rich get richer while the poor got poorer. The Economic Recovery Tax Act of 1981 reduced tax rates across the board. The Tax Reform Act of 1986 lowered the top rate, but increased the bottom rate.

Reagan embraced two other components of supply-side economics. Deregulation, which had started under Carter, is the removal of governmental restraints on private industries. Reagan expanded deregulation to cover more businesses, particularly in the financial and telecommunications industries. In addition trade barriers, such as high tariffs on imported goods, were reduced or eliminated to encourage business with other countries.

By the end of Reagan’s second term the economy was in much better shape. Although the administration got most of the credit, some economists argued that the improvement was due largely to the efforts of the Federal Reserve, the nation’s central bank. The Fed, as it is called, had a new chairman, Paul Volcker (1927–)—he had been appointed by Carter in 1979—who implemented fiscal policy that tightened the nation’s money supply, making credit more difficult to obtain and eventually lowering inflation rates.

The Deficits

In fiscal year 1980, when Reagan ran for president, the federal budget deficit totaled $59 billion. By 1982 it exceeded $100 billion—the largest in history to that time. The next year it soared above $200 billion. A group of senators—Phil Gramm (1942–), a Republican from Texas; Warren Rudman (1930–), a Republican from New Hampshire; and Ernest Hollings (1922–), a Democrat from South Carolina—drafted the Balanced Budget and Emergency Deficit Control Act of 1985. Signed into law by Reagan, it set deficit targets for upcoming budgets and required automatic spending cuts if the targets were exceeded. In addition new spending programs had to be financed by reducing existing expenditures or raising new revenues—by raising taxes, for example. Despite these controls, deficits continued to occur.

The deficits of the mid 1980s were more than twice what they had been during the mid 1970s for several reasons. Reagan pushed for greater spending on national defense as a tactic in the Cold War with the Soviet Union. Although his administration made selective cuts in spending on social services, the largest and most expensive of these programs—such as Medicare, which provides assistance to the elderly—were not substantially decreased. At the same time tax cuts meant that less revenue went into government coffers. The federal deficit in Reagan’s last year in office exceeded $150 billion.

Reagan’s Cold War

For years the cornerstone of U.S. policy in the Cold War with the Soviet Union was “containment” of communism—to keep it from spreading into noncommunist countries. Reagan quickly took a more aggressive stance, referring to the Soviet Union as “the evil empire.” He pushed a massive—and expensive—buildup of the military, which Reagan called a “peace through strength” approach. By the end of the decade the Soviet Union, partly because it had followed suit and built up its military forces, developed serious economic problems. Political opposition and unrest precipitated a breakup of the Soviet Union into individually governed republics. Reagan was championed by many in the West as the president who “stared down the Soviet Union” and ended the Cold War.

The Iran-Contra Affair

Ironically it was Reagan’s fierce anticommunism that led to his administration’s greatest challenge—the Iran-Contra scandal. In 1986 the public learned that high-ranking U.S. officials had been engaged in two secret and illegal programs: one provided money and military aid to counterrevolutionaries, known as Contras, seeking to overthrow Nicaragua’s leftist government, and the other sold arms to Iran. Some money from the arms sales was funneled to the Contra operation. An independent counsel investigated and issued a scathing report critical of the Reagan administration, noting that Reagan had shown “disregard for civil laws enacted to limit presidential actions abroad” and created a climate in which some of his officials felt emboldened to break those laws to implement his policies.

Several of the officials were indicted by a grand jury. Two convictions were overturned on appeal, and many of the charges against others were dismissed because the administration refused to turn over classified documents considered crucial to the case. Other indicted officials were pardoned in 1992 by George H. W. Bush as his term as president was about to end.

Throughout the scandal Reagan denied knowing that the operations were illegal and attributed them to rogue elements within his administration. The independent counsel was skeptical of his claims, noting “the governmental problems presented by Iran-Contra are not those of rogue operations, but rather those of Executive Branch efforts to evade congressional oversight.”

Lebanon and Libya

Reagan faced two other major foreign-policy challenges. In 1983 terrorists detonated a bomb at a barracks in Lebanon that housed United Nations peacekeeping forces. More than two hundred U.S. Marines were killed. Responsibility for the bombing could not be determined conclusively; therefore, no U.S. retaliatory action was taken. Several months later U.S. troops were removed from Lebanon. During the 1980s Libya was linked to terrorist acts in which Americans were killed, particularly a bombing at a disco in Germany in 1986. Reagan ordered military strikes on Libyan targets in response, destroying several facilities that were believed to be used for training terrorists. In addition Libyan leader Muammar al-Gadhafi (1942–) was wounded and his infant daughter was killed.

Post Presidential Life

The Ronald Reagan Presidential Library opened in Simi Valley, California, in 1991. Three years later his family released a letter from the president announcing that he had been diagnosed with Alzheimer’s disease, a degenerative disease of the neurological system. He rarely appeared in public after that time. Reagan died on June 5, 2004, at age 93.

Star Wars

In 1983 President Ronald Reagan (1911–2004) proposed a Space Defense Initiative (SDI) to develop space-based weapons to protect the United States from incoming Soviet nuclear missiles. The proposal increased tensions between the United States and the Soviet Union, which Reagan had already described as the “focus of evil in the modern world.” Soviet officials responded that Reagan was full of “bellicose lunatic anticommunism.”

The media dubbed SDI the “Star Wars” program after the 1977 hit movie. SDI had many detractors, including scientists who doubted that it was technically feasible and politicians who feared it would be extremely expensive and start an arms race in space. Despite these criticisms, Reagan continued to push the SDI program. In 1984 the U.S. Army successfully tested an interceptor missile that flew above the atmosphere, located, tracked, and destroyed a missile launched from another location. More than $1 billion was spent on SDI in 1985.

SDI became a major point of contention between Reagan and Soviet leader Mikhail Gorbachev (1931–). During arms talks in 1986 Gorbachev offered to make huge cuts in Soviet stockpiles of nuclear missiles if the U.S. would abandon SDI. Reagan refused, unwilling to give up a project he deemed crucial to the nation’s future security. In 1989 Congress made drastic cuts in the SDI budget. By that time the Soviet Union had started to break up into smaller republics. The Cold War soon ended.

SDI, however, lingered on. Largely ignored for more than a decade, the program found renewed interest after the terrorist attacks on the United States in 2001. Advocates saw it as a defense against ballistic missiles launched by rogue governments or terrorists.

See also Supply-Side Economics

Sandra Day O’Connor

Sandra Day O’Connor (1930–) was the first woman to serve on the U.S. Supreme Court. Nominated in 1981 by President Ronald Reagan (1911–2004), O’Connor at first took a conservative stance on many issues—as Reagan had hoped—but gradually adopted a centrist position and was often the swing vote on closely contested cases. She retired from the court in 2005.

Background

Sandra Day, born into a Texas ranching family, earned a law degree from Stanford University. She married John Jay O’Connor, who was also a student at Stanford, in 1953. They had three sons. O’Connor worked as a deputy county attorney and in private practice before being appointed an Arizona state senator in 1969. She was reelected to that position several times, finishing her last term in 1975. She then served as a county judge and as a member of the Arizona Court of Appeals.

During the 1980 presidential campaign Reagan promised that, if elected, he would name a woman to the Supreme Court. In 1981 the new president nominated O’Connor to take the place of Justice Potter Stewart (1915–1985), who had retired. Neither conservatives nor liberals were particularly pleased by O’Connor’s credentials. Conservatives were dismayed to find out that, as an Arizona state senator, she had voted for a bill to decriminalize abortion in that state. Liberals were hoping for a female jurist more supportive of feminist causes. Nevertheless, O’Connor was confirmed unanimously by the U.S. Senate.

O’Connor’s Decisions

During her twenty-four years on the court O’Connor gradually earned a reputation as a moderate coalition-builder whose decision on a particular issue was difficult to predict. For example, in 1992 she coauthored the decision in Planned Parenthood v. Casey, which reaffirmed a woman’s right to an abortion, which had been guaranteed by the court’s decision in Roe v. Wade (1973). However, the 1992 ruling also upheld as constitutional the rights of a state to regulate abortion as long as the regulations do not place an “undue burden” on women seeking abortions. The case involved a Pennsylvania law that required, among other conditions, a waiting period of twenty-four hours before an abortion could be obtained. That particular provision was upheld as constitutional by the court because it did not meet their “undue burden” test.

During her tenure O’Connor considered a number of cases that dealt with affirmative action. In the 1980s she typically dissented from court rulings that upheld remedial hiring preferences for women and minorities, such as cases brought under Title VII of the Civil Rights Act of 1964. Gradually she came to rule that affirmative action was acceptable when “narrowly tailored” to a “compelling interest.” In Grutter v. Bollinger (2003) the court considered the case of a white woman who alleged she had been denied admittance to the University of Michigan Law School because school officials, to produce a diverse student body, had used race as a factor in making admissions decisions. The court ruled in favor of the school 5 to 4. Writing for the majority, O’Connor said the university’s actions were constitutional because they fit the “narrowly tailored” and “compelling interest” criteria.

In a pair of notable cases O’Connor showed a change of opinion regarding the death penalty for mentally retarded defendants. In her opinion for the majority in Penry v. Lynaugh (1989), she wrote that the court upheld such executions as constitutional because a “national consensus” had not developed regarding the practice. However, in Atkins v. Virginia (2002) she joined the majority in ruling that execution of mentally retarded criminals was “cruel and unusual punishment” and prohibited by the Eighth Amendment to the U.S. Constitution. By that time more than fifteen states had outlawed the execution of the mentally retarded. O’Connor considered that to be adequate to determine “national consensus” on the issue.

Soft Money

One of the most partisan issues in politics is campaign finance reform. In 2002 John McCain (1936–), a Republican senator from Arizona, and Russ Feingold (1953–), a Democratic senator from Wisconsin, sponsored the Bipartisan Campaign Finance Reform Act (BCRA). The act banned campaign donations to the political parties of “soft money”—monetary contributions not regulated by the federal election laws, which were passed in the 1970s and are enforced by the Federal Election Commission.

So-called hard-money contributions are regulated and go directly to the campaign funds of individual candidates. Soft-money donations go to the political parties, which are supposed to spend them on generic purposes, such as get-out-the-vote drives or advertisements about party platforms and issues. During the 1990s both of the major parties began stretching the rules, targeting the soft money to help individual candidates in races. The BCRA put an end to that practice.

In 2003 the act was challenged in court by Mitch McConnell (1942–), a Republican senator from Kentucky, and dozens of special interest groups. Historically, campaign donations have been considered expressions of free speech guaranteed by the U.S. Constitution because they reflect political values. By a vote of 5 to 4 the court upheld the law as constitutional. Writing for the court, Justices Sandra Day O’Connor (1930–) and John Paul Stevens (1920–) reasoned that the soft-money restrictions resulted in a minimal restraint on free speech because soft money was supposed to be devoted to generic purposes, not specific candidates’ campaigns. They noted, “We are under no illusion that BCRA will be the last congressional statement on the matter. Money, like water, will always find an outlet. What problems will arise, and how Congress will respond, are concerns for another day.”

As the justices predicted, during the 2004 presidential election tax-exempt political organizations—named “527s” after a section of the tax code—took in large contributions of soft money and used it in highly targeted campaign advertisements. Critics complained loudly that 527s, which legally cannot endorse specific candidates, had used a loophole so they could get awfully close to doing so.

Alan Greenspan

Alan Greenspan (1926–) was chairman of the nation’s central banking system, the Federal Reserve—known as the “Fed”—from 1987 to 2006. He is widely credited with implementing monetary policy that helped the U.S. economy to prosper during the 1990s and early 2000s. Greenspan wielded so much influence that he was often referred to as the second most powerful man in the United States. His high-profile tenure brought great public attention to the Fed and the important role it plays in the economy.

Early Life and Career

Greenspan, who was born into a middle-class family in New York, became an accomplished musician at an early age, playing both the clarinet and the saxophone. He alternately toured with a jazz band and attended New York University, where he earned a bachelor’s, a master’s, and a doctorate in economics. He eventually became chairman and president of an economic consulting firm.

In the mid 1970s President Gerald Ford (1913–2006) chose Greenspan to chair the President’s Council of Economic Advisers. In the early 1980s he chaired the National Commission on Social Security Reform. President Ronald Reagan (1911–2004) appointed Greenspan to a number of advisory boards.

Greenspan’s first marriage in 1952 to painter Joan Mitchell lasted less than a year. In 1997 he married Andrea Mitchell (1946–), a correspondent for NBC news.

Chairman Greenspan

In 1987 Greenspan was appointed chairman of the Board of Governors of the Federal Reserve System. The Fed influences the overall economy by controlling both the amount of money in circulation and the interest rates that banks charge their customers. During his long tenure as chairman Greenspan was reappointed by presidents Reagan; George H. W. Bush (1924–), Bill Clinton (1946–); and George W. Bush (1946–).

Under Greenspan’s leadership the Fed made decisions that helped to keep unemployment low and production high during the 1990s and to keep inflation under control. Inflation occurs when prices of goods and services increase over time. A small amount of inflation is considered acceptable, but excessive inflation discourages people and businesses from spending and investing.

During the late 1980s the Fed began making interest-rate cuts that prompted a boom in the real-estate market that lasted into the early 2000s. Although some economists argued that it was a good move for the economy, others said Greenspan and the Fed made it too easy for people to take on more debt than they could afford. In 2004 the Fed began raising interest rates in response to signs that inflation was rising.

The nation prospered during the 1990s and early 2000s, and Greenspan received much of the credit. In 2005 he was awarded the Presidential Medal of Freedom, the highest award given to a civilian by the U.S. government. The following year he retired from the Fed and became a prolific public speaker.

The Fed

The Federal Reserve, which is usually called the Fed, was created in 1913 to be the nation’s central bank, to furnish currency, and to supervise banking in the United States. Eventually it assumed additional authority, influencing the amount U.S. currency in circulation and the interest rates charged by banks to their customers. The Federal Reserve System is made up of twelve banks across the country and is overseen by a seven-member board headquartered in Washington, D.C.

The Fed was designed to be an independent entity, immune to pressure from the president and Congress to impose politically popular but short-term solutions to economic problems. It was supposed to take a big-picture, long-range view of the nation’s economic policy.

The Fed affects the state and growth of the economy by influencing interest rates on loans. When interest rates go down, people and businesses are more likely to borrow money and spend it or invest it. That can provide a boost to a lackluster economy. However, the Fed’s work is a delicate balancing act. If it puts too much money into circulation, demand for goods and services can grow faster than the supply of goods and services, which can lead to excessive price increases, or inflation. To curb inflation the Fed raises interest rates to make borrowing less appealing.

The Fed also plays a major role in the nation’s money supply—the total amount of coins and paper currency in circulation and held by financial institutions. When the money supply increases, banks make more loans, which puts more money into the hands of consumers and pushes up demand for goods and services. If that practice becomes too inflationary, the Fed may decide to decrease the amount of money that banks can loan to the public. When borrowing decreases, demand for goods and services slows down and is better matched with supply.

George H. W. Bush

George H. W. Bush (1924–), the forty-first president of the United States, faced two major issues when he was in office: the Persian Gulf War, in which a U.S.-led military coalition ousted Iraqi forces from Kuwait, and the state of the U.S. economy. Despite achieving an overwhelming military victory against Iraq, Bush lost favor at home because of an economic slowdown, large federal budget deficits, and his decision to raise taxes after promising not to do so.

Early Life

Born into a well-to-do New England political family, Bush joined the U.S. Navy immediately after graduating from high school. He became a pilot and was on active duty in the Pacific during the latter years of World War II. He was awarded the Distinguished Flying Cross for his service. In 1945 he married Barbara Pierce (1925–) of New York. They had five children, one of whom died of leukemia at a young age. Their two oldest sons went on to careers in public service: George W. (1946–) as governor of Texas and president of the United States and Jeb (1953–) as governor of Florida.

The senior Bush graduated from Yale University in 1948 with a degree in economics. The family then moved to Texas, where he worked in the oil business and eventually founded his own oil company.

Public Service

In 1964 Bush, a Republican, ran unsuccessfully for a Texas seat in the U.S. Senate. Two years later he was elected to the U.S. House of Representatives. Bush was reelected to the House in 1968, then was defeated in a second run for the U.S. Senate in 1970. In 1971 President Richard Nixon (1913–1994) appointed Bush U.S. ambassador to the United Nations, a post he held for two years. He then became chairman of the Republican National Committee. In 1976 President Gerald Ford (1913–2006) named Bush director of the Central Intelligence Agency. Bush was replaced a year later when Democrat Jimmy Carter (1924–) became president.

During the 1980 presidential primaries Bush’s main rival for the Republican nomination was the former California governor Ronald Reagan (1911–2004). Reagan advocated supply-side economics, a policy that favored incentives, such as tax cuts, to spur entrepreneurs and investors. Bush famously denounced Reagan’s plan as “voodoo economics.” Although Bush performed well in some of the early primaries, he was soon eclipsed by the charismatic Reagan, who became the party’s nominee. Reagan at first sought former president Ford as his running mate. When a political deal between the two could not be reached, Reagan chose Bush instead. Bush was considered a moderate Republican who brought foreign-affairs experience to the ticket.

The Reagan-Bush team won the election over Carter, the incumbent, in a landslide, garnering 489 out of 538 electoral votes. In 1984 Reagan and Bush had an even more impressive win—525 electoral votes—against their Democratic challengers, former vice president Walter Mondale (1928–) and Geraldine Ferraro (1935–), a representative from New York. During his years as vice president Bush focused on such administration priorities as deregulation and the war on illegal drugs.

The 1988 Election

As Reagan approached the end of his second term, Bush became the Republican front-runner for president. He benefited from the enormous popularity that Reagan had enjoyed, but presented a quieter and more deliberate personality. When he accepted his party’s nomination, he said, “I want a kinder and gentler nation.” It was an attempt to quell criticism that the Reagan administration had been insensitive to the needs of the poor and disadvantaged. Bush also spoke of the importance of the nation’s many social and religious organizations and compared them to “a thousand points of light in a broad and peaceful sky.” But perhaps his most famous remark (and one that came back to haunt him) was a promise: “Congress will push me to raise taxes, and I’ll say no, and they’ll push, and I’ll say no, and they’ll push again, and I’ll say to them, ‘Read my lips: no new taxes.’”

As his running mate Bush chose Senator Dan Quayle (1947–) of Indiana. The choice invoked some criticism because Quayle was a relatively unknown and inexperienced legislator. However, the Bush-Quayle ticket easily won the election with 53 percent of the popular vote and 426 electoral votes. Their main opponents were Democrats Michael Dukakis (1933–), the governor of Massachusetts, and Lloyd Bentsen (1921–2006), a senator from Texas.

Domestic Policy Issues

Bush worked with Congress to enact legislation on a variety of domestic issues. The Whistleblower Protection Act (1989) encouraged and protected government employees who disclosed instances of waste, corruption, or illegal behavior by federal agencies. The Americans with Disabilities Act (1990), a civil-rights law, provided legal protections for disabled persons against discrimination in employment; state and local government programs and services; public accommodations; commercial facilities; transportation; and telecommunications. Bush supported an amendment to the U.S. Constitution to ban desecration of the American flag, but a joint resolution to that effect was voted down in Congress. Instead, Congress passed the Flag Protection Act of 1989, which Bush did not sign, but allowed to become law. It was subsequently ruled unconstitutional by the U.S. Supreme Court in United States v. Eichman (1990).

Among the most important pieces of legislation passed during the Bush administration were the Clean Air Act Amendments (CAAA) of 1990, which addressed three issues of growing concern: acid rain, urban air pollution—particularly smog—and emissions of toxic air pollutants. They strengthened enforcement and compliance procedures and established emissions allowances that could be bought and sold at auction—a market-based way for businesses to meet their pollution-reduction goals.

Foreign Affairs

Bush’s foreign policy was dominated by two events. U.S. forces invaded Panama in December 1989 to oust that nation’s military leader, Manuel Noriega (1938–), who had invalidated the results of recent elections and was considered a threat to U.S. interests, particularly the Panama Canal, a major shipping channel connecting the Atlantic and Pacific oceans. The United States also accused Noriega of engaging in drug trafficking and money laundering. The U.S. military captured Noriega within days and took him to the United States, where he was convicted and sentenced to forty years (later reduced to thirty years) in prison.

Another event had a more profound effect on Bush’s legacy: in August 1990 Iraqi military forces invaded neighboring Kuwait, ostensibly because of a dispute over oil rights. Iraq’s leader, Saddam Hussein (1937–2006), ignored demands from the international community to leave Kuwait. The United Nations imposed strict economic sanctions on Iraq, and the Security Council set a deadline of January 15, 1991, for Iraq’s withdrawal from Kuwait. Bush assembled an international coalition to use military force to enforce the resolution. On January 12 the U.S. House and Senate authorized the use of U.S. troops. Four days later the coalition began bombing targets within Iraq. In late February coalition ground forces swept into Kuwait and pushed the Iraqis back across the border within days. Bush declared a cease-fire and chose not to expand the war against Iraq. More than 2 million U.S. military personnel served in the Persian Gulf War; 382 died, 147 of them in combat.

After the war the Security Council imposed additional sanctions on Iraq, restricting its development of nuclear and other weapons. Iraq was ordered to work with inspectors from the International Atomic Energy Agency to ensure that all materials related to nuclear weapons were destroyed. During the following decade Hussein allowed inspections at some times but not at others.

The nation’s initial reaction to the Persian Gulf War was relief and a surge of patriotic pride. Bush’s approval ratings soared.

Bush and the Economy

Although Reagan’s tax cuts had pleased many voters, they had meant reduced revenue for the federal government. He had also pushed for greater national defense spending as part of his “peace through strength” policy toward the Soviet Union. The combination of these factors resulted in high federal deficits. In 1981 the national debt passed $1 trillion.

When Bush entered the White House he faced political pressure to reduce the federal deficit without raising taxes, a difficult task. The prospect became even more bleak when the savings and loan industry, which had recently been deregulated, nearly collapsed. A series of unwise loans and poor business decisions left most of the industry in shambles and a subsequent a government bailout exceeded $500 billion. At the same time the administration faced rapidly growing costs for Medicare, the health-care program for the elderly, and Medicaid, which serves the very poor. Despite his “read my lips” pledge, Bush reluctantly agreed with Congress that taxes had to be raised. The Omnibus Budget Reconciliation Act of 1990 included spending caps on government programs, but also enacted a significant tax increase. Raising taxes proved to be damaging politically for Bush.

In late 1990 and early 1991—at the time of the Persian Gulf War—the U.S. economy went into recession, with unemployment increasing and production decreasing. The economy was slow to recover, which became an issue in the 1992 presidential election. Bush and Quayle were defeated by Democrats Bill Clinton (1946–), the governor of Arkansas, and Al Gore Jr. (1948–), a senator from Tennessee.

Post-Presidential Life

After leaving the White House Bush maintained a relatively low public profile for several years, concentrating on speaking engagements and business ventures. His foundation built the George Bush Presidential Library Center on the campus of Texas A&M University. It includes the George Bush Presidential Library and Museum and the George Bush School of Government and Public Service. In 2004 Bush teamed with Clinton to lead an international fund-raising drive for victims of a tsunami that devastated Indonesia and other countries along the Indian Ocean. The next year the two men collaborated again to raise money for those who lost their homes and businesses when Hurricane Katrina hit the U.S. Gulf Coast.

Bill Clinton

Bill Clinton (1946–), the forty-second president of the United States, led the nation during a time of relative peace and economic prosperity. The federal government achieved a budget surplus for the first time in decades. However, Clinton’s presidency was plagued by scandals, including his extramarital affair with a White House intern. Clinton lied about the affair to the public; he also lied during court testimony, which led to impeachment (indictment) by the House of Representatives. He was acquitted of the charges by the Senate. Clinton was a popular president despite the scandals and maintained a high-profile role in public life after leaving the White House in January 2001.

Early Life and Public Service

Three months before William (Bill) Jefferson Blythe III was born, his father died in a traffic accident. His mother later remarried, and he began using his stepfather’s last name. He was an excellent student and became interested in politics after meeting President John F. Kennedy (1917–1963) during a White House visit. After graduating from Georgetown University, Clinton won a prestigious Rhodes Scholarship to Oxford University in England. While other young men were being drafted for service during the Vietnam War, Clinton obtained a deferment from being drafted by making a verbal commitment to join the ROTC program at the University of Arkansas. Later he changed his mind. By that time the draft rules had changed, and he was not called for military service.

During the early 1970s he attended Yale University, where he met his future wife—Hillary Rodham (1947–). Both graduated with law degrees. They married in 1975.

In 1974 Clinton ran for Congress as a Democrat in his home state of Arkansas, but was defeated. Two years later he was elected the state’s attorney general. In 1978 Clinton ran for governor and, after promising to improve the state’s schools and road system, won by a wide margin. In 1980 he was defeated in his bid for re-election, but tried again two years later and was elected. His second stint as governor lasted from 1983 until his successful bid for the presidency in 1992.

Clinton Wins the White House

Clinton’s primary competitors for the presidential election of 1992 were the incumbent president, George H. W. Bush (1924–), a Republican, and Ross Perot (1930–), a Texas businessman running as an independent. A recession during the Bush administration and the high federal budget deficit made the economy a key issue during the campaign. Clinton’s opponents tried to use his past against him—chiefly his draft deferment during the Vietnam War; his admission that he smoked marijuana as a young man; and allegations of sexual affairs. However, Clinton’s personal charm and promises of economic reform resonated with the voters. He and his running mate, Al Gore Jr. (1948–), a senator from Tennessee, were relatively young candidates and capitalized on the nation’s yearning for change. Clinton won the election easily, capturing 370 of the 538 electoral votes. He won 43 percent of the popular vote, while Bush won 37 percent and Perot won 19 percent.

Clinton’s Rocky Start

During his first years in office Clinton had some successes—notably the Family and Medical Leave Act (1993); the Brady Handgun Violence Prevention Act (1993); and the Violent Crime Control and Law Enforcement Act (1994). However, most other domestic legislation he championed struggled for survival. His foreign-policy efforts included implementation of the North American Free Trade Agreement (1993), but he was hurt politically by a failed humanitarian mission in Somalia in which several U.S. soldiers were killed.

When Clinton took office Congress was controlled by his party, the Democrats, so the public expected that he would have little trouble gaining passage of economic and social reforms. That expectation proved to be overly optimistic. The president’s proposed economic stimulus and jobs program prompted a Republican filibuster and complaints from centrist members of his own party. Critics grumbled that Clinton had campaigned as a populist reformer, but behaved like a “tax-and-spend liberal” once he was in office. Bickering over the program went on for months, which damaged Clinton’s standing in public-opinion polls. Ultimately Congress passed the Omnibus Budget Reconciliation Act of 1993, which combined tax increases and spending cuts in an effort to end a string of high federal budget deficits.

The president also had trouble implementing a campaign promise to allow gays and lesbians to serve in the military. After it became clear that Congress would not support such a measure, Clinton backed down. A policy called “don’t ask, don’t tell” was instituted: commanders are not to ask service members about their sexual orientation, and gay service members are not to volunteer information about their sexual orientation. Those service members who divulge their homosexuality can be discharged from the military.

Clinton had also promised reforms of the country’s welfare and health systems. Welfare reform had broad public appeal, but Clinton was slow to develop a policy. In mid 1994 he proposed a plan that satisfied neither Democrats nor Republicans in Congress and was soon abandoned. Early in his administration he placed health-care reform under the direction of his wife, a radical break with tradition regarding the role of a first lady. She chaired the President’s Task Force on National Health Care Reform, which several months later released a health-care plan that was more than a thousand pages long. Although it was supposed to be secret, portions were leaked to the press. Republicans, moderate Democrats, and key interest groups harshly criticized the plan for its complexity, cost, and bureaucratic nature. Congressional leaders of both parties proposed alternative plans, but no consensus was ever reached.

In the 1994 midterm elections Republican candidates took advantage of public discontent over Clinton’s perceived shortcomings. The so-called Republican Revolution gave the party majority control over the House and Senate.

Congressional Conflict and Economic Prosperity

The remaining years of Clinton’s presidency were characterized by conflict with Congress, particularly over financial matters. On two occasions in 1995 and 1996 many operations of federal agencies were forced to cease when he and Congress could not agree on spending priorities. The Republicans wanted to balance the federal budget by cutting expenditures for Medicare, Medicaid, and welfare programs and by implementing tax cuts. The president opposed the plan, arguing that their proposed cuts in social programs were too deep. The two sides waged a very public feud in the media. In general, Clinton won the public-opinion battle, as congressional Republicans got most of the blame for government shutdowns.

Despite all of the bickering, the resulting economic policy proved highly effective. During the mid 1990s federal budget deficits began to decline. In 1998 there was a budget surplus for the first time in nearly thirty years. Annual surpluses occurred throughout the remainder of Clinton’s presidency. The economy flourished, with both low unemployment and low inflation rates. The prosperous economy was a major factor in Clinton’s re-election in 1996. He won easily, garnering 49 percent of the popular vote and 379 out of 538 electoral votes. His major competitors were Bob Dole (1923–), a Republican senator from Kansas, and Perot, running on the ticket of the newly formed Reform Party.

In 1996 long-awaited welfare reform was finally achieved in the Personal Responsibility and Work Opportunity Reconciliation Act. This act, which was drafted largely by congressional Republicans, established new time limits for federal assistance and included work requirements for recipients. That same year Clinton persuaded Congress to increase the minimum wage and to pass a law granting the president the authority to veto specific items in spending bills. The law, the Line-Item Veto Act, was later ruled unconstitutional by the U.S. Supreme Court. In 1997 the booming economy prompted passage of the Taxpayer Relief Act, which included incentives for Americans to save and invest more and pay for college education. However, the tax cuts were relatively small compared with the tax increases that had been implemented in 1993.

Scandal and Impeachment

During his second year in office a special prosecutor began investigating the Clintons’ involvement in a failed real-estate venture dating back decades. In 1978 the Clintons, along with their business partners, Jim (1940–1998) and Susan McDougal (1955–), invested in an Arkansas real-estate development called Whitewater. The McDougals later operated a savings and loan institution while Bill Clinton was governor and hired Hillary Clinton to do legal work. During the 1980s and 1990s the McDougals’ business deals became the subject of federal investigations. In 1994 Attorney General Janet Reno (1938–) appointed the first of several independent counsels who investigated the Clintons’ involvement in the Whitewater venture. The investigation, which lasted until 2000, found no evidence of wrongdoing by the Clintons. The McDougals and a dozen of their business associates were convicted of federal crimes.

In 1994 Paula Jones (1966–) filed a civil lawsuit against the president, alleging that he had sexually harassed and assaulted her when he was governor of Arkansas. The president’s lawyers tried unsuccessfully to get the case delayed until he was out of office. Meanwhile, in the ongoing Whitewater probe, independent counsel Kenneth Starr (1946–) began investigating allegations that Clinton had engaged in extramarital affairs. Jones’s lawyers received anonymous tips that Clinton had been having an affair with Monica Lewinsky (1973–), a White House intern. Starr also learned about Lewinsky and suspected that the president had asked her to lie when she was subpoenaed to testify in the Jones case. In January 1998 during a pretrial deposition Jones’s lawyers questioned Clinton about Lewinsky; he denied under oath having had an affair with her.

The Lewinsky allegations soon became international news. On January 26, 1998, Clinton appeared on television and stated “I did not have sexual relations with that woman, Miss Lewinsky. I never told anybody to lie, not a single time, never. These allegations are false.” Several months later the Jones case was dismissed without going to trial. In July 1998 Starr obtained DNA evidence that proved a sexual encounter had occurred between Lewinsky and Clinton. In exchange for immunity from prosecution Lewinsky testified before a federal grand jury and revealed details about her affair with the president. However, she denied that he had asked her to lie under oath in the Jones case. In August Clinton testified before the same jury via videotape, but refused to answer many of the questions. He appeared again on television and this time admitted that he had had a relationship with Lewinsky that was “not appropriate.”

In September 1998 Starr presented his report to the House of Representatives, which released it to the public. The report included graphic details about the Clinton-Lewinsky affair. In December the House began considering articles of impeachment, accusing Clinton of lying under oath; obstruction of justice by encouraging witnesses to give false testimony; and abuse of power. On December 19, 1998, the president was impeached for perjury and obstruction of justice. Clinton became only the second president in history, after Andrew Johnson (1808–1875), to be impeached by the House. Meanwhile, Clinton continued to enjoy high approval ratings from the public.

The impeachment trial in the Senate began on January 7, 1999, with Chief Justice William Rehnquist (1924–2005) presiding. The trial lasted for more than a month and included videotaped testimony from Lewinsky. On February 12, 1999, the final vote was taken. The president was found not guilty by a vote of 55 to 45 of perjury. The vote on the obstruction charge was 50 to 50. Because the required two-thirds majority was not obtained for impeachment, the president was acquitted.

After the Presidency

Clinton left office in January 2001 with relatively high job-approval ratings from the public. He established the William J. Clinton Foundation at the Clinton Presidential Center in Little Rock, Arkansas. The foundation focuses on health issues, particularly HIV and AIDS; economic empowerment; leadership development and citizen service; and reconciliation of racial, ethnic, and religious differences. In 2004, following the tsunami that devastated Indonesia and other countries along the Indian Ocean, Clinton teamed with former president George H. W. Bush in an international fund-raising drive. In 2005 Clinton established the Clinton Global Initiative, which brings together governments, private corporations, and nonprofit organizations to address such problems as poverty and pollution. That same year he and Bush teamed up again to raise money for the victims of Hurricane Katrina, which ravaged the Gulf Coast of the United States.

How Impeachment Works

Impeachment is a procedure for removing a public official from office because of misconduct. The U.S. Constitution provides for impeachment in Article 2, Section 4, which reads, “The President, Vice President and all civil Officers of the United States, shall be removed from Office on Impeachment for, and Conviction of, Treason, Bribery, or other high Crimes and Misdemeanors.” Although impeachment is commonly associated with presidents, it also applies to other officials, such as federal judges and cabinet members. The procedure is not applicable to members of Congress; the House and Senate have their own rules for dealing with misconduct by their members.

Impeachment begins in the House of Representatives. According to Article 1, Section 3 of the Constitution, “The House of Representatives shall choose their Speaker and other Officers; and shall have the sole Power of Impeachment.” It is considered the power to indict, similar to what a grand jury does in criminal law. The House Judiciary Committee examines the charges and evidence and draws up specific indictments called Articles of Impeachment. All House members then vote on each individual article; a simple majority—more than 50 percent—is required to obtain impeachment.

If the House votes for impeachment, the U.S. Senate then becomes the courtroom. According to Article 1, Section 3, Clause 6 of the U.S. Constitution, “The Senate shall have the sole Power to try all Impeachments. When sitting for that Purpose, they shall be on Oath or Affirmation. When the President of the United States is tried, the Chief Justice shall preside: And no Person shall be convicted without the Concurrence of two thirds of the Members present.”

The penalty for conviction is prescribed in Article 1, Section 3, Clause 7 of the Constitution: “Judgment in Cases of Impeachment shall not extend further than to removal from Office, and disqualification to hold and enjoy any Office of honor, Trust or Profit under the United States: but the Party convicted shall nevertheless be liable and subject to Indictment, Trial, Judgment and Punishment, according to Law.” In other words, impeached officials can still face criminal trial for their wrongdoings.

Ross Perot

Ross Perot (1930–), a Texas businessman and billionaire, ran for president in 1992 and 1996 as an independent, third-party candidate. His image as a straight-talking, no-nonsense, political outsider proved popular with voters who were disenchanted with the two major parties. He garnered 19 percent of the popular vote in 1992 and about 9 percent in 1996, but received no electoral-college votes in either election. He founded a new party—the Reform Party—for his second campaign.

Early Life and Career

Henry Ross Perot, born into a middle-class family in Texarkana, Texas, graduated from the Naval Academy in 1953 and served in the Navy for four years. He married Margot Birmingham, and they had five children. After leaving the Navy, Perot had a successful career as a salesman for IBM Corporation. In 1962 he started his own company, Electronic Data Systems (EDS), which he built into a highly profitable technology-services firm. Perot sold the company in 1984 for $2.5 billion. Several years later he founded a similar—and successful—company called Perot Systems Corporation.

During the early 1970s EDS acquired a multimillion-dollar contract in Iran. By the end of the decade that country was torn by revolution, and most EDS personnel had been evacuated. Two EDS executives who remained behind were seized and imprisoned by the Iranian government. When diplomatic efforts failed to free the men, Perot secretly financed a multimillion-dollar rescue attempt led by a retired military officer and staffed by EDS volunteers. The team entered Iran illegally and joined with revolutionaries to storm the jail and releas all the prisoners. The U.S. contingent managed to get out of the country undetected. The event was dramatized in a 1983 book “On the Wings of Eagles,” which was made into a television miniseries in 1986. Both helped make Ross Perot a household name and a popular hero.

Political Experience

Although he would later claim to be a political outsider, Perot met with President Richard Nixon (1913–1994) on several occasions and enjoyed close ties with his administration. Depending on the situation, Nixon’s aides viewed Perot as a useful and wealthy ally or an unpredictable pest. He funded some projects deemed helpful to Nixon’s image, but often defied directives from his White House contacts and demanded favors from them to further his business interests.

In 1969 Perot formed a citizens’ action group called United We Stand, which was devoted to soldiers who were prisoners of war (POWs) in Vietnam or considered missing in action (MIAs). With the secret blessing of the White House, he funded and led a humanitarian mission in which several planes were loaded with food, medicines, and presents for POWs held in North Vietnam. The North Vietnamese refused to let the planes land in their country or accept the shipments. Perot hopscotched around Asia, using the trip to get international press coverage for the POWs. On several occasions he paid the expenses of wives of POWs who traveled to Paris, the site of ongoing peace talks. After the war ended, POWs maintained that Perot’s widely publicized efforts helped achieve better treatment for them while they were in captivity. Perot became very popular with military and veterans’ groups. In 1979 Perot privately funded several unsuccessful covert operations to rescue Americans held hostage in Iran.

During the 1980s Perot earned praise for heading Texas committees that overhauled the state’s drug-abuse laws and school system. He was also a trusted adviser to President Ronald Reagan (1911–2004), who appointed Perot to the Foreign Intelligence Advisory Board. In 1988 Vice President George H. W. Bush (1924–) was elected president. Perot had known Bush for many years and reportedly disliked him very much. He often criticized the new president’s policies, particularly his decision to wage the Persian Gulf War in 1991. Although Perot had been a political insider during the Republican administrations of Nixon and Reagan, he turned against the party while Bush was president.

Perot’s Presidential Bids

In 1991 Perot began quietly preparing to run for president as an independent, finally announcing his intentions in February 1992 on the CNN program “Larry King Live.” It was the first of many appearances on the show, which Perot used to share his platform with the public. He cast himself as a political outsider with a common-sense plan for reform. He focused primarily on economic and foreign-trade issues, avoiding sensitive social issues that tended to polarize voters. Perot spent more than $50 million of his own money on his campaign, using such unconventional means as televised infomercials to spread his message. He got high marks in public-opinion polls—sometimes higher than his two main challengers, Bush, the Republican, and Arkansas Governor Bill Clinton (1946–), the Democrat.

During the summer his campaign stumbled. Remarks he made about gays and African-Americans caused controversy. Stories in the press painted him as a difficult and paranoid egomaniac. He accused the Republican Party of waging a “dirty tricks” campaign against him and his family. In July 1992 Perot suddenly quit the race. His stunned followers continued to work on his behalf, however, conducting petition drives that succeeded in placing his name on ballots in all fifty states.

In October 1992 Perot reentered the presidential race and performed well in televised debates with the other candidates. He garnered more than 19 million votes in the 1992 election, capturing 18.9 percent of the popular vote. It was the best showing by an independent candidate in nearly eighty years. Because he got more than 5 percent of the popular vote, Perot became eligible for matching federal campaign funds for the next presidential election. He adopted a new issue—opposition to passage of the North American Free Trade Agreement (NAFTA). NAFTA was passed by a Democrat-controlled Congress in November 1993, prompting Perot to encourage Americans to vote Republican in the 1994 midterm elections.

In 1995 Perot created the Reform Party and, in August 1996, accepted its nomination for president. His core issues were reducing the federal deficit; reforming campaign-finance laws; and opposing NAFTA. The economy was booming, and Perot’s popularity had fallen since 1992. He got approximately 9 percent of the popular vote in 1996. Perot won no electoral college votes in either presidential election.

Fading from the Political Stage

Perot’s performance in the 1996 election weakened his position in the Reform Party. Internal conflicts arose, and many of his followers left the party. In June 2000 he announced that he did not intend to run for president that year. Shortly before the election he endorsed George W. Bush (1946–), the son of his former campaign opponent. Perot then faded from the political stage, turning his attention to his business ventures and continuing his public campaign on behalf of veterans.

Third-Party Politics

Since the 1800s American politics have been dominated by two parties, the Republicans and the Democrats. However, many other parties—all of which are called “third” parties—have appeared from time to time.

States make it difficult for third parties to compete, imposing stringent requirements for being listed on ballots. Typically candidates must get a certain number of verifiable signatures on petitions within a certain amount of time. For example, Georgia’s procedure is considered one of the most restrictive in the nation. It requires that a petition be signed by at least 5 percent of the number of people who were eligible to vote in the previous election, and all signatures must be obtained in a 180-day period. Georgia’s requirements were upheld as constitutional by the U.S. Supreme Court in Jenness v. Fortson (1971).

Another roadblock for third-party presidential candidates involves “federal matching funds”—money provided from public funds that matches a certain amount of the money raised by candidates from private donors. The program, which is administered by the Federal Election Commission (FEC), is funded on a voluntary basis by U.S. taxpayers. Federal income-tax returns include a box that taxpayers can check to indicate they want $3 of their federal tax payment to go to the program.

The FEC designates funds for the Republican and Democratic national conventions and for the nominees of those two parties in the general presidential election. All candidates in the primary elections can request public funds to match the contributions they have received from individuals (up to $250 from each donor). However, the FEC imposes strict eligibility requirements. A candidate must first raise $100,000 in private donations, including at least $5,000 contributed from people in at least twenty different states, to become eligible for federal matching funds.

Third parties that received at least 5 percent of the popular vote in the previous general election are eligible for public funds for the next general election. New third parties—those participating for the first time in a general election—are not eligible. However, the FEC ruled in 1996 that even though the Reform Party did not exist in 1992, the performance by independent candidate Ross Perot (1930–) in that election qualified him for matching federal funds in 1996. The FEC did not address the issue of whether a different Reform Party candidate in 1996 would have been eligible.

Another major obstacle for third-party candidates is participation in the presidential debates held several weeks before the general election. The private Commission on Presidential Debates determines which candidates are eligible, using fairly rigorous criteria. For example, a candidate must be listed on the ballot in enough states to have a mathematically possible chance of winning the electoral-college vote. A candidate must also receive the support of at least 15 percent of the people interviewed for five public-opinion polls selected by the commission—that requirement effectively eliminated Perot from the 1996 presidential debates.

See also The Reform Party

Hillary Clinton

Hillary Rodham Clinton (1947–) was first lady from 1993 to 2001 and was elected U.S. senator from New York in 2000 and 2006. In 2007 she announced she was a candidate for president in the 2008 election. As first lady she had unprecedented political duties. Almost immediately after he was inaugurated, her husband, President Bill Clinton (1946–), put her in charge of a task force that developed a plan for universal health insurance for the country. Although that particular plan did not pass Congress, the goal of health-care reform remained a cornerstone of her political agenda.

Early Life and Career

Hillary Rodham, the oldest child of a middle-class Chicago family, was active in student government during her high school and college years. After graduating from Wellesley College in Massachusetts she earned a law degree from Yale University, where she met her future husband. After graduation they moved to her husband’s native Arkansas. They had a daughter, Chelsea, in 1980.

Clinton worked for a law firm while her husband began his political career. In 1976 he was elected attorney general; two years later he ran for governor and won. Although defeated for re-election in 1980, he ran again in 1982 and was elected. His tenure as Arkansas governor ultimately lasted from 1983 until his successful run for president in 1992. While she was first lady of Arkansas, she focused on issues important to women and children, primarily in education and health care.

First Lady of the United States

As first lady of the United States, Clinton played an unprecedented role in her husband’s administration. Days after the president took office he placed health-care reform under her direction. She chaired the President’s Task Force on National Health Care Reform, which developed a plan for universal government-sponsored health insurance. The effort took several months and was largely conducted in secret, although many people contributed to the project. The resulting plan was more than a thousand pages long. After portions of the plan were leaked to the press, it was roundly condemned by Republicans, moderate Democrats, and key interest groups for its complexity, cost, and bureaucratic nature. Hillary Clinton was blamed for its failure to garner public or political support. Although congressional leaders of both parties proposed alternative health-insurance plans, no consensus was ever reached. For her role in the debacle, she got poor marks in public-opinion polls. Some of her critics, particularly conservative Republicans, branded her an unrealistic far-left radical.

During their second year in the White House the Clintons came under investigation for their involvement in the so-called Whitewater affair—a failed real-estate investment in Arkansas dating back to the 1970s. The Clintons had entered the business venture with Jim (1940–1998) and Susan McDougal (1955–), a couple who became the focus of federal investigations into financial crimes. The McDougals had operated a savings and loan institution during Bill Clinton’s years as governor. They were legal clients of Hillary Clinton. In 1994 Attorney General Janet Reno (1938–) appointed the first of several independent counsels to examine the Clintons’ involvement in the Whitewater venture. The investigation lasted until 2000, but found no evidence of wrongdoing by the Clintons. The McDougals and a dozen of their business associates were convicted of federal crimes.

Before and during his presidency, Bill Clinton was accused by several women of having made unwanted sexual advances toward them when he was governor of Arkansas. The Whitewater investigation inadvertently turned up evidence that the president had engaged in an extramarital affair with Monica Lewinsky (1973–), a White House intern. The first lady defended her husband against the accusations and blamed a “vast right-wing conspiracy” for plotting against him. Eventually the president admitted that he had had an “inappropriate” relationship with Lewinsky. In December 1998 he was impeached (indicted) by the U.S. House of Representatives for lying under oath and obstruction of justice. However, he was acquitted by the U.S. Senate. Hillary Clinton publicly stood by her husband throughout these scandals, which raised her stature in public-opinion polls.

Senator Clinton

In 2000 Hillary Clinton ran for the U.S. Senate from New York, winning 55 percent of the vote. During her first term she surprised political observers by taking a centrist stand on many issues. She even collaborated on legislation with Republican senators who had been among her husband’s most vocal critics when he was in the White House. In 2002 she voted in support of the proposal by President George W. Bush (1946–) to use military force against Iraq. Her vote was criticized by many liberal Democrats. In 2005 she introduced legislation known as the Count Every Vote Act, which called for several election reforms, such as requiring electronic voting machines to create verifiable paper records and making election day a national holiday. The bill died in committee; she reintroduced it two years later.

In 2006 Clinton was reelected easily, capturing 67 percent of the vote. She served on the powerful Senate Armed Services Committee, the Senate Committee on Environment and Public Works, the Senate Committee on Health, Education, Labor and Pensions, and the Senate Special Committee on Aging. She championed expansion of the Children’s Health Insurance Program (CHIP), a joint federal-state initiative that provides health insurance to children in families that make too much money to qualify for Medicaid, but not enough money to afford private health insurance. She has described CHIP as “a step” toward universal health care.

In January 2007 she announced her candidacy for president in 2008. Universal health care, removing U.S. troops from Iraq, and energy independence were among her major campaign themes.

George W. Bush

George W. Bush (1946–), the forty-third president of the United States, faced one overriding issue after his inauguration: terrorism. The terrorist attacks of September 11, 2001, set off an unprecedented chain of events, including wars in which the United States invaded Afghanistan and Iraq. Although the country rallied around Bush at first, his popularity plummeted as the death toll of U.S. troops serving in Iraq increased. Americans were shocked by military and political scandals during his administration, including the mistreatment and torture of prisoners captured during the wars.

Early Life and Public Service

Bush was born into a well-to-do family with a distinguished record in public service and the Republican Party. Bush’s grandfather represented Connecticut in the U.S. Senate during the 1950s and early 1960s. His father, George H. W. Bush (1924–), was the forty-first president of the United States.

George W. Bush attended Yale University, graduating in 1968 with a degree in history. He was a pilot in the Texas Air National Guard and then earned a master’s degree in business administration from Harvard Business School in 1975. After graduating he returned to Texas to work in the energy industry and was part owner of a professional baseball team. In 1994 he was elected governor of Texas; he was reelected four years later. As governor Bush developed a reputation as a consensus-builder, able to work with Democrats in a bipartisan fashion to achieve desired legislation.

The 2000 Election

In the 2000 presidential campaign Bush espoused a political philosophy called “compassionate conservatism” and—because he thought the presidency had been tainted by scandals during the Clinton administration—promised to bring dignity and respect back to the White House. Bush defined compassionate conservatism as follows: “It is compassionate to actively help our fellow citizens in need. It is conservative to insist on responsibility and results.” He chose as his running mate former secretary of defense Dick Cheney (1941–). Their main opponents were Democratic Vice President Al Gore Jr. (1948–) and Senator Joseph Lieberman of Connecticut (1942–).

The election proved to be extremely close and contentious. Bush was eventually declared the winner, but only after a month of legal wrangling over a vote recount in Florida. Bush received less of the popular vote (47.9 percent) than did Gore (48.4 percent), but won the election by capturing 271 electoral votes—only one more than was required. Congressional outcomes in that election were close as well. U.S. Senate elections resulted in a 50-50 split between Democrats and Republicans. The Republican Party lost seats in the U.S. House, but maintained a slim majority.

Some Democrats were bitter about the outcome of the presidential election—they believed that Gore had been denied victory unfairly—so Bush entered office in a highly partisan political climate.

9/11 and Afghanistan

Bush’s agenda changed on September 11, 2001, when terrorists commandeered four commercial airliners in the United States. Two of the planes were crashed into the twin towers of the World Trade Center in New York, leading to their collapse. A third plane struck the Pentagon. The fourth hijacked plane crashed in rural Pennsylvania after passengers stormed the cockpit. More than two thousand nine hundred people were killed. In a televised address that evening Bush promised to bring the terrorists to justice, noting “we will make no distinction between the terrorists who committed these acts and those who harbor them.”

Intelligence revealed that the hijackers were associated with the terrorist organization al-Qaeda, led by Osama bin Laden (1957–), the son of a wealthy Saudi family, and aided by the Taliban government of Afghanistan. In an address to Congress on September 20 Bush publicly demanded that the Taliban hand over bin Laden and his top lieutenants or the United States would strike. “Every nation, in every region now has a decision to make,” he said. “Either you are with us, or you are with the terrorists.”

The Taliban did not comply with Bush’s demands, so the U.S. military invaded Afghanistan to oust its government and capture bin Laden. U.S. forces encountered little organized resistance. A multinational military force organized by the United Nations and including a significant number of U.S. troops eventually took over security responsibilities for Afghanistan. They found themselves in a lingering guerrilla-type conflict with former Taliban supporters and al-Qaeda fighters. U.S. forces did not find bin Laden.

The Case against Iraq

After the 9/11 attacks Bush called for a worldwide war against terrorism. His firm stance and quick action against Afghanistan earned the president high ratings in public-opinion polls—his approval rating soared to nearly 90 percent. However, support began to erode as bin Laden remained elusive, and the Bush administration turned its attention to Iraq.

Iraq and its leader Saddam Hussein (1937–2006) had worried the U.S. government for more than a decade. Following the Persian Gulf War in 1991 the UN Security Council called on Iraq not to acquire or develop nuclear weapons and to turn over any nuclear weapons or related equipment to the International Atomic Energy Agency (IAEA). Hussein alternately denied IAEA inspectors access to the country and then let them in; he regularly refused to abide by UN resolutions. Bush and many of his inner circle—Cheney, Secretary of Defense Donald Rumsfeld (1932–), and Deputy Secretary of Defense Paul Wolfowitz (1943–)—became convinced that Iraq was developing weapons of mass destruction. During Bush’s State of the Union address in 2002, he described Iraq as a member of an “axis of evil” in the world. In October 2002 Congress approved a resolution authorizing the use of force if Iraq failed to comply with UN Security Council resolutions regarding weapons inspections.

The Bush administration was spurred by intelligence reports (later found to be in error) that Iraq was developing nuclear weapons. Bush highlighted this intelligence in his 2003 State of the Union address. Soon afterward Secretary of State Colin Powell (1937–) tried unsuccessfully to persuade the United Nations that military action should be taken against Iraq. Only Britain pledged its full support for the U.S. war plan. In March 2003 Bush addressed the nation and presented his case for war against Iraq, noting that the intelligence gathered “leaves no doubt that the Iraq regime continues to possess and conceal some of the most lethal weapons ever devised.” He gave Hussein and his sons forty-eight hours to leave Iraq or face military action. The president promised the Iraqi people that the United States would help them build “a new Iraq that is prosperous and free.”

The War in Iraq

On March 20, 2003, U.S.-led forces began with a massive bombing campaign against Iraq, followed by an invasion by ground troops. By early April U.S. troops had captured Baghdad, Iraq’s capital, and British troops occupied much of southern Iraq. Initial jubilation by some Iraqi citizens turned to dismay when U.S. troops could not maintain order. However, massive looting and lawlessness were downplayed by Bush administration officials. On May 1, 2003, Bush announced that the major combat operations in Iraq were over. Before the end of the year most former members of Iraq’s regime had been captured or killed. Hussein was found hiding in a pit and turned over to Iraq’s new government to stand trial for war crimes. He was eventually convicted and hanged.

Bush’s declaration of an official end of combat operations did not bring peace and security to Iraq. U.S. and British forces soon faced deadly attacks from insurgents determined to drive them from their country. Meanwhile, after a careful search of the country, no weapons of mass destruction were discovered.

In early 2004 reports began appearing in the media about the abuse of prisoners at the Abu Ghraib detention facility operated by the U.S. military in Iraq. These reports included digital photos taken by U.S. soldiers that showed prisoners enduring humiliating and abusive treatment. The news shocked Americans and resulted in apologies from Bush and Rumsfeld.

The 2004 Elections

Bush began his 2004 election campaign in a relatively strong political position. The economy had weathered dips caused by the collapse of Internet stocks and by the 9/11 terrorist attacks. The Economic Growth and Tax Relief Reconciliation Act (2001) implemented the tax cuts he had promised. Education reform was also under way, led by the No Child Left Behind Act (2002). The Medicare Prescription Drug, Improvement and Modernization Act (2003) created a new prescription-drug plan for seniors. Foreign affairs and the war on terror dominated the 2004 campaign. Bush was subject to some criticism for his handling of the war as commander in chief, particularly regarding the growing insurgency in Iraq and the Abu Ghraib scandal, but he was able to counter with the fact that no new terrorist acts had occurred on U.S. soil.

Bush and Cheney were reelected with 51 percent of the popular vote, capturing 286 of the 538 electoral votes. Their Democratic opponents were Massachusetts Senator John Kerry (1943–) and North Carolina Senator John Edwards (1963–). Congressional Republicans built on gains they had made during the 2002 midterm elections. In 2004 they maintained their majorities in both houses of Congress.

The Second Term

Bush’s political fortunes fell quickly in his second term. In 2005 Tom DeLay (1947–), a Republican representative from Texas and a close Bush ally, was indicted by a grand jury for his role in funneling corporate contributions to Republican candidates for state office. One of DeLay’s associates, lobbyist Jack Abramoff (1959–), was indicted for fraud in a separate investigation, which grew to expose a massive scandal involving influence peddling. Then in August Hurricane Katrina hit the Gulf Coast. Floodwater breached weakened levees, and the city of New Orleans was flooded. The federal government’s response to the disaster was widely criticized as slow and inadequate.

In Iraq insurgent attacks increased dramatically between 2004 and 2006. Violence between Iraq’s two major religious sects—Sunni Muslims and Shiite Muslims—threatened to escalate into civil war. U.S. military forces suffered increasing numbers of casualties from insurgent attacks as they trained Iraqi army and police forces. By mid 2007 more than three thousand U.S. military personnel had died in Iraq, and thousands more had been injured. Meanwhile there was an upswing in militant violence in Afghanistan, particularly in the southern part of the country.

In 2005 a commission created by Bush to investigate America’s prewar intelligence about Iraq issued its final report, and it was extremely critical. The commission said that U.S. intelligence agents had relied on false information from unreliable informants and poor data sources, such as forged documents. As important, the commission said, was a reluctance among intelligence analysts to accept any evidence that challenged their preconceived notions about Iraq. The commission found that Iraq had had no active program to develop or acquire weapons of mass destruction since the Persian Gulf War. The findings embarrassed the administration, for it had used the intelligence extensively as it pressed for the invasion of Iraq in 2003. Bush’s critics began to ask if the administration had knowingly used faulty intelligence to make its case for the war.

The 2006 Midterm Elections

All of these issues had a dramatic effect on the 2006 midterm elections. Democratic candidates received widespread support from voters dissatisfied with the war in Iraq and government scandals. Democrats seized majority control in both houses of Congress (two of the newly elected senators were independents who aligned with the Democratic Party). California Representative Nancy Pelosi (1940–) became the first woman in history to be speaker of the House.

The day after the election Bush accepted Rumsfeld’s resignation and appointed Robert Gates (1943–), who had earlier run the Central Intelligence Agency, to take his place as secretary of defense. Bush stated, “I recognize that many Americans voted last night to register their displeasure with the lack of progress being made in Iraq.” Democratic leaders began calling for a timetable for withdrawal of U.S. troops from Iraq.

The Bush Presidency

As of mid 2007, the Bush presidency had been defined by lingering problems in Iraq and relatively good economic conditions at home. Deficit spending had soared, however—the war in Iraq had been particularly expensive, as had the new Medicare prescription-drug plan and emergency spending because of Hurricane Katrina. When Bush entered office the federal budget was in surplus; then, in each fiscal year from 2002 through 2005, the federal deficit exceeded $300 billion. It dipped below $250 billion for fiscal year 2006. (The fiscal year runs from October 1 through September 30). In February 2007 Bush proposed a long-term plan to reduce the federal deficit gradually each year and achieve a balanced budget by fiscal year 2012. The plan assumed that Congress would reduce spending on domestic programs and that U.S. expenses in Iraq would decline dramatically by 2010. Critics doubted that those two goals would be realized. Bush’s approval rating dropped to near 30 percent, one of the lowest levels ever recorded for a second-term president.

Dick Cheney

Dick Cheney (1941–) was elected vice president on the 2000 Republican ticket with President George W. Bush (1946–). He played a prominent role in the administration, particularly in making the case for war with Iraq.

Early Life and Career

Cheney, who was raised in Casper, Wyoming, graduated from the University of Wyoming and began his political career with various positions in the administration of Republican President Richard Nixon (1913–1994). During the mid 1970s Cheney served as White House chief of staff for Republican President Gerald Ford (1913–2006). In 1978 he was elected to the U.S. House of Representatives from Wyoming; he was reelected five times. In 1988 he became minority whip, a House leadership position that made him responsible for mobilizing votes within the Republican Party on key issues.

Secretary of Defense

When President George H. W. Bush (1924–) named him secretary of defense in 1989, Cheney began streamlining the U.S. military into a smaller, more mobile force suitable for regional conflicts; the effort was precipitated partly by budget deficits but mostly by the breakup of the Soviet Union, which effectively ended the Cold War. Cheney recommended General Colin Powell (1937–) to be chairman of the Joint Chiefs of Staff, the first African-American to hold that position.

In 1990, when the Iraqi military invaded Kuwait, Cheney obtained permission from King Fahd (1921–2005) of Saudi Arabia to amass U.S. troops on Saudi soil—an operation named Desert Shield. The United Nations Security Council subsequently set a deadline of January 15, 1991, for Iraq to withdraw from Kuwait or be forcibly expelled. Bush formed an international coalition of military forces to impose the Security Council resolution. Operation Desert Storm began two days after the deadline passed with air strikes on Iraqi forces in Kuwait and Iraq. A ground offensive followed and within days had pushed Iraqi troops out of Kuwait. Cheney and Powell were widely credited as the architects of the successful Persian Gulf War. In 1991 Cheney was awarded the Presidential Medal of Freedom for his role.

Vice President

In 1995 Cheney became president and chief executive officer of Halliburton Energy Services, a Texas-based company that specializes in oil and gas exploration, development, and production. In 2000 George W. Bush, then governor of Texas, chose Cheney as his vice presidential running mate. The two narrowly won the election against their Democratic challengers, Al Gore Jr. (1948–), a senator from Tennessee, and Joseph Lieberman (1942–), a senator from Connecticut.

The terrorist attacks of September 11, 2001, occurred less than a year later and thrust Cheney into a major role in administration strategy. According to some observers, he became obsessed with the idea that Iraq was involved in the attacks and was amassing weapons of mass destruction. Like Secretary of Defense Donald Rumsfeld (1932–) and Deputy Secretary of Defense Paul Wolfowitz (1943–)—the three had known each other for years—Cheney urged an aggressive military approach toward Iraq. The nation’s first response to the terrorist attacks was a military operation in Afghanistan that ousted the Taliban government. That effort was followed by a push for an invasion of Iraq. Using reports by intelligence agencies, the administration claimed that Iraq was in collusion with terrorists and was amassing weapons of mass destruction. However, the head of the Central Intelligence Agency at the time, George Tenet (1953–), later claimed that the decision to invade Iraq was made without “serious debate” by the Bush administration. The allegation that Iraq colluded with terrorists was later found to be inaccurate, and the weapons of mass destruction were never found.

In 2003 the United States and a handful of allies—chiefly Britain—began an aerial and ground assault on Iraq. The operation was successful at first. Iraqi leader Saddam Hussein (1937–2006) was ousted from power and a new government was installed. However, as allied forces tried to rebuild the country a fierce insurgency erupted and widened into a deadly struggle between Iraqi factions divided by religious and political differences.

Americans who became disillusioned with the war placed much of the blame on Cheney for advocating the invasion so forcefully. In 2004 his chief of staff, I. Lewis “Scooter” Libby (1950–), was indicted by a federal grand jury for his involvement in leaking the identity of a CIA agent to the press. The leak was considered an attempt to undermine reports that the administration knowingly used faulty intelligence about Iraq’s efforts to develop nuclear weapons. Libby was convicted in 2007 of perjury, making false statements, and obstruction of justice for his involvement in the leak. Cheney steadfastly denied knowing about the leak. At the same time questions of conflict of interest arose when Cheney’s former employer, Halliburton, was awarded a multibillion dollar government contract to participate in the rebuilding of Iraq.

Cheney, who has had four heart attacks, said early in the Bush administration that he had no interest in running for president when Bush left office.

Tom DeLay

Tom DeLay (1947–) was a Republican representative from Texas from 1985 until 2006, when he gave up his post as majority leader and resigned from Congress after being indicted on federal charges related to campaign contributions.

Background

DeLay had a long career in public service, starting in 1979 when he was elected to the Texas House of Representatives. In 1984 he was elected to the U.S. House and reelected ten times. DeLay and Newt Gingrich (1943–), a Republican representative from Georgia, were the principal architects behind the so-called Republican Revolution of 1994, which gave the party majority control of the House for the first time in four decades. DeLay, who was nicknamed “the hammer,” wielded great power in his party, serving as majority whip (1998–2000) and majority leader (2000–2006).

The Charges

In 2005 Delay was indicted by a Texas grand jury for his role in a scheme to funnel corporate contributions to Republican candidates in his state. According to the indictment, an organization called Texans for the American Way Political Action Committee (TAWPAC) accepted nearly $200,000 in donations from corporate donors in 2002 and transferred it to the Republican National State Elections Committee—an arm of the Republican National Committee—which, in turn, distributed it among Republican candidates running for the Texas House. It is illegal in Texas to use corporate funds in state election campaigns.

DeLay helped organize and fund TAWPAC, served on its advisory board in 2001 and 2002, and participated in its fund-raisers. However, he denied being involved in TAWPAC’s day-to-day operations and knowing about the transactions outlined in the indictment. As required by House rules, DeLay stepped down from his position as majority leader after he was indicted.

DeLay’s Resignation

DeLay ran for re-election in the 2006 and easily won the primary against his Republican competitors. By that time a number of his associates, including lobbyist Jack Abramoff (1959–), had been indicted in a separate corruption investigation. Only months before the general election, DeLay resigned from the House and abandoned his re-election campaign. However, DeLay’s name remained on the ballot for the general election because state officials ruled it was too late to change the ballot. The Texas Republican Party challenged that decision in state and federal courts, but was unsuccessful. The party called upon Republicans in DeLay’s congressional district to vote for a write-in Republican candidate, but she was defeated by a Democratic challenger, Nick Lampson (1945–).

After he resigned, DeLay wrote No Retreat, No Surrender: One American’s Fight, which was released in early 2007. In the book he criticized many of his former allies in the Republican Party and continued to declare his innocence. As of mid 2007 he was still awaiting trial on charges of money laundering.

Gerrymandering?

At the heart of the indictments that led to the resignation of Republican representative Tom DeLay (1947–) of Texas lies a battle over congressional redistricting for partisan political gain—a practice known as gerrymandering.

As directed by the U.S. Constitution, the nation is divided into districts, each of which has one representative in the U.S. House. The number of congressional districts in each state is based on population (determined by the U.S. Census, which is conducted every ten years). As the nation’s population grew, so did the number of congressional districts. In 1911 the U.S. Congress limited the number of districts to 435. That required apportionment—the distribution of those 435 seats among the fifty states according to population. States with smaller populations, such as Alaska and Delaware, have fewer House seats than states with large populations, such as California and Texas. Within each state the congressional districts are also apportioned according to population and are typically redrawn after each national census. Following the 2000 census each congressional district represented, on average, 646,952 people.

Most states allow their state legislatures to conduct redistricting; many observers say that practice encourages gerrymandering to benefit the party that controls the legislature. For example, politicians may choose to redraw districts so that most voters of the opposite party live in just a few of the state’s districts. Alternatively they may spread their opponents across many districts to dilute their voting power. Yet another possibility is racial gerrymandering, in which congressional districts are redrawn to affect the voting power of minority populations within a state.

Redistricting to dilute or concentrate votes is unconstitutional because it violates the Equal Protection Clause of the Fourteenth Amendment. However, it is difficult to prove in court.

While the indictment gave few details, analysts believe that the purpose of the money-laundering scheme linked to DeLay was to elect enough Republicans to the Texas House in 2002 so the party would have majority control. It could then conduct a redistricting favorable to the Republican Party in Congress. In 2002 the Republicans did win majority control of the Texas House—for the first time in 130 years—and did perform a congressional redistricting. In the 2004 election the Republican Party gained six new seats in the U.S. House.

The redistricting was challenged in court. However, in League of United Latin American Citizens v. Perry (2006) the U.S. Supreme Court upheld the redistricting, except for one district in which the court believed racial gerrymandering had taken place. Approximately one hundred thousand Hispanic voters had been moved from a district with a Republican representative to a new district. The court also ruled that states can legally redistrict between national censuses, meaning that state legislatures could redistrict every time the majority party shifts.

Political Parties, Platforms, and Key Issues

Supply-Side Economics

Supply-side economics is a theory that advocates stimulating the efforts of businesses and entrepreneurs in order to achieve overall growth in the nation’s economy. Embraced by the Reagan administration (1981–1989), supply-side policies contrasted with traditional economic practices, which tended to encourage consumer spending—the demand side. The methods used to implement the policy were reductions in tax rates, deregulation of industries, and lowered trade barriers.

Stagflation and “Reaganomics”

When President Ronald Reagan (1911–2004) took office in January 1981 the economy was in a recession, or slowdown, and unemployment was high. Inflation was also very high, meaning that prices were rising quickly. This unusual combination of economic troubles is called stagflation. Reagan asserted that the cure for stagflation was an economic policy that directly helped entrepreneurs and producers of goods and services. In a 1988 speech he said, “God did give mankind virtually unlimited gifts to invent, produce, and create. And for that reason alone, it would be wrong for governments to devise a tax structure or economic system that suppresses and denies those gifts.” Reagan’s economic philosophy—it became known as “Reaganomics”—was influenced and aided by several of his contemporaries, including Jack Kemp (1935–), a Republican representative from New York; Bill Roth (1921–2003), a Republican senator from Delaware; economists Arthur Laffer (1940–) and Robert Mundell (1932–); and Jude Wanniski (1936–2005), a Wall Street Journal writer who is credited with coining the phrase “supply-side economics.”

Reagan and his advisers argued that if entrepreneurs and producers—generally the wealthiest segment of the populace—were less burdened by taxes and regulations, they would produce and invest more. The benefits, they claimed, would “trickle down” to individuals in the lower income levels. Critics derided this theory as a political ploy to help the rich get richer while the poor got poorer.

Marginal Tax Rates

To implement the policy Reagan’s team advocated a radical change in income tax brackets. Income was—and is—taxed at different rates depending on its amount. The tax rate paid on the highest dollar earned is called the marginal tax rate. Supply-side economists of the 1980s believed that marginal tax rates should be lowered to encourage people to earn more. For a simplified example, consider a person making $50,000 per year in taxable income. Imagine the first $10,000 is taxed at 15 percent; income from $10,001 to $50,000 is taxed at 30 percent; and any income over $50,000 is taxed at 70 percent. A supply-side advocate would argue that this person has little incentive to earn more than $50,000 because the government will take seventy cents of every additional dollar earned.

In 1977 Kemp and Roth sponsored a bill to cut federal tax rates. It was rejected as inflationary by the Carter administration, but resurrected with Reagan’s blessing in the Economic Recovery Tax Act (ERTA) of 1981. ERTA reduced the highest tax rate from 70 percent to 50 percent and lowered the other rates by varying percentage points. The lowest tax rate, 14 percent, was decreased to 10 percent.

ERTA was supposed to reduce the marginal tax rates without substantially reducing the amount of tax revenue taken in by the federal government. Supply-side proponents believed that people—particularly those in the highest tax brackets—would be inspired by cuts in the marginal tax rate to earn more income. In the most optimistic scenario enough “new” income would be earned at the lower tax rates to offset the loss to the government of income taxes that would have been paid at the higher tax rates.

Another consideration in the debate over tax brackets was the amount of money that people declared as income. Supply-side theorists asserted that the nation’s wealthiest people were avoiding high marginal tax rates by putting some of their income into tax havens in other countries or by using tax loopholes. Lowering the marginal tax rates, they predicted, would encourage the wealthy to report—and pay taxes on—more of their income.

The capital-gains tax is a tax paid on the profit made from selling an investment, such as stocks, bonds, or real estate. At the federal level capital-gains taxes are tied to marginal tax rates, so lowering the marginal tax rates serves to lower capital-gains tax rates. Supply-side supporters knew that it was primarily the wealthy who incurred capital gains and paid taxes on them. Therefore, the economists argued that lowering the capital-gains tax rates would encourage more investing, which would provide a boost to the economy.

Another benefit of reducing the marginal tax rates, they said, was the elimination of “bracket creep.” Wages and salaries had been increasing in response to rising prices during the 1970s, pushing many taxpayers into higher tax brackets. While their buying power had stayed about the same, taxpayers had less to spend because their income was taxed at a higher rate. A reduction of the marginal tax rates was expected to help taxpayers across the spectrum by correcting their shift into higher brackets.

ERTA phased in tax-rate changes gradually and included provisions that lowered corporate taxes and estate and gift taxes. The Tax Reform Act of 1986 lowered marginal tax rates even further—for example, the highest rate was decreased from 50 percent to 28 percent.

Deregulation and Free Trade

Two other components of supply-side economics, deregulation and free trade, were also championed by the Reagan administration. President Jimmy Carter (1924–) had started deregulation, which removes or reduces governmental restraints on industry sectors. In the Reagan years it was expanded to cover more businesses, particularly in the financial and telecommunications industries. Supply-side theorists believed that reducing or eliminating government regulations, which often required considerable expenditures by industries, would allow them to spend more on expanding their businesses. That, in turn, would help the economy overall. Likewise, trade barriers, such as high tariffs on imported goods, were viewed as excessive burdens on private industry. During his 1980 presidential campaign Reagan promoted free trade between the United States and Mexico. He signed the U.S.-Canada Free Trade Agreement in 1988 and supported the international talks that eventually produced the World Trade Organization, which has overseen multinational trade agreements since 1995. He also vetoed bills that would have impeded textile imports to the United States.

Praise and Criticism

By 1989 the nation’s economy was doing much better; unemployment and inflation were down substantially. Economists had mixed opinions on the role supply-side economics had played in this turnaround. Some believed that the cuts in marginal tax rates inspired investment by the wealthiest sectors, which helped jumpstart the economy. Others believed that a recovery was going to happen anyway and that the changes in the tax code unfairly burdened the poorest taxpayers. They pointed to data indicating that people in the lowest tax brackets wound up paying a higher, rather than a lower, percentage of their income in taxes after the tax code was changed. The new tax laws, they noted, not only changed the tax brackets but also reduced the tax exemptions and deductions that were most utilized by lower-income Americans. Other analysts pointed to actions by the Federal Reserve, the nation’s central bank, which operates independently of congressional and presidential control. In the 1980s it had a new chairman, Paul Volcker (1927–), who had been appointed by Carter. Some economists thought his actions, which affected both the amount of money circulating in the United States and interest rates, may have been crucial factors in the economic upturn.

When the tax changes were first proposed, critics of supply-side economics claimed that the loss of tax revenue would force the government to borrow a lot of money, which would increase the budget deficit and the national debt. Deficit spending was high during the Reagan administration, which seemed to validate their opinion. However, other factors can be seen at work: for example, the government maintained a high defense budget as a tactic in the Cold War with the Soviet Union, which increased federal spending. Reagan also refused to make major cuts in some of the most expensive social programs, such as Social Security.

See also Ronald Reagan

The Contract with America

The Contract with America was a set of political promises made by the Republican Party six weeks before the 1994 elections. The party pledged to make specific economic and social reforms if Republican candidates were elected in sufficient numbers to control Congress. The voters responded positively, and the Republican-dominated House followed through on nearly all of the contract’s promises. Many of the reforms did not survive Senate scrutiny or presidential veto; nevertheless, the contract is remembered as a bold political move and a symbol of the Republican Party’s resurgence during the 1990s.

Political Background

In 1994 the Democratic Party had controlled the House of Representatives for forty years. A Democratic president—Bill Clinton (1946–)—had been in office for two years. Congressional scandals and public disenchantment with Clinton’s early political agenda presented a positive climate for Republican candidates as the November 1994 elections approached. With great fanfare the party unveiled the Contract with America on the steps of the U.S. Capitol on September 28, 1994. Signed by 350 Republican candidates, the contract promised a balanced federal budget, numerous tax cuts, a crackdown on crime, greater government accountability, and many other reforms.

Republicans did well in the election, seizing majority control of both houses of Congress. Newt Gingrich (1943–), a Republican representative from Georgia, was elected speaker of the House. One of the coauthors of the contract, he was considered its chief champion.

The Terms of the Contract

The Contract with America had two major components—eight reforms the Republican majority promised to launch on its first day in session and ten bills to be introduced during the first one hundred days.

The specific promises were:

  1. Members of Congress would be subject to all laws applicable to the rest of the people.
  2. A comprehensive audit of Congress would be conducted to detect “waste, fraud or abuse.”
  3. The number of House committees would be reduced, as would the number of people staffing the committees.
  4. The terms of all committee chairs would be limited.
  5. Committee members would be banned from casting proxy votes.
  6. All committee meetings would be open to the public.
  7. A three-fifths majority vote would be needed to pass a tax increase in the House.
  8. The House would implement zero base-line budgeting to provide “an honest accounting” of the nation’s finances.

The ten pieces of legislation in the Contract with America covered a variety of economic, social, and legal issues. The Fiscal Responsibility Act called for a balanced federal budget, tax limits, and a presidential line-item veto. The Taking Back Our Streets Act was a comprehensive crime-and-punishment package. The Personal Responsibility Act overhauled the welfare system. The Family Reinforcement Act covered child support, adoption, education, child pornography, and elderly dependents. The American Dream Restoration Act promised tax cuts and reforms. The National Security Restoration Act pledged that U.S. troops would not serve under the command of the United Nations and called for greater spending on national security. The Senior Citizens Fairness Act made changes in the Social Security program. The Job Creation and Wage Enhancement Act contained a variety of incentives for small businesses. The Common Sense Legal Reform Act put limits on lawsuits. The Citizen Legislature Act implemented term limits on certain “career politicians.”

The Contract with America also included a promise that the budget would be cut sufficiently to ensure that implementation of these laws would not increase the federal deficit.

The Outcome

The House delivered on all but one of the promises contained in the Contract with America. A vote to amend the U.S. Constitution to implement term limits on certain politicians did not pass. However, many of the specific measures passed by the House did not make it through the Senate unscathed or were vetoed by Clinton. The measures that eventually became law in some form included a tax credit for families that adopt children; tax cuts for businesses; a line-item veto (later ruled unconstitutional by the U.S. Supreme Court); congressional accountability; health insurance reforms; tax deductions for long-term care insurance; restrictions on lobbyists; and welfare reform.

Government Shutdowns

Many operations of federal agencies were forced to cease on two occasions in 1995 and 1996 because the Republican-controlled Congress and Democratic President Bill Clinton (1946–) failed to agree on spending legislation. These “government shutdowns” highlighted the deep division between the legislative majority and the executive branch on the financial priorities of the country—in particular, how best to deal with federal deficits, which occur when the government spends more money than it takes in.

Appropriations Acts

By federal law the president must submit a proposed budget to Congress by the first Monday in February prior to the start of the next fiscal year. (The federal fiscal year begins on October 1 and runs through September 30.) Congress considers the president’s proposed budget and passes a series of appropriation acts, each of which authorizes funding for one or more federal agencies. These acts must be passed by October 1 so the agencies can continue to operate. Since the late 1800s passage of all appropriations acts by the deadline has rarely been achieved. To provide funding past October 1, “continuing resolutions,” have been passed by both houses of Congress and signed by the president.

A “funding gap” occurs if neither an appropriations act nor a continuing resolution is in place. A funding gap can also occur if a continuing resolution expires and is not replaced with a new one.

The History of Funding Gaps

Funding gaps were first reported in the 1880s. For nearly a century they were very short affairs that were resolved quickly. This began to change during the 1970s as disagreements about spending priorities grew sharper. Between 1976 and 1979 there were several funding gaps, one of which exceeded two weeks. At that time funding gaps did not have a major effect on government operations: the affected federal agencies continued to operate—they minimized their spending activities—despite the presence in federal code of the Antideficiency Act. It began as a simple statute in 1870 and evolved into a complex law covering the government’s actions during a funding gap.

In an opinion released in 1980 Benjamin Civiletti (1935–), who was attorney general in the Carter administration, said that the Antideficiency Act required “nonessential” government operations to cease completely when a funding gap occurs. This opinion achieved dramatic results: there was no funding gap that year. During the following decade the funding gaps that did occur lasted only a day or two. From 1991 through 1994 there were no funding gaps at all.

The 1995 and 1996 Funding Gaps

When the elections of 1994 gave the Republican Party majority control of Congress, a contentious relationship quickly developed between powerful Republican lawmakers and the Democratic president, which hindered agreement on appropriations acts. In late September 1995 a continuing resolution was passed to fund the government from October 1 through November 13. A second continuing resolution passed Congress, but was vetoed by the president, triggering a funding gap that began on Tuesday, November 13, 1995, and lasted through the weekend. On November 20 a continuing resolution was enacted to fund the government through December 15. It expired and was not replaced. The funding gap that began on December 15 lasted for more than three weeks—the longest in history. It was resolved on January 6, 1996, by a new continuing resolution—the first of several passed in 1996 until a final budget agreement was reached.

During the two funding gaps, parts of the federal government closed down. Approximately eight hundred thousand employees were put on temporary furlough during the first shutdown. Far fewer people—around two hundred eighty thousand—were furloughed during the second shutdown because some funding bills did get passed. National parks, museums, and monuments around the country closed during the shutdowns, greatly affecting tourism. Thousands of applications for passports and visas were not processed. Many other government services were delayed or slowed. Operations deemed “essential” did not cease, however, including the military and federal law enforcement agencies, mail delivery, and the processing of payments to existing recipients of Social Security and Medicare.

The Budget Battle

The Republican budget plan called for a balanced federal budget within seven years through cuts in the Medicare, Medicaid, and welfare programs. It also called for tax cuts. This approach was championed by Kansas Senator Bob Dole (1924–), the majority leader of the Senate, and Georgia Representative Newt Gingrich (1943–), the speaker of the House. The president refused to support the plan, claiming that it cut too much funding from social, educational, and environmental programs. A very public feud was conducted in the media, with each side blaming the other for the impasse.

The battle took an interesting twist after Clinton, Dole, and Gingrich traveled with many other U.S. politicians to Israel on November 6, 1995, for the funeral of Israeli Prime Minister Yitzhak Rabin (1922–1995). Afterward Gingrich complained that he and Dole had been snubbed by Clinton during the trip—he said they had had to exit the plane by the rear door instead of the front door—and that Clinton could have engaged them in dialogue on the budget during the long flights to and from Israel, but chose not to do so. The incident, Gingrich told the press, caused him to take a harder line in his budget negotiations with the White House. Critics lambasted Gingrich, calling him a “crybaby.” The event helped turn public opinion against the Republican position. Congressional phone lines were flooded with complaints. The budget impasse was resolved, and the government resumed normal operations.

The Reform Party

The Reform Party is a political organization created in 1995 by Texas billionaire Ross Perot (1930–) to further his campaign for president. Perot was unsuccessful in his bid in 1996, and the party was subsequently torn apart by conflicts between his followers and those of two party newcomers—Jesse Ventura (1951–), who won the Minnesota governorship in 1998, and Pat Buchanan (1938–), who ran unsuccessfully for president in 2000.

The Perot Years

Perot made a surprisingly strong showing as an independent candidate in the 1992 presidential election, garnering about 19 percent of the popular vote to finish in third place. Three years later he inspired his followers to found a new political party. Hurried petition drives got the party certified and on the presidential ballot election in as many states as possible. In November 1995 the Reform Party received its first official recognition from the state of California. The following summer it held its first national convention.

On August 18, 1996, Perot accepted the party’s nomination for president. Because of his performance in the 1992 race, he was eligible for millions of dollars in matching campaign funds from the federal government. Perot’s 1996 platform called for reducing the federal deficit; campaign reforms; and opposition to the North American Free Trade Agreement (NAFTA). Perot captured about 9 percent of the vote in 1996—enough to secure matching campaign funds for the party for the next presidential election, but not enough to get on the ballot in some states (many states have a 10 percent threshold).

Deep Divisions Arise

In 1998 the party’s Minnesota chapter helped Ventura, a former professional wrestler and mayor of Brooklyn Park, Minnesota, win the governor’s office. Ventura resented attempts by the national leadership to take credit for his victory and distanced himself from the party. He became more disillusioned in 1999 when Buchanan left the Republican Party for the Reform Party and made clear his intentions to run for president in 2000. Ventura and his followers left the Reform Party soon afterward. A court battle for party leadership ensued between the Perot and Buchanan factions. When Buchanan’s camp emerged victorious, Perot and his followers also abandoned the Reform Party.

In the 2000 presidential election Buchanan, as the Reform Party candidate, garnered 0.4 percent of the popular vote. His showing resulted in the loss of ballot access in many states and no chance for matching federal campaign funds for the 2004 election. In 2002 many of Buchanan’s followers left the Reform Party for the Constitution Party and the newly formed America First Party. The Reform Party was severely crippled by these defections. In 2004 it did not field a candidate for president; instead, many of its remaining chapters endorsed independent candidate Ralph Nader (1934–).

See also Ross Perot

The Green Party

The Green Party is a political party that rose to national prominence by fielding consumer advocate Ralph Nader (1934–) as a candidate in the 1996 and 2000 presidential elections. The Green Party began in the United States as an informal offshoot of the European Greens, a federation of European political parties devoted to environmental issues and social justice. The Green Party of the United States made a respectable showing in the 2000 presidential election, but has concentrated since that time on local and state political contests.

A “Grassroots” Party

The Green Party began in the United States with organizations at the local and state level. This “grassroots” origin became one of its greatest recruiting points. Gradually two factions emerged within the party—one moderate and one more leftist in its political philosophy. The latter faction began calling itself The Greens/Green Party USA and splintered away from the main group.

In 1996 the Association of State Green Parties (ASGP) formed to consolidate the power of the moderate state and local chapters and recruited Nader to run for president. The ASGP managed to get Nader on the ballot in twenty-two states, despite its raising only a few thousand dollars in campaign financing. Nader garnered slightly more than seven hundred thousand votes, representing less than 1 percent of the popular vote. The ASGP capitalized on its experiences in the first election and conducted a much more sophisticated campaign for Nader in 2000. He received nearly 2.9 million votes (2.7 percent of the popular vote) to come in third, after the Republican candidate, Texas Governor George W. Bush (1946–), and the Democratic candidate, Vice President Al Gore (1948–). Gore’s camp complained bitterly that Nader’s entry into the race diverted votes from their candidate and gave the win to Bush.

In 2001 the AGSP became the Green Party of the United States and was officially recognized by the Federal Election Commission. That recognition allowed the party to accept much higher campaign contributions from individual donors. After a split with Nader, the party fielded a virtually unknown candidate—David Cobb (1962–), a Texas lawyer—in the 2004 presidential election. Cobb received less than 0.1 percent of the popular vote.

The Green Party Platform

At its convention in 2000 the Green Party ratified “ten key values” that represented its priorities and goals: grassroots democracy; social justice and equal opportunity; ecological wisdom; nonviolence; decentralization of wealth and power; community-based economics and economic justice; feminism and gender equity; respect for diversity; personal and global responsibility; and future focus and sustainability, especially in regard to natural resources, economic development, and fiscal policies.

The Post-Nader Party

Following Nader’s split with the Green Party it faded on the national political scene. In 2007 the party reported that more than two hundred members held elected local offices in more than two dozen states, mostly in California, Pennsylvania, and Wisconsin. Because the Green Party failed to garner at least 5 percent of the popular vote in the 2004 presidential election, it was rendered ineligible for matching campaign funds from the federal government for the 2008 presidential election.

Ralph Nader—Shades of Green

Early in his career Ralph Nader (1937–), a graduate of Princeton University and Harvard University Law School, became active in politics and developed an interest in public safety and health issues. In 1965 he wrote Unsafe at Any Speed: The Designed-In Dangers of the American Automobile, a book that catapulted him to fame and triggered major changes in federal oversight of the automobile industry. During the following decades Nader’s outspoken activism earned him a reputation as a consumer advocate fighting corporate interests. In 1992 he ran for president as an independent write-in candidate. Nader was critical of both major political parties, claiming there was virtually no difference between them.

In early 1996 Green Party leaders persuaded Nader to run on their ticket in the California election for president. Greens in other states seized this opportunity to gain national recognition for the party. Although Nader never officially joined the Green Party, he identified with many of its political goals. Despite his late entry into the race and lack of campaigning, Nader garnered enough votes to encourage a second run on the Green ticket. His 2000 presidential campaign raised more than $4 million and received endorsements from Hollywood celebrities and other public figures. However, a furor arose among his liberal supporters because the final election result was so close: he was accused of siphoning votes away from Democrat Al Gore (1948–) and helping propel Republican George W. Bush (1946–) into the White House. Nader adamantly disagreed, saying that the Gore campaign had been lackluster and failed on its own accord.

In 2003 Nader chose Green activist Peter Camejo (1939–) as his vice presidential running mate and hoped to win the endorsement of the Green Party; however, he was disappointed at its national convention. The Green Party chose two of its members—attorney David Cobb (1962–) for president and businesswoman Pat LaMarche (1960–) for vice president. Nader ran as an independent, garnering slightly more than four hundred thousand votes—less than 0.4 percent of the popular vote. Nader did not receive any electoral votes in any of the presidential elections in which he participated.

See also Ralph Nader

Current Events and Social Movements

The AIDS Crisis

The AIDS (Acquired Immunodeficiency Syndrome) crisis began in the United States in 1981 when an unknown, potentially fatal disease was diagnosed in previously healthy people. Initially, AIDS was found mostly in gay men and those who injected illegal drugs, giving it a social and political stigma. The government’s response in the early years was lackluster, reflecting indifference or ignorance about the seriousness of the threat. As the death rate from AIDS increased, a kind of public hysteria developed, particularly as the disease began to spread into more mainstream populations. By the mid 1990s new medications had turned AIDS into a chronic—and in many cases manageable—disease in the United States and other wealthy nations. Public concern shifted to the toll that AIDS was taking in Africa and other parts of the developing world. The U.S. government pledged billions of dollars worldwide in the fight against AIDS.

AIDS in the 1980s

In 1981 doctors began reporting that they had diagnosed dozens of young, previously healthy, gay men with rare illnesses that usually afflict only people with severely weakened immune systems. Public health officials and medical specialists suspected that an unidentified infectious agent was to blame. They initially called it gay-related immune deficiency syndrome (GRIDS); the name was changed in 1982 to acquired immunodeficiency syndrome (AIDS) to reflect occurrences outside the gay community, primarily in people who injected drugs and recent Haitian immigrants to the United States. More than four hundred cases were reported in 1982, and more than one hundred fifty people died from the disease. Although media coverage made the public aware of the disease, its association with homosexuality and drug use limited concern about its spread in the general population. Some commentators claimed those infected with AIDS were to blame because they engaged in what was seen as risky and immoral behavior.

Public concern rose dramatically when medical authorities began to report cases of AIDS in heterosexual women and people who had received blood transfusions. Fears grew about the safety of the nation’s blood supply and about contracting the disease through casual contact. Although government researchers assured the public that the disease was spread through sexual activity, the sharing of needles, and blood transfers, public anxiety about AIDS and discrimination against those who had been diagnosed with AIDS became serious problems. By mid 1984 nearly five thousand Americans had been diagnosed with the disease, and more than 75 percent of them had died. That year brought the first major breakthrough when scientists identified the infectious agent that causes AIDS: the human immunodeficiency virus (HIV). The first drug specifically targeting AIDS was also developed that year. It did not provide a cure; however, it did slow the progress of the disease in those who were infected.

When actor Rock Hudson (1925–1985) died from AIDS in 1985—he was the first major public figure whose death was attributed to the disease—the news shocked the public and increased its anxiety. Then in Russiaville, Indiana, Ryan White (1971–1990), a thirteen-year-old boy, was barred from his school after he contracted AIDS from a tainted blood product. White suffered from hemophilia, a blood disease that is treated with frequent blood transfusions. The boy took classes via a special telephone hookup while his family fought the school’s decision for more than a year. After officials relented and readmitted the boy, he attended school for only one day before being barred again because of a lawsuit filed by a group of parents. The suit was later dropped. In 1987 White’s family relocated to another town where the boy attended school without incident. However, his cause had attracted widespread attention, and he became a symbol of the nation’s fear and hostility toward people with AIDS. He was befriended by major celebrities, featured on the cover of national magazines, and spoke frequently on television news shows. White died from AIDS in 1990 at age eighteen. More than fifteen hundred people attended his funeral, including first lady Barbara Bush (1925–).

In 1989 the government reported that more than one hundred fourteen thousand Americans had been diagnosed with AIDS since the disease was first discovered. Nearly fifteen hundred of the cases were in infants younger than age five who had been infected while still in the womb or while being breastfed.

The Government’s Initial Response

The first medical reports about the mysterious new illness stirred action by the nation’s public health professionals. At the federal level this effort was waged primarily by the National Institutes of Health (NIH) and the Centers for Disease Control and Prevention (CDC). Both are agencies of the Department of Health and Human Services (HHS). The NIH immediately accepted AIDS patients into its clinical center and began processing requests from researchers for public grant money to study the disease. The CDC took primary responsibility for investigating AIDS outbreaks for epidemiological factors (for example, factors that influence the spread and control of the disease). In 1981 HHS spent $200,000 on AIDS research; in 1989 expenditures exceeded $1 billion.

The AIDS crisis began during the first year of the presidency of Ronald Reagan (1911–2004). He had been elected in 1980 on a popular mandate of deregulation, lowered taxes, and less government intervention in the lives of Americans. Reagan was a conservative Republican, and many of his supporters were members of the Moral Majority, a religious/political movement that considered homosexuality and drug use to be sinful. The more vocal of its members publicly hinted that AIDS was God’s punishment for immoral behavior. AIDS received little attention from the White House during the early years of the epidemic. The president did not speak publicly about AIDS during this time, and an official policy was not developed for dealing with the health crisis.

In 1981 Reagan appointed C. Everett Koop (1916–) to be the nation’s surgeon general. Koop watched uneasily as the epidemic worsened with no reaction from the White House. He later claimed that he was prevented access to the president by Reagan’s inner circle. Finally in 1985 Reagan asked Koop to write a report for the American people about AIDS. Three million copies of the sixteen-page report were distributed. The report was so frank in its discussion of sexuality and anatomical parts of the body that it caused outrage in conservative circles.

At a meeting of the president’s cabinet in 1987 Koop played a major role in developing an official government policy on the AIDS crisis. He fought against mandatory testing, fearing that it would drive possible AIDS patients underground and out of reach of the public health system. In 1988 he convinced the Senate to fund the mailing of an AIDS brochure to more than one hundred million American households—the largest government mailing ever conducted. By that time it was estimated that more than one million Americans were infected with HIV. In 1989 Congress created the National Commission on AIDS to advise legislators and the president on national AIDS policy. The statute authorizing the commission expired in 1993.

The AIDS Crisis in the 1990s

Republican President George H. W. Bush (1924–) did not speak publicly about AIDS until early 1990, nearly two years into his term. In 1991 the National Commission on AIDS issued a report, “America Living with AIDS,” that was sharply critical of the response of government and society to the health crisis. The commission complained that “the country has responded with indifference” and warned that “soon everyone will know someone who has died of AIDS.” Only months after the report was issued, professional basketball player Earvin “Magic” Johnson announced that he was infected with HIV. The revelation from a popular sports hero and self-described heterosexual stunned the country. Johnson quit basketball and was named by Bush to the National Commission on AIDS. He resigned less than a year later, complaining that Bush had “dropped the ball” in the nation’s fight against AIDS.

When Democrat Bill Clinton (1946–) became president in 1993, he promised to increase funding for AIDS research and make the epidemic a top priority of his administration. He held the first White House Conference on HIV and AIDS and created the White House Office of National AIDS Policy and the Presidential Advisory Council on HIV and AIDS. Government spending on AIDS research, prevention, and treatment exceeded $4 billion per year by the end of the decade.

Through the early 1990s the annual death toll from AIDS continued to rise. There were more than fifty thousand deaths per year by 1995. That same year the Food and Drug Administration approved newly developed drugs called protease inhibitors that had a dramatic effect on the AIDS epidemic. In 1996 the death rate dropped for the first time since the epidemic began and continued to decline as the decade progressed. A new AIDS drug regimen—nicknamed a “cocktail” because it mixes several potent drugs—slowed or even stopped multiplication of the virus within the body and strengthened the immune system so it could better fight off infection. The advent of the drug cocktails turned AIDS from a disease that was nearly always fatal into one that was survivable in many cases.

At the end of 1999 the CDC reported that more than four hundred thousand Americans were living with HIV or AIDS. During the 1990s the epidemic began to affect increasingly larger percentages of African-Americans, Hispanics, and women.

The AIDS Crisis in the 2000s

During the early 2000s the number of people living with HIV or AIDS continued to increase, but the number of new cases reported each year decreased slightly. By 2005 the total number of Americans who had ever had HIV/AIDS was approaching one million. Approximately half of them had died. The number of HIV/AIDS deaths in 2005 (the most recent year for which data are available) was about seventeen thousand—approximately one-third of what it was a decade earlier.

The epidemic continues to spread in many developing nations, particularly in Africa, where life-saving drugs are not as widely available. In 2006 the United Nations estimated that as many as forty-seven million people worldwide were afflicted with AIDS and that between three million and six million more were becoming infected every year.

In 2003 President George W. Bush (1924–) pledged $15 billion over five years for the care and treatment of people with HIV/AIDS and prevention of the spread of the disease in developing countries. In 2007 he doubled the pledge. The initiative has been criticized because the money comes with strings attached: the prevention program relies on an “ABC strategy,” in which A, B, and C stand for “abstinence,” “be faithful,” and “condom use.” Strict requirements were established concerning the proportion of funds that can be allocated to each element of the program. Too much focus is placed on the A and B elements, critics say, when widespread and correct use of condoms could be much more effective in stopping the spread of HIV/AIDS.

The CDC

The Centers for Disease Control and Prevention (CDC) in Atlanta, Georgia, plays a major role in the U.S. public health and security system. The CDC evolved from a World War II agency, Malaria Control in War Areas, which had been set up because malaria posed a considerable threat at military training bases in the southern United States and U.S. territories in tropical regions. On July 1, 1946, the Communicable Disease Center was created to continue and expand upon the work of the wartime agency. Its focus became the study of many diseases to determine their incidence and distribution and to find ways to control and prevent those diseases.

The center made a name for itself during the 1950s by training “disease detectives” who investigated mysterious illnesses and tracked down their causes. They also collected data leading to the development of national flu vaccines. Throughout the 1960s the center’s responsibilities were expanded to cover chronic diseases; venereal diseases; tuberculosis; nutrition; occupational safety and health; immunizations; and family planning. It also recorded data on diseases that originated outside the United States. In 1970 its name was changed to the Center for Disease Control. Over the following two decades the center played a key role in determining the causes of emerging health threats, including Legionnaire’s Disease, a severe bacterial infection that usually leads to pneumonia—it got its name after an outbreak at an American Legion convention in 1976.

The CDC’s image was severely damaged in the early 1970s when the media reported about the agency’s role in a public health study, begun forty years earlier in Tuskegee, Alabama, which traced the progression of syphilis in hundreds of poor, uneducated African-American men. Government doctors did not treat the men with penicillin, even though it had become the drug of choice for curing the disease in the 1940s. The public was outraged when the details became public. The victims and their families received a $10 million settlement from the government. In 1997 President Bill Clinton (1946–) issued an official apology, noting “I am sorry that your federal government orchestrated a study so clearly racist.”

The center’s image was further tarnished in the 1970s when it orchestrated a nationwide effort to vaccinate Americans against swine flu. The agency acted after a soldier in New Jersey died from influenza. Government researchers believed the strain of influenza was very similar to the one that caused the flu epidemic of 1918 in which millions of people died worldwide. The vaccination effort proved counterproductive, however, when hundreds of people became ill and twenty-five died from a side-effect of the vaccine. The vaccination program was ended prematurely, and the swine flu epidemic never materialized.

Despite these lapses, the CDC remains a leading force in disease identification and containment. The CDC’s techniques using public health networks and data to track and monitor the progress of diseases and epidemics—known as surveillance—are world-renowned, and played a key role in the eradication of smallpox in the 1960s and 1970s and the identification of AIDS in the early 1980s. Thanks to the CDC’s efforts, many outbreaks of food-borne illness are traced back to their source within days of the initial infection. In 2006 the CDC celebrated its sixtieth anniversary in government service.

The Iran-Contra Scandal

The Iran-Contra scandal, which erupted during the administration of Republican President Ronald Reagan (1911–2004), involved an elaborate, secret scheme in which administration officials sold weapons to Iran and used some of the profits to fund a counterrevolution in Nicaragua. The arms sales to Iran were intended to curry favor with moderate elements within the Iranian government in hopes that they could help secure the release of Americans held hostage in Lebanon. The counterrevolution in Nicaragua—led by forces known as Contras—worked to oust the communist-leaning government controlled by the Sandinistas, a leftist political party. The Sandinistas had earlier overthrown the government of Anastasio Somoza Debayle (1925–1980), whose family had ruled Nicaragua since the 1930s. Freeing American hostages and halting the spread of communism were both high priorities for Reagan; however, he denied having direct involvement in any illegal activities undertaken by high-ranking members of his government. Although criminal charges were lodged against more than a dozen officials, some were later dropped for technical reasons and others led to pardons granted by President George H. W. Bush (1924–), who was Reagan’s successor.

The Scandal Unfolds

The scandal first came to light in the fall of 1986 when two seemingly unrelated events occurred. The Nicaraguan government shot down a U.S. cargo plane filled with military supplies. The lone survivor of the crash admitted that he worked for the Central Intelligence Agency (CIA). Less than a month later a Lebanese publication claimed that the United States had been selling arms to Iran. U.S. Department of Justice officials learned that some of the money obtained from the Iranian arms sales had been diverted to support the Contras in Nicaragua. Attorney General Edwin Meese (1931–) appointed an independent counsel to investigate. Hearings into the matter, which lasted for nearly a year, revealed that the operation had been carried out by National Security Council (NSC) staff members who believed they had the blessing of the president. The NSC is part of the Executive Office of the President and provides advice on national security issues.

The independent counsel concluded that the arms sales to Iran certainly violated U.S. policy and possibly violated the Arms Export Control Act of 1976. U.S.-Iranian relations had been tense since the late 1970s, when U.S. embassy personnel in Iran were held hostage for more than a year. Many top officials, including Reagan, admitted knowing about the arms sales; however, legal experts disagreed about whether the sales violated U.S. law. Providing U.S. financial support to the Contras was expressly forbidden by the Boland Amendment, which Congress attached to a defense-appropriations bill. Reagan signed the law in December 1982.

The Convictions and Pardons

The special counsel indicted and obtained convictions of eleven people on such charges as conspiracy, obstruction of justice, perjury, defrauding the government, and altering or destroying evidence. Among those convicted for their roles were NSC adviser John Poindexter (1936–); NSC staff member Oliver North (1943–), a lieutenant colonel in the U.S. Marine Corps; Richard Secord (1932–), a retired U.S. Air Force major general; Albert Hakim (1937–2003), an Iranian-born American citizen and business partner to Secord; NSC adviser Robert McFarlane (1937–); and Secretary of Defense Caspar Weinberger (1917–2006). The convictions of North and Poindexter were overturned on appeal because of technicalities. The charges against one defendant were dropped after the Reagan administration refused to release classified information relevant to the case.

On December 24, 1992, Bush—who had been vice president under Reagan—issued pardons for Weinberger, McFarlane, and three CIA officials. The pardons were controversial because Bush was in the last weeks of his presidency, having been defeated for re-election the month before. Only five of the original defendants were sentenced for their crimes. Four of them (including Secord and Hakim) received probation and fines; the fifth served sixteen months in prison for income-tax fraud.

The Political Effect

When it first became public the Iran-Contra scandal seemed to pose a serious threat to Reagan’s presidency. However, Reagan weathered the controversy because he was not directly implicated in any criminal activity. The independent counsel did complain that Reagan had shown “disregard” for laws intended to curb his presidential powers and had given his advisers the impression that he tacitly approved of their actions in the affair.

The Savings and Loan Crisis

The savings and loan (S&L) crisis occurred during the late 1980s, shortly after the S&Ls were deregulated by the government. The companies made so many unwise loans and poor business decisions that the government had to supply billions of dollars to prevent the industry from collapsing.

The Crisis Develops

S&Ls first appeared in the United States in the 1800s. Their original purpose was to provide mortgage loans to working-class people not typically served by conventional banks. U.S. government officials saw them as useful to the economy because S&Ls helped the housing industry and promoted home ownership. During the early 1980s the government began loosening regulations on the institutions so they could provide more services and expand their customer base. The S&Ls had been weakened by competition from banks and by the poor economy during the 1970s.

The Depository Institutions Deregulation and Monetary Control Act of 1980 and the Garn-St. Germain Act of 1982 were parts of an overall government move during the late 1970s and early 1980s to lessen regulatory oversight on major industries. Deregulation was championed by the administrations of Jimmy Carter (1924–), a Democrat, and Ronald Reagan (1911–2004), a Republican.

The Failures Begin

In 1984 a large S&L in Texas failed. Authorities discovered that the institution had been making high-risk loans and engaging in criminal activities. The following year all S&Ls in Ohio had to be closed temporarily because of lack of funds. The state allowed a few to reopen after they obtained deposit insurance from the federal government. Months later a similar situation occurred with S&Ls in Maryland. During the late 1980s a number of financially troubled S&Ls around the country were allowed to continue operating despite reporting major losses. This problem was particularly acute in Texas, which was suffering a statewide recession because of low prices for crude oil.

In January 1987 a government report showed that the Federal Savings and Loan Insurance Corporation (FSLIC), which insured deposits in S&Ls, was in severe financial stress. Months later the failure of an S&L in California resulted in losses of more than $2 billion. The institution was operated by Charles Keating (1923–), who had made substantial campaign contributions to five senators—Alan Cranston (1914–2000), a Democrat from California; Dennis DeConcini (1937—), a Democrat from Arizona; John Glenn (1921–), a Democrat from Ohio; John McCain (1936–), a Republican from Arizona; and Donald Riegle (1938–), a Democrat from Michigan. Congress investigated the senators—the so-called Keating Five—after it was learned that they had questioned the chairman of the Federal Home Loan Bank Board about the appropriateness of investigating Keating’s S&L. Although the Keating Five were chastised by Congress for their actions, no criminal activity was uncovered.

When George H. W. Bush (1924–) took office in 1989, he crafted a bailout plan for the S&L industry. The Financial Institutions Reform, Recovery and Enforcement Act of 1989 abolished the Federal Home Loan Bank Board and the FSLIC and gave regulatory authority for S&Ls to a newly created Office of Thrift Supervision. In addition, deposit insurance responsibility was shifted to the Federal Deposit Insurance Corporation, which provides deposit insurance for conventional banks.

More than one thousand S&Ls closed during the crisis. Because deposit-insurance reserves were insufficient to cover their losses, approximately $124 billion of taxpayer money was required to back up the commitments of the failed institutions.

The Persian Gulf War

The Persian Gulf War began in August 1990 with the invasion of Kuwait by Iraq and ended in February 1991 after a U.S.-led coalition sanctioned by the United Nations drove Iraqi forces out of Kuwait. As the first major war waged by the United States since the Vietnam era, the Persian Gulf War highlighted U.S. military strengths and the use of modern technology in warfare.

The March to War

In August 1990 Iraqi military forces invaded Kuwait after a dispute arose over an oil field near the border between the two countries. Iraq’s action was protested by other countries in the region and around the world. The UN Security Council responded with resolutions condemning the invasion and imposed economic sanctions on Iraq. In November 1990 the council adopted Resolution 678, which authorized nations to “use all necessary means” after January 15, 1991, if Iraq did not withdraw from Kuwait.

Meanwhile the administration of Republican President George H. W. Bush (1924–) assembled a coalition of dozens of nations willing to cooperate with the United States in the use of military force against Iraq. Some, like Britain and France, committed troops to the effort, while most others pledged money or equipment. The United States was concerned that Middle Eastern oil supplies would be disrupted and that Iraq might invade Saudi Arabia. The U.S. government obtained permission from King Fahd (1921–2005) of Saudi Arabia to amass U.S. troops on Saudi soil to prevent such an incursion.

On January 13, 1991, Congress authorized the use of U.S. military force against Iraq to enforce UN Resolution 678. The vote was not overwhelming: in the House, 250 to 183; in the Senate, 52 to 47.

The Air and Ground Offensive

On January 17, 1991, Operation Desert Storm began with a massive bombing campaign by the coalition against Iraqi targets and forces. Iraq responded by launching Scud missiles into Israel and Saudi Arabia. Israel was anxious to strike back at Iraq for the Scud attacks, but was persuaded not to take action by Bush because the coalition included many Arab countries whose leaders would have rebelled against participation by Israel.

On February 24, 1991, the ground offensive began. Coalition troops swept into Kuwait and within days had routed the Iraqi military and driven it back across the border into Iraq. As they retreated the Iraqis set fire to many of Kuwait’s oil wells, creating huge clouds of noxious smoke, which became an environmental disaster. On February 28, 1991, Bush declared an end to Desert Storm. The United States lost 382 troops in the Persian Gulf War, of which 147 were killed in action.

Reflections on the War

The Persian Gulf War was the largest operation conducted by U.S. troops since the early 1970s. The nation’s military showed off its new capabilities with so-called “smart” weapons, such as laser-guided missiles and bombs. The war was followed closely by television viewers around the world, who watched much of the action unfold live before their eyes. The success of the military operation temporarily enhanced U.S. public opinion of Bush; however, questions would linger about the appropriateness of leaving Iraqi dictator Saddam Hussein (1937–2006) in power.

UN Security Council Resolution 687 set forth the terms of the cease-fire between Iraq and Kuwait. It required Iraq to destroy any chemical or biological weapons in its possession and prohibited Iraq from acquiring or developing nuclear weapons. It also required Iraq to allow international inspection teams into Iraq to enforce the resolution. Over the following decade those inspections were at times allowed and at other times abruptly stopped by Hussein. Experts repeatedly charged that he was attempting to build nuclear weapons and refused inspections so he could hide his clandestine efforts; ultimately those charges led to war with the United States in 2003.

The Internet

The Internet is a worldwide computer network infrastructure that has provided arguably the greatest advance in communications technology since the invention of the radio. The U.S. government spearheaded the creation of the Internet in the late 1960s, and by the mid-1990s it had made the transition from obscure government and academic research tool to full-fledged consumer utility. The Internet has changed the way that personal communications and commerce are transacted throughout the world. In the areas of government and politics it has facilitated grassroots organization of political causes, enhanced fundraising efforts, and expanded the dissemination of campaign literature and propaganda. In the United States, there has also been substantial debate about regulating the Internet, primarily focused on the balance between free speech and “indecent” words and images that are transmitted over this medium.

The History of the Internet

In 1957 the Soviet Union launched the unmanned satellite Sputnik into the Earth’s orbit, bringing the Cold War to a new level. Many of the same technologies that were used to put Sputnik in orbit could be used to create an Intercontinental Ballistic Missile, which would allow the Soviets to launch a nuclear attack against any point on the globe—including the United States. Sputnik convinced the U.S. government that there was an urgent need to close the “science gap” between the two superpowers, and as a result the Advanced Research Projects Agency (ARPA) was formed within the Department of Defense.

J. C. R. Licklider (1915–1990), the first head of ARPA’s Information Processing Office, proposed a system to interconnect the computer networks of various ARPA projects throughout the United States, and envisioned the creation of a massive network of computers, which would result in vast amounts of information being available at the fingertips of any person connected to the network. In the 1960s it was determined that this project, called ARPAnet, would use digital packet-switching technology, as opposed to circuit-switching technology then used for telephonic communication. In 1974 Vinton Cerf (1943–) and Robert E. Kahn (1938–) developed a new method for computers to connect to the packet-switched network, coining the term “Internet” in the process. Although commercial use of packet-switched network technology started that year, it would take almost twenty years—and the widespread prevalence of the personal computer—for the Internet to reach the mainstream.

The Role of the Internet in Political Campaigns and Grassroots Organization

While certain uses of the Internet, such as electronic mail and commerce, were adopted quickly by the population at large in the 1990s, political parties were slow to embrace the medium. The November 1999 World Trade Organization (WTO) conference in Seattle, Washington, offered a glimpse of the Internet’s potential as a tool in grassroots organization, as protesters from different interest groups and countries coordinated the efforts of large groups of people in unexpectedly aggressive demonstrations against the WTO and globalization policies.

Full utilization of the Internet in major U.S. political campaigns did not occur until the 2004 presidential election, spurred by a new awareness of the Internet’s capabilities as a fundraising tool. The prodigious fundraising of Vermont governor Howard Dean (1948–) in preparation for the 2004 Democratic Party primaries was ascribed to his campaign’s efficient use of technology, including a personal blog—an Internet journal—maintained by the candidate himself. Dean’s experience showed that the Internet was a good tool for collecting small donations from a large number of Americans. Other candidates, including, John Kerry (1943–), the candidate who eventually beat Dean for the Democratic nomination, have since emulated Dean’s example.

The emerging role of the Internet in politics has not been limited to official campaign activities. In September 2004 a network television report accusing President George W. Bush (1946–) of being absent without leave from his service with the Texas Air National Guard during the Viet Nam War, was retracted after multiple political bloggers contested the authenticity of documents presented in the course of the report. This was hailed as an example of new media, amateur reporters on the Internet, trumping the power of the established media—the news division of one of America’s oldest and largest television networks.

The Internet has also decreased the cost of creating and disseminating political propaganda and advertising, often through the “viral video” phenomenon, in which a video clip hosted by a free public Web site is disseminated by the recommendations of viewers to their friends and contacts. In a nation where millions of people have the ability to record photos or video using their mobile phones, any embarrassing moment in public for a person in office or seeking office can easily be recorded and broadcast around the world. For example, Senator George Allen (1952–) of Virginia lost a closely contested election in 2006 after a video of him using a racial slur against an opposition campaign worker was widely viewed on the Internet. In addition to capturing spontaneous blunders such as Allen’s for posterity, viral video is also being used to disseminate low-cost political attack ads, sometimes produced by private citizens, often without the “official” approval of a candidate or political campaign.

Regulating the Internet

Although the Internet started as a U.S. government project, it has long been a question to what extent—if any—the federal government should regulate expression and transactions online. Much of the legislation aimed at regulating the Internet has focused on security concerns, such as identity theft, fraud, and electronic terrorism (or “cyberterrorism”). A 2004 survey of 269 companies found that they had lost $144 million to computer security failures, the worst of which were computer viruses and worms, followed by “denial of service” attacks. One difficulty in enforcing the law online is the Internet’s international nature—many computer crimes are carried out from overseas, particularly Asia and Eastern Europe, outside of the jurisdiction of American law enforcement.

A secondary focus of American attempts to regulate the Internet has been the attempt to curtail vice, particularly pornography and online gambling—two industries that quickly and vigorously established their presences on the Internet. Jurisdiction has played a role in attempts to regulate these industries, as Internet gambling operations, in particular, have taken care to establish themselves outside of U.S. jurisdiction. Nonetheless, the government has attempted to reduce the populace’s access to indecent or pornographic material, through the use of blocking software in schools and libraries. While there is a general consensus in favor of protecting children from the online activities of sexual predators, there is considerable controversy about the use of blocking software, since many of the same programs which block access to pornographic Web sites also block students’ access to medical information about birth control and abortion.

Global Warming

Global warming is the accelerated warming of the Earth’s temperature over the past few decades. Most scientists agree that this warming trend is the result of human activities, such as the burning of coal and oil, that load the atmosphere with carbon dioxide and other heat-trapping gases. The temperature increase has caused changes in some ecosystems around the world and, if it is not reversed, is expected to lead to major climatic changes in the future. In response to global warming many of the world’s nations have pledged to abide by the Kyoto Protocol—an international agreement that set target levels for emissions of heat-trapping gases. The U.S. government did not accept the Kyoto Protocol, arguing that it unfairly excuses developing countries, such as China and India, from meeting emissions limits and places an undue economic burden on the United States.

The Scientific Background

Radiation from the sun passes through Earth’s atmosphere and warms the planet. The planet then releases some of that radiation, which does not escape into outer space but is trapped in the atmosphere. Atmospheric composition plays a major role in this phenomenon. Some gases, such as water vapor, carbon dioxide, and methane are naturally found in the atmosphere and act to trap heat in the same way that glass panels trap heat in a greenhouse. The glass panels allow sunlight into the greenhouse, but prevent heat from escaping.

The naturally occurring greenhouse effect is necessary to provide a warm atmosphere conducive to life on Earth. Many scientists believe that the natural effect has been, and is being, augmented by the release of large amounts of “greenhouse gases” from human activities, such as burning fossil fuels (mostly coal and oil).

Two U.S. government agencies—the National Climatic Data Center and the Goddard Institute for Space Studies, which is part of the National Aeronautics and Space Administration (NASA)—track temperature records. According to their data the 1990s was the warmest decade of the twentieth century and the warmest decade since humans began measuring temperatures in the mid nineteenth century. The 2000s are likely to break that record—2005 was the hottest year ever reported. The next four hottest years on record have been 1998, 2002, 2003, and 2006.

Nations Respond

The first major warning about global warming came in 1979 from the World Meteorological Organization (WMO—a nongovernmental agency under the United Nations Environment Programme, or UNEP). The WMO warned that human activities “may” cause global climate changes. In 1988 the WMO and UNEP established the Intergovernmental Panel on Climate Change (IPCC) to assess available scientific information on climate change, estimate the expected impact of climate change, and formulate strategies for responding to the problem. The first IPCC assessment report, issued in 1990, noted several alarming trends that included rising Earth temperatures and faster melting of glaciers and sea ice. A second IPCC report, issued in 1995, said the scientific evidence suggested a human influence on global climate. Additional IPCC reports released in 2001 and 2007 reaffirmed the previous findings and predicted worldwide climate disruptions because of continued global warming.

The UN Framework Convention on Climate Change

In 1992 the United Nations crafted the United Nations Framework Convention on Climate Change—an international agreement in which countries agreed to voluntary, nonbinding reductions of greenhouse gases. The agreement was signed by more than one hundred countries, including the United States. Republican President George H. W. Bush (1924–) signed the document at the United Nations Conference on Environment and Development in Rio de Janeiro, Brazil. (The conference is often referred to as the Earth Summit or the Rio Summit.) The U.S. Senate voted to approve the agreement, and it was signed into law by Bush in October 1992. The agreement became effective internationally two years later.

In 1997 delegates from 166 countries met in Kyoto, Japan, to negotiate specific binding targets for greenhouse-gas emissions. Officials of some developed nations, including the United States, argued that all countries should abide by emissions limits. Representatives from developing countries said industrialized nations were responsible for most global warming and therefore should bear the brunt of economic sacrifices to control it.

The delegates developed an agreement known as the Kyoto Protocol to the United Nations Framework Convention on Climate Change (or Kyoto Protocol, for short). Different targets were set for different countries, depending on their economic and social circumstances. Developing countries, such as China and India, were not required to commit to limits, but did have to develop national programs for dealing with climate change. Overall, the Kyoto Protocol was intended to reduce total greenhouse-gas emissions by at least 5 percent by 2012 compared with 1990 levels.

The Kyoto Protocol also established an emissions trading system, which allowed countries that exceeded their limits to purchase credits from countries that emitted less than they were allowed. This provision was added to satisfy members of the U.S. delegation.

In 2005 the Kyoto Protocol went into effect after being ratified by the required number of countries. Ratifying entities included Canada, China, the European Union, India, Japan, and Russia. Democratic Vice President Al Gore Jr. (1948–) had signed the Kyoto Protocol on behalf of the United States in 1998; however, because the administration believed the Republican Senate would not ratify the protocol, it never submitted the treaty for a vote. As of mid 2007 the United States had still not ratified the Kyoto Protocol.

The U.S. Viewpoint

Republican President George H. W. Bush opposed precise deadlines for carbon-dioxide limits, arguing that the extent of global warming was too uncertain to justify painful economic measures. He did sign the Global Change Research Act of 1990, which authorized formation of the U.S. Global Change Research Program (USGCRP).

After Democratic President Bill Clinton (1946–) took office in 1993, the government issued The Climate Change Action Plan, which included measures to reduce emissions for all greenhouse gases to 1990 levels by 2000. However, the U.S. economy grew much faster than anticipated during the 1990s, so emissions levels increased instead of decreased. In addition Congress did not provide full funding for the plan. The Clinton administration implemented some policies that did not require congressional approval. It focused on energy efficiency and renewable energy technologies and required all federal government agencies to reduce their greenhouse-gas emissions below 1990 levels by 2010. Clinton also established the U.S. Climate Change Research Initiative (USCCRI) to study global climate change and to identify priorities for public funding.

In 1997 Robert Byrd (1917–), a Democratic senator from West Virginia, and Chuck Hagel (1946–), a Republican senator from Nebraska, sponsored a nonbinding resolution that stated the U.S. Senate would not ratify any environmental treaty that did not include all nations or that damaged U.S. economic interests. The resolution passed unanimously and effectively blocked Senate consideration of the Kyoto Protocol.

When Republican President George W. Bush (1946–) took office in 2001, he established a new cabinet-level structure to oversee government investments in climate-change science and technology. The USCCRI and USGCRP were placed under the oversight of the Interagency Climate Change Science Program (CCSP), which reports integrated research sponsored by thirteen federal agencies. The CCSP is overseen by the Office of Science and Technology Policy, the Council on Environmental Quality, and the Office of Management and Budget.

In 2002 the Bush administration released U.S. Climate Action Report–2002, which acknowledged that greenhouse gases resulting from human activities were accumulating in the atmosphere and that they were causing air and ocean temperatures to rise. However, it did not rule out the role of natural factors in global warming. Bush announced that the United States planned to reduce its greenhouse-gas emissions by 18 percent by 2012 through a combination of existing regulations and voluntary, incentive-based measures. Bush repeatedly said that he did not support U.S. ratification of the Kyoto Protocol because it does not require developing countries, mainly China and India, to commit to emissions reductions even though China and India are major emitters of greenhouse gases.

How Treaties Are Made

A treaty is an official agreement between two or more nations. In the United States treaties are negotiated by the executive branch—by the president, the vice president, or their designees, such as State Department officials. The U.S. Constitution describes the role of the president in regard to treaties in Article 2, Section 2: “He shall have Power, by and with the Advice and Consent of the Senate, to make Treaties, provided two thirds of the Senators present concur.”

The requirement for a two-thirds majority in the Senate, instead of a simple majority (more than 50 percent of those voting), makes it less likely that one political party can push a treaty through the Senate. Historically the Senate has been fairly evenly split between Democratic and Republican senators, so a bipartisan effort has been necessary to gain concurrrence on treaties.

After the Senate receives a treaty from the president, the treaty is referred to the Committee on Foreign Relations for review. The committee can make recommendations about the treaty to the Senate at large. Technically the Senate does not vote to approve a treaty; instead it votes on a “resolution of ratification,” in which it formally gives its advice and consent on the treaty and empowers the president to proceed with ratification.

Most legislation dies if it does not pass by the end of each two-year session of Congress. Treaties, however, can carry over from one congressional session to the next. If a treaty gets held up in committee, it can still be considered by Congress in the next session.

The Internet Stock Bubble

The Internet stock “bubble” refers to a phenomenon of the late 1990s and early 2000s when stocks of Internet-related businesses increased dramatically in price. For many investors, excitement about possible financial gains overruled sober analysis of the stocks’ real value. Those who got into the buying frenzy early profited handsomely if they sold their stocks while the bubble was still growing. However, predicting if a bubble exists and, if it does, when it will “burst” is extremely difficult. Many investors waited too long and lost much of their money when Internet stock prices collapsed.

The Bubble Grows

During the 1990s access to the Internet became widespread, creating new market opportunities for entrepreneurs. Analysts called it the “new economy.” Investors enthusiastically poured money into the stocks of “dot.com” businesses, such as online retailers and auction houses, travel services, and Internet search engines. All of these companies were relatively new and unproven, but many people were convinced that they would be extremely profitable. Investors relied on perceived potential, rather than business history to make investment decisions. As investors bought the stock of an Internet company, its price would rise, which encouraged other investors to buy in anticipation of further increases. The boom in Internet stocks also boosted such related sectors as computers, microchips, and information technology.

The Fed and the Bubble

In a capitalist nation, such as the United States, the government does not manage the overall economy. However, the government does have some influence on the financial decisions made in the private sector. This is particularly true of the Federal Reserve (the Fed), which is the nation’s central bank. Its decisions affect the amount of money circulating in the United States and the interest rates charged by banks to their customers.

In 1999 Alan Greenspan (1926–), the chairman of the Federal Reserve’s Board of Governors, acknowledged that many goods and services were moving from traditional markets onto the Internet. “[U]ndoubtedly some of these small companies whose stock prices are going through the roof will succeed,” Greenspan said. “They may well justify even higher prices. The vast majority, however, are almost certain to fail. That is the way the markets tend to work in this regard.” Greenspan’s words failed to change investors’ minds.

The Bubble Bursts

Very few of the new Internet businesses managed to turn much of a profit. Many lost money because of poor business planning, lack of experience, failure to build a customer base, or widespread competition. When enough investors realized their stocks had become overvalued—meaning that the stocks’ increased prices could not be sustained by the companies’ actual financial performance—a selling frenzy began. Stock prices for Internet companies plummeted. Affiliated businesses were hurt as well.

NASDAQ is a U.S.-based stock market on which the stocks of many technology companies are traded. The NASDAQ composite index is a measure of the performance of many of the stocks listed on NASDAQ. In 1990 the index was below 500. In early 2000 the index peaked above 4,000—the height of the Internet stock bubble. By late 2002 the NASDAQ composite index had dropped to about 1,200.

The 2000 Presidential Election

The 2000 presidential election was one of the closest and most contentious races in U.S. history. The two major-party candidates were Vice President Albert Gore Jr. (1948–), a Democrat from Tennessee, and Texas Governor George W. Bush (1946–), a Republican. The final outcome was delayed for more than a month after the election because of legal wrangling over a vote recount in Florida, where only a few hundred votes separated the two candidates.

The Race

Polling throughout most of the campaign indicated that the race was very close. One candidate would pull ahead temporarily and then lose the lead. Gallup polls conducted a month before the election indicated that prospective voters saw little difference between the two candidates in terms of their policies and leadership abilities. However, Bush outpolled Gore when respondents were asked to rate the two men on their honesty and trustworthiness. In the days leading up to the election, most polls gave Bush a very slight lead.

How the Outcome Is Determined

The president and vice president are actually elected indirectly: electors in each state who are pledged to specific candidates cast votes in state capitals forty-one days after the election. The candidate who wins the most popular votes in a state usually wins the electoral vote in that state. Since 1964 the number of electors has been set at 538, with the number in each state equal to the number of seats allocated to that state in the House of Representatives. To win an election, a candidate must get at least 270 electoral votes. Candidates often win many more popular votes in some states than in others, so the final outcome is determined by which candidate carries which states. On election day interest is focused on the states with many electoral votes. In the 2000 presidential election, the states with the largest number of electoral college votes were California (fifty-four), New York (thirty-three), Texas (thirty-two), Florida (twenty-five), Pennsylvania (twenty-three), and Illinois (twenty-two).

As the votes were counted on November 7, 2000, it soon became apparent that Florida was going to be a “swing” state—its electoral votes could go to either candidate—and play a major role in the election’s outcome. By the time the polls closed it was clear Gore had won in Illinois, New York and Pennsylvania. Shortly before 8 p.m. Eastern Standard Time all of the television networks projected a win for Gore in Florida. Two hours later the networks issued a retraction and declared the state for Bush. This gave Bush a sizable lead in electoral votes, as he had already won most of the Southeast and Midwest and his home state of Texas. Although Gore got a sizable jump late in the evening by winning California, most other western states went to Bush. Shortly after 2 a.m. the networks declared Bush the nationwide winner. Gore called Bush and conceded the election.

Less than an hour later vote counts still coming in from Florida revealed that Bush’s lead there had shrunk dramatically. Gore retracted his concession. At 4:15 a.m. the networks retracted their projection of a Bush win, admitting that the Florida vote made the race too close to call. As Americans woke up on November 8 they discovered that neither candidate had captured the 270 electoral votes needed to win the election. The final tally in Florida was going to decide the presidency.

The Florida Recount

Because the vote was so close in Florida—less than half of one percent of votes separated the two candidates—a mechanical recount was automatically triggered. It was completed on November 10 and gave Bush a win by only a few hundred votes. However, the state was still waiting for absentee ballots to come in, so the final tally was not certain.

Soon after the mechanical recount began, Gore’s lawyers requested a recount by hand of ballots cast in four hotly contested counties—Broward, Miami-Dade, Palm Beach, and Volusia. Bush’s lawyers went to federal court to halt the manual recounts. Over the following days numerous lawsuits were filed by both parties. Florida was supposed to certify its final election results by November 14. It soon became obvious that the recounts would not be completed by that date. Katherine Harris (1957–), a Republican who was then Florida’s secretary of state, extended the deadline by one day. On November 15 she ordered the manual recounts to cease. The following day the Florida Supreme Court ordered the manual recounts to proceed. On November 18 a count of the absentee ballots revealed that Bush had won the state overall by 930 votes. The legal battle continued. On November 21 the Florida Supreme Court again ordered manual recounts to continue and a set a deadline of November 26 for certification of election results. On that date Harris certified the state’s results minus the manual recount from Palm Beach County, because it was not completed by the deadline. She declared Bush the winner in Florida by 537 votes.

Two days before the certification date Bush’s legal team was granted an appeal before the U.S. Supreme Court. The court decided on December 4 in Bush v. Palm Beach County Canvassing Board that the Florida Supreme Court should explain its decision to extend the deadline for the manual recounts. The legal battle returned to the Florida courts. On December 8 the Florida Supreme Court ordered recounts of contested ballots in every county. The Bush team appealed that decision to the U.S. Supreme Court. On December 12, the court ruled 7 to 2 in Bush v. Gore that the Florida Supreme Court’s recount scheme was unconstitutional. The following day Gore conceded the election to Bush.

The final, official vote count in Florida was the one certified by Harris on November 26, which gave Bush the win by only 537 votes. Florida’s 25 electoral votes gave Bush a total of 271 electoral votes, enough to clinch the presidency. Nationwide he received 50.5 million popular votes compared with 60 million cast for Gore. It was only the fourth time in U.S. history that the winner of the popular vote did not win the presidency. The previous occurrences were in 1824, 1876, and 1888.

The Disputed Florida Ballots

The Gore campaign chose the four Florida counties for manual recounts largely because the counties used punch-card voting. In that system voters receive paper ballots printed with the names of the candidates. Next to each name is a prescored square or circle that the voters punch out with a stylus—a pointed instrument—to indicate their choices. The tiny pieces of punched-out paper are called “chads.” Completed punch cards are fed through a machine that detects the punches and tallies the votes. Ballots with incompletely punched holes are rejected by the machines. Ballots that do not clearly indicate a selection are called “undervotes.” Undervotes may or may not be counted manually at the county’s discretion. Immediately after the polls closed thousands of undervotes were reported in the four counties that Gore targeted. He hoped that manual analysis of the punch cards in those counties would change many of the undervotes to votes for him.

The manual recount was covered extensively by the media. It soon became apparent that counting punch-card undervotes was fraught with problems. Television viewers watched as local elections officials with enormous magnifying glasses examined individual punch cards looking for hanging chads—chads that had not been completely dislodged from the punch cards—or even indentations in still intact chads. The latter were called “dimpled chads” or “pregnant chads.” Officials assumed that voters had attempted to punch out the chads, but had been unsuccessful. Debate soon arose over the practices used by elections official to interpret voter intent on contested punch cards.

An additional controversy erupted over the layout of the punch-card ballot used in Palm Beach County. Typically a punch card has two columns of candidate names with prescored chads to the right of the names. Palm Beach County officials had laid out their ballot as a “butterfly” with all the prescored chads in a single column down the middle of the card. The Republican and Democratic candidates for president were listed on the left side of the card and third-party candidates were listed on the right side. Voters had to be careful to match names to chads to prevent errors.

The day after the election several private citizens sued the Palm Beach County Canvassing Board, claiming that the butterfly design had confused many voters, particularly elderly ones, into voting for the wrong candidate. This allegation seemed to be supported by data which showed that third-party presidential candidate Patrick Buchanan (1938–) received more than three thousand four hundred votes in the county—more than three times the number reported in any other Florida county. Buchanan’s name was on the right-hand side of the butterfly ballot; the prescored chad for Buchanan was positioned between those for Bush and Gore. Lawyers collected affidavits from local residents who claimed that the layout of the ballot had caused them to vote for Buchanan when they had meant to vote for Gore. They asked for a countywide revote. The Florida Supreme Court ultimately vetoed a revote for constitutional reasons.

The Consequences

The disputed 2000 presidential election had two major consequences in American politics. First, it aggravated partisan divisions. Many Gore supporters adamantly believed that the election had been “stolen” from their candidate. They blamed elections officials in Florida—a state whose governor was Republican Jeb Bush (1953–), the brother of the president-elect; the Supreme Court, which was dominated at the time by conservative justices who had been appointed by Republican presidents; and even Ralph Nader (1934–), who ran for president in 2000 on the Green Party ticket and captured nearly three million popular votes. Second, the election spurred widespread calls for reform and modernization of election methods, particularly development of computerized voting machines.

Disenfranchised Voters

Disenfranchised voters are those who have been denied their legal right to vote. In common usage, the word disenfranchisement is also used to describe any circumstance that discourages voters from voting. The 2000 presidential election in Florida was noteworthy not only for the closeness of the race and for disputed ballots, but also for generating thousands of complaints (mostly from minorities) about obstacles that kept people from voting. The Voting Rights Act of 1965 forbids discrimination against voters on account of race, color, religion, sex, age, disability, or national origin.

The U.S. Commission on Civil Rights (USCCR) is an independent bipartisan agency that investigates voting irregularities. In early 2001 the commission heard testimony from hundreds of witnesses regarding their Florida election experiences. The most common complaints came from voters who had been turned away from the polls because their names were not on precinct voter lists. Several poll workers testified that they tried to verify registration status in such cases, but the phone lines to their supervisors’ offices were constantly busy or not answered. Most poll workers said they were unaware that Florida law allowed these people to vote as long as they signed an affidavit swearing that they were registered voters.

The USCCR also heard many complaints about polling places that closed early, moved without providing advanced notice to voters, or turned away voters who were waiting in line at closing time. Some African-American witnesses testified that they were turned away by poll workers who allowed white voters to enter. Others complained about a Florida Highway Patrol roadblock set up on a major road near a polling place that served neighborhoods with large minority populations. Although the highway-patrol officials denied that the roadblock was intended to disenfranchise minority voters, the commission noted that its presence was perceived by some local residents as voter intimidation.

Florida’s Voter Registration Act allows residents to register to vote in several places; for example, they can register at the Department of Highway Safety and Motor Vehicles when they apply for or renew a driver’s license. The USCCR documented many instances in which the information for these “motor voters” was not filed with local elections offices.

The commission concluded that there had been “widespread denial of voting rights” in Florida because of “injustice, ineptitude, and inefficiency.” It further stated that “disenfranchisement of Florida’s voters fell most harshly on the shoulders of black voters.” This statement was based in part on the high level of rejected ballots reported for African-American voters. The ballots were rejected primarily for overvoting (more than one candidate selected for a particular office). Overvoting can be caught by sophisticated voting equipment that allows voters to correct their mistakes. The USCCR found that minority communities were less likely to have this sophisticated equipment than white communities. As a result African-American voters were nearly ten times more likely than nonblack voters to have their ballots rejected.

The Terrorist Attacks of September 11, 2001

On the morning of September 11, 2001, hijackers commandeered two U.S. commercial airliners—American Airlines Flight 11 and United Airlines Flight 175—and crashed them into the twin towers of the World Trade Center in New York. A short while later a third hijacked plane—American Airlines Flight 77—was crashed into the Pentagon outside Washington, D.C. A fourth plane—United Airlines Flight 93—crashed in a field in Pennsylvania after passengers decided to resist the hijackers who had taken control. Its intended target is unknown. Because the planes were laden with jet fuel, the crashes ignited massive fires. Both World Trade Center towers collapsed after burning for more than an hour. The attacks killed more than two thousand nine hundred people.

Within hours intelligence agencies had learned that the hijackers were associated with the militant Islamist group al-Qaeda, led by Osama bin Laden (1957–), the son of a wealthy Saudi family, and aided by the Taliban government of Afghanistan. The attacks—the most extensive and orchestrated terrorist attacks in U.S. history—led to an invasion of Afghanistan by U.S. and allied forces.

The Government Reacts

President George W. Bush (1946–) was visiting a Florida school at the time the planes crashed. He spoke briefly on television, indicating that the country had experienced “an apparent terrorist attack,” and then disappeared from public view for hours as he was flown from one military installation to another for security reasons. Meanwhile the White House, Capitol, and other government buildings were evacuated. The first lady, the vice president, and other top officials were taken to secure locations. The Federal Aviation Administration grounded all domestic air flights and diverted U.S.-bound transatlantic flights to Canada.

Bush returned to Washington, D.C., late in the afternoon. That evening, in a nationally televised address, he promised to bring the terrorists to justice and noted “we will make no distinction between the terrorists who committed these acts and those who harbor them.”

Three days later Congress authorized the president to use “all necessary and appropriate force” against those found to have been involved in the terrorist attacks. On September 20, 2001, before a joint session of Congress, Bush issued an ultimatum to Afghanistan’s Taliban government. He wanted all al-Qaeda leaders turned over to U.S. authorities; all terrorist training camps closed; and every terrorist handed over to the appropriate authorities. The demands were not open to negotiation: “The Taliban must act, and act immediately,” Bush warned. “They will hand over the terrorists, or they will share in their fate.” The Taliban rejected the ultimatum.

The United States and allied forces launched a war against Afghanistan and drove the Taliban from power in less than three months. A multinational peacekeeping force was assembled to help U.S. forces secure the country, while a new government was installed. Although the military operation was deemed a success, security was difficult to maintain in Afghanistan because of continued rebel uprisings. In addition the U.S. failed to locate bin Laden.

The Victims of September 11, 2001

The vast majority of casualties from the attacks were at the World Trade Center towers. More than two thousand seven hundred people died there; about two hundred died at the Pentagon; and forty-three at the Pennsylvania crash site. The area of wreckage and debris at the World Trade Center site became known as “ground zero.” Rescue, recovery, and cleanup operations lasted for months. More than $1 billion in charitable contributions was raised for the victims and their families.

Within two weeks of the attacks Congress passed the Air Transportation Safety and System Stabilization Act to provide financial assistance to the airlines—lawmakers feared that lawsuits might bankrupt the industry—and to protect the U.S. economy. Congress also created the September 11th Victim Compensation Fund of 2001 to provide public compensation to victims (or their relatives) who had been killed or physically injured in the attacks. The fund, which was available to claimants who agreed not to sue the airline companies, paid out about $7 billion to the survivors of 2,880, or 97 percent, of the individuals killed in the attacks. Some 2,680 injured persons were also compensated by the fund. The average death award was in excess of $2 million per claim; the average injury award was approximately $400,000.

The 9/11 Commission Report

In 2002 the president and Congress created the National Commission on Terrorist Attacks upon the United States to investigate all of the circumstances relating to the terrorist attacks. For nearly two years the commission reviewed relevant documents and interviewed more than one thousand people, including captured al-Qaeda operatives, to re-create the events leading up to and occurring on and after September 11, 2001. Its report, which became known as The 9/11 Commission Report, traces the plotting, execution, and aftermath of the attacks. The commission learned that all nineteen hijackers were from nations in the Middle East; sixteen of them were Saudis. Six of the men were “lead operatives”—the best trained of the team. Four of these men, who piloted the hijacked planes, had studied for months at U.S. flight schools. The lead operatives lived in the United States for up to a year before the day of the attacks. Thirteen were “muscle hijackers” selected to assist in overpowering the flight crews and passengers. They came to the United States only months before the attacks after undergoing extensive training at al-Qaeda camps in Afghanistan.

According to the report, all of the hijackers were selected by al-Qaeda because of their willingness to martyr themselves for the Islamist cause espoused by bin Laden. However, the commission also discovered that very few people within al-Qaeda knew the details and scope of the hijack plan before it was carried out.

The War in Afghanistan

On October 7, 2001, the United States and its coalition partners, including Britain, began Operation Enduring Freedom with an aerial bombardment of Afghanistan. The war plan—tailored to avoid a lengthy ground campaign against guerrilla-type fighters in the mountainous terrain—relied on covert intelligence agents; special forces trained in counterterrorist actions; air power; and opposition forces within Afghanistan, such as the Northern Alliance, a coalition of Afghani forces that had been fighting each other but united to rout the Taliban from the country. Only a relatively small number of U.S. ground troops were required. By the end of December 2001 the Taliban government had been removed from power.

Afghan opposition leaders met with officials of the United Nations in Bonn, Germany, to work out a plan for a new permanent government for Afghanistan and to ensure security, reconstruction, and political stability. UN officials called for a multinational military force, the International Security Assistance Force (ISAF), to secure Kabul, Afghanistan’s capital. In 2003 the North Atlantic Treaty Organization (NATO) assumed control of ISAF and expanded its area of responsibility. In 2006 ISAF relieved U.S. and allied troops of security details in southern Afghanistan and assumed responsibility for the entire country. Nearly twelve thousand U.S. troops became part of ISAF, and thousands more continued to train and equip Afghani police and army forces.

As of mid 2007 more than two hundred U.S. military personnel had died in combat in Operation Enduring Freedom and more than one thousand two hundred had been wounded.

The War in Iraq

The war in Iraq is an extension of the “war on terror” that was initiated by the United States in the aftermath of the September 11, 2001, terrorist attacks on New York City and Washington, D.C. In March 2003 troops from the United States, Britain, and a handful of other countries, attacked Iraq in an effort to oust its dictator, Saddam Hussein (1937–2006). The military action was precipitated by claims from the intelligence community that Iraq was amassing weapons of mass destruction (WMD). These claims would later prove to be mistaken.

Military operations in Iraq were initially successful. Within weeks in 2003 long-time Iraqi dictator Hussein was removed from power, and before long a new government was installed. Reconstruction of Iraq and training of Iraqi forces to take over security duties got under way. However, a fierce insurgency erupted, driven by militant elements opposed to the occupation. Eventually the violence widened into a deadly civil struggle between Iraqi factions divided by religious and political differences. As of mid 2007 U.S. troops still occupied Iraq, and the effort to stabilize and rebuild Iraq continued. The rising toll of U.S. military casualties made the American public dissatisfied with the progress of the war and spurred calls for a timetable for withdrawal of U.S. troops.

The History of U.S.-Iraq Relations

By the early 2000s the United States and Iraq had had a rocky relationship for decades. In 1967 Iraq severed diplomatic ties to protest U.S. support of Israel during the Arab-Israeli Six-Day War. In 1984 diplomatic ties were restored when the U.S. government favored Iraq in its war with neighboring Iran. However, the relationship cooled after U.S. officials condemned Hussein for using chemical weapons in that war and against the Kurds, a minority people in northern Iraq. In 1990 Iraq invaded neighboring Kuwait, ostensibly in a dispute over oil rights. Republican President George H. W. Bush (1924–) assembled an international military coalition to force Iraq to retreat. The Persian Gulf War, as it became known, quickly liberated Kuwait. It ended with UN resolutions that required Iraq to destroy any WMD that it possessed and to refrain from development of such weapons. In addition inspectors from the International Atomic Energy Agency were to be allowed into Iraq to verify that the government was abiding by the resolutions. During the next decade, however, Hussein behaved belligerently toward the United Nations, although at first he allowed inspectors into the country. In 1998 he expelled the inspectors. His continuing defiance convinced some observers that he was hiding a weapons program.

After the Persian Gulf War, U.S. and British military personnel began enforcing a “no-fly zone” over northern Iraq to prevent Hussein from attacking the Kurds from the air. Another no-fly zone was added in southern Iraq to protect the largely Shiite population from a crackdown by Hussein’s government, which was predominantly Sunni. (Shiite and Sunni are two distinct sects within the religion of Islam.) Iraqi military forces often fired at U.S. and British warplanes enforcing the no-fly zones. In response, the coalition bombed Iraqi air-defense systems on the ground.

The Intelligence against Iraq

The terrorist attacks in the United States on September 11, 2001, drastically changed the government’s policy toward Iraq. President George W. Bush (1946–) disclosed later that he “wondered immediately after the attack whether Hussein’s regime might have had a hand in it.” Although U.S. intelligence agencies found no conclusive evidence that Iraq had been involved, suspicions lingered. Some members of the administration, including Secretary of Defense Donald Rumsfeld (1932–) and Deputy Secretary of Defense Paul Wolfowitz (1943–), advocated striking Iraq as part of Bush’s “war on terror.”

In early 2002 Bush described Iraq as a member of an “axis of evil” in the world. Later that year intelligence officials told the president they were certain Iraq had restarted its nuclear-weapons program and likely had new chemical and biological weapons as well. In October Congress voted to authorize the use of U.S. military force in Iraq if diplomatic efforts failed to get Iraq to submit to inspections by the IAEA.

The administration sought worldwide support for military action, but found very little. In February 2003 Secretary of State Colin Powell, in a speech before the United Nations General Assembly, asserted that the United States had evidence that Iraq had purchased uranium yellowcake, a nuclear material, from Niger. He pressed for a UN resolution that would allow the use of military force against Iraq. Although the United Nations was frustrated by Iraq’s behavior toward international inspectors, it did not authorize military action.

On March 17, 2003, Bush presented his argument for war in a nationally televised address. He highlighted Iraq’s failure to abide by UN resolutions regarding weapons inspections and noted that U.S. intelligence “leaves no doubt that the Iraq regime continues to possess and conceal some of the most lethal weapons ever devised.” He expressed his fear that those weapons could be transferred to terrorists, who could use them against the United States. In the address he told Hussein and his sons they had forty-eight hours to leave Iraq or face military action. He promised the Iraqi people that the United States would help them build a prosperous and free country.

The War Unfolds

The coalition that formed to fight the war was much smaller than the group that had joined together for the Persian Gulf War in 1991. Besides the United States, only Britain pledged large numbers of troops; some other countries contributed smaller numbers of troops—primarily Australia, Denmark, and Poland. Many traditional allies of the United States, especially France and Germany, were opposed to military action against Iraq and offered no assistance.

On March 20, 2003, the war effort—known as Operation Iraqi Freedom—began with a massive aerial bombardment followed by a ground invasion. Within three weeks U.S. troops had captured Baghdad, the capital, and British troops occupied much of southern Iraq. The nation’s oil-field infrastructure had been secured with little damage, which was considered important: oil income was expected to pay for reconstruction of the country. Initially some Iraqis were jubilant about the toppling of Hussein’s repressive government. However, the mood quickly evaporated as lawlessness and massive looting erupted. Coalition troops were unable to restore order in many areas. On May 1, 2003, Bush announced that major combat operations in Iraq were over. He promised that the coalition would begin securing and reconstructing Iraq.

During the summer L. Paul Bremmer (1941–), who was appointed the U.S. administrator of Iraq by Bush, created the Coalition Provisional Authority as a transitional government. Among its major tasks were establishment of an Iraqi Governing Council and oversight of the training of Iraq’s Security Forces (ISF—both army and police). Bringing the ISF to full capacity and strength was considered essential to ending the coalition’s occupation of Iraq.

As the summer progressed, however, insurgents began to disrupt all aspects of the reconstruction effort, including the training of Iraqi military and police officers; the rebuilding of damaged infrastructure; and production from Iraq’s oil fields.

By the end of 2003 U.S. troops had killed or captured dozens of the most wanted members of Iraq’s former regime, including Hussein. He was later tried for war crimes by the new Iraqi government, and was convicted and executed in 2006.

A Constitution and Elections

In June 2004 UN Resolution 1546 transferred the sovereignty of Iraq from the Coalition Provisional Authority to the Iraqi Interim Government. Over the next year it hammered out a new constitution, which was approved by the Iraqi people in October 2005. Two months later Iraq held national elections.

The insurgents continued their efforts to disrupt reconstruction, however, using sniper fire, suicide bombings, and improvised explosive devices (IEDs)—usually planted along roads where they were struck by armored personnel carriers—to kill and wound Iraqi troops and civilians, as well as U.S. and British troops. The number of attacks increased dramatically in 2004, largely because of the leadership of Abu Musab al-Zarqawi (1966–2006), a Jordanian terrorist, who spearheaded an ultraviolent group, Tawhid and Jihad, that captured worldwide media attention by videotaping the beheadings of kidnapped Western civilians, primarily contractors working for the coalition. In 2005 the United States put a $25 million bounty on al-Zarqawi and accused him of waging attacks against Iraqi Shiites to spark a sectarian war. In June 2006 Zarqawi was killed in a U.S. bombing raid. However, sectarian violence continued and threatened to escalate into full-blown civil war.

No WMD

After the invasion, U.S. forces searched thoroughly for weapons of mass destruction, but none were found. Bush established an investigatory commission to examine prewar intelligence about Iraq. After months of hearings, the Commission on the Intelligence Capabilities of the United States Regarding Weapons of Mass Destruction issued a report in 2005. It concluded that U.S. intelligence agencies were “dead wrong” in almost all of their prewar assessments regarding Iraq’s weapons program. The commission noted that much of the U.S. intelligence was based on false information from unreliable informants and poor data sources. For example, the much-touted evidence that Iraq had purchased uranium yellowcake from Niger turned out to be based on forged documents. The commission also reported that intelligence agents had ignored information that did not fit their preconceived notions about Iraq. The report prompted Bush’s critics to charge that he and his administration had used that intelligence to sustain their own preconceived notions. Some said the administration misled or lied to the American public—that it had used intelligence it knew was false—to make a convincing case for war.

The Repercussions

Growing public discontent with the war caused Bush’s poll ratings to plunge. In the 2006 congressional elections that dissatisfaction brought widespread success for Democratic candidates; Democrats and independents gained majority control of both the House and the Senate. The day after the election Bush accepted the resignation of his secretary of defense, Donald Rumsfeld, one of the chief architects of the war. Bush acknowledged that voters had shown their displeasure with the lack of progress in Iraq.

As of mid 2007 more than three thousand U.S. military personnel had died in Iraq, and thousands more had been wounded. Many were severely injured, with a considerable number returning to the United States as amputees. Many soldiers experienced brain trauma from the explosion of IEDs as well.

In Iraq reconstruction continued, although it was repeatedly interrupted by sectarian warfare. Thousands of Iraqis had fled to safety in neighboring countries. More than one hundred fifty thousand U.S. troops remained in Iraq.

The Abramoff Scandal

Jack Abramoff (1959–), a highly paid lobbyist with political connections, pleaded guilty in 2006 to several criminal charges, including conspiracy to bribe public officials. The investigation of Abramoff resulted in the indictments of one former congressman and several staff members in other congressional offices. Because Abramoff agreed to cooperate with authorities as part of his plea arrangement, the scandal was still unfolding in mid 2007.

Background

A professional lobbyist is a person who is paid to influence legislation on behalf of a special-interest group, industry, or particular cause. In the early 1980s Abramoff made important political connections as a young man through his activities in the College Republicans, a student organization for supporters of the Republican Party. In 1994—the year he became a lobbyist for a Washington, D.C., firm—the so-called Republican Revolution swept many Republican candidates into national office and gave the party majority control in the U.S. House of Representatives for the first time in decades. Abramoff lobbied his Republican friends and associates in government on behalf of a number of high-paying clients, including several Native American tribes that operated gaming casinos.

In February 2004 the Washington Post published an article describing Abramoff’s dealings with the tribes and the millions of dollars they paid him for his efforts on their behalf. The article noted Abramoff’s ties to Michael Scanlon, who had been an aide to Tom DeLay (1947–), a Republican representative from Texas who was a powerful leader of the U.S. House. Scanlon, who operated a public relations firm, was also paid large sums of money by the tribes. Federal officials began investigating Abramoff and Scanlon because federal law places tight restrictions on how tribes that are engaged in gaming can spend their revenues.

The Scandal Erupts

In January 2006 Abramoff pleaded guilty to fraud, tax evasion, and conspiracy to bribe public officials. In exchange for his plea, he agreed to provide investigators with information about his dealings with high-ranking officials in Congress and the administration of President George W. Bush (1946–). Abramoff admitted that he had defrauded some of his tribal clients out of millions of dollars by persuading them to hire Scanlon’s firm for their public-relations needs; Abramoff split the fees paid to Scanlon in a kickback scheme. Abramoff also provided investigators with details about trips, campaign contributions, and favors that he had lavished on politicians to influence their activities.

A 2006 investigation by the U.S. House Government Reform Committee showed that Abramoff had made hundreds of contacts with Bush administration officials, including Karl Rove (1950–), Bush’s deputy chief of staff. As of mid 2007 the Abramoff scandal had resulted in criminal charges against several figures, including Bob Ney, a former Republican representative from Ohio, who was sentenced to more than two years in prison for conspiring to commit fraud; making false statements; and violating lobbying laws. As part of his deal with prosecutors, Scanlon pleaded guilty to conspiring to bribe a member of Congress and other public officials and agreed to pay back $19.6 million to Indian tribes that had been his clients. He also agreed to assist in the investigation and testify against Abramoff.

Hurricane Katrina

Hurricane Katrina was one of the most devastating natural disasters in the history of the United States. Hitting the Gulf Coast in August 2005, Katrina caused an estimated $81 billion in property damages and killed more than fifteen hundred people. The storm surge from the hurricane ravaged coastal areas in Mississippi and Louisiana and overwhelmed levees protecting the City of New Orleans. Low-lying areas of the city were flooded up to the rooftops of houses, stranding tens of thousands of people. A massive rescue effort was carried out by the U.S. Coast Guard. The Federal Emergency Management Agency (FEMA)—a federal agency responsible for disaster relief—was harshly criticized for its response to the disaster.

The Storm

As Hurricane Katrina approached the United States, forecasters warned that it could be one of the most powerful storms ever to hit the country. Hurricanes are rated on the Saffir-Simpson scale of intensity from Category 1 (the weakest) to Category 5 (the strongest). Katrina first crossed the southern tip of Florida as a Category 1 storm, killing seven people. Over the warm waters of the Gulf of Mexico it quickly gained strength to a Category 5 storm.

Warnings were issued across the Gulf Coast, and more than one million people evacuated the area. More than seventy-five thousand people gathered at shelters. In New Orleans approximately ten thousand people sought shelter in the Superdome.

Hurricane Katrina was a strong Category 3 storm when it hit land again on the morning of August 29, 2005, just east of New Orleans near the border between Louisiana and Mississippi. The storm covered a wide area, and its winds pushed a powerful storm surge ashore. The Mississippi coastline was flooded for more than five miles inland, encompassing the cities of Gulfport and Biloxi. Structures were damaged or demolished by the combination of high winds and storm surge.

At first it appeared that New Orleans had been spared the worst of the hurricane. Damage was minimal, particularly in the downtown district. However, the Superdome lost power, leaving the people there with no air-conditioning in the sweltering summer heat and no working bathrooms. Around midday on August 29, some of the levees protecting New Orleans were breached by the enormous pressure of the water backed up behind them. More than half of New Orleans lies below sea level, so when the levees failed, water poured into the city, flooding to depths of twenty feet in some places. Desperate people climbed onto rooftops to escape rising water. Coast Guard helicopters rescued more than a thousand people. Many of them went to the Superdome, which was already very crowded. Meanwhile, looting became a problem in downtown areas.

The FEMA Controversy

Local officials claimed that FEMA acted too slowly to provide supplies and evacuate people. At first President George W. Bush (1946–) praised FEMA and its director, Michael Brown (1954–). As criticism grew Brown was relieved of his duties and returned to Washington, D.C. Days later he resigned his position. He defended his agency’s actions, noting that local and state officials, not the federal government, are supposed to provide “first response” in emergencies.

In the following months more than a dozen congressional hearings investigated the government’s response to the disaster. Brown complained that he received lackluster support from his superiors in the Department of Homeland Security. In February 2006 the U.S. Government Accountability Office published a report extremely critical of FEMA’s handling of the Individuals and Household Program (IHP), which provides money directly to victims of natural disasters. The GAO complained that the agency’s poor oversight of IHP payments, which had totaled more the $5 billion, resulted in “substantial fraud and abuse.”

Legislation, Court Cases, and Trials

Nixon v. Fitzgerald

In Nixon v. Fitzgerald (1982) the U.S. Supreme Court ruled that former president Richard Nixon (1913–1994) was immune from a civil lawsuit filed by a former Department of Defense employee who believed he had been wrongly dismissed from his job at Nixon’s direction. The court found that absolute immunity is appropriate for presidents so they can act on behalf of the entire country without fear of incurring personal-damages claims from individuals aggrieved by public policy.

Background

In 1968 A. Ernest Fitzgerald, a civilian analyst with the U.S. Air Force, testified before a congressional committee that cost overruns on a new C-5A transport plane could reach $2 billion, partly because of technical difficulties during development. The revelations were widely covered in the press and were considered embarrassing to Fitzgerald’s superiors at the Department of Defense. The testimony occurred during the waning months of the presidency of Lyndon Johnson (1908–1973), who was succeeded by Nixon.

In January 1970 Fitzgerald lost his job, purportedly because of reorganization within the air force to cut costs and improve efficiency. His dismissal attracted press coverage and spurred calls for an official investigation. When asked publicly about the incident, Nixon promised to look into the matter. He subsequently made inquiries within his administration about transferring Fitzgerald to another position, but that idea was vetoed in writing by some of his aides who questioned Fitzgerald’s loyalty. Fitzgerald protested to the Civil Service Commission (CSC) that he had been terminated unlawfully in retaliation for his 1968 testimony before the congressional committee. After a lengthy investigation a CSC examiner concluded in 1973 that no evidence supported Fitzgerald’s claim that he had been dismissed as a retaliatory measure. However, the examiner did find that the dismissal violated CSC regulations because it was motivated by “reasons purely personal” and recommended that Fitzgerald be reinstated.

Fitzgerald promptly filed a civil lawsuit against several officials of the Defense Department and two White House aides. In 1978 he amended his lawsuit to include charges against Nixon. (Nixon had resigned the presidency in 1974.) Nixon’s lawyers claimed their client could not be sued in a civil matter because historically immunity had been granted to high-ranking government officials in such cases. The U.S. Supreme Court agreed to consider the scope of immunity available to a president.

The Decision

The court ruled 5 to 4 that a president is entitled to absolute immunity from liability for civil damages related to his official acts while in office. Justice Lewis Powell (1907–1998), writing for the majority, noted that the court had long recognized that certain government officials are entitled to some form of immunity from suits for civil damages. The president, Powell concluded, occupies a “unique position in the constitutional scheme” and should be protected from private lawsuits that could risk “the effective functioning of government.”

In a related and simultaneous case, Harlow v. Fitzgerald, the court ruled that Nixon’s aides were not entitled to absolute immunity, but to qualified immunity. Powell’s opinion describes qualified immunity as a shield from liability for civil damages so long as the conduct does not violate “clearly established statutory or constitutional rights of which a reasonable person would have known.” In other words, conduct that does violate such rights is subject to civil litigation.

Sexual Harassment and the President

During the mid 1990s the lawyers for President Bill Clinton (1946–) relied heavily on the court’s decision in Nixon v. Fitzgerald to argue that a sexual harassment lawsuit against Clinton should be dismissed. The suit was filed by Paula Jones (1966–), a former Arkansas state employee who claimed that Clinton made unwanted sexual advances toward her when he was governor of Arkansas. She alleged that turning down his advances led to poor treatment of her by supervisors in state government.

In Clinton v. Jones (1997) the U.S. Supreme Court considered Clinton’s claim to absolute immunity. In a unanimous decision the court ruled that absolute immunity did not apply in this case because the alleged misdeeds occurred before Clinton was president. Justice John Paul Stevens (1920–), in his opinion for the court, wrote that “immunities for acts clearly within official capacity are grounded in the nature of the function performed, not the identity of the actor who performed it.” In addition the court noted that presidential immunity does not apply to “unofficial conduct.”

In April 1998 a federal judge dismissed Jones’s case against Clinton saying that it failed to meet the legal standards for proving sexual harassment and other charges made in the suit.

The Balanced-Budget Amendment

During the 1980s and 1990s several vigorous but unsuccessful attempts were made in Congress to pass an amendment to the U.S. Constitution that would force the U.S. government to balance the nation’s budget each year. Republican lawmakers, in particular, made it a high political priority. When budget surpluses were achieved in the late 1990s and early 2000s the amendment movement faded.

Budget Deficits and Legislation

A balanced budget occurs when the federal government spends the same amount of money that it takes in during a fiscal year. Balanced budgets have not been the norm in U.S. history. Budget deficits—when spending is greater than revenues—have occurred frequently, most often because of wartime spending or economic disruptions, such as the Great Depression. Beginning in 1970 a budget deficit occurred every year for nearly three decades.

The economy of the 1970s was plagued by energy problems, inflation, and high unemployment. These problems persisted into the early 1980s and were aggravated by growing expenses for national defense and domestic programs, such as Medicare, a health-care program for the elderly. During the 1980s the federal deficit regularly exceeded $100 billion each year. Lawmakers responded with legislation, including the Balanced Budget and Emergency Deficit Control Act of 1985 and the Budget Enforcement Act of 1990, but deficits continued to occur.

Each time a deficit occurs the Treasury Department must borrow money to cover the shortfall. Over time the government must pay back the borrowed money plus interest, diverting money from other uses. The borrowed money becomes part of the nation’s debt, which represents a burden upon future taxpayers.

Pushing a Constitutional Amendment

Some lawmakers have considered budget deficits so threatening to the nation’s well-being that they advocated amending the U.S. Constitution to require Congress to balance the budget each year. Amending the Constitution is not easy: the measure must pass both the House of Representatives and the Senate with two-thirds majorities and then be ratified by three-fourths of the states. Only proposals with broad support from both major political parties and the public have a chance of succeeding.

Since the 1930s there have been many attempts to pass a balanced-budget amendment, but most attempts failed to make serious progress. Although the measures often found sufficient support in the House, they were stymied in the Senate. During the 1980s the political climate shifted when the newly dominant Republican Party, as part of its national platform, focused on elimination of federal deficits. Republican presidents Ronald Reagan (1911–2004) and George H. W. Bush (1924–) made a balanced-budget amendment a high priority. However, proposals were voted down in Congress in 1982, 1986, 1990, and 1992. In 1986 a balanced-budget amendment failed by only one vote in the Senate.

Democratic President Bill Clinton (1946–), who opposed a balanced-budget amendment, faced stiff opposition on the issue from the Republicans who took control of Congress in 1994. The so-called Republican Revolution was based on a platform called the Contract with America; among its prominent features was a balanced-budget amendment. Congressional votes between 1994 and 1997 did not obtain the majorities needed for passage. In 1997 a proposal failed by only one vote in the Senate.

By that time a booming U.S. economy had caused budget deficits to decline. In 1998 there was a budget surplus for the first time in nearly thirty years. Annual surpluses continued each year through fiscal year 2001. The debate about a constitutional amendment for a balanced budget faded from the political landscape.

The Return of Deficits

Budget deficits returned in fiscal year 2002 during the presidency of George W. Bush (1946–). The terrorist attacks of September 11, 2001, and subsequent wars in Afghanistan and Iraq dramatically increased government spending on national defense and homeland security. In fiscal year 2004 the deficit reached $413 billion, an all-time record. Congress passed the Deficit Reduction Act of 2005, which called for major cuts in spending on Medicare; Medicaid, the health-care program for the very poor; and student loan programs. However, hurricanes Katrina and Rita devastated the Gulf Coast region that year, causing a spike in federal spending.

In 2007 Bush proposed a long-term budget plan to achieve a balanced federal budget by fiscal year 2012. The plan assumed that Congress would slow spending on domestic programs and that U.S. expenses on the war and rebuilding in Iraq would decline dramatically. Bush’s critics were doubtful that those reductions would occur.

Amending the Constitution

As of mid 2007 there were twenty-seven amendments to the U.S. Constititution. Amendments 1 through 10—which are known as the Bill of Rights—were ratified by 1791. Five more amendments were added by the end of the 1800s, and the remaining dozen were ratified during the twentieth century.

The original framers of the Constitution purposely made it difficult to amend the document. Article V describes the two alternative processes that must be followed. The first requires a two-thirds majority vote in both the House of Representatives and the Senate. A two-thirds vote, rather than a simple majority (more than 50 percent), helps to ensure that an amendment enjoys broad support from the major political parties. Following congressional approval an amendment must then be ratified by at least three-fourths of the states. A second method for amending the Constitution requires the calling of a Constitutional Convention by at least two-thirds of the state legislatures. Any amendment resulting from the convention still must be ratified by three-fourths of the states. This method has never been used.

James Madison (1751–1836), one of the founding fathers and the fourth president of the United States, once wrote that the Constitution should only be amended on “certain great and extraordinary occasions.” Despite this admonition more than eleven thousand amendments have been proposed in Congress over the years. In recent decades amendments have been offered on such topics as balancing the federal budget, criminalizing flag burning, imposing term limits on politicians, and protecting the rights of crime victims.

Advocates believe that a constitutional amendment, rather than conventional legislation, is the preferred way to commit the nation to a bedrock principle (such as the right of people of all races to vote, which was granted by the Fifteenth Amendment). However, many historians have argued that lawmakers often espouse amendments to gain political attention and are too quick to use them for fashionable causes that may not stand the test of time. The prime historical example is the prohibition of alcohol by the Eighteenth Amendment, which was ratified in 1919 and went into effect in 1920. It was repealed only fourteen years later by the Twenty-first Amendment.

Bob Jones University. v. United States

In Bob Jones University v. United States (1983) the U.S. Supreme Court ruled that private schools with racially discriminatory practices are not eligible for tax-exempt status. The case was decided along with Goldsboro Christian Schools, Inc. v. United States. In both cases the private schools operated as nonprofit corporations and espoused fundamentalist Christian beliefs. Bob Jones University was opposed to interracial dating and marriage. Goldsboro Christian Schools, Inc., refused admittance to students who were not wholly or partially Caucasian. Both schools asserted that their school policies on these issues were biblically based and constitutionally protected. However, the court decided that the policies violated the public goal of eliminating racial discrimination in the schools and therefore invalidated the schools’ claims to tax-exempt status.

Background

In 1970 the Internal Revenue Service (IRS) ruled that it would no longer grant tax-exempt status under Section 501(c)(3) of the Internal Revenue Code to private schools that practiced racial discrimination. The ruling was prompted by a lawsuit filed earlier that year by parents of African-American children who attended public schools in Mississippi. The parents argued that the IRS should not grant tax-exempt status under 501(c)(3) to private schools in the state that discriminated against African-Americans.

Bob Jones University is a private Christian college and seminary located in Greenville, South Carolina. The university had a policy, which it believed to be Bible-based, that forbade interracial dating and marriage by its students. Students who belonged to organizations that advocated interracial dating or marriage could be expelled as well. Goldsboro Christian Schools, Inc., operated a private Christian school in Goldsboro, North Carolina. Based on its interpretation of the Bible, the school refused admission to students who were not Caucasians, although it had made a few exceptions for racially mixed students with at least one Caucasian parent. Both schools fought the new IRS ruling in court, arguing that the IRS had exceeded its delegated powers and that the ruling infringed upon their rights under the religion clause of the First Amendment. That clause says “Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof.”

The Supreme Court Decision

By a vote of 8 to 1 the court ruled that the IRS had acted legally in denying tax-exempt status to the two schools. Chief Justice Warren Burger (1907–1995), in the majority opinion, wrote that the original purpose of Congress, when it created the tax benefits in Section 501(c)(3), was to encourage the development of private institutions “that serve a useful public purpose.” He noted that racial discrimination in education had been found by the court in recent decades to violate national public policy and individual rights. Therefore, the discriminatory policies of Bob Jones University and Goldsboro Christian Schools, Inc., were deemed “contrary to public policy.” Burger concluded that “the Government’s fundamental overriding interest in eradicating racial discrimination in education substantially outweighs whatever burden denial of tax benefits places on petitioners’ exercise of their religious belief.”

The dissenting vote was cast by Justice William Rehnquist (1924–2005), who argued that the IRS had exceeded its authority because it interpreted Section 501(c)(3) in a manner that went beyond the original language used by Congress. Rehnquist agreed that the United States did have a “strong national policy” against racial discrimination, but noted that Congress had failed to specify this policy in Section 501(c)(3), and therefore “this Court should not legislate for Congress.”

Texas v. Johnson

In Texas v. Johnson (1989) the U.S. Supreme Court ruled that burning the American flag is protected as free speech under the First Amendment to the U.S. Constitution. The case involved a Texas man who burned a flag during a political protest.

Background

In 1984 Gregory Johnson participated in a protest march during the Republican National Convention in Dallas, Texas. He and approximately one hundred other participants were protesting the policies of President Ronald Reagan (1911–2004). The march culminated outside the city hall, where Johnson doused an American flag with kerosene and set it afire. The protestors gathered around the burning flag and chanted, “America, the red, white, and blue, we spit on you.” No one was physically injured during the incident, and the demonstrators dispersed.

Johnson was subsequently convicted of violating a Texas law that made “desecration of a venerated object” a misdemeanor offense. In particular that law defined desecrate to mean “deface, damage or otherwise physically mistreat in a way that the actor knows will seriously offend one or more persons likely to observe or discover his action.” Several witnesses testified that Johnson’s burning of the flag offended them. He was sentenced to a year in jail and fined $2,000. His sentence was first affirmed on appeal and then reversed by Texas courts. The U.S. Supreme Court agreed to hear the case and judge its constitutionality.

The Supreme Court Decision

The court ruled 5 to 4 that Johnson’s act of flag burning constituted expressive conduct protected as free speech under the First Amendment to the U.S. Constitution. Justice William Brennan (1906–1997), writing for the majority, noted that although the amendment specifically refers to “speech,” the Supreme Court had long considered the protection to extend beyond the spoken and written word to conduct “with elements of communication.”

The majority rejected claims from Texas authorities that Johnson’s conviction was justified by the state’s interest in preserving the peace. No evidence was offered that the flag burning incited a riot or would incite a riot. Furthermore, the majority said, Johnson’s expressive conduct did not include “fighting words” that could have been interpreted by an onlooker as a direct personal insult or a dare to start a fight. Brennan noted that a separate Texas law prohibits disturbing the peace, but Johnson was not charged under that law. The court also rejected the state’s argument that it had an interest in protecting the flag as “a symbol of nationhood and national unity.” Brennan wrote that “if there is a bedrock principle underlying the First Amendment, it is that the government may not prohibit the expression of an idea simply because society finds the idea itself offensive or disagreeable.”

The court’s decision caused a storm of controversy. Congress responded with the Flag Protection Act of 1989, which the Supreme Court later struck down as unconstitutional. Efforts to obtain a constitutional amendment to prohibit flag burning also failed.

Protecting the Flag

Many politicians expressed outrage at the U.S. Supreme Court decision in Texas v. Johnson that flag burning is protected free speech under the First Amendment to the U.S. Constitution. Congress passed a number of resolutions condemning the decision. President George H. W. Bush (1924–) called for passage of a constitutional amendment to make flag desecration a federal crime. At first that idea had widespread political backing, particularly with Republicans, but it quickly lost steam. Instead, legislators enacted an amendment to existing U.S. Code. It was called the Flag Protection Act of 1989 and went into effect on October 30, 1989. That very day two people were charged under the new law—Shawn Eichman in Washington, D.C., and Mark Haggerty in Seattle, Washington.

Their cases were combined and argued before the U.S. Supreme Court, which ruled 5 to 4 in United States v. Eichman (1990) that the new law was unconstitutional because it violated the First Amendment protections of freedom of speech. Justice William Brennan (1906–1994) wrote the majority decision, as he had in Texas v. Johnson. “Punishing desecration of the flag dilutes the very freedom that makes this emblem so revered, and worth revering,” he wrote.

The decision spurred renewed efforts in the Bush administration to amend the Constitution to ban flag desecration. In the following years such an amendment was approved several times by the House of Representatives, but was always voted down by the Senate. Obtaining passage of a constitutional amendment is difficult: it requires a two-thirds majority vote in both houses of Congress and then ratification by at least thirty-eight states within seven years.

Cruzan v. Director, Missouri Department of Health

In Cruzan v. Director, Missouri Department of Health (1990) the U.S. Supreme Court upheld as constitutional the actions by the state of Missouri courts to maintain life-support for a woman in a persistent vegetative state. The woman’s parents wanted the life-support removed, arguing that she would not have wanted to continue living in such a state. The court upheld the state’s policy of insisting on “clear and convincing evidence” that a patient would have wanted life-support removed. The parents subsequently obtained that evidence and the tube was removed. She died shortly thereafter.

Background

In 1983 Nancy Beth Cruzan, who was twenty-five years old, sustained serious injuries in an automobile accident. She was not breathing when paramedics reached her. Medical experts estimated she had been deprived of oxygen for at least twelve minutes; permanent brain damage is presumed to occur after approximately six minutes of oxygen deprivation. Cruzan was resuscitated and taken to a hospital, where doctors found she had suffered cerebral contusions. After being in a coma for several weeks, she progressed to a persistent vegetative state in which she showed some motor reflexes, but no indication of significant brain function. A feeding and hydration tube was implanted to provide nutrition and water. After it became apparent that Cruzan had virtually no chance of regaining her mental faculties, her parents asked the hospital to remove the tube and allow her to die. The hospital refused, and the parents obtained a court order to do so. That decision was overturned by the Supreme Court of Missouri.

The U.S. Supreme Court Decision

The U.S. Supreme Court, voting 5 to 4, upheld the decision of the Missouri Supreme Court. At issue was the Fourteenth Amendment to the U.S. Constitution, which says that no state shall “deprive any person of life, liberty, or property, without due process of law.” Justice William Rehnquist (1924–2005), writing for the majority, noted that the due process clause protects an individual’s interest in life and in refusing life-sustaining medical treatment. However, the right is not extended to “incompetent” people because they are “unable to make an informed and voluntary choice.”

The state of Missouri did have a procedure in place that allowed a surrogate—a person acting on behalf of another—in certain cases to make life-or-death decisions for a patient deemed incompetent. The procedure required that the surrogate’s actions meet as best as possible the wishes expressed by the patient while still competent. Cruzan’s parents had presented the testimony of one of their daughter’s roommates, who recalled that Cruzan had once said she would not wish to be kept alive in such circumstances. The Missouri Supreme Court had ruled that this was not “clear and convincing evidence” that Cruzan would want the feeding and hydration tube removed. The U.S. Supreme Court agreed. Rehnquist concluded that “there is no automatic assurance that the view of close family members will necessarily be the same as the patient’s would have been had she been confronted with the prospect of her situation while competent.”

After the Supreme Court decision made national news, three people who had known Cruzan contacted her parents and told them about conversations in which she had said she would not want to be kept alive in a vegetative state. This evidence was presented to a Missouri court and deemed to be “clear and convincing.” On December 14, 1990, the feeding and hydration tube was removed. Cruzan died on December 26, 1990, at age thirty-three.

Cipollone v. Liggett Group, Inc.

In Cipollone v. Liggett Group, Inc. (1992) the U.S. Supreme Court ruled that the federally mandated health warnings that appear on cigarette packages do not protect cigarette manufacturers from being sued under state personal-injury laws. The case involved a smoker who sued three cigarette manufacturers after she contracted lung cancer from smoking for more than forty years. The court ruled on the specific types of lawsuits that can be filed—mainly those involving claims of fraudulent advertising or conspiracy to mislead the public about the adverse health effects of cigarette smoking.

Background

In 1983 New Jersey residents Rose Cipollone and her husband filed a lawsuit against three cigarette companies—Liggett Group, Philip Morris, and Lorillard—alleging that she had contracted lung cancer because of the harmful effects of smoking. She died in 1984. Her son, acting on behalf of her estate, filed an amended lawsuit. It alleged that the cigarettes were defective; that the manufacturers had failed to provide adequate warnings on the packages; and that the cigarette companies had been negligent in the way they researched, advertised, and promoted their product. In addition the lawsuit alleged that the companies had warranted that smoking did not have significant health consequences; had used advertising to neutralize the federally required warning labels; had failed to act upon data in their possession indicating that cigarette smoking was hazardous; and had conspired to withhold that data from the public.

The manufacturers contended that the warning label required by the Federal Cigarette Labeling and Advertising Act of 1965—“Warning: The Surgeon General has determined that cigarette smoking is dangerous to your health”—protected them from liability incurred after 1965. A jury rejected most of Cipollone’s claims but found that Liggett Group had “breached its duty to warn and its express warranties” prior to 1966. However, the jury noted that Cipollone had voluntarily incurred “a known danger” by smoking cigarettes and was 80 percent responsible for her injuries. Her husband was awarded $400,000 as compensation for the breach of warranty claim. He died after the trial. The case was appealed to the U.S. Supreme Court.

The Supreme Court Decision

The Supreme Court delivered a complicated decision, in which it reversed parts of the previous judgment and affirmed other parts. Justice John Paul Stevens (1920–) wrote that the federal government’s 1965 law did not preempt lawsuits seeking damages at the state level. However, the court said that such lawsuits could not claim that cigarette manufacturers failed to warn about the dangers of cigarette smoking. However, such lawsuits could claim that the manufacturers used fraudulent advertising or conspired to mislead people about the dangers of cigarette smoking.

The Supreme Court decision prompted many lawsuits against major tobacco companies. Although juries awarded a few settlements, most legal actions against “Big Tobacco” were unsuccessful because they were filed as class-action suits, in which large numbers of complainants unite to sue. Court rulings found that class-action status was not appropriate because individual issues predominated over common issues in these cases.

The Government and Big Tobacco

In the mid 1990s a number of state governments filed lawsuits against tobacco manufacturers to recoup Medicaid funds spent on tobacco-related illnesses. Medicaid is a health program for the poor that is funded by taxpayer money. In 1998 a “master settlement agreement” was reached between the major tobacco manufacturers and forty-six state attorneys general (Texas, Florida, Minnesota, and Mississippi settled independently). The tobacco companies accepted limitations on how they market and sell their products, including no more youth-targeted advertising, marketing, and promotion; limiting brand-name sponsorship of events with significant youth audiences; terminating outdoor advertising; banning youth access to free samples; and setting the minimum package size at twenty cigarettes. (The last requirement expired at the end of 2001.) In addition, the tobacco companies agreed to pay more than $200 billion to the states.

In his 1999 State of the Union address President Bill Clinton (1946–) promised to sue the tobacco industry to recover money spent by the federal government to treat illnesses caused by smoking. The U.S. Department of Justice filed suit, alleging that tobacco companies had misled and defrauded the public regarding the dangers of cigarette smoking. The federal government hoped to recover more than $200 billion from the companies under a federal racketeering law and force cigarette manufacturers to abide by new sales and marketing restrictions.

In 2005 a federal appeals court blocked the government’s claim for monetary damages. In August 2006 a federal judge ruled that the tobacco companies had engaged in a conspiracy for decades to deceive the public about the health risks of cigarette smoking. However, she refused to impose the multibillion-dollar damages that the government had requested. Instead, she ordered the cigarette companies to cease using labels such as “light” or “low tar” on certain brands, arguing that these labels are deceptive. She also ordered the companies to conduct an advertising campaign to warn people about the adverse health effects of smoking.

The Brady Handgun Violence Prevention Act of 1993

The Brady Handgun Violence Protection Act of 1993 created a national system to check the backgrounds of people who want to purchase handguns. At the time it was anticipated that a computerized system, which would provide almost instantaneous background checks, would become available in five years. In the meantime, an interim system was set up that required local law-enforcement agencies to perform background checks on prospective handgun buyers. The interim system was subsequently struck down as unconstitutional by the U.S. Supreme Court. In 1998 a federally operated computerized system for background checks went into effect to satisfy the intent of the original legislation.

Background

The act is named for James Brady (1940–), who was the White House press secretary under President Ronald Reagan (1911–2004). In 1981 Reagan and Brady were both shot during an assassination attempt. Reagan fully recovered from his injuries, but Brady was left permanently disabled from a brain injury. In the years following the shooting, he and his wife devoted themselves to the cause of gun control. This was not a popular cause with Republican leaders. Despite a groundswell of public support for gun control, legislative efforts waged during the administrations of Reagan and his successor, George H. W. Bush (1924–), were not successful. Much of the pressure against such legislation was exerted by the National Rifle Association, which promotes the rights of gun owners and provides gun-safety education.

In 1993 the political climate changed with the inauguration of Democratic President Bill Clinton (1946–), who made passage of a new gun-control bill a prominent goal of his administration. The result was the Brady Act, which passed the House of Representatives 238 to 189 and the Senate 63 to 36. The bill was essentially an amendment to the Gun Control Act of 1968, which included prohibitions on the sale of handguns by dealers to many people, including convicted felons, the mentally impaired, and people under age twenty-one. However, that legislation relied on the honesty of the buyers in providing accurate information at the time of purchase. Critics called it the “lie and buy” system.

The Brady Act required the federal government to establish a national instant background-check system by November 30, 1998. In the interim it established a temporary system that required firearms dealers to submit applications from prospective gun buyers to their local “chief law-enforcement officer” for a background check to be completed within five business days. Sheriffs in Arizona and Montana challenged the law in court, arguing that it was unconstitutional for the federal government to mandate federal duties to local law-enforcement officers. A U.S. District Court agreed, but its finding was reversed on appeal to the U.S. Circuit Court. The case then went before the U.S. Supreme Court.

The Supreme Court Decision

In Printz v. United States the Supreme Court ruled 5 to 4 that the interim background-check system was unconstitutional. The majority decision, written by Justice Antonin Scalia (1936–), noted that the “federal government’s power would be augmented immeasurably and impermissibly if it were able to impress into its service—and at no cost to itself—the police officers of the 50 states.” The ruling ended mandatory background checks; however, local law-enforcement officers were free to conduct background checks voluntarily.

A Brady Act Update

In 1998 the Federal Bureau of Investigation’s computerized National Instant Criminal Background Check System went into effect, fulfilling the original intent of the Brady Act. Meanwhile, Brady and his wife formed the Brady Center to Prevent Gun Violence and its affiliate, the Brady Campaign, to promote stronger gun-control laws.

Universal Health Insurance

Universal health insurance is a system in which everyone is covered by some type of health insurance, whether private or public. Such plans became a high priority in the late twentieth and early twenty-first centuries: as the cost of medical care skyrocketed, and fewer workers were covered by employer-funded health plans, millions of people could not afford to buy private medical insurance. Most of them did not qualify for the government-funded insurance plans that already existed—Medicare for the elderly and Medicaid for the very poor. When uninsured people require treatment for illness or injury and cannot pay their bills, the costs are ultimately passed on to other consumers in the health-care system. Most of the advocates for universal health insurance see it has a matter of social justice—they believe that affordable health care should be a right guaranteed to all Americans.

Background

Several systems have been proposed for implementing universal health insurance. In the most centralized system—often called “socialized medicine”—the doctors and other medical workers are government employees and the hospitals are owned by the government. That type of system is used in some European countries, notably the United Kingdom, and by the U.S. Veterans Administration. In another model, known as single-payer national health care, doctors have their own private practices but their fees are paid directly from a single government fund. Health-care providers and hospitals negotiate with the government to determine fees. Medicare is an example of a single-payer system. The least-centralized type of universal health care is market-based: everyone is covered, by either public or private insurance, and the costs are divided up among the consumers of health care, the government, and employers.

Two States Try It

In 2006 Massachusetts lawmakers approved a comprehensive market-based system for universal health coverage that was endorsed by Republican Governor Mitt Romney (1947–). The plan required all state residents to be covered by a public or private health insurance plan by July 2007. Businesses with at least ten employees had to pay for health insurance for their employees or pay a fee to the government. The state subsidized—on a sliding scale—the cost of insurance for those residents who did not get it from their employers and could not afford it on their own. Massachusetts officials expected that the plan would reduce some costs. For example, uninsured people often go to hospital emergency rooms when they are sick because they cannot be refused treatment there; providing health insurance for those people would give them an incentive to visit doctors’ offices or clinics instead, which would be much cheaper.

In January 2007 Arnold Schwarzenegger (1947–), the Republican governor of California, announced a similar proposal for insuring all of that state’s thirty-six million residents.

The National Level

Universal health insurance is not a new political issue at the national level. It was advocated by Democratic presidents Franklin Roosevelt (1882–1945) and Harry Truman (1884–1972) in the 1940s. Neither could muster enough public and congressional support for a program, however. In the 1990s it played a prominent role in the political agenda of Democratic President Bill Clinton (1946–). He placed health-care reform under the direction of his wife, Hillary Clinton (1947–). She chaired the President’s Task Force on National Health Care Reform, which developed a plan for universal government-sponsored health-care insurance. The plan was roundly condemned for its complexity, cost, and bureaucratic nature and did not proceed further. Other proposals were offered in the following years, but, as of mid 2007, none had made headway on the national level.

The North American Free Trade Agreement

The North American Free Trade Agreement (NAFTA) is an agreement implemented in 1994 between the United States, Canada, and Mexico to lower trade barriers between the three nations. NAFTA was politically controversial when it was first introduced and passed. Criticism faded, but did not disappear, in the following years. Historically, free-trade agreements have been embraced by Republican leaders because of their commitment to lowering government impediments to business; NAFTA was politically unique, however, because it also enjoyed strong support from a Democratic president—Bill Clinton (1946–), who was instrumental in securing its passage.

Trade Barriers

A tariff is a duty, or tax, imposed on imported goods by a government. Tariffs have economic, foreign-relations, and political effects. High tariffs imposed by the U.S. government on incoming products raise government revenues and give U.S. producers of the same goods a price advantage in the United States. However, the tariffs typically increase prices for the imported products, which displeases American consumers and cause foreign nations to retaliate by imposing high tariffs on American goods. Maintaining an appropriate balance between these competing interests is a challenge; the goal of the U.S. government, starting in the 1980s, was to reduce or eliminate tariffs and other trade barriers to allow easier flow of goods, services, and money between nations. This effort was called trade liberalization or free trade.

Modern Trade Agreements

Following World War II the United States and the United Kingdom pushed for creation of an international trade organization to negotiate trade rules and tariff reductions between countries. The result was the General Agreement on Tariffs and Trade (GATT), which was established in 1947. In 1995 the World Trade Organization (WTO) was formed to oversee GATT and monitor other trade rules. The WTO also provides a framework for trade negotiations.

The United States forged a free-trade agreement with Israel in 1985 during the presidency of Republican Ronald Reagan (1911–2004), a staunch advocate of free trade as part of his overall economic philosophy of reducing government regulation of business. The U.S.-Israel Free Trade Area agreement gradually eliminated duties on Israeli merchandise entering the United States. It did not prevent the two countries from using other trade barriers, such as quotas, to protect certain agricultural commodities.

In 1988 the Reagan administration negotiated the U.S.-Canada Free Trade Agreement (FTA), which targeted trade in such sectors as agriculture, automobiles, energy, and financial services. It also established procedures for resolving trade disputes. The FTA received little attention in the United States, but became a major political issue in Canada. Its opponents feared the agreement would weaken Canada’s economy—and even undermine the nation’s sovereignty—in the face of the much larger and stronger U.S. economy. FTA advocates prevailed, however, and the agreement went into effect in 1989. Plans for a larger free-trade area were already under way.

NAFTA Politics

In 1990 newly elected President George H. W. Bush (1924–), a Republican, informed Congress that his administration intended to negotiate a free-trade agreement with Mexico. Canada soon joined the negotiations, which proceeded through 1992. When the details of NAFTA were first revealed, they drew congressional opposition, mostly from Democratic lawmakers who expressed concerns about the impact of the agreement on American workers and the environment. Bush signed NAFTA on December 17, 1992, shortly before he left office; however, the agreement still had to receive congressional approval to go into effect. Under federal law NAFTA was a congressional-executive agreement, rather than a treaty. A treaty requires a two-thirds majority in the Senate for passage; NAFTA required a simple majority (over 50 percent) in both the House and the Senate.

When he became president in 1993, Clinton pledged to continue U.S. support for NAFTA. He faced an uphill battle with members of his own party who had majority control of the U.S. House and U.S. Senate for the first two years of his presidency. Democrats were under intense pressure from their historical allies—labor unions—not to support NAFTA for fear it would drive wages down in the United States and encourage industries to move manufacturing jobs to Mexico to take advantage of cheaper, nonunionized labor. Some environmental groups also opposed NAFTA, fearing that industrial expansion in Mexico would be loosely regulated, leading to pollution problems.

Politicians at the left end of the political spectrum, including Ralph Nader (1935–) and foes of globalization, believed NAFTA would benefit only large multinational corporations and hurt middle- and working-class people. On the right, politicians such as Pat Buchanan (1938–) argued that NAFTA would weaken the nation’s sovereignty and increase foreign influence over the U.S. government. Ross Perot (1930–), a Texas billionaire who had made a respectable showing as an independent candidate for president against Clinton in the 1992 presidential election, famously said that Americans would “hear a giant sucking sound” as their jobs went south into Mexico.

Despite well-funded opposition to NAFTA, Clinton managed to gain its passage with strong support from Republican lawmakers. On November 17, 1993, the North American Free Trade Agreement Implementation Act passed the House 234 to 200. Three days later it passed the Senate 61 to 38.

NAFTA Is Implemented

NAFTA, which went into effect on January 1, 1994, gradually eliminated nearly all tariffs between the United States and Canada by 1998 and between the U.S. and Mexico by 2008. NAFTA also removed many nontariff barriers that helped to exclude U.S. goods from the other two markets. The agreement ended restrictive government policies on investors; included provisions to protect intellectual property rights, such as trademarks and patents; and ensured that industries and businesses in all three countries would have access to government procurement contracts.

Despite its tumultuous beginnings, NAFTA became largely a political nonissue during the prosperous decade that followed its passage. Perot tried, but failed, to make it a major concern during the 1996 presidential campaign. Opposition to NAFTA did not fade completely—it was regularly criticized as detrimental to labor standards and workers’ rights and blamed for the transfer of jobs from the United States to Mexico, particularly in the manufacturing sector. This movement of jobs was part of a broader economic phenomenon known as offshoring, in which U.S. businesses relocate all or portions of their work to foreign countries, primarily developing countries where labor costs are much cheaper. Although offshoring is decried by politicians, some economists argue that it ultimately leads to lower prices for U.S. consumers, which, they say, benefits the overall economy.

Advocates of NAFTA point to data indicating increases in trade among the three nations and growth in the U.S. economy since the agreement was passed. Opponents believe the gains were the result of many factors and might have occurred without NAFTA. The administration of Republican President George W. Bush (1946–) continued the nation’s strong support of NAFTA and secured passage of free-trade agreements between the United States and other countries.

On the Fast Track

NAFTA, like many other trade agreements adopted by the United States, came about through a legislative procedure known as “expedited consideration” or “fast track.” The president or other members of the executive branch negotiate foreign agreements. However, the U.S. Constitution requires all bills raising revenue to originate in the House of Representatives and gives the Senate the opportunity to propose or concur with amendments. Because tariffs are revenues, free-trade agreements require congressional approval. However, this makes it difficult for a president to conduct negotiations in good faith, not knowing if the agreements will meet with subsequent congressional approval.

The Tariff Act of 1890 delegated tariff-bargaining authority to the president and allowed him to change or remove existing duties on particular items. This law was challenged in the U.S. Supreme Court, but upheld as constitutional. The Reciprocal Trade Agreements Act of 1934 granted the president temporary authority to enter into tariff agreements and, within certain limits, to set tariffs without obtaining subsequent congressional approval. That authority was used when the United States joined the General Agreement on Tariffs and Trade (GATT) in 1947.

The Trade Act of 1974 allowed the president to negotiate multiparty trade agreements during a set period of time and permitted him to make certain tariff reductions and modifications. It also included procedures for expedited consideration of bills resulting from trade negotiations. The fast-track provision meant that Congress could only vote yes or no on the bills—no revisions were allowed—and had to conduct the vote within a specific time period—typically ninety days.

The Trade and Tariff Act of 1984 gave the president temporary statutory authority to make bilateral (two-country) free-trade agreements. The Omnibus Trade and Competitiveness Act of 1988 extended presidential authority to enter free-trade agreements until 1993. (It was later extended to 1994). That authority and the fast track were used to negotiate NAFTA and get it passed into law.

The presidential fast-track authority expired in 1994 and was not reinstated until passage of the Trade Act of 2002, during the administration of George W. Bush (1946–). That law set a new expiration date of June 2007.

United States v. Lopez

In United States v. Lopez (1995) the U.S. Supreme Court ruled that a federal law criminalizing the possession of a firearm in a school zone was unconstitutional because Congress had overstepped its power under the U.S. Constitution. A student in Texas, who had been arrested at school for carrying a firearm, was charged under the law. The court found that the law did not fall under the powers granted by the U.S. Constitution to the legislative branch of the federal government to regulate interstate commerce.

Background

In 1992 Alfonzo Lopez, a twelfth-grader at Edison High School in San Antonio, Texas, was arrested at school for carrying a concealed .38-caliber handgun and five bullets. He was charged with violating a state law that prohibits the carrying of firearms on school grounds. The following day the state charges were dropped, and Lopez was charged under the federal Gun-Free School Zones Act of 1990. That law made it a federal offense “for any individual knowingly to possess a firearm at a place that the individual knows, or has reasonable cause to believe, is a school zone.” A U.S. District Court found Lopez guilty and sentenced him to six months in prison and two years of supervision upon release. His lawyers appealed the conviction, arguing that Congress had exceeded its power in passing the federal law. The U.S. Court of Appeals agreed and reversed the conviction. The case was then brought before the U.S. Supreme Court.

The Court’s Decision

The court decided 5 to 4 that the reversal by the court of appeals was correct because the Gun-Free School Zones Act of 1990 was unconstitutional. Chief Justice William Rehnquist (1924–2005), who wrote for the majority, noted that Article 1, Section 8 of the U.S. Constitution says that Congress shall have power “to regulate commerce with foreign nations, and among the several states, and with the Indian Tribes.” The Supreme Court has historically identified three broad categories of activities that Congress may regulate under the commerce clauses: the use of the channels of interstate commerce; the instrumentalities of interstate commerce or persons or things in interstate commerce; and activities that substantially affect interstate commerce. While the states have primary authority for defining and enforcing criminal law, Congress can pass criminal laws as long as they fall into at least one of the three categories.

The federal government argued that the law being considered fell under the last category—activities that substantially affect interstate commerce—for two reasons: first, because possession of a firearm in a school zone can result in violent crime, which has detrimental effects on the nation’s economy; and second, because firearms in the schools impair the educational process, which ultimately is bad for the economy. The court rejected the validity of both claims and affirmed the decision of the court of appeals to reverse Lopez’s conviction.

Democratic President Bill Clinton (1946–) publicly expressed his disappointment at the court’s decision and urged lawmakers to amend the law to make it constitutionally acceptable. In the amendment Congress used new language, saying “it shall be unlawful for any individual knowingly to possess a firearm that has moved in or that otherwise affects interstate or foreign commerce at a place that the individual knows, or has reasonable cause to believe, is a school zone.” In other words, the new law focuses on the firearm itself and its relationship to commerce to demonstrate Congress’s authority over the crime. The amendment was tacked on to a budget bill, the Omnibus Consolidated Appropriations Act of 1997, which was enacted as Public Law 104-208. As of mid 2007 the constitutionality of the amendment had not been challenged.

Communications Decency Act of 1996

The Communications Decency Act of 1996 prohibited the use of telecommunications equipment for obscene or harassing purposes. Its primary target was the Internet. In 1997 two provisions of the law were declared unconstitutional by the U.S. Supreme Court. Both provisions criminalized the knowing transmission of obscene or indecent materials to people younger than age eighteen. The court ruled that these provisions were overly broad and suppressed materials that adults have a constitutional right to send and receive under the First Amendment of the U.S. Constitution. In addition it ruled that some of the language used in the provisions was vague and undefined. In 1998 Congress tried again with the Child Online Protection Act. It too was declared unconstitutional by the Supreme Court.

Background

The 1996 telecommunications act was the first major reform in this field since the original telecommunications act was passed in 1934. The new law dealt primarily with economic and market issues related to the telecommunications industry. However, legislators added a section, Title V: Obscenity and Violence, that was called the Communications Decency Act (CDA) of 1996. It prohibited a variety of actions, including the transmission of obscene or indecent materials to persons known to be younger than age eighteen.

The act was immediately challenged in court by a large group of petitioners, including online service providers; library and media associations; civil-liberty groups, such as the American Civil Liberties Union (ACLU); and thousands of individual Internet users. After a U.S. District Court in Pennsylvania ruled in favor of the petitioners, the federal government appealed the case to the Supreme Court.

The Supreme Court Decision

The Supreme Court upheld the lower court’s decision in Reno v. American Civil Liberties Union. Specifically, the court considered two disputed provisions of the law: Section 223(a)(1) prohibited the use of telecommunications devices to knowingly transmit any language or image communication “which is obscene or indecent” to a recipient younger than age eighteen. Section 223(d) prohibited the use of any interactive computer service to display to a person younger than age eighteen any language or image communication that “depicts or describes, in terms patently offensive as measured by contemporary community standards, sexual or excretory activities or organs.”

The court ruled that the provisions violated the right of freedom of speech guaranteed by the First Amendment. In particular, the court complained that “the CDA lacks the precision that the First Amendment requires when a statute regulates the content of speech.” Although the court acknowledged that the law had “legitimate purposes,” it thought the provisions placed an unacceptable burden on adult speech. It pointed to other alternatives that would be as effective, such as computer software that parents could use to prevent their children from accessing material the parents considered inappropriate. In addition the court noted that the terms “indecent” and “patently offensive” were undefined in federal law. (The term “obscene” is already defined in federal law.)

Congress Tries Again

In 1998 Congress passed the Child Online Protection Act in an attempt to remedy the constitutional problems with the CDA. The new law prohibited “communication for commercial purposes that is available to any minor and that includes any material that is harmful to minors.” The law was immediately challenged in federal court; a temporary injunction was issued to prevent implementation of the law. Eventually, in Ashcroft v. American Civil Liberties Union (2004), the Supreme Court upheld the injunction as constitutional. In 2007 a federal court in Pennsylvania issued a final ruling that struck down the law for interfering with First Amendment rights.

The Library Wars

In 2000 Congress passed the Children’s Internet Protection Act (CIPA), which requires public schools and libraries that receive certain types of federal funding to implement measures to keep children from viewing some content on the Internet. According to Federal Communications Commission rules implemented in 2001, schools subject to CIPA must adopt and enforce a policy to monitor the online activities of minors. Schools and libraries subject to CIPA must ensure that minors cannot access inappropriate materials, and they must protect the safety of minors using e-mail, chat rooms, and other forms of electronic communications. CIPA includes a provision that allows blocking or filtering measures to be disabled when adults want to use the computers for research or other purposes.

In 2001 the American Library Association and American Civil Liberties Union challenged the new law in federal court. The court ruled unanimously that CIPA violated the First Amendment; that Congress had exceeded it authority; and that the use of software filters was not an action sufficiently tailored to the government’s interest in preventing the dissemination of harmful materials to minors.

The government appealed the ruling to the U.S. Supreme Court. In U.S. v. American Library Association (2003) the court, by 6 to 3, overturned the lower court’s ruling. It found CIPA to be constitutional because the blocking and filtering mechanisms can be easily disabled to allow adult access to online content. Chief Justice William Rehnquist (1924–2005), writing for the court, noted, “Because public libraries’ use of Internet filtering software does not violate their patrons’ First Amendment rights, CIPA does not induce libraries to violate the Constitution, and is a valid exercise of Congress’ spending power. Congress has wide latitude to attach conditions to the receipt of federal assistance to further its policy objectives.”

Clinton v. City of New York

In Clinton v. City of New York (1998) the U.S. Supreme Court ruled that the Line Item Veto Act of 1996 was unconstitutional. The line-item veto allows a president to veto particular items in the federal budget bills passed by Congress and submitted for the president’s signature. While advocates tout the presidential tool as a way to cut out wasteful federal spending, critics believe it grants too much power to the president.

Background

The line-item veto, which has been sought by presidents since the Civil War, would allow a president to veto particular items in spending bills, which often contain provisions inserted by lawmakers to fund “pet projects,” such as roads and bridges, in their home districts. The projects are also called “pork” because legislators can take credit for “bringing home the bacon.” Advocates of the line-item veto believe that presidents should be able to eliminate expensive pet projects and pork from budget bills to save taxpayers money.

The push for the line-item veto gained momentum during the administrations of Republican presidents Richard Nixon (1913–1994), Gerald Ford (1913–2006), Ronald Reagan (1911–2004), and George H. W. Bush (1924–). It was also a prominent feature of the Contract with America, the Republican Party platform for the midterm election campaign of 1994.

In 1996 Congress passed the Line Item Veto Act. Clinton, the first president to have the authority, used the line-item veto on the Balanced Budget Act of 1997 and the Taxpayer Relief Act of 1997. Both items he vetoed, which affected a variety of jurisdictions, including New York City, were then returned to Congress for reconsideration. After the law was ruled unconstitutional by a U.S. District Court, it ended up before the U.S. Supreme Court.

The Supreme Court Ruling

The court, by 6 to 3, upheld the lower court’s ruling that the line-item veto law was unconstitutional. Justice John Paul Stevens (1920–), writing for the court, noted that the law violated the presentment clauses, set out in Article I, Section 7 of the U.S. Constitution, which prescribe how a bill becomes a law. The presentment clauses say that every bill passed by Congress “shall, before it become a law, be presented to the President of the United States; if he approve he shall sign it, but if not he shall return it…” and that all orders, resolutions, or votes in which the concurrence of Congress is necessary (excluding adjournments) “shall be presented to the President of the United States; and before the same shall take effect, shall be approved by him, or being disapproved by him, shall be repassed by two thirds of the Senate and House of Representatives.…” In other words, Stevens said, the president has only two choices—to approve or to disapprove of a bill in its entirety.

Stevens noted that if Congress wanted to implement a new procedure for presentments, such a change would have to come through an amendment to the Constitution.

The court’s ruling was hailed by strict constitutionalists, but disappointed the Clinton administration as well as Republican leaders. In February 2006 Clinton’s successor, Republican President George W. Bush (1946–), asked Congress for line-item veto authority in a new bill that he believed would pass constitutional muster. The bill passed the Republican-controlled House several months later but never made it to the Senate floor for a vote.

Pork

Bills crafted by Congress generally target matters of broad concern to the American people. However, members of Congress often slip in provisions, particularly in spending bills, that are beneficial only to their own constituents. Examples include funding for bridges, roads, and parks. Over the years these provisions have earned the nickname “pork.” In general, a specific piece of pork is considered wasteful by everyone except those people it benefits directly. Legislators are highly motivated to get pork projects passed because the projects can earn them votes. This is particularly true for members of the House of Representatives, who face re-election every two years. The most successful purveyors of pork are said to “bring home the bacon” for their constituents.

Some projects are loudly—and quickly—branded as pork. In 2005, for example, Congress passed a $286 billion federal highway and mass transportation bill. It included about $200 million for construction of a bridge to connect the small town of Ketchikan, Alaska, to nearby Gravina Island. Ketchikan has a population of around nine thousand and the island has fewer than one hundred residents. The bridge was intended to replace the ferry to the island—the ferry trip takes about ten minutes. Critics called it “the bridge to nowhere.” The bridge was championed by two powerful Alaskan lawmakers—Representative Don Young (1933–) and Senator Ted Stevens (1923–). Both are senior Republican legislators with decades of experience and positions on key congressional committees that make major funding decisions. The bridge sparked widespread criticism in the media and won the Golden Fleece Award from Taxpayers for Common Sense, an organization that calls itself a “nonpartisan budget watchdog.” While Republican leaders agreed to take the bridge out of the spending bill, they provided the same amount of money for Alaska, which the state government could spend as it wished—even on the bridge. Young was reelected easily in 2006, proving that bringing home the bacon has its political rewards.

Nixon v. Shrink Missouri Government PAC

In Nixon v. Shrink Missouri Government PAC (2000) the U.S. Supreme Court declared constitutional a Missouri law that placed limits on the amount of money that individuals and groups could contribute to candidates running for state office. The case was initiated by a candidate for Missouri office and a political action committee (PAC) that supported him. They claimed that contribution limits violated constitutional guarantees of free speech, free association, and equal protection. The court disputed these claims and found Missouri’s contribution limits to be justifiable to prevent corruption and the appearance of corruption in government.

Background

In 1994 the Missouri legislature passed a law that imposed limits on campaign contributions to candidates running for state offices. The amount that each contributor was allowed to give to each candidate ranged from $250 to $1,000, depending on the size of the population represented by the office. The law allowed the limits to be raised slightly every two years to compensate for inflation. The contribution limit in 1998 for the office of Missouri state auditor was $1,075.

In 1998 Zev David Feldman, a Republican candidate for that office, received the maximum donation allowed from a PAC named Shrink Missouri Government. PACs are private organizations that raise money to support candidates who share their political interests. Feldman and the PAC alleged in court that the contribution limits violated their rights under the First and Fourteenth Amendments to the U.S. Constitution. Campaign contributions have historically been considered a form of free speech. In addition Feldman argued that the limits severely impeded his campaign efforts and that the effects of inflation were not properly accounted for in the state’s limits. After a U.S. District Court ruled against Feldman and the PAC, the case was appealed and the judgment reversed.

The U.S. Appeals Court found that Missouri had improperly based its law on the U.S. Supreme Court’s decision in Buckley v. Valeo (1976). That case set the legal precedent for limiting campaign contributions to candidates for national office and declared the limits necessary for “avoiding the corruption or the perception of corruption caused by candidates’ acceptance of large campaign contributions.” The appeals court ruled that Missouri had to show “demonstrable evidence” that “genuine problems” resulted from allowing contributions greater than the limits.

The Court’s Decision

The Supreme Court reversed the appeals court decision, ruling 6 to 3 that Buckley v. Valeo was sufficient authority to impose contribution limits on candidates for state offices and for the same reasons. Justice David Souter (1939–), writing for the majority, noted “the cynical assumption that large donors call the tune could jeopardize the willingness of voters to take part in democratic governance.” The court found that a showing of “demonstrable evidence” by the state was not necessary to prove a correlation between large campaign contributions and corruption or the appearance of corruption. The court also ruled that the state law did not have to match the contribution limit specified in Buckley v. Valeo ($1,000 per donor).

The Patriot Act

The Patriot Act, passed only weeks after the September 11, 2001, terrorist attacks on the United States, included a number of provisions intended to improve the country’s security, strengthen and coordinate intelligence and law-enforcement actions against terrorists, and provide aid for people victimized by terrorism. Although considered relatively benign at first, the law soon became controversial for its implications for civil liberties, particularly with respect to surveillance procedures. The Patriot Act was renewed in 2006, but with new safeguards designed to better protect the civil rights of Americans.

Overwhelming Support

The Uniting and Strengthening America by Providing Appropriate Tools Required to Intercept and Obstruct Terrorism (USA PATRIOT) Act of 2001 is known simply as the Patriot Act. It passed the House of Representatives by a vote of 357 to 66. In the Senate, only one senator voted against it—Russ Feingold (1953–), a Democrat from Wisconsin. He stated that he supported most of the law, but was deeply troubled by the civil-liberty implications of a handful of provisions.

The U.S. Department of Justice claimed that the Patriot Act made only “modest, incremental changes” to existing law related to criminal activities and made tools available in the “war on terror” that had been used for decades against other kinds of crimes, particularly organized crime and drug trafficking. These tools include surveillance techniques, such as wiretapping, and legal maneuvers, such as streamlining procedures to obtain search warrants. The law also allows federal agents to obtain business records relevant to national-security investigations without obtaining a subpoena from a grand jury. Agents make such requests to a special federal court, the Foreign Intelligence Surveillance Court (FISC), which can grant permission if the government meets certain criteria.

Criticism of the Act

The Patriot Act was quickly criticized by private organizations concerned with protecting civil liberties, including the American Civil Liberties Union (ACLU) and the Electronic Privacy Information Center (EPIC). Civil libertarians worried that the law was passed so quickly in the aftermath of the September 11, 2001, terrorist attacks that few safeguards afforded by the U.S. Constitution were included.

Controversy over the FISC

The FISC is made up of eleven federal district court judges. The court convenes only when needed. Much of its work is conducted in secret because of the sensitive nature of national-security matters and because it relies on classified information. According to public records, the FISC has approved thousands of applications from the government to conduct electronic surveillance as part of national-security investigations.

In 2005 the New York Times reported that the administration of President George W. Bush (1946–) had been conducting wiretap operations without FISC approval. The newspaper alleged that in early 2002 Bush issued a secret executive order authorizing the National Security Agency (NSA) to bypass FISC procedures for conducting domestic surveillance. The NSA oversees signals intelligence operations for the U.S. intelligence community.

The New York Times estimated that the international phone calls and e-mails of “hundreds, and perhaps thousands” of Americans had been monitored to search for links to international terrorism. It also acknowledged that the surveillance program helped uncover several terrorist plots against targets in the United States and Britain. According to the newspaper’s account, the president based his order on his belief that a September 2001 congressional resolution granted him “broad powers” in the war on terror to protect U.S. interests. The program was reportedly suspended in mid 2004 because of a complaint from the federal judge overseeing the FISC. The program was “revamped” and continued to operate.

Repercussions

The ACLU filed a lawsuit against the NSA, claiming that the surveillance program violated the First and Fourth Amendments to the U.S. Constitution and that Bush had exceeded his authority under the separation of powers outlined in the Constitution. The ACLU asked for the program to be dismantled. In August 2006 a federal judge ruled in the group’s favor on grounds that the surveillance program violated the Fourth Amendment. She also found that Bush had exceeded his authority under the Constitution. Two months later a Court of Appeals panel stayed the ruling while the government appealed the decision. Top government officials continued to defend the program as necessary to combat terrorism.

Renewing the Patriot Act

The original Patriot Act called for sixteen of its sections to sunset—that is, automatically expire–after four years unless they were renewed by Congress. In 2005 intense debate began about whether those sections should be renewed. Ultimately the House voted 251 to 174 to renew the provisions with some modifications. In March 2006 the USA PATRIOT Act Improvement and Reauthorization Act of 2005 passed the Senate by a vote of 89 to 10. The modified Patriot Act makes permanent fourteen of the original sixteen sunset provisions and places new four-year sunset periods on the other two provisions, which concern surveillance techniques and the acquisition of business records. The Department of Justice stated that “dozens of additional safeguards to protect Americans’ privacy and civil liberties” were included as part of the reauthorization.

Bibliography

Books

Binder, Sarah A., and Steven S. Smith. Politics or Principle?: Filibustering in the United States Senate. Washington, DC: The Brookings Institution, 1997.

Periodicals

Bartlett, Bruce. “How Supply-Side Economics Trickled Down.” New York Times, April 6, 2007. http://select.nytimes.com/search/restricted/article?res=F60C14FC3C5B0C758CDDAD0894DF404482 (accessed April 20, 2007).

Curry, Timothy, and Lynn Shibut. “The Cost of the Savings and Loan Crisis: Truth and Consequences.” FDIC Banking Review 13, no. 2 (2000). http://www.fdic.gov/bank/analytical/banking/2000dec/brv13n2_2.pdf (accessed May 3, 2007).

Farrell, Maureen. “The Future of Universal Health Care.” Forbes, March 28, 2007. http://www.forbes.com/2007/03/28/unitedhealth-walmart-medicaid-ent-hr-cx_mf_0328outlookuniversal.html (accessed May 24, 2007).

Kosar, Kevin R. “Shutdown of the Federal Government: Causes, Effects, and Process.” CRS Report for Congress (Congressional Research Service), September 20, 2004. http://www.rules.house.gov/archives/98-844.pdf (accessed May 2, 2007).

San Francisco Chronicle. “Special Section: U.S. vs. Iraq; The 1991 Gulf War” (September 24, 2002). http://www.sfgate.com/cgi-bin/article.cgi?f=/c/a/2002/09/24/MN168392.DTL&hw=The+1991+Gulf+War&sn=002&sc=713 (accessed May 2, 2007).

Schmidt, Susan, and James V. Grimaldi. “Abramoff Pleads Guilty to 3 Counts.” Washington Post, January 4, 2006. http://www.washingtonpost.com/wp-dyn/content/article/2006/01/03/AR2006010300474_pf.html (accessed May 19, 2007).

Stanford Report. “Computerized Voting Lacks Paper Trail, Scholar Warns” (February 4, 2003). http://news-service.stanford.edu/news/2003/february5/dillsr-25.html (accessed April 28, 2007).

Wallis, Claudia. “AIDS: A Growing Threat.” Time, April 18, 2005. http://www.time.com/time/magazine/article/0,9171,1050441,00.html (accessed May 3, 2007).

Web sites

CNN. “A Chronology: Key Moments in the Clinton-Lewinsky Saga.” http://www.cnn.com/ALLPOLITICS/1998/resources/lewinsky/timeline/ (accessed May 4, 2007).

CNN. “September 11: Chronology of Terror.”. http://archives.cnn.com/2001/US/09/11/chronology.attack/ (accessed May 19, 2007).

Government Printing Office. “Citizen’s Guide to the Federal Budget: Fiscal Year 1999.” http://www.gpoaccess.gov/usbudget/fy99/guide/guide.html (accessed April 23, 2007).

Library of Congress. “H.R. 1025 (The Brady Handgun Bill).” http://thomas.loc.gov/cgi-bin/bdquery/z?d103:HR01025:@@@D&summ2=m& (accessed May 24, 2007).

Limburg, Val E. “U.S. Broadcasting Policy.” The Museum of Broadcast Communications. http://www.museum.tv/archives/etv/F/htmlF/fairnessdoct/fairnessdoct.htm (accessed April 20, 2007).

McCarthy, Leslie. “2006 Was Earth’s Fifth Warmest Year.” Goddard Institute for Space Studies, National Aeronautics and Space Administration. February 8, 2007. http://www.nasa.gov/centers/goddard/news/topstory/2006/2006_warm_prt.htm (accessed May 17, 2007).

National Commission on Terrorist Attacks Upon the United States. “The 9/11 Commission Report.” July 22, 2004. http://www.9-11commission.gov/report/911Report.pdf (accessed May 24, 2007).

Office of the United States Trade Representative. “NAFTA: A Strong Record of Success.” Trade Facts, 2006. http://www.ustr.gov/assets/Document_Library/Fact_Sheets/2006/asset_upload_file242_9156.pdf (accessed May 21, 2007).

The Oyez Project. Reno v. ACLU. 521 U.S. 844 (1997). http://www.oyez.org/cases/1990-1999/1996/1996_96_511/ (accessed May 21, 2007).

Public Broadcasting Service. “Once upon a Time in Arkansas.” Frontline, October 7, 1997. http://www.pbs.org/wgbh/pages/frontline/shows/arkansas/ (accessed May 4, 2007).

Public Broadcasting Service. “The Presidents: Ronald Reagan.” American Experience. http://www.pbs.org/wgbh/amex/presidents/40_reagan/index.html (accessed May 4, 2007).

Public Broadcasting Service. “Reforming the Reform Party.” Online NewsHour, July 26, 1999. http://www.pbs.org/newshour/bb/election/july-dec99/reform_7-26.html (accessed April 17, 2007).

The White House. “Presidents of the United States.” http://www.whitehouse.gov/history/presidents/chronological.html (accessed April 9, 2007).

U.S. Commission on Civil Rights. “Voting Irregularities in Florida during the 2000 Presidential Election.” The 2000 Vote and Election Reform, June 2001. http://www.usccr.gov/pubs/vote2000/report/main.htm (accessed May 1, 2007).

U.S. Supreme Court Center. Texas v. Johnson, 491 U.S. 397 (1989). http://supreme.justia.com/us/491/397/case.html (accessed May 21, 2007).

More From encyclopedia.com

About this article

The Internet Age (1980–Present)

Updated About encyclopedia.com content Print Article

You Might Also Like

    NEARBY TERMS

    The Internet Age (1980–Present)