This is a revised text of the sixth John Whitehead Lecture delivered by Strobe Talbott at the Royal Institute of International Affairs, Chatham House on 9 October 2003. As always with writings and commentary of Brookings scholars, the views expressed here are personal and do not reflect institutional positions or policy.
I am honoured to give a lecture established in honour of John Whitehead. He was a predecessor of mine at the State Department and an active trustee and chairman of the board of Brookings. He remains a friend and mentor. Two weeks ago, I visited Chatham House in cyberspace in order to read the inaugural address of its new Chairman DeAnne Julius. Like her, I feel I should address the war in Iraq, where 138,000 American and 11,000 British troops are stationed and where my president and your prime minister have bet their political futures. Indeed, the stakes are even higher - I daresay much higher - than that. The war and its aftermath will have much to do with determining the direction of American and British foreign policy for decades to come. Before offering my personal concerns and hopes about what may lie ahead, let me begin by offering a few reflections on the past.
At the beginning of the twentieth century, the international system was based largely on two epochal events in European history: the Peace of Westphalia in 1648 and the Congress of Vienna of 1814-15. My country missed out on both those grand and consequential assemblies -Westphalia for the simple reason that the US did not exist, and the Congress of Vienna because the US, then not even 40 years old, was not invited. President Madison did not even have a representative at the Habsburg court to sit in as an observer. Besides, we Americans had our hands full negotiating an end to the War of 1812 with George III’s envoys at Ghent-and, I might add, cleaning up the mess the Redcoats made of my hometown when they sacked it and set fire to the White House.
Westphalia and the Concert of Europe put in place an international status quo that prevailed until the US became strong enough to shake it up. Westphalia established the nation-state as the polity of choice for the next three and a half centuries. A nation-state is a territory controlled by a single government and inhabited by a distinct population with a common culture that commands the loyalty and shapes the identity of its citizens: France for the French, Sweden for the Swedes, England for the English.
By that definition, America is not a nation-state in the Westphalian sense - it never has been, and never will be. It was conceived by its founders as a new kind of nation and, indeed, a new kind of state-one based not on the combined accidents of demography and geography, but on the combined exertion of political will and championship of political ideas. In 1776, Thomas Jefferson and his colleagues summarized the main ideas in a document that made them liable to be hanged if the opportunity had presented itself to the authorities of the British Crown.
Now, I hasten to add that these ideas, radical as they were, owed a lot to Europe-and to the Enlightenment. Moreover, they were developed and promulgated by transplanted Europeans-not just ones with English names (like Jefferson), but also with names like Van Steuben, Kos;ciusko, Lafayette, and Rochambeau. I pick those four examples because their statues have pride of place in the square just across from the White House. They stand as a constant reminder of America’s debt to what might be called the original ‘old’ Europe. But as applied to the American experiment in statehood, they were universal ideas-that is, they were believed to be applicable to all humanity, the basis for what might be called (in a phrase used by a former president named Bush) a New World Order.
I stress this bit of American history because it helps to account for a strain in US foreign policy of exemplary exceptionalism-that is, the notion that the US is exceptional in ways that should serve as an example for others. The image of Uncle Sam as a wise, stern authority figure who believes he knows what’s best for the whole family of humankind may be particularly evident today, but it is by no means new.
I now turn to that meeting in Vienna in 1814 at which the Viscount Castlereagh and the Duke of Wellington helped redraw the map of the continent. The Congress of Vienna and the treaty that emerged from it sanctified balance of power as the dynamic of choice for international relations. That was, and remains, a very European idea.
But it is emphatically not an America idea. Balance of power was the nineteenth century version of what today is commonly -and, on this side of the Atlantic, approvingly -known as ‘multipolarity’.
A recurring and animating premise of US foreign policy has always been the righteous imbalance of power; that is, an imbalance in favour of the US, its friends, its allies, its protégés and, crucially, its fellow democracies. In that sense, the intellectual-I would even say ideological-justification for America as a superpower predated both the phenomenon and the terminology.
A century ago Theodore Roosevelt and Woodrow Wilson took the US onto the world stage. Those two presidents were different in many ways, not just in their party affiliation. They also detested each other. But they both saw themselves, when they acted overseas, as motivated by something nobler than the cold-blooded calculus of raison d’état or realpolitik-those specialities of European statecraft that Americans have never deigned even to translate from French and German. Roosevelt and Wilson both believed that American foreign policy must combine power and principle, realism and idealism, national selfinterest and an altruistic international mission.
However, these two distinctly American themes of exemplary exceptionalism and the righteous imbalance of power did not automatically translate into what, in the parlance of today’s debate, we call unilateralism. Quite the contrary, it was yet another theme in US foreign policy during the twentieth century that precisely because American values were universal, they had a natural constituency in other countries. From that conviction, it followed logically that American goals could be-and whenever possible should be-achieved in concert with others; through international structures, international institutions, international compacts, and international rules that apply to everyone, including the rule maker-in-chief. Collaborative, consensual arrangements were seen as an appropriate and effective means of advancing American interests and values, and of leveraging American power.
The US’s two most ambitious diplomatic undertakings of the twentieth century were structural; they were construction projects-and they were joint ventures, involving many partners, with the US in the self-assigned (and generally welcome) role of master architect and general contractor. The first of these projects, after the First World War was the League of Nations. It was Woodrow Wilson’s dream and his débâcle. The League failed largely because Wilson failed to build support at home for what he was trying to build abroad. The spasm of isolationism and protectionism that ensued contributed to the rise of European fascism and the outbreak of the Second World War.
When that war came to an end, there was another opportunity for institution building on a global scale. Once again - as at Westphalia, Vienna and Versailles - the victors gathered not just to divide the spoils of war but to build the structures of peace. This time, they got it right, largely because the US stayed involved in the design and the management of the institutions that emerged. Bretton Woods led to the establishment of the World Bank and IMF, Dumbarton Oaks and San Francisco to the creation of the United Nations, and the Washington Treaty to the founding of NATO. Under the protection of that US-conceived, US-led alliance, another great project-European integration- came into being.
Much has changed, of course, in the nearly 60 years since those constructs were put in place. A cold war has come and gone and is now itself a decade and a half in the past. A Russian diplomat sits as an equal with American, British and other western colleagues around a large table in Brussels at something Winston Churchill and Harry Truman would have had difficulty imagining-a NATO- Russia Council. In a development that would surely have astonished Jean Monnet (not to mention Joseph Stalin), the European Union will next year admit four former Warsaw Pact allies and three former Soviet republics. During the 1990s, a number of new structures and arrangements sprang up. Some, like the G8, are global. Others, like the ASEAN Regional Forum and APEC, are regional. Most have depended on the active involvement, if not the instigation, of the US.
Along with the expansion and adaptation of international institutions in the 1990s came the establishment of a new principle. It was agreed that there are limits on the sovereignty of the nation-state: national governments are subject to international sanction if they violate certain basic norms within their own borders. Our own governments along with others enforced that principle by stepping in and reversing a military coup and restoring a democratically elected president in Haiti in 1994; by ending genocide in Bosnia in 1995 and in Kosovo in 1999; and by overseeing a peaceful transition from annexation to independence in East Timor. Taken together, those exertions of collective will on behalf of shared values and interests constituted a landmark accomplishment of the 1990s. The international community lived up to its name. It did so by relying on international institutions and agreements.
With all that as background, let me turn to the foreign policy of the current President Bush. In one respect, he is very much in an American tradition going back one hundred years. ‘Moral clarity’ is a phrase right out of the vocabulary of both Teddy Roosevelt and Woodrow Wilson. If you want to see a synthesis of the ‘Rough Rider’ spirit and Wilsonianism, read President Bush’s National Security Strategy released a year ago, with its vow to make the world, starting with the Greater Middle East, safe for democracy - and to do so with a very big stick. But in another respect Mr Bush, as the first president to take office in the twenty-first century, has broken with his ten predecessors, Republican and Democrat, from the end of the Second World War. By and large, those earlier occupants of the Oval Office - from Truman to Eisenhower to Nixon to George Herbert Walker Bush to Clinton - believed in a foreign policy that combined American leadership with institutionalized, codified cooperation with other countries.
Mr Bush came into the presidency with reservations on this score and with an inclination to experiment with a new concept: that the sheer preeminence of American power could, in itself, be the ordering and taming principle of a disorderly and dangerous world-and that the confident assertion of that power made it less necessary for the US to rely on structural arrangements, especially ones that limited America’s freedom of action. That was the subtext of the administration’s rejection of the Kyoto Protocol on climate change, the International Criminal Court, the Treaty on Anti-Ballistic Missile Systems, the land mine ban, and an array of conventions designed to protect the rights of children, stop torture, curb discrimination by race and gender, end the production of biological weapons, prevent money laundering, and limit trafficking in small arms.
Earlier American administrations objected to some features of those accords, but in most cases they sought to improve them rather than discard them. By contrast, intellectually formidable and politically powerful figures in the Bush administration seemed to be calling into question the very idea of binding agreements-and the very idea of international structures.
That included, by the way, the structure that is taking form on this side of the Atlantic: the European Union. For the first time in 50 years, starting in January 2001, there was, in official Washington, a qualitatively new scepticism about European integration. It wasn’t just scepticism about whether integration will succeed, but scepticism about whether Americans should want it to succeed- about whether progress toward a United Europe is in the interests of a United States.
In a development that was both telling and peculiar, the word ‘imperial’ came into fashion among some surrogates, and even some spokesmen, for the Bush administration. Virtually all previous watersheds in the evolution of the international system had entailed the repudiation of specific empires. Cumulatively, they amounted to a repudiation of imperialism in general. Westphalia marked the end of the Holy Roman Empire, the Congress of Vienna carved up the Napoleonic one; Versailles did the same to the Habsburgs’ and Ottomans’; the Romanovs’ was by then already on the ash heap of history. The allies in the Second World War defeated the Third Reich and the Empire of the Rising Sun, and the sun began to set on the British Empire not long after. Throughout the twentieth century the US was an opponent of empire and a champion of decolonization. The fading of the Cold War was made possible by the collapse of what was often called the world’s last empire, with Moscow as its metropole. Yet a decade later, in 2001, theoreticians for a new administration in Washington toyed with the idea of imperialism as a model for American foreign and defense policy.
Then came September 11. Around the world, the effect was to galvanize sympathy and support for the US-including support for the remarkably swift and totally justified American military action against the Taleban and Al-Qaeda in Afghanistan. As evidence of that solidarity, NATO, for the first time in its history, invoked Article V of its charter, proclaiming that the assaults against the World Trade Center and the Pentagon constituted an attack on all member states. Yet in the way it waged the war in Afghanistan, the administration sidelined the alliance. Only when the mission of regime-change was accomplished and the job became one of nation-building did the US welcome international participation.
September 11 had another effect inside the US that must be understood outside the US. It made Americans more supportive of the new, unilateralist premise of their government’s foreign policy. In the course of that single brilliant, blue-sky morning two years and one month ago, Americans suddenly saw the world as a more perilous place, inhabited by bad people who wanted to kill us indiscriminately and in large numbers on our own territory. That new fear made the body politic more receptive to the administration’s doctrine of preemption and prevention.
All that is backdrop to the war in Iraq. In the year and a half after 9/11, the Bush administration set about persuading the American people that Operation Iraqi Freedom would be the next necessary battle in the ongoing war against terrorism. To make that case, the administration-in a phrase that became common in Washington post 9/11-‘connected the dots’ between Saddam on the one hand and, on the other, the ultimate NGO, Al-Qaeda, and the ultimate instrument of terror, nuclear weaponry. Of course, as we now know, the administration over-connected the dots: it exaggerated Saddam’s ties to Al- Qaeda and the extent of his nuclear programmes. Nevertheless, the argument worked domestically.
It did not, however, work internationally. Hence the collapse, in February and March 2003, of the administration’s effort to get the UN to accept the American timetable, the American rationale for the war and the American willingness to fight the war without the backing of the United Nations. Some in the administration, particularly (though not exclusively) among high-level Pentagon civilians, were relieved when the Security Council went into deadlock. They had regarded the president’s decision to go to the UN in the first place, a year ago, as a mistake. He had, in their view, fallen into what they called ‘the UN trap’, from which the obstreperous French provided us with a welcome escape. For those with that view, the war was not just a successful military operation that liberated Iraq-it was a political breakthrough that liberated American foreign policy from the encumbrance of multilateralism. Much of the world, of course, was anxious and even appalled. There was a lot of worry that Iraq, as a sequel to Afghanistan, had created a precedent for further sequels elsewhere. As the US Third Infantry Division rolled past Basra on its way to capture Baghdad in late March, many watching the spectacle in real time on television feared that those armoured columns would, in effect, just keep rolling-all the way to Tehran and Pyongyang, taking care of the entire axis of evil in one giant Operation Global Freedom.
On Wednesday 29 October, my colleague from Brookings, Ivo Daalder, will come to this podium to talk about a new book he has written with Jim Lindsay, America unbound: the Bush revolution in US foreign policy. I agree with Ivo and Jim that the administration’s approach to the world has been sufficiently radical to qualify as revolutionary. But in my view (which, like much of what I am saying, is vigorously debated within Brookings, not to mention elsewhere), it is at least possible that we are now seeing the Thermidor of the Bush revolution-that is, a period comparable to the one in the French Revolution when radicalism began to ebb and more moderate political figures came to dominate the First Republic. That may be happening within the Bush administration itself. The radical preferences favoured by some in its ranks have, in the last several months, collided with reality. Actually, they have collided with several realities: on the ground in Iraq, in the corridors and hearing rooms of Congress, in the public opinion polls and in the balance sheets of the federal budget.
For in-depth, objective and more importantly balanced journalism, Click here to subscribe to Outlook Magazine