Вы находитесь на странице: 1из 84

Should advertising be abolished?

Context

We live in a media-saturated and ad-saturated world where

advertisements engulf our every activity, catapulting themselves onto buildings, public transport, our Google search attempts or as overlays on our YouTube videos.

Sao Paulo mayor, Gilberto Kassab, passed a Clean City law in 2005 that Excessive, intrusive use of advertising within media prevalence of

banned all forms of outdoor advertising.

product placement in television shows and movies as well as on radio (eg. Class 95 DJs Glenn Ong and the Flying Dutchman frequently promote products on-air)

Of course, most developed nations have in place advertising

laws that already place restrictions that if contravened, will leads to fines and bans. The Ribena ad controversy in 2007, where two schoolgirls discovered that there was little to none of the Vitamin C promised in GlaxoSmithKlines Ribena ads is perhaps the most famous example. The same applies to the Coca-Cola brands VitaminWater, which (truth be told) has more sugar / calories than actual vitamin content. Question Analysis

Advertising can and should be interpreted as widely as possible, to

include internet / social media advertising, outdoor advertising, television commercials, public announcement / awareness campaigns, coupons and promotions, product placement in films and TV.

Abolished is certainly an extreme term that can and should be

challenged. We might consider reduced as an alternative. The key idea is, as in the 2011 JC2 Mid-Year Exam question, the need for or role of advertising in the world today.

Possible Arguments For

Critics of advertising like the mayor of Sao Paulo believe that it should

be abolished on the account that advertising is, in essence, visualpollution. Advertising has always been regarded as a nuisance an unnecessary distraction in our world, from its infancy in the late 19th century to todays advertisement-saturated media landscape. The beauty of New Yorks Times Square is nothing to behold. Product placement in movies, video games and mobile phone apps take away from the users experience. The list of complaints goes on and on, as does the list of examples.

Some may feel that advertising should be abolished because it

serves no vital purpose that cannot be fulfilled by other methods in todays media-rich world. Online reviews, social media, enthusiast websites / forums and other new media platforms are adequate for the average consumer; they arguably provide more comprehensive, accurate and authoritative information than advertising can.

To add to this chorus of disapproval, a professor from Harvard even

proclaimed that advertising tempts people to squander money on unneeded possessions. Advertising misleads consumers into buying products by exaggerating and inaccurate reporting. Possible Arguments Against

However, such claims are themselves misplaced and suggest some

level of ignorance. As David Ogilvy pointed out in his 1963 book Confessions of an Advertising Man, the consumer is better protected than he thinks he is. The Federal Trade Commission in the USA regulates the advertising industry and can prosecute firms that deliberately seek to mislead the consumer. Ogilvy even notes that lawyers from a barbecue sauce company in the 1960s demanded that his copywriters prove to them the sauce truly had a smoky flavour before

allowing their advertising copy pass. There is certainly no need to abolish advertising when such restrictions are already in place.

To believe that advertising should be abolished is also to miss the point

of advertising. Its role in society is not just to persuade (or mislead as critics may decry) but to inform us of new products or new brands on the market. It gives us some level of awareness and may influence us subtly as we choose one brand over another at the supermarket. We should not however allow our own intelligence to be insulted; even if advertisements are inaccurate, do not for a second think that the consumer cannot see through it. Again, as Ogilvy quips, advertisers must treat the consumer like their wife (with attention and respect), not some dumb person off the street they can swindle.

Indeed, the general raising of standards of modern civilization among

all groups of people in the past half century would be impossible without advertising. Advertising spreads the knowledge of higher standards, making each of us first see and then demand similar or better quality in everything we interact with. (Franklin Roosevelt)

Advertising nourishes the consuming power of men. It sets up before a

man the goal of a better home, better clothing, better food for himself and his family. (Winston Churchill) Advertising spurs the individual to work harder and produce more in order to attain his desired lifestyle. We may deride advertising because of the consumption culture it seems to perpetuate, but it is probably unfair to point fingers at advertising when it really is consumerism that is the larger evil.

In addition to the above social functions, advertising is also a mark

ofculture: it reflects our desires, our tastes and preferences. At some level, advertising can be construed as art, with its myriad colours, illustrations, photos and clever slogans representing our history and our values through the ages.

The chief argument against the abolishment of advertising is perhaps

the simplest. Any attempt to ban advertising in a world oversaturated with it is not feasible and counter-productive. What is advertising today? Yes, we can ban the one-way billboard advertisements as Sao Paulo have ably demonstrated but advertising is no longer defined in straightforward terms. Advertising can come in the guise of a song, a facebook fan page, an online Amazon review, a widget, a game, a parody, a music video. Advertising is everywhere. We can seek to ban one mode of advertising but the consumerist, free-market capitalist world we live in renders any attempt to ban or restrict advertising pointless. Advertisers and corporations will simply find another means of advertising. The question is not, should advertising be abolished? The question is, can it be abolished or restricted?

To what extent do advertisements reflect what society desires?


Context

Media saturated society with the increasing pervasiveness of adv. Commercialisation of culture and a high degree of consumerism Ads today less focused on product details and focus more on

highlighting the product within relevant social contexts Question Analysis

ADVERTISMENTS: purpose is to promote awareness/demand or REFLECT DESIRES: To merely mirror societal wants. Assumption: its role is not/less in creating or shaping desire but TO WHAT EXTENT: Assert how far one agrees with statement and

increase corporate profile.


showcasing what society views as important

reasons why. Explain the limits of the stand

Possible arguments (to a large extent)

Primarily, ads have to reflect societal wants in order to CREATE A Adverts will show how products FIT into society through familiar (Root cause argument) It is the changes in society which include

REINFORCING EFFECT on viewers and sell products.

images.

changes in tastes, desires/ demands that direct the creation of new products. Advertising carried out to promote products is thus a reflection of emerging desires/needs.

Advertising is an indicator of taste as seen by the constantly evolving

messages and images being created for the same product. Changes in beliefs/norms in society translate into changes in the way they are targeted. Possible arguments (limited extent)

Ads however show a BLINKERED/BIASED VIEW OF SOCIETY in order to Yet it does not mean that it is not a relevant reflection of societal For NEW/CUTTING EDGE INNOVATIONS and emerging corporations, ads

maximise selling points

needs, merely amplified or exaggerated?

primarily create or amplify new desires because society has not explicitly expressed them.

Yet, without any hint of desire/demand from society, there would be no A selection of ads deliberately shape societal desires even when there Eg: Presenting lifestyles of the urban middle-class to rural communities

such creation of products and hence no need to spend on advertising

is hardly any connection with every day life.

for example is common in advertising in India.

Yet it is arguable that there is already awareness or a latent desire of

such lifestyles and its accompanying products even if there is no evidence of it being lived out yet

Modern technology has not made us wiser. What is your view?


Context / Issues

The promises of technology do not seem to be fulfilled as technology Everything that we do is inherently technological, from academic

increasingly dominates modern life and influences developing nations.

instruction (Voila! This very blog is evidence of that!) to cooking to global financial markets.

The reliance argument recalls the Arthur C Clarke (2001 A Space

Odyssey) thesis: as our tools become more complex, Man gradually relinquishes control and allows himself to be subject to these tools.

On a separate note, the large amounts poured into research and

development of technology, whether into consumer products or solving larger scientific quandaries (cure for cancer, quest to uncover the secret of the universe, quantum mechanics etc) may or may not have enlightened us, depending on how you see it. We do not see the latter having an immediate effect on our daily lives, so we can consider the difference between the layman and Mankind as a whole. Question Analysis

We must first be concerned with modern technology, which has to be

interpreted as the development and application of technology, from simple electrical appliances (at the individual level) to say, nuclear technology (at the societal / international level).

Not is an absolute term that should be challenged. We are

probably not looking at the other extreme of technology has made us wiser because we must recognise how technology has failed to make us wiser.

Wiser is really the key term that will determine the strength of our

essay. Wiser can be interpreted as (i) access to and acquiring more knowledge; (ii) being more discerning in our daily lives; (iii) equipped us with moreways of thinking and more skills (eg. smartphones make us instant route planners and financial planners); (iv) upped our intellectual capabilities (not sure how).

Some unwise interpretations to avoid: wiser = not childish, wiser =

have more choice when it comes to buying TVs. Possible arguments

Critics assert that modern technological devices, from the simple

calculator to the now-ubiquitous smartphone, have led to the deterioration of our thinking capabilities because we allow these devices to do the thinking and remembering for us.

We are fallible human beings and so, when equipped with the latest

tools, may not use it as wisely or maturely as we should. Technology has not made us more discerning, simply because that is not its role. We have to decide for ourselves how to use it. - While the Internet and Web 2.0 services have given us limitless opportunities for self-expression, we can just as easily and have indeedabused them.

The opportunities afforded to us with modern technology may even

have the opposite effect. The unwise may use technology to push an agenda that has a detrimental effect on mankind; we can point the finger at modern technology for encouraging BOTH destructiveness and constructiveness. - Rogue states like North Korea and Iran continue their uranium

enrichment programme and develop a nuclear arsenal. Unwise men make unwise use of technology.

Yet, we cannot deny that the avenues technology has opened doors for

us in terms of knowledge. Never before have we reached this current level of transparency and information density as with the world wide web and social media platforms. - MIT online lectures have arguably democratised the education landscape: anyone and everyone with an Internet connection (and smartphone, laptop or desktop computer) can become wiser if we so choose. - Wikileaks and Wikipedia, the former one way and the latter a community website are key platforms in their respective domains.

The impact of large-scale inventions and technological processes may

not be immediately evident now but we can argue that we are in the process of becoming wiser. - The Large Hadron Collider may be dismissed by the common man but it is emblematic / symbolic of our quest to become wiser, to understand ourselves and our universe better. - The same can be said of innumerable biomedical projects, themselves enabled and driven by technology, that seek to give us more control of our bodies and make us wiser denizens.

Unprepared for the modern world. Is this an accurate assessment of the youth in your society?
Question Analysis and Context

unprepared for the modern world requires a consideration of the traits of the current world that are relevant for the question such as the competitive global environment which involves the threat of foreign talent, more companies going global and requiring workers to leave their comfort zone to work overseas; globalized world where countries (and Singapore) are more vulnerable to international events and trends; requirements of knowledgebased economies etc. Outline 1. Born with a silver spoon in their mouth, many Singaporean youths have been too overly sheltered and are so dependent on others that they may be ill-prepared to compete with their foreign counterparts who possess a greater drive to achieve success.

Many youths grew up with domestic workers who took care of their Many working youths in Singapore live with their parents and remain

every need.

financially dependent on them even after they start working. As more Singaporean youths tend to delay marriage in favour of a career, they live with their parents for a longer period. In contrast, youth in other countries tend to leave the nest at the age of 18 years. 2. Aptly termed the strawberry generation, Singapore youths are soft and lack the resilience to withstand disappointments and challenges in life.

Singaporean youths of today have generally been well-provided for and Rising rates of teen suicide which suggest that youths are having Due to the fear of failure, Singaporean youths of today are

have not really experienced much hardship.

difficulty coping with increasing modern stresses. 3. often too afraid to take risks and this makes them unprepared for an economy that competes on entrepreneurship and ideas.

This is often attributed to an educational system that prioritizes grades The pragmatic culture is also seen as another reason for this outcome.

and rote learning rather than initiative and inventiveness.

Due to the primary objective of most youths to attain a high-paying job in the future, youths are likely to follow the conventional routes to success rather than take the road less taken. 4. However, there is a growing emergence of youths who are investing their time and money in various enterprises and have started their own business.

Teenage entrepreneurs turn to the internet, creating blog shops and The amended Civil Law and Bankruptcy Bills allows anyone aged 18 or More schools are starting to train their students to think

eBay stores or set up stalls at flea markets and bazaars.

older to start and run a business.

entrepreneurially even Primary school students are involved in minibusiness startups and tertiary institutions have sent their students to be immersed in overseas entrepreneurial hubs. 5. In addition, in recognition of the flaws of the current educational system, there is a growing attempt by MOE and schools to nurture students to develop 21st century competencies.

Attachment programmes (to various sectors of the industries) areoften

included in tertiary institutions to provide early exposure to the working life.

To cultivate students who are prepared for the globalized world, more

schools are involving their students in exchange programmes to allow them to experience other cultures. There is also greater participation in international competitions so that students are better aware of international benchmarks and are exposed and hence better prepared to handle the rigours of competing with foreign talent.

6.

It is also arguable that Singaporean youths are academically

equipped and possess the relevant IT skills necessary for the world ahead.

Singaporean students are still among the best worldwide in Science Most Singaporean students are bilingual. Singapore is the 2nd most technologically savvy country and youths are

and Mathematics.

well equipped with such technological skills.

How far can our leaders be trusted to do what is right?


Possible arguments

Arguably, we have to trust the leaders we elect into office to some We tend to trust politicians with a good track record and who are seen

degree for them to execute their duties and fulfil their promises.

to be of good moral character more; doing what is right is highly subjective but is largely interpreted as goodwill and a genuine desire to serve society.

Yet, it is essential that the element of distrust remains. This puts

pressure on elected officials to maintain clean records (free from corrupt acts) and hopefully, deliver on what they said they would in the first place.

Even so, the best-intentioned leader can be trusted but may not

necessarily be able to do what is right. He or she may be found to lack

the experience and capability to satisfy the publics demands. Also, extenuating circumstances such as political opposition or a financial crisis may prevent him or her from doing what we perceive as right.

We must also remember that this notion of political trust is in fact a

privilege in the developed and developing world. The tyrants and despots of authoritarian regimes or even communist regimes are not elected through a fair process in the first place and the issue of trust is quite irrelevant.

It is not so much whether we can trust them or not (since we, depending on your definition of we, elect them) but that we are in some way bound to their decisions. We HAVE TO trust them to some degree; such is the premise of leadership. Otherwise, electing them is pointless and we will never be able to judge them fairly. We can of course look at the resistance, not only from Congress, towards Obamacare and to a lesser extent, the American Jobs Act. One perhaps does not even have to go that far and examine, say, school leadership or the general obstinacy of the Singaporean state. Im not quite sure if this is the rebuttal you are looking for, but the idea is that perspectives / views are never just yes-no, agree-disagree. Altering the modal verb is sometimes far more pertinent (as it is, I feel, in this case) and urgent. One of my students suggested that the real issue is whether we DO trust them, given (a) the electoral process or in many cases of distrust / resistance, the lack thereof and/or (b) their track record. She was of course thinking of despotic regimes when she came up with this point and did not hesitate to throw up Mubarak and Gaddafi as examples of leaders who cannot be trusted insofar as they are (or rather, was, in the case of the former) no longer trusted by their people. As you note in a separate thread, it is the value judgement should that is worth debating. Can they be trusted? is quite colloquial in expression in the sense that the accuracy of expression isnt really there. Of course, they can be trusted. Everyone, technically, can be

trusted, whether insane, malicious or in the case of North Koreas supreme leadership, dead.

For what it is worth, I think the real issue here comes with the word should. Coming from a country with a very old parliamentary democracy, I have a particular view of the sentiment implied in the question. To be honest, the concept of modern democracy would actually demand that we should NOT trust our leaders because this suggests blind faith and a level of ignorance that should not be expressed in a modern, civilised and eduacted context. This goes right back to the Greeks (no not the ones who are sinking the Euro!). In ancient Greece, the concept of leadership incorporated numerous differing views as an essential component. The idea was to create checks and balances against any one particular vested interest. So, in a nutshell, the system actually requires us not to trust in the simplistic sense. Instead, we are encouraged to trust the system. This idea translates fairly well to the context of the essay. We can trust our leaders only in as far as the system incorporates such checks and balances. No controls means no trust. This kind of approach would allow students to differentiate between different forms of government (for example nepotistic and cronyist regimes would inspire very little trust) and at the same time allow scope depending on context. A very good example is the current debate in the UK about the Defence Secretary Liam Fox. He has been in the post for a little over a year and in that time has allowed his good friend (best man at his wedding) Mr Werrity to benefit financially from their friendship Mr Werrity runs a firm that offers consultancy on defence issues. They even held talks with Mindef in Singapore last year (Mr Fox in his official capactiy and Mr Werrity as unofficial advisor). Thankfully, being a democracy, Britains press (the fourth estate) has exposed this issue. We can therefore trust in a system that exposes poor judgement in our leaders even if we cannot trust the leaders themselves.

The person who dies rich dies disgraced. Discuss.


Question Analysis Andrew Carnegies original (The man who dies rich dies disgraced) has as its premise the obligation or responsibility of the wealthy to give their riches away before they die. Is it fair to expect them to be philanthropic to such a large extent? Is it really a moral obligation to do so? Essentially, you are called to evaluate the relationship between wealth and charity; given the good Warren Buffett and Bill Gates billions have arguably done, is it legitimate to ask for more? While the prompt quotation is idiomatic, one can also interpret the question, as the Broader Perspectives writers have, to equate being rich with being disgraced. With resentment against investment bankers and the rich rekindled with the 2008 crisis and the double dip in 2011, one could perhaps make the case that one has no right to be rich and that wealth surfaces in tandem with disgrace. As the complexities and conditions of the first interpretation, I would consider, are inviting enough, I have decided to ignore the above reading. Essay Outline The following has been adapted from Dennis Lims essay (CJC 2T17). Thesis: While it is certainly unpardonable to hoard ones wealth at the expense of others, it is unnecessary to demean the work and status of those wish to die rich. TS1: It is perhaps easy to see how the rich should donate their assets to those who need them more. Even if naive in outlook, a rudimentary cost-benefit

analysis tells us that apportioning the wealth at the top can help maximise the well-being of the maximum number of people. It is therefore, as liberals might argue, imperative that the rich follow in Andrew Carnegies footsteps. TS2: Such an assertion would however prove hasty, if not fallacious. We need to appreciate that the rich have the ultimate right to deal with their wealth in whichever way they choose. Under a free-market capitalist economy, the principle of ownership must hold true. TS3: To demonise the person who dies rich as Carnegie polemically does, is unfair and unhelpful. A more reasoned approach is therefore necessary. We expect the rich to give away some of their monies, but a call to give awayall their wealth whilst sparing none for their own children and friends simply cannot be justified. TS4: It is precisely because the wealthy are not obliged to donate their riches that society can come to regard these acts as charitable. In order to encourage the virtue of philanthropy, we cannot censure or shame those who choose not to be charitable, or even as charitable as the Carnegies and Buffetts of the world. TS5: It is also worth pointing out that the motion is itself highly subjective. Who can determine who is rich and who is not? The lines are hard to draw. The mega-rich whose estates are worth billions upon billions may be charged but can we say that all in the upper-income bracket are disgraced for not donating the little surplus they receive each year? Then, how much must one give to earn the respect, or not earn the ire of the people? Must one build libraries, universities and trust funds? Again, it is

easy to demand that the rich exercise some degree of philanthropy but the extent is extremely hard to determine. TS6: Cultural differences also mean that each society will have its own views on dying rich. Passing on a large inheritance to ones young is arguably the Chinese Dream. In liberal Western European countries, such a thought should not even be entertained, much less forgiven. Conclusion: Is the person who dies rich disgraced? If we were to ignore the complexities highlighted above, we can at best conclude that it is the person who wants to die rich who dies disgraced. Charity may not be a moral obligation but it is certainly a social responsibility or virtue.

Rage against the elites, around the world


Whats intriguing about the eruption of Occupy Wall Street is that its so similar to other populist movements that are demanding change in nearly every major region of the world. You cant help but wonder if we arent seeing, as a delayed reaction to the financial crisis of 2008, a kind of global spring of discontent. Obviously, circumstances differ: The anti-corporate activists gathered in Manhattans Zuccotti Park have a different agenda than the demonstrators in Cairos Tahrir Square, or this past summers rioting street protesters in Britain and Greece, or the anti-corruption marchers in New Delhi. These movements mostly lack leaders or clear ideologies, so theyre hard to categorize. But the protesters do share some basics: rejection of traditional political elites; a belief that globalization benefits the rich more than the masses;

anger media.

about

intertwined

business

and

political

corruption;

and

the

connectedness and empowerment fostered by Facebook and other social

This neo-populism is all the more striking because it seems to transcend traditional political boundaries. The Tea Party movement may wear conservative colors, but it arose as a protest against elites in Washington and on Wall Street who were seen to be profiting at the expense of everyday people. Occupy Wall Street comes at these same issues from the left, but the two movements have much in common. The Arab Spring is the worlds most potent populist movement, sweeping away governments in Tunisia, Egypt and Libya. These uprisings began as leaderless explosions of indignation blurring the usual lines of capitalist and socialist, Muslim and Christian. These cleavages have returned, especially in Egypt. But the core of the revolution there remains a rage against traditional elites. Protests in Europe have the same note of mass indignation. In Greece, Italy and even France, you see the anger of the middle class that their debtenfeebled governments cant deliver on welfare-state promises. In some countries, such as Britain and Germany, there is unrest, too, among growing immigrant populations that are not tethered to national cultural or political norms. Even in the boom countries, such as China and India, there is the turmoil that comes with rising expectations. According to Chinas Ministry of Public Security, the country experienced 87,000 incidents of popular unrest in 2005. Thats 238 protests a day! The Chinese stopped publishing the number after that, but it surely hasnt gone down. India, too, has seen a rising tide of protest, symbolized by the mass street marches in the summer that surrounded Anna Hazares hunger strike to protest corruption.

Its a stretch, perhaps, to look for shared themes in such disparate countries. But these movements seem to have a common indignation toward leaders who are failing to maintain social justice along with global economic change. Thats certainly true in America, where the Tea Party and Occupy Wall Street both rage against a financial elite that stumbled into a ruinous recession and then got bailed out by a Washington elite thats in hock to special interests. The Tea Party, especially, tapped the bedrock American mistrust of big banks, which dates to Thomas Jefferson and Andrew Jackson. Growth and prosperity would restore public confidence, as in the past. But this time, the anticipated recovery and deflation of popular anger still seems a few years away. Europes neo-populism will surely increase, as countries struggle with painful economic adjustments. Population is declining in most of Europe, which means there will be fewer young workers to pay for the pensions of retirees. To regain competitiveness and solvency, wages and the quality of life will have to decline in many European countries. Meanwhile, according to a recent study by the National Intelligence Council, by 2025 Western Europes Muslim population could increase to 25 million to 30 million from the current 15 million to 18 million, causing additional strains. Theres no sign yet of a new European political leadership that can accomplish the necessary rewrite of the social contract. Much of the worlds neo-populist anger is justified, given the greed and folly of recent years. What worries me is the echo of the 1930s, a similar period of economic change and dislocation. When the traditional business and political leaders seemed to have failed during the downturn of the 30s, populist indignation veered sharply right and left toward dangerous movements that expressed national indignation at the point of a gun.

America was lucky then to have had, in President Franklin D. Roosevelt, a charismatic politician who could rehabilitate the center. And now? Not so lucky.

Why scientists are smarter than politicians


One of the best things about being an artist is that nobody can tell you youre doing things wrong. Theres no true or false in a Picasso painting, no yes or no in a Mahler composition. That, of course, is how it should be. The opposite is true for science and thats how it should be too. The scientific method is defined by the search for the irreducible truth. The riddle of a disease isnt solved till youve isolated the virus; no particle is fully understood till its been successfully smashed. Its not for nothing that recent news of a neutrino that may have traveled .0025% faster than light is causing such a stir. If that vanishingly tiny anomaly cant be resolved and disproven, a century of physics could collapse. But the stone walls between art and science arent nearly as thick as they seem; indeed, in some ways theyre entirely permeable. Thats a lesson we badly need to learn if were going to make sound policy decisions in an era in which science and politics seem increasingly at odds. In the Oct. 3 issue of TIME, theoretical physicist Lisa Randall of Harvard University made a plea for greater deference to reason in the still-young but already-ugly 2012 presidential campaign. Randall lamented the fundamental disregard for rational and scientific thinking in a political culture in which Texas governor Rick Perry can dismiss evolution as merely a

theory thats out there, and Minnesota Congresswoman Michele Bachmann can traffic in poppycock about the HPV vaccine causing mental retardation. Randalls new book, Knocking on Heavens Door, takes the case one intriguing step further. The book explores some of the biggest ideas in contemporary physics and how they undergird such everyday matters as risk assessment, logic and even our understanding of beauty. But its in her chapter on creativity not a quality always associated with the datacrunching business of science that she makes her most compelling case against the willful know-nothingism that plagues public debate.(Read about why Michele Bachmann is a real GOP contender.) It takes a certain kind of hubris to be a pundit or politician and tell scientists often many, many scientists that theyre wrong about what their studies have shown them. One of the things that makes it easy to make such counterfactual arguments is that there are often studies to back them up. The nonsense about vaccines causing autism began with a now- discredited 1998 paper by British physician Andrew Wakefield that linked the disorder to the measles-mumps-rubella vaccine. A far greater number of studies have shown that climate change is by no means fully understood. Anyone scientist or not can read papers on both sides and seem to come to a wellreasoned conclusion either way. What distinguishes scientists from the rest of us is their ability not just to understand the data but to derive the data which is a bit like the difference between being able to graph a 95-yd. touchdown run and being able to execute one, cutting across the seam and exploiting the gaps in coverage that the average person would never see. Thats what good scientists do every day. The cracks and discrepancies that might seem too small or obscure for some, Randall writes, can be the portal to new concepts and ideas for those who look at the problem the right way.

Thats not easy, and not even all scientists do it artfully or well. Randall cites autistics and not entirely in jest bureaucrats and academics as good examples of how simply having extraordinary technical skills can be meaningless without the creativity to exploit them. She quotes Pushkin, who once said that Inspiration is needed in geometry, just as much as in poetry. Similarly, some of the most touching scenes in the movie Rainman are those in which the autistic lead character recites Abbott and Costellos brilliant Whos on first sketch, hitting all of the words but understanding none of the wit. (Read about the 100 best TV shows of all time.) For any highly accomplished person, creativity begins with the least creative mindset possible a near-obsessive ability to think endlessly about a problem, and indeed an inability not to think about it. Even if golf pros perfect their swing over countless repeated attempts, Randall writes, I dont believe everyone can hit a ball a thousand times without becoming exceedingly bored or frustrated. Tiger Woods could do that and at least before his current woes on the links the results showed not just in championship play, but in flat-out inspirational play. Something similar is true of science too. Once skillsbecome second nature, you can call them up much more easily when you need them, Randall writes. Such embedded skills often continue operating in the background even before they push good ideas into your conscious mind. Larry Page once told Randall that the seed idea for Google came to him in a dream, but that was only after he had been absorbed by the problem for months. We never questioned Woods swing, and we certainly dont question the brilliance of what Page helped invent. But we feel free to sneer at what scientists tell us when it serves our political ends. None of this means we should defer to scientists simply because they have the degrees to back up their claims. That kind of blind belief in the well-

lettered has led to everything from the disgrace that was the eugenics movement to the nincompoopery of the vaccine scare. Whats more, Randall herself is a scientist and not above a little inside-the-clubhouse bias. Still, history has tended to prove the points she makes. Several years ago, when I was writing a book about the polio vaccine, I had the opportunity to spend months wading through the personal papers of Jonas Salk. It was only when I had gone through few the first few thousand letters, memos, notebooks and even scrawled phone messages that it occurred to me that I hadnt stumbled on a single doodle not one. It became something of a game to look for one and finally, deep in a notebook in which Salk was recording data from a mouse study, there it was a tiny triangular design made of perhaps six or seven pen strokes. That was it, the entire body of Jonas Salks art work. And yet the inspiration to create a vaccine that hundreds of other scientists had sought and the millions of lives that were saved as a result of it is surely artistry of a far higher kind. Scientists arent always right, but when they talk, they deserve at least the initial presumption of wisdom. All of us especially the people who seek to lead us could well learn something from listening to what they have to say.

Leaders, we want the whole truth and nothing but the truth
Kishore Mahbubani, a retired Singaporean diplomat, published a provocative essay in The Financial Times on Monday that began like this: Dictators are falling. Democracies are failing. A curious coincidence? Or is it, perhaps, a sign that something fundamental has changed in the grain of human history. I believe so. How do dictators survive? They tell lies. Muammar Gaddafi was one of the biggest liars of all time. He claimed that his people loved him. He also controlled the flow of information to his people to prevent any

alternative narrative taking hold. Then the simple cellphone enabled people to connect. The truth spread widely to drown out all the lies that the colonel broadcast over the airwaves. So why are democracies failing at the same time? The simple answer: democracies have also been telling lies. Mahbubani noted that the eurozone project was created on a big lie that countries could have monetary union and fiscal independence without pain. Meanwhile, in America, added Mahbubani, now the dean of the Lee Kuan Yew School of Public Policy at the National University of Singapore, No U.S. leaders dare to tell the truth to the people. All their pronouncements rest on a mythical assumption that recovery is around the corner. Implicitly, they say this is a normal recession. But this is no normal recession. There will be no painless solution. Sacrifice will be needed, and the American people know this. But no American politician dares utter the word sacrifice. Painful truths cannot be told. Of course, there is a big difference between America and Libya. We can vote out our liars, unlike certain Arab and Asian countries. Still, Mahbubanis comparison warrants some reflection this week, which coincides with the 10th anniversary of 9/11 and the presidents jobs speech. It is a great week for truth-telling. Can you remember the last time you felt a national leader looked us in the eye and told us there is no easy solution to our major problems, that weve gotten into this mess by being self-indulgent or ideologically fixated over two decades and that now we need to spend the next five years rolling up our sleeves, possibly accepting a lower living standard and making up for our excesses? For me, this is the most important thing to say both on the anniversary of 9/11 and on the eve of President Obamas jobs speech. After all, they are intertwined. Why has this been a lost decade? An answer can be found in

one simple comparison: How Dwight Eisenhower and his successors used the cold war and how George W. Bush used 9/11. America had to face down the Russians in the cold war. America had to respond to 9/11 and the threat of Al Qaeda. But the critical difference between the two was this: Beginning with Eisenhower and continuing to some degree with every cold war president, we used the cold war and the Russian threat as a reason and motivator to do big, hard things together at home to do nation-building in America. We used it to build the interstate highway system, put a man on the moon, push out the boundaries of science, teach new languages, maintain fiscal discipline and, when needed, raise taxes. We won the cold war with collective action. George W. Bush did the opposite. He used 9/11 as an excuse to lower taxes, to start two wars that for the first time in our history were not paid for by tax increases, and to create a costly new entitlement in Medicare prescription drugs. Imagine where wed be today if on the morning of 9/12 Bush had announced (as some of us advocated) a Patriot Tax of $1 per gallon of gas to pay for education, infrastructure and government research, to help finance our wars and to slash our dependence on Middle East oil. Gasoline in the U.S. on Sept. 11, 2001, averaged $1.66 a gallon. But rather than use 9/11 to summon us to nation-building at home, Bush used it as an excuse to party to double down on a radical tax-cutting agenda for the rich that not only did not spur rising living standards for most Americans but has now left us with a huge ball and chain around our ankle. And later, rather than asking each of us to contribute something to the war, he outsourced it to one-half of one-percent of the American people. Everyone else yall have fun. We used the cold war to reach the moon and spawn new industries. We used 9/11 to create better body scanners and more T.S.A. agents. It will be

remembered as one of the greatest lost opportunities of any presidency ever. My fervent hope is that on Thursday Mr. Obama will set an example and tell the cold, hard truth to parents and kids. I know. Honesty, we are told, is suicidal in politics. But as long as every solution that is hard is off the table, then our slow national decline will remain on the table. The public is ready for more than Michele Bachmanns fairy-dust promise that she can restore $2 a gallon gasoline. For once, Mr. President, lets start a debate with the truth. Tell us what you really think will be required to get us out of this stagnation, what kind of collective action and shared sacrifice will be needed and why that can lead not just to muddling through, not just to being O.K., but to restoring American greatness.

Is non-violent resistance the most effective way to win?


Nonviolent Resistance Is Admirable but Ineffective. Hardly. In the current geopolitical moment, it may seem hard to argue that a nonviolent uprising is a better tool for uprooting a dictator than the violent kind. Armed rebels, backed by NATO air power, are on the verge of ending four decades of despotic rule by Muammar al-Qaddafi in Libya. Meanwhile to

the east, Syrias Bashar al-Assad has killed with impunity more than 2,200 members of a mostly nonviolent resistance to his familys long-lived rule. Arguing in favor of the Syrians tactics, and against the Libyans, would seem counterintuitive but for the evidence. The truth is that, from 1900 to 2006, major nonviolent resistance campaigns seeking to overthrow dictatorships, throw out foreign occupations, or achieve self-determination were more than twice as successful as violent insurgencies seeking the same goals. The recent past alone suggests as much; even before the Arab Spring, nonviolent campaigns in Serbia (2000), Madagascar (2002), Ukraine (2004), Lebanon (2005), and Nepal (2006) succeeded in ousting regimes from power. The reason for this is that nonviolent campaigns typically appeal to a much broader and diverse constituency than violent insurgencies. For one thing, the bar to action is lower: Potential recruits to the resistance need to overcome fear, but not their moral qualms about using violence against others. Civil resistance offers a variety of lower-risk tactics stay-aways (where people vacate typically populated areas), boycotts, and go-slows (where people move at half-pace at work and in the streets) that encourage people to participate without making enormous personal sacrifices. This years peaceful uprising in Egypt saw the mobilization of men, women, children, the elderly, students, laborers, Islamists, Christians, rich, and poor a level of participation that none of Egypts armed militant organizations in recent memory could claim. Nonviolent Resistance and Pacifism Are the Same Thing. Not at all. When people hear the word nonviolent, they often think of peaceful or passive resistance. For some, the word brings to mind pacifist groups or individuals, like Buddhist monks in Burma, who may prefer death to using violence to defend themselves against injustice. As such, they conflate nonviolent or civil resistance with the doctrine of nonviolence or pacifism, which is a philosophical position that rejects the use of

violence on moral grounds. But in civil resistance campaigns like those occurring in the Arab Spring, very few participants are pacifists. Rather, they are ordinary civilians confronting intolerable circumstances by refusing to obey a method available to anyone, pacifist or not. Even Mahatma Gandhi, the iconic pacifist, was a highly strategic thinker, recognizing that nonviolence would work not because it seized the moral high ground, but because massive noncooperation would ultimately make the British quit India: We should meet abuse by forbearance, he said. Human nature is so constituted that if we take absolutely no notice of anger or abuse, the person indulging in it will soon weary of it and stop. Nonviolent Resistance Works Better in Some Cultures Than Others. Wrong. Nonviolent movements have emerged and succeeded all over the world. In fact, the Middle East routinely written off by people elsewhere as a hopeless cauldron of violence can boast some of the biggest successes, even before the Arab Spring. The Iranian Revolution that took down Shah Mohammed Reza Pahlavis dictatorial regime and brought Ayatollah Ruhollah Khomeini to power was a nonviolent mass movement involving more than 2 million members of Iranian society (though also a useful reminder that nonviolent uprisings, like the violent kind, dont always produce the results one might hope for). Palestinians have made the most progress toward selfdetermination and lasting peace with Israel when they have relied on mass nonviolent civil disobedience, as they did in the demonstrations, strikes, boycotts, and protests that dominated the First Intifada from 1987 to 1992 a campaign that forced Israel to hold talks with Palestinian leaders that led to the Oslo Accords, and convinced much of the world that Palestinians had the right to self-rule. In the Americas, Venezuela, Chile, Argentina, and Brazil have all experienced nonviolent uprisings, ousting military juntas and at times replacing them with democratically elected leaders. South Africas nonviolent anti-apartheid

campaign fundamentally altered the political, social, and economic landscape there, while the African National Congresss forays into revolutionary violence yielded little. Europe, of course, can claim some of the most iconic examples: the 1989 Eastern European revolutions, for instance, and the Danish resistance to the Nazi occupation during World War II. And in Asia, successful nonviolent resistance has succeeded in casting off oppressive regimes in places as diverse as India, the Maldives, Thailand, Nepal, and Pakistan. Nonviolent Movements Succeed by Persuasion. Not always. The moral high ground is necessary, but hardly sufficient. Campaigns need to be extremely disruptive and strategically so to coerce entrenched dictators to abandon their posts. Nonviolent resistance does not necessarily succeed because the movement convinces or converts the opponent. It succeeds when the regimes major sources of power such as civilian bureaucrats, economic elites, and above all the security forces stop obeying regime orders. The literary scholar Robert Inchausti put it well when he said, Nonviolence is a wager not so much on the goodness of humanity, as on its infinite complexity. As in war, the key for a nonviolent campaign is to find and exploit the opponents weaknesses. Take the recent uprising in Egypt. In the first days of the uprising, military and security forces cracked down heavily on protests. But the demonstrators were prepared: Activists influenced by recent nonviolent revolutions elsewhere circulated instructions to protesters detailing how to respond to the crackdown and began placing women, children, and the elderly on the front lines against the security forces. The handouts encouraged protesters to welcome the soldiers into the ranks of the movement and strongly forbade any violence against them. Movement leaders also made sure that repressive acts against peaceful protesters were caught on video and publicized.

Ultimately, the Egyptian Army refused orders to suppress the campaign and Hosni Mubaraks regime lost one of its key centers of power. Here again is an advantage that nonviolent groups have over armed guerrillas: Loyalty shifts among the security forces are difficult for small, clandestine, violent groups to achieve. Violent threats typically unite the security forces, who join together to defend against them (which is precisely why the Syrian regime insists it is fighting armed groups rather than unarmed civilians). Only Weak or Weak-Willed Regimes Fall to Nonviolent Uprisings. Not true. Many nonviolent campaigns have succeeded against some of the bloodiest regimes on Earth, at the height of their power. In fact, a vast majority of the major nonviolent campaigns in the 20th century were facing down regimes such as Gen. Muhammad Zia ul-Haqs in Pakistan, Slobodan Milosevics in Serbia, Augusto Pinochets in Chile, Suhartos in Indonesia, and various imperial rulers who were clearly invested in maintaining power over their colonies. During the famed Rosenstrasse incident in Berlin in 1943, for example, even the Nazis showed their vulnerability to nonviolent protests, when German women organized protests and faced down SS machine guns to demand the release of their Jewish husbands a small victory against one of historys most genocidal regimes, and an unthinkable one had the protesters taken up arms. In fact, almost all major nonviolent campaigns of the 20th and early 21st centuries have faced massive and violent repression. In Pinochets Chile, for instance, the regime often used torture and disappearances to terrorize political opposition. In such circumstances, engaging in visible mass protest would have been highly risky for those opposing the government. So in 1983, civilians began to signal their discontent by coordinating the banging of pots and pans a simple act that demonstrated the widespread support for the civilians demands and showed that Pinochet would not be able to suppress the movement with the tools at his disposal. People also walked through the

streets singing songs about Pinochets impending demise a practice that so irked the general that he banned singing. But such desperate measures demonstrated his weakness, not his strength. Ultimately, Pinochet caved and agreed to hold a 1988 referendum on the question of whether he would serve an additional eight years as president. Opposition leaders took the opportunity to organize nonviolent direct actions that focused on coordinating no votes, obtaining an independently verifiable vote count, and holding Pinochet accountable to the results. When it was clear that Pinochet had lost, the military ultimately sided with the Chilean people, and Pinochet stepped aside. Sometimes Rebels Have No Choice but to Take Up Arms. Not true. The current civil conflict in Libya, its easy to forget now, began with nonviolent protests in Benghazi around Feb. 15. The demonstrations were summarily crushed, and by Feb. 19, oppositionists had responded by taking up arms, killing or capturing hundreds of Qaddafis mercenaries and regime loyalists. In his infamous Feb. 22 speech, Qaddafi said, Peaceful protest is one thing, but armed rebellion is another, and threatened to go house by house in search of the rebel rats. Few civilians would be willing to participate in unarmed resistance after such threats, and what had begun as a peaceful movement unequivocally became an exclusively violent rebellion. It appears now to have been a success, but one that came at an enormous cost: Although an accurate death toll for the conflict is thus far impossible to come by, some counts midway through the war put the casualties as high as 13,000 deaths. Could it have been otherwise? Hindsight is 20/20, of course, but if Libyas activists had a chance to evaluate their experience, they may have recognized a few mistakes. First, the movement appeared to have been fairly spontaneous, unlike the well-planned, highly coordinated campaign in Egypt. Second, the nonviolent movement may have focused too much on a single

tactic protests to pursue its aims. When movements rely exclusively on rallies or protests, they become extremely predictable: sitting ducks for regime repression. Successful movements will combine protests and demonstrations with well-timed strikes, boycotts, go-slows, stay-aways, and other actions that force the regime to disperse its repression in unsustainable ways. For example, during the Iranian Revolution, oil workers went on strike, threatening to cripple the Iranian economy. The shahs security forces went to the oil workers homes and dragged them back to the refineries at which point the workers worked at half-pace before staging another walkout. This level of repression required to force the masses to work against their will is untenable because it requires a massive coordination of regime resources and effort. In fact, what we know from previous cases, such as Iran, is that the kind of violent reprisal Qaddafi used against the nonviolent uprising at the outset is often unsustainable against coordinated nonviolent movements over time. Moreover, the rebels nearly immediate turn to violent resistance evoked the strongest reaction from Qaddafi, and it immediately excluded large numbers of people who might have been willing to regroup and brave the streets against Qaddafi but who had no interest in joining what was sure to become a nasty fight. Before NATO lent its support, the largest gains the Libyan opposition made were during the nonviolent phase of the uprising, which involved massive protests that shut down the country, elicited numerous defections from key regime functionaries, and even led to the taking of Benghazi without significant bloodshed. But once the rebels reacted to Qaddafis repression by taking up arms, they required NATO intervention to stand a chance. Or consider Syria, where the decision to use violence or not is similarly wrenching. In August, following months of peaceful mass protests, Assad ordered a full-scale military bombardment of Hama, a largely Sunni city known for an armed Islamist uprising that was even more brutally crushed in

the 1980s, and other opposition strongholds across the country. Time to grab your gun, right? Even in such cases, nonviolent movements have choices. They could respond to regime violence by switching tactics. In fact, Syrian activists have been doing this well, avoiding regime repression by using flash mobs and nighttime protests, which are more difficult to repress. Daytime protests are now well-planned, with multiple escape routes and mirrors to blind snipers trying to shoot protesters. Syrian activists have also so far largely avoided the temptation to respond to regime provocations with violence a critical decision, not only because taking up arms may undermine their domestic bases of participation and support, but also because it makes security forces more likely to obey orders to repress the movement. Because the regime has expelled journalists and cut off electricity in cities under siege, Syrian activists charge their laptops using car batteries and make fake IDs to get close to security forces so they can document human rights abuses and share them online. The continued mobilization resulting from these acts may help the opposition forge indispensable links with regime elites. Nonviolent resistance is, in effect, a form of asymmetric warfare. Dictators predictably rely on their perceived advantages in brute force to defeat challengers. Its best to fight the enemy where you have an advantage in this case, people power, unpredictability, adaptability, and creativity rather than where he does. Nonviolent Uprisings Lead to Democracy. Not necessarily. There is a strong empirical association between nonviolent campaigns and subsequent democratization, which shouldnt be terribly surprising: Higher levels of political participation and civil society factors that make a nonviolent uprising more likely to take root tend to lead to higher levels of democracy. But there are important exceptions. The Iranian Revolution one of the worlds largest and most participatory nonviolent

uprisings eventually ushered in a theocratic and repressive regime. The Philippines has endured several major nonviolent revolutions and continues to struggle with democratic consolidation and corruption. The largely successful Orange Revolution in Ukraine seemingly heralded a new era of political liberalization, but recent setbacks suggest the country is reversing course. But none of these outcomes would likely have improved if the revolutions had been violent. In fact, in most countries where violent revolution has succeeded, the new regimes have been at least as brutal as their predecessors as anyone who has lived in the aftermath of the Russian Revolution, the French Revolution, the Afghan civil war, or the Cuban Revolution could tell you. As Nobel laureate Aung San Suu Kyi, the leader of the Burmese pro-democracy movement, put it, It is never easy to convince those who have acquired power forcibly of the wisdom of peaceful change. The bottom line is that while nonviolent resistance doesnt guarantee democracy, it does at least more or less guarantee the lesser of the various potential evils. The nature of the struggle can often give us a good idea of what the country will be like after the new regime takes shape. And few people want to live in a country where power is seized and maintained by force alone.

The world is adrift without a leader


THE demand for global leadership has never been greater. The world is truly lost in trying to find a way out of the current crisis. America is imploding. Europe is crumbling. London is burning. The Arab Spring has lost direction.

China and India remain internally preoccupied. If ever there were a moment for a global leader to step up, this is it. So why is no leader emerging?

First, the world has changed structurally, yet our systems for managing

global affairs have not adapted. In the past, when the billions of citizens of Planet Earth lived in separated countries, it was like having an ocean of separate boats. Hence, the postwar order created rules to ensure that the boats did not collide; it created rules for cooperation.

Up until now, this arrangement has worked well. World War III did not follow World Wars I and II. But today the worlds seven billion citizens no longer live in separate boats. They live in more than 190 cabins on the same boat. Each cabin has a government to manage its affairs. And the boat as a whole moves along without a captain or a crew. The world is adrift. The G-20 was set up to provide global leadership at the height of the latest financial crisis. The group came together in London in early 2009 to save the global economy. However, as soon as the crisis receded, the G-20 leaders retreated into their cabins again. To make matters worse, some nations have become unmanageable. Just look at the United States. The best candidate for global leader is, of course, President Barack Obama. No leader gets as much global press coverage as Mr Obama does. But he has no time to save the world. This summer, a tiny group of crazy Tea Party congressmen held him, the United States and the world hostage.

In the next 14 months, Mr Obama will only focus on his re-election. The world will not matter. Sadly, no European leader seems ready to fill this vacuum. Nor is there a Chinese or Indian leader willing to step up. Our global boat will continue to drift in the coming months.

The second reason no global leader has emerged: the geopolitics of

the world are running at cross purposes with the geoeconomics of the world. Geoeconomics requires consensus; countries coming together. In geopolitics, we are experiencing the greatest power shifts we have seen in centuries. Power is shifting from West to East. All this creates deep insecurity in the established powers. They want to cling on to privileges acquired from previous days of glory.

Only this can explain the rush by Europe to reclaim the headship of the International Monetary Fund when Dominique Strauss-Kahn stepped down. No one doubts that Ms Christine Lagarde is a competent administrator. But is it wise for Europe to cling on to old privileges when power is shifting? And is it wise to choose a non-economist to run the most important economics organisation at a time of economic turmoil? A secure Europe may have ceded power graciously. An insecure Europe clings to privileges. Third, political leadership is always preceded by intellectual leadership.

For several decades, the Western intelligentsia provided this intellectual leadership. Indeed, they used to happily lecture the world on what should be done. Today, they are clearly lost.

As an Asian, I used to be regularly lectured by Westerners on the inability of Asians to slay their sacred cows. Today, the Western intelligentsia seems equally afraid to attack their own sacred cows. Surely, after the damage done by the Tea Party episode, an obvious question to ask is: Have democracies become dysfunctional? Have special interest groups distorted the global agenda? Should some of them be disbanded? Sadly, the parameters of intellectual discourse in the West have become narrower and narrower. Short-term political fights take precedence over long-term strategic decisions. Only one phrase captures the current Asian perception of the West: sheer incredulity. How could the best preachers on political courage and economic discipline in the world display none of it when the hour came? In short, we are not going to get any great global leadership soon. And if we continue to drift, we will at least know why.

"Failed States Are a Threat to U.S. National Security."


Only some of them. It has been a truism of U.S. foreign
policy since the 9/11 terrorist attacks that the United States is, in the words of President George W. Bush's 2002 National Security Strategy, "threatened less by conquering states than we are by failing ones." Defense Secretary Robert Gates has said that over the next 20 years, the gravest threats to America will come from failing states "that cannot meet the basic needs -- much less the aspirations -- of their people." Both as candidate and as president, Barack Obama has repeated this claim and has sought to reorient policy toward the prevention of state failure.

But the truth is that some state failure poses a real danger to the United States and the West, and some does not. Consider the Democratic Republic of the Congo, where some 5 million or more people have died in the wars that have convulsed the country since the mid-1990s -- the single most horrific consequence of state failure in modern times. What has been the consequence to Americans? The cost of coltan, a material mined in Congo and used in cell phones, has been extremely volatile. It's hard to think of anything else. Even the role of failed states in global terrorism may have been overstated. To start, terrorism is only a problem in failed states with significant Muslim populations -- admittedly, 13 of the top 20 in this year's Failed States Index. But the correlation between failure and global menace is weaker than we think. Islamist militants in unequivocally failed Muslim states such as Somalia, or profoundly weak ones such as Chad, have thus far mostly posed a threat to their own societies. They are surely less of a danger to the West than Pakistan or Yemen, both at least somewhat functional countries where state ideology and state institutions abet terrorists. In his new book, Weak Links, scholar Stewart Patrick concludes that "a middle-ranking group of weak -- but not yet failing -states (e.g., Pakistan, Kenya) may offer more long-term advantages to terrorists than either anarchic zones or strong states." (See "The Brutal Truth.") Terrorists need infrastructure, too. The 9/11 attacks, after all, were directed from Afghanistan, but were financed and coordinated in Europe and more stable parts of the Muslim world, and were carried out mostly by citizens of Saudi Arabia. Al Qaeda is a largely middle-class organization. A similar pattern plays out in the world of transnational crime. Take the three-cornered drug market that links cocaine growers in

Latin America, traffickers in West Africa, and users in Europe. The narcotraffickers have found the failed states of West Africa, with their unpatrolled ports and corrupt and undermanned security forces, to be perfect transshipment points for their product. Drugs are dumped out of propeller planes or unloaded from ships just off the coast of Guinea, Guinea-Bissau, or Sierra Leone, and then broken into smaller parcels to be shipped north. But the criminal gangs operate not out of these Hobbesian spaces but from Ghana and Senegal -- countries with reliable banking systems, excellent air connections, pleasant hotels, and innumerable opportunities for money laundering. The relationship is analogous to that between Afghanistan, whose wild spaces offer al Qaeda a theater of operations, and Pakistan, whose freewheeling urban centers provide jihadists with a home base.

"Failed States Are Ungoverned Spaces."


Not necessarily. Somalia, the land of the perpetual war of all
against all, is our beau ideal, so to speak, of the failed state, and for the fourth year running it is No. 1 on the Failed States Index. Nobody can match Somalia for anarchy, but elsewhere in the world, government, rather than its absence, is chiefly to blame for state failure. Consider Sudan, where the state, deploying its national army as well as paramilitaries, fomented the violence that has dominated Sudanese life for decades and placed it near the very top of the index. Somali violence is a symptom of state failure; Sudanese violence is a consequence of state policy. Grard Prunier, a prominent Africa scholar, has written that since coming to power in 1989, Sudanese President Omar Hassan al-

Bashir has adopted a policy toward restive ethnic groups that is "verging on genocide." The same was true in Burundi in the 1990s, where Hutu governments massacred Tutsis, after which the Tutsis turned around and did the same to Hutus. In these and other failed states, mass atrocity has almost become an accepted form of politics. A categorical divide, albeit a sometimes blurry one, separates two classes of failed states. A country like Somalia is incapable of forming and executing state policy; it is a hapless state. States like Sudan, by contrast, are precarious by design. Or take Pakistan, which has followed clear and consistent policies, laid down by the military, since its inception in 1947. Unlike Somalia, or, for that matter, its neighbor Afghanistan, Pakistan is an intentional state. But just as Sudanese policy has provoked decades of violence by pitting the state against the periphery, so the cultivation of jihadi groups by the Pakistani military and intelligence services -- as a counterweight to India and a source of "strategic depth" in Afghanistan -- has turned Pakistan into a cockpit of terrorist violence. Pakistan does, of course, have ungoverned spaces, in the Pashtun-dominated badlands along the border with Afghanistan. But the country's military leaders have made a strategic choice to allow the Pashtuns to govern themselves there, the better to be able to use them against their alleged adversaries. Intentional states, in short, often pose far greater threats to the world than hapless ones do.

"Failed States Are the West's Fault."


If only. The colonial powers, especially the more heedless ones,
undoubtedly dumped their former possessions on the threshold of independence with little if any preparation for statehood. Think of

Congo, which Belgium's King Leopold II ruled as the chief executive of a private company dedicated to the extraction of raw materials under conditions of virtual enslavement, and whose entire population at independence in 1960 included not a single person with a graduate degree in any subject. Others, like nevercolonized Afghanistan, were shredded in the savage crossfire of the Cold War. But how can you hold the West responsible for states like Iraq (at least before 2003), Ivory Coast, Kenya, and Zimbabwe, all of which enjoyed relative prosperity and stability in the first decades after emerging from rule by a Western power? Or what about Haiti, which threw off the yoke of French colonialism in the time of Napoleon, but never acquired more than the trappings of statehood in the two centuries since? Less than half of the dozen most-failed countries can reasonably blame their Western parents for their plight. Why, after all, is Pakistan No. 12 on the list and India No. 76, despite sharing the same history of British colonization? Why is Ivory Coast 10 and Senegal 85, when both were under French rule? Same colonial upbringing, very different outcomes.

"Some States Were Born to Fail."


Unfortunately true. Although some failed states have no
one but themselves -- or rather, their corrupt or brutal political elites -- to blame, others never had a chance to start with. Here we face a problem of nomenclature. The very expression "failed" falsely implies a prior state of success. In fact, many countries in the upper tiers of the Failed States Index never emerged into full statehood. Fourteen of the 20 highest-scoring states are African, and many of them, including Nigeria, Guinea, and, of course,

Congo, consisted at birth of tribes or ethnic groups with little sense of common identity and absolutely no experience of modern government. (Perhaps in this more limited sense one can blame colonialism, because it was the European powers that drew the dubious borders.) They are, in novelist V.S. Naipaul's expression, "half-made societies," trapped between a no-longerusable past and a not-yet-accessible future. They "failed" when modernity awakened new hopes and appetites (and rivalries) that overwhelmed the state's feeble institutions or that leaders sought to master and exploit. What is the world to do about such misbegotten states? One answer is that you seek to minimize the harm that comes from them, or to them -- by stemming the flow of drugs into and out of Guinea, say, or by using peacekeeping troops to prevent the spillover of violence from Darfur and Chad into the Central African Republic. You bolster the regional and subregional organizations in their neighborhoods (the African Union, or ECOWAS). And you acknowledge that even in places that pose no meaningful threat to the West, a moral obligation to relieve suffering requires that those who can help do so.

"The United States Needs a FailedStates Policy."


Maybe not. One of the standing critiques of the Obama
administration's foreign policy is that, though the president has spoken frequently of the danger posed by state failure, he has never formulated a coherent policy to prevent or cure it. The administration has been sensitive on this score; during her recent tenure as head of policy planning at the State Department, Anne-

Marie Slaughter suggested that the U.S. civilian-military counterinsurgency strategy in Afghanistan could be viewed as a "petri dish" for such a policy and that the post-earthquake statebuilding effort in Haiti, with its high level of collaboration with international partners, could serve as an alternative model. But today, even advocates of the administration's large-scale effort in Afghanistan acknowledge that the attempt to spread good governance there has largely failed, while even a year after the Haiti quake the state-building effort there has barely even begun. Perhaps the problem lies with our habit of thinking of failed states monolithically. What can it mean to have a policy that covers both Haiti and Afghanistan? What template could dictate a useful set of choices for U.S. officials in both Yemen, where state failure poses a direct threat to U.S. interests, and the Central African Republic, which has no strategic significance? And what policy would supply any useful options at all for Somalia, a wasteland that appears to be impervious to all forms of outside meddling, benevolent or malign? In this case, policy coherence may be overrated. The Obama administration is certainly seeking such coherence. The State Department's Quadrennial Diplomacy and Development Review, a novel effort to marshal the tools of "soft power," repeated the criticism about the absence of an overarching policy, but also placed a welcome emphasis on the need to develop civilian capacity to actually do whatever it is policymakers decide needs to be done. At present, meaningful U.S. policy options are undermined by the absence, at least outside the armed forces, of operational or "expeditionary" capacity: police trainers, sanitation experts, public-health officials, forensic accountants, and lawyers (yes, lawyers) who can be deployed to fragile states or postconflict settings. You need people to do things. Unfortunately, congressional Republicans seem determined to gut any and all

increases in nonmilitary capacity. Conservatives seem more comfortable with old-fashioned threats from powerful countries like China, Iran, and Russia. Perhaps they're not troubled by the absence of a failed-states strategy because they don't worry about failed states.

"Military Intervention Never Works."


Wrong. The fixity of the failed-states rankings from year to year
reminds us that the multiple diseases that plague these places are very resistant to being cured, whether by domestic actors or outsiders. Certainly the examples of Afghanistan and Haiti, the petri dishes of 2010, are not encouraging. But there are a few rays of light -- all of which, oddly enough, have involved military intervention. Liberia and Sierra Leone have been pulled back from the brink of utter chaos in recent years, and both are now at peace. The same may be true of Ivory Coast in future years; it's still too early to tell after this year's brief and bloody post-election civil war. Iraq, a country whose descent seemed to have no bottom five years ago, has improved its standing on the index as sectarian violence has diminished over the last year, from No. 7 to No. 9. The inference to be drawn is not that the solution to failed states is to send in the Marines, but rather that, at moments of supreme crisis, outsiders can bend the trajectory of failed states by using force to topple monstrous leaders or prevent them from gaining power. But intervention is itself a sign of failure, a failure to anticipate the moment of crisis. Any new policy toward failed states needs to focus on prevention rather than reaction, not only to avoid the need for military force, but also because in many places intervention simply will not be possible. You want to know

now that, say, Thailand is at risk of political crisis, because while neighboring countries and Western powers have diplomatic tools they can use to avert calamity, there may be little they can do once violence breaks out. The supreme example of the dire consequences of ignoring early warnings is, of course, Rwanda, where U.N. officials and the Security Council ignored repeated warnings of an impending genocide and reacted only when it was too late to stop the killing.

"Failed States Can't Be Helped."


Some of them can. What can outsiders do when this
moment of leverage has passed? What can they do to promote reconciliation among tribes in Kenya, to bolster civilian rule in Pakistan, to help create an economic base to replace dwindling supplies of oil in Yemen? These are, of course, profoundly different questions, but they do have one common answer: It depends on the willingness of the state to be helped. Outsiders can do little in Zimbabwe so long as Robert Mugabe remains in power, for Mugabe is prepared to wreck his country in order to preserve his rule over it. The best thing outsiders can do is pressure or bribe him and his immediate circle into leaving. On the other hand, outsiders may be able to accomplish a great deal in Liberia, where President Ellen Johnson Sirleaf has invited U.N. officials to operate from inside the country's ministries in order to provide expertise and prevent abuse. The same contrast may apply between Sudan, an autocracy afloat on oil wealth, and Southern Sudan, a new country born naked and helpless, but with a legitimate political leadership (though there is a real danger that Sudan's abrupt seizure of the border territory of Abyei could plunge both countries into a spiral of violence).

It is tempting to view the problem of failed states in technocratic terms. In Fixing Failed States, Ashraf Ghani and Clare Lockhart argue that failed states need to be connected to global markets and have their innovative energies unshackled. They do -- but ruthless dictators view economic and political freedom as a threat to their rule. The generals who run Burma will make sure that no one save themselves and their friends benefits from global markets. There's no escaping politics, and political will. The hapless states, like Liberia, want help, and sometimes they can be helped. The intentional states, like Burma or Sudan, will exploit outside help for their own purposes. Unfortunately, it's the intentional states, by and large, that pose the greatest threat to the United States and the West. So here's a proposal: Maybe we can formulate a new kind of failing-states policy, one to help the deserving states, those that can be helped, and minimize the harm from the others.

"The Berlin Wall Has Fallen in the Arab World."


Yes and no. It's tempting to compare the astonishing wave of
political upheaval in the Arab world to the equally dramatic wave of political change that swept Central and Eastern Europe in 1989. In the Middle East today, as in 1989, extraordinary numbers of ordinary people are courageously and for the most part nonviolently demanding a better future for themselves and their children. The wave broke just as suddenly and was almost entirely unpredicted by experts both inside and outside the region. And the process of cross-country contagion -- the political sparks

jumping across borders almost instantaneously -- has also been strikingly reminiscent of Central and Eastern Europe in that fateful year. Yet the 1989 analogy is misleading in at least two major ways. First, the communist governments of Central and Eastern Europe had been imposed from the outside and maintained in place by the Soviet Union's guarantee -- the very real threat of tanks arriving to put down any serious insurrection. When Soviet power began to crumble in the late 1980s, this guarantee turned paper thin and the regimes were suddenly deeply vulnerable to any hard push from inside. The situation in the Arab world is very different. For all you hear about America's support for dictators, the Arab autocracies were, or are not, held in place by any external power framework. Rather, they survived these many decades largely by their own means -- in some cases thanks to a certain amount of monarchical legitimacy well watered with oil and in all cases by the heavy hand of deeply entrenched national military and police forces. True, the United States has supplied military and economic assistance in generous quantities to some of the governments and did save Kuwait's ruling Sabah family from Iraqi takeover in 1991. Generally, however, the U.S. role is far less invasive than that of the Soviets in Central and Eastern Europe -- just ask American diplomats if they feel like Saudi Arabia's independentminded King Abdullah is a U.S. puppet. What's more, though citizens in some Arab countries have given their governments a hard push, the underlying regimes themselves -- the interlocking systems of political patronage, security forces, and raw physical coercion that political scientists call the "deep state" -- are not giving up the ghost but are hunkering down and trying to hold on. Shedding presidents, as in

Tunisia and Egypt, is a startling and significant development, but only partial regime collapse. The entrenched security establishments in those countries are bargaining with the forces of popular discontent, trying to hold on to at least some parts of their privileged role. If the protesters are able to stay mobilized and focus their demands, they may be able to force a step-bystep dismantling of the old order. Elsewhere in the region however, the changes so far are less fundamental. Second, Middle East regimes are much more diverse than was the case in Central and Eastern Europe. The Arab world contains reformist monarchs, conservative monarchs, autocratic presidents, tribal states, failing states, oil-rich states, and waterpoor states -- none of which much resemble the sagging bureaucratic communist regimes of Central and Eastern Europe in the late 1980s. Thus, even if the transition process in one or two Arab states does end up bearing some resemblance to what occurred in Central and Eastern Europe, what takes place elsewhere in the region is likely to differ from it fundamentally. Anyone trying to predict the political future of Libya or Yemen, for example, will not get very far by drawing comparisons to the Poland or Hungary of 20 years ago.

"Middle East Transitions Are More Like Sub-Saharan Africa."


Worth considering. Central and Eastern Europe in 1989 is
not the only historical analogy political observers are proffering. Some are citing Iran in 1979; others Europe in 1848. A more telling analogy has not been receiving sufficient attention -- the wave of authoritarian collapse and democratic transition that swept through sub-Saharan Africa in the early 1990s. After more

than two decades of stultifying strongman rule in sub-Saharan Africa following decolonization, broad-based but loosely organized popular protests spread across the continent, demanding economic and political reforms. Some autocrats, such as Mathieu Kerekou in Benin, fell relatively quickly. Others, like Daniel arap Moi in Kenya, hung on, promising political liberalization and then reconsolidating their positions. Western governments that had long indulged African dictators suddenly found religion on democracy there, threatening to withhold aid from countries whose leaders refused to hold elections. The number of electoral democracies grew rapidly from just three in 1989 to 18 in 1995, and a continent that most political analysts had assumed would indefinitely be autocratic was replete with democratic experiments by the end of the decade. Of course, like all such analogies, the comparison to today's Middle East is far from exact. The oil-rich Arab states are much wealthier than any African states were in 1990. The significant presence of monarchical systems in the Arab world has no equivalent in sub-Saharan Africa -- as much as delusional "kings" like Swaziland's Mswati III would have it otherwise. The sociopolitical role of Islam in Arab societies is quite different from the role of Islam or other religions in many sub-Saharan African countries. Yet enough political and economic similarities exist to give the analogy as much or more utility than those of Central and Eastern Europe in 1989, Europe in the mid-19th century, and Iran facing Ayatollah Khomeini and his followers. It would behoove policymakers to quickly study up on the sub-Saharan African experience to understand how and why the different outcomes evident on the subcontinent -- shaky democratic success in some cases (Ghana and Benin), authoritarian reconsolidation in others

(Cameroon and Togo), and civil war in still others (Democratic Republic of the Congo) -- took place.

"Democracy's Long-Term Chances Are Slim."


Don't give up hope. During the heyday of democracy's
global spread in the 1980s and 1990s, democracy enthusiasts tended not to pay much attention to the underlying social, economic, and historical conditions in countries attempting democratic transitions. Democracy appeared to be breaking out in the unlikeliest of places, whether Mongolia, Malawi, or Moldova. The burgeoning community of international democracy activists thought that as long as a critical mass of people within a country believed in and pushed for democracy, unfavorable underlying conditions could be overcome. Two decades later, democracy enthusiasts are chastened. Democracy's "third wave" produced a very mixed set of outcomes around the world. Many once hopeful transitions, from Russia to Rwanda, have fallen badly short. Given these different results, it has become clear that underlying conditions do have a big impact on democratic success. Five are of special importance: 1) the level of economic development; 2) the degree of concentration of sources of national wealth; 3) the coherence and capability of the state; 4) the presence of identity-based divisions, such as along ethnic, religious, tribal, or clan lines; and 5) the amount of historical experience with political pluralism. Seen in this light, the Arab world presents a daunting picture. Poverty is widespread; where it is not present, oil dominates. Sunni-Shiite divisions are serious in some countries; tribal tensions haunt others. In a few countries, like Libya, the coherence of basic state institutions has long been shockingly low.

In much of the region, there is little historical experience with pluralism. A hard road ahead for democracy is almost certain. Yet within the political and economic diversity of the Arab world lie some grounds for hope. Tunisia's population is well educated and a real middle class exists. Egypt's protests have shown the potential for cross-sectarian cooperation. Bahrain, Jordan, Kuwait, and Morocco have parliamentary institutions with significant experience in multiparty competition, however attenuated. Additionally, the five factors mentioned above are indicators of likelihood, not preconditions. Their absence only indicates a difficult path, not an impossible one. After all, India failed this fivepart test almost completely when it became independent, but has made a good go of democracy. Returning to the analogy of subSaharan Africa in the 1990s, at least one-third of African states have made genuine democratic progress despite facing far more daunting underlying conditions.

"Islamists Will Win Big in Free and Fair Elections."


Not necessarily. Many observers watching the events in the
Arab world worry that expanding the political choices of Arab citizens will open the floodgates to a cascade of Islamist electoral landslides. They invoke the experience of Islamist victories in Algeria in 1991 and Palestine in 2006 as evidence for their concern. This fear is overblown. Elections in which Islamists do well grab international attention, but do not represent the norm. Islamist parties have a long history of electoral participation in Muslim countries but usually only gain a small fraction of the vote. In their extensive study of Islamist political participation,

published in the April 2010 Journal of Democracy, Charles Kurzman and Ijlal Naqvi find that most Islamist parties win less than 10 percent of the vote in the elections in which they participate. It is true that after decades of autocracy, secular opposition parties in most Arab societies are weak and Islamists are sometimes the most organized alternative. Yet organization itself does not automatically guarantee electoral success. Given the powerful role of television and the Internet, electoral campaigns have often become as much about mass-media appeal as grassroots work. Especially in new democracies, charismatic candidates leading personalistic organizations and offering vague promises of change sometimes win out over better-organized groups (think President Susilo Bambang Yudhoyono of Indonesia). In addition, Islamists may inspire strong loyalty among their core supporters, but winning elections requires appealing to the moderate majority. The protests sweeping the Arab world have so far been notable for their lack of Islamist or sectarian sentiment, and nowhere among the countries in flux is there a charismatic religious leader such as Ayatollah Khomeini ready to seize power. An Islamist victory somewhere in the Middle East can't be ruled out, but that does not mean we will see a replay of Iran circa 1979. Never in the Arab world have any Islamist election gains resulted in a theocracy, and established Islamist parties across the region have proved willing to work within multiparty systems. Moreover, newly elected Islamists would not have free rein to impose theocracy. Whoever is elected president in Tunisia or Egypt will face mobilized populations with little patience for fresh dictatorial methods as well as secular militaries likely to resist any theocratic impulses.

"Democracy Experts Know How to Help."


Yes, but humility is needed. With all the experience of
attempted democratic transitions around the world over the past 25 years, is a clear "transition tool kit" now available to would-be Arab democrats? Western democracy promoters are indeed hurrying to organize seminars and conferences in the region to present lessons learned from other parts of the world. Former activists from Chile, the Philippines, Poland, South Africa, and other new democracies are in demand. Arab activists are eager to learn from the experience of others, and in many cases their hunger for knowledge is intense. The traveling transition veterans tend to showcase a fairly common set of lessons: Opposition unity is crucial. Constitutional reform should be inclusive. Elections should not be hurried, but also not put off indefinitely. Banning large swaths of the former ruling elite from political life is a mistake. Putting a politically grasping military back in the box should be approached step by step, rather than in one big swoop. Given likely public disenchantment with the fruits of democracy, finding rapid ways to deliver tangible economic benefits is critical. Such lessons are valid, but the notion of a workable tool kit is dubious. Every Arab country, be it Morocco, Bahrain, or Yemen, has such particular local sociopolitical conditions that supposedly universal lessons will be only suggestive at best. Nostrums about the importance of opposition unity, for example, often derive from countries such as in Chile, Poland, and South Africa, where political opposition movements enjoyed a much higher level of organization and concentration than what exists in the Arab states. Forging unity in a situation like Egypt or Tunisia where

mass protest movements draw from dozens of smaller, often informal opposition groups and lack any strong leaders is a very different matter. A six-month wait for elections may be reasonable in some cases, but much too short in others. Moreover, these lessons are often useful more as descriptions of endpoints rather than processes. The hard part -- how to achieve them -- has to be figured out case by case.

"Europe Has a Major Role to Play."


It could. Given its proximity to the Arab world, its welter of
diplomatic ties, the weight of its commercial connections, its ample aid budgets, and much else, Europe should be a major player in the region's political evolution. During the last two decades, Europe set up gleaming cooperative frameworks to intensify economic and political ties with the region, including firmly stated principles of democracy and human rights. Yet the first of these, the so-called Barcelona Process, proved to be toothless on democracy and rights. Its successor, French President Nicolas Sarkozy's Union for the Mediterranean, was even weaker. A whole series of obstacles have inhibited Europe's efforts to turn its value-based intentions into real support for Arab political reform. European leaders fear that political change might produce refugee flows heading north, disruption in oil supplies, and a rise in radical Islamist activity. Traditional French and British ties with nondemocratic Arab elites only add to the mix. The tendency of lowest-common-denominator policies by the consensus-oriented European Union has further undercut efforts to take democracy seriously.

The new fervor for democracy in the Middle East represents an enormous opportunity for Europe to regain credibility in this domain. The European Union played an invaluable role in helping influence the political direction of Central and Eastern Europe after 1989 thanks to the promise of membership for those states meeting certain political and economic standards. The same offer is clearly not possible now, but the core idea should be preserved. Numerous Arab societies badly need reference points and defined trajectories as they try to move away from decades of stagnant autocratic rule. If Europe could reinvigorate its tired and overused concept of neighborhood partnership by offering real incentives on trade, aid, and other fronts to Arab states that respect democratic principles and human rights, it could redeem its past passivity. Brussels appears to be starting to move in this direction -- but follow-through will be the rub.

"The United States Is Now on Democracy's Side."


Not so fast. In his most recent State of the Union address,
U.S. President Barack Obama highlighted the success of protests in Tunisia, declaring that "the United States of America stands with the people of Tunisia and supports the democratic aspirations of all people." Then, shortly after Hosni Mubarak stepped down as president of Egypt, Obama pledged that the United States stands "ready to provide whatever assistance is necessary -- and asked for -- to pursue a credible transition to a democracy." On Libya, the United States has joined the many international calls for Muammar al-Qaddafi to leave. Has the United States finally turned the page on its long-standing support for autocratic stability in the Arab world? George W. Bush started to turn that page in 2003 by eloquently declaring that the United States was

moving away from its old ways and taking the cause of Arab democracy seriously. But various unsettling events, especially Hamas's coming to power in Palestine in 2006, caused the Bush administration to let the page fall back to its old place. It is as yet uncertain whether a fundamental change in U.S. policy will occur. The dream of Arab democracy appears to resonate with Obama, and numerous U.S. officials and aid practitioners are burning the candle at both ends to find ways to support the emerging democratic transitions. Yet enduring U.S. interests in the region continue to incline important parts of the Washington policy establishment to hope for stability more than democracy. Concerns over oil supplies undergird a continuing strong attachment to the Persian Gulf monarchies. The need for close cooperation on counterterrorism with many of the region's military and intelligence services fuels enduring ties. Washington's special relationship with Israel prompts fears of democratic openings that could result in populist governments that aggressively play the anti-Israel card. Given the complex mix of U.S. interests and the probable variety of political outcomes in the region, U.S. policy is unlikely to coalesce around any unified line. A shift in rhetoric in favor of democracy will undoubtedly emerge, but policy on the ground will vary greatly from country to country, embodying inconsistencies that reflect clashing imperatives. Comparing U.S. policies with regard to democracy in the former Soviet Union is instructive: sanctions against and condemnations of the dictator in Belarus, accommodation and even praise for the strongman in Kazakhstan, strenuous efforts at constructive partnership with undemocratic Russia, active engagement in democracy support in Moldova, a live-and-let-live attitude toward autocratic Azerbaijan, and genuine concern over democratic backsliding in Ukraine. A similar

salad bar of policy lines toward a changed Middle East is easy to imagine.

"The World Is a More Violent Place Than It Used to Be."


No way. The early 21st century seems awash in wars: the
conflicts in Afghanistan and Iraq, street battles in Somalia, Islamist insurgencies in Pakistan, massacres in the Congo, genocidal campaigns in Sudan. All in all, regular fighting is taking place in 18 wars around the globe today. Public opinion reflects this sense of an ever more dangerous world: One survey a few years ago found that 60 percent of Americans considered a third world war likely. Expectations for the new century were bleak even before the attacks of Sept. 11, 2001, and their bloody aftermath: Political scientist James G. Blight and former U.S. Defense Secretary Robert McNamara suggested earlier that year that we could look forward to an average of 3 million war deaths per year worldwide in the 21st century. So far they haven't even been close. In fact, the last decade has seen fewer war deaths than any decade in the past 100 years, based on data compiled by researchers Bethany Lacina and Nils Petter Gleditsch of the Peace Research Institute Oslo. Worldwide, deaths caused directly by war-related violence in the new century have averaged about 55,000 per year, just over half of what they were in the 1990s (100,000 a year), a third of what they were during the Cold War (180,000 a year from 1950 to 1989), and a hundredth of what they were in World War II. If you factor in the growing global population, which has nearly quadrupled in the

last century, the decrease is even sharper. Far from being an age of killer anarchy, the 20 years since the Cold War ended have been an era of rapid progress toward peace. Armed conflict has declined in large part because armed conflict has fundamentally changed. Wars between big national armies all but disappeared along with the Cold War, taking with them the most horrific kinds of mass destruction. Today's asymmetrical guerrilla wars may be intractable and nasty, but they will never produce anything like the siege of Leningrad. The last conflict between two great powers, the Korean War, effectively ended nearly 60 years ago. The last sustained territorial war between two regular armies, Ethiopia and Eritrea, ended a decade ago. Even civil wars, though a persistent evil, are less common than in the past; there were about a quarter fewer in 2007 than in 1990. If the world feels like a more violent place than it actually is, that's because there's more information about wars -- not more wars themselves. Once-remote battles and war crimes now regularly make it onto our TV and computer screens, and in more or less real time. Cell-phone cameras have turned citizens into reporters in many war zones. Societal norms about what to make of this information have also changed. As Harvard University psychologist Steven Pinker has noted, "The decline of violent behavior has been paralleled by a decline in attitudes that tolerate or glorify violence," so that we see today's atrocities -though mild by historical standards -- as "signs of how low our behavior can sink, not of how high our standards have risen."

"America Is Fighting More Wars Than Ever."

Yes and no. Clearly, the United States has been on a war
footing ever since 9/11, with a still-ongoing war in Afghanistan that has surpassed the Vietnam War as the longest conflict in American history and a pre-emptive war in Iraq that proved to be longer, bloodier, and more expensive than anyone expected. Add the current NATO intervention in Libya and drone campaigns in Pakistan, Somalia, and Yemen, and it's no wonder that U.S. military spending has grown more than 80 percent in real terms over the last decade. At $675 billion this year, it's now 30 percent higher than what it was at the end of the Cold War. But though the conflicts of the post-9/11 era may be longer than those of past generations, they are also far smaller and less lethal. America's decade of war since 2001 has killed about 6,000 U.S. service members, compared with 58,000 in Vietnam and 300,000 in World War II. Every life lost to war is one too many, but these deaths have to be seen in context: Last year more Americans died from falling out of bed than in all U.S. wars combined. And the fighting in Iraq and Afghanistan has taken place against a backdrop of base closures and personnel drawdowns elsewhere in the world. The temporary rise in U.S. troop numbers in South Asia and the Middle East, from 18,000 to 212,000 since 2000, contrasts with the permanent withdrawal of almost 40,000 troops from Europe, 34,000 from Japan and South Korea, and 10,000 from Latin America in that period. When U.S. forces come home from the current wars -- and they will in large numbers in the near future, starting with 40,000 troops from Iraq and 33,000 from Afghanistan by 2012 -- there will be fewer U.S. troops deployed around the world than at any time since the 1930s. President BarackObama was telling the truth in June when he said, "The tide of war is receding."

"War Has Gotten More Brutal for Civilians."


Hardly. In February 2010, a NATO airstrike hit a house in
Afghanistan's Marja district, killing at least nine civilians inside. The tragedy drew condemnation and made the news, leading the top NATO commander in the country to apologize to Afghan President Hamid Karzai. The response underscored just how much has changed in war. During World War II, Allied bombers killed hundreds of thousands of civilians in Dresden and Tokyo not by accident, but as a matter of tactics; Germany, of course, murdered civilians by the millions. And when today's civilians do end up in harm's way, more people are looking out for them. The humanitarian dollars spent per displaced person rose in real terms from $150 in the early 1990s to $300 in 2006. Total international humanitarian assistance has grown from $2 billion in 1990 to $6 billion in 2000 and (according to donor countries' claims) $18 billion in 2008. For those caught in the crossfire, war has actually gotten more humane. Yet many people insist that the situation is otherwise. For example, authoritative works on peacekeeping in civil wars (Roland Paris's award-winning At War's End and Michael Doyle and Nicholas Sambanis's Making War and Building Peace), as well as gold-standard reports on conflict from the World Bank and the Carnegie Commission on Preventing Deadly Conflict, tell us that 90 percent of today's war deaths are civilian while just 10 percent are military -- the reverse of a century ago and "a grim indicator of the transformation of armed conflict" in the late 20th century, as political scientist Kalevi Holsti put it.

Grim indeed -- but, fortunately, untrue. The myth originates with the 1994 U.N. Human Development Report, which misread work that Swedish researcher Christer Ahlstrm had done in 1991 and accidentally conflated war fatalities in the early 20th century with the much larger number of dead, wounded, and displaced people in the late 20th century. A more careful analysis done in 1989 by peace researcher William Eckhardt shows that the ratio of military to civilian war deaths remains about 50-50, as it has for centuries (though it varies considerably from one war to the next). If you are unlucky enough to be a civilian in a war zone, of course, these statistics are little comfort. But on a worldwide scale, we are making progress in helping civilians afflicted by war.

"Wars Will Get Worse in the Future."


Probably not. Anything is possible, of course: A full-blown war
between India and Pakistan, for instance, could potentially kill millions of people. But so could an asteroid or -- perhaps the safest bet -- massive storms triggered by climate change. The big forces that push civilization in the direction of cataclysmic conflict, however, are mostly ebbing. Recent technological changes are making war less brutal, not more so. Armed drones now attack targets that in the past would have required an invasion with thousands of heavily armed troops, displacing huge numbers of civilians and destroying valuable property along the way. And improvements in battlefield medicine have made combat less lethal for participants. In the U.S. Army, the chances of dying from a combat injury fell from 30 percent in World War II to 10 percent in Iraq and Afghanistan -though this also means the United States is now seeing a higher proportion of injured veterans who need continuing support and care.

Nor do shifts in the global balance of power doom us to a future of perpetual war. While some political scientists argue that an increasingly multipolar world is an increasingly volatile one -- that peace is best assured by the predominance of a single hegemonic power, namely the United States -- recent geopolitical history suggests otherwise. Relative U.S. power and worldwide conflict have waned in tandem over the past decade. The exceptions to the trend, Iraq and Afghanistan, have been lopsided wars waged by the hegemon, not challenges by up-and-coming new powers. The best precedent for today's emerging world order may be the 19th-century Concert of Europe, a collaboration of great powers that largely maintained the peace for a century until its breakdown and the bloodbath of World War I. What about China, the most ballyhooed rising military threat of the current era? Beijing is indeed modernizing its armed forces, racking up double-digit rates of growth in military spending, now about $100 billion a year. That is second only to the United States, but it is a distant second: The Pentagon spends nearly $700 billion. Not only is China a very long way from being able to go toe-to-toe with the United States; it's not clear why it would want to. A military conflict (particularly with its biggest customer and debtor) would impede China's global trading posture and endanger its prosperity. Since Chairman Mao's death, China has been hands down the most peaceful great power of its time. For all the recent concern about a newly assertive Chinese navy in disputed international waters, China's military hasn't fired a single shot in battle in 25 years.

"A More Democratic World Will Be a More Peaceful One."


Not necessarily. The well-worn observation that real
democracies almost never fight each other is historically correct, but it's also true that democracies have always been perfectly willing to fight non-democracies. In fact, democracy can heighten conflict by amplifying ethnic and nationalist forces, pushing leaders to appease belligerent sentiment in order to stay in power. Thomas Paine and Immanuel Kant both believed that selfish autocrats caused wars, whereas the common people, who bear the costs, would be loath to fight. But try telling that to the leaders of authoritarian China, who are struggling to hold in check, not inflame, a popular undercurrent of nationalism against Japanese and American historical enemies. Public opinion in tentatively democratic Egypt is far more hostile toward Israel than the authoritarian government of Hosni Mubarak ever was (though being hostile and actually going to war are quite different things). Why then do democracies limit their wars to non-democracies rather than fight each other? Nobody really knows. As the University of Chicago's Charles Lipson once quipped about the notion of a democratic peace, "We know it works in practice. Now we have to see if it works in theory!" The best explanation is that of political scientists Bruce Russett and John Oneal, who argue that three elements -- democracy, economic interdependence (especially trade), and the growth of international organizations -are mutually supportive of each other and of peace within the community of democratic countries. Democratic leaders, then, see themselves as having less to lose in going to war with autocracies.

"Peacekeeping Doesn't Work."


It does now. The early 1990s were boom years for the blue
helmets, with 15 new U.N. peacekeeping missions launched from 1991 to 1993 -- as many as in the U.N.'s entire history up to that point. The period was also host to peacekeeping's most spectacular failures. In Somalia, the U.N. arrived on a mission to alleviate starvation only to become embroiled in a civil war, and it quickly pulled out after 18 American soldiers died in a 1993 raid. In Rwanda in 1994, a weak U.N. force with no support from the Security Council completely failed to stop a genocide that killed more than half a million people. In Bosnia, the U.N. declared "safe areas" for civilians, but then stood by when Serbian forces overran one such area, Srebrenica, and executed more than 7,000 men and boys. (There were peacekeeping successes, too, such as in Namibia and Mozambique, but people tend to forget about them.) In response, the United Nations commissioned a report in 2000, overseen by veteran diplomat Lakhdar Brahimi, examining how the organization's efforts had gone wrong. By then the U.N. had scaled back peacekeeping personnel by 80 percent worldwide, but as it expanded again the U.N. adapted to lessons learned. It strengthened planning and logistics capabilities and began deploying more heavily armed forces able to wade into battle if necessary. As a result, the 15 missions and 100,000 U.N. peacekeepers deployed worldwide today are meeting with far greater success than their predecessors. Overall, the presence of peacekeepers has been shown to significantly reduce the likelihood of a war's reigniting after a cease-fire agreement. In the 1990s, about half of all cease-fires

broke down, but in the past decade the figure has dropped to 12 percent. And though the U.N.'s status as a perennial punching bag in American politics suggests otherwise, these efforts are quite popular: In a 2007 survey, 79 percent of Americans favored strengthening the U.N. That's not to say there isn't room for improvement -- there's plenty. But the U.N. has done a lot of good around the world in containing war.

"Some Conflicts Will Never End."


Never say never. In 2005, researchers at the U.S. Institute of
Peace characterized 14 wars, from Northern Ireland to Kashmir, as "intractable," in that they "resist any kind of settlement or resolution." Six years later, however, a funny thing has happened: All but a few of these wars (Israel-Palestine, Somalia, and Sudan) have either ended or made substantial progress toward doing so. In Sri Lanka, military victory ended the war, though only after a brutal endgame in which both sides are widely believed to have committed war crimes. Kashmir has a fairly stable cease-fire. In Colombia, the war sputters on, financed by drug revenue, but with little fighting left. In the Balkans and Northern Ireland, shaky peace arrangements have become less shaky; it's hard to imagine either sliding back into full-scale hostilities. In most of the African cases -- Burundi, Rwanda, Sierra Leone, Uganda, the Democratic Republic of the Congo, and Ivory Coast (notwithstanding the violent flare-up after elections there in late 2010, now resolved) -U.N. missions have brought stability and made a return to war less likely (or, in the case of Congo and Uganda, have at least limited the area of fighting).

Could we do even better? The late peace researcher Randall Forsberg in 1997 foresaw "a world largely without war," one in which "the vanishing risk of great-power war has opened the door to a previously unimaginable future -- a future in which war is no longer socially-sanctioned and is rare, brief, and small in scale." Clearly, we are not there yet. But over the decades -- and indeed, even since Forsberg wrote those words -- norms about wars, and especially about the protection of civilians caught up in them, have evolved rapidly, far more so than anyone would have guessed even half a century ago. Similarly rapid shifts in norms preceded the ends of slavery and colonialism, two other scourges that were once also considered permanent features of civilization. So don't be surprised if the end of war, too, becomes downright thinkable.

"The Answer to the Israeli-Palestinian Conflict Is a Two-State Solution."


In an ideal world, yes, but that doesn't mean it's going to
happen. In the 18 years since the signing on the White House lawn of the Oslo Accords, which laid the groundwork for a negotiated end to the Israeli-Palestinian conflict, the idea of a two-state solution has gained wide acceptance. According to ajoint Israeli-Palestinian poll from March 2010, 57 percent of Palestinians support it; among Israelis the percentage is even higher -- 71 percent. In both Europe and the United States, it's seen as the natural end point of this six-decade conflict. As U.S. President Barack Obama said in May, the "United States believes that negotiations should

result in two states -- with permanent Palestinian borders with Israel, Jordan, and Egypt, and permanent Israeli borders with Palestine." Nonetheless, we have reached the point where the ideas of two independent states and a negotiated resolution to the conflict reside on life support. The short explanation for this conundrum is that for much of the past 18 years, the momentum of obstructionism has been far more powerful than the momentum of progress. This has been consistently true since the earliest days of the Oslo process, as the forces that oppose peace have demonstrated a deadly effectiveness at thwarting it. From Baruch Goldstein's horrific massacre of Palestinians at Hebron's Cave of the Patriarchs in 1994 to the subsequent Palestinian suicide bombing attacks of 1994-1995 and the assassination of Israeli Prime Minister Yitzhak Rabin; from the targeted killing of Hamas leaders to the terrible violence of the Second Intifada against Israeli citizens, bloodshed has been a constant tool utilized by both sides to erode trust and strengthen the forces of irredentism. Beyond the use of violence, the lack of political will on both sides has been most catastrophic. As former Israeli Foreign Minister Shlomo Ben-Ami recounts in Scars of War, Wounds of Peace, even the architects of the Oslo peace process -- Israeli leaders Shimon Peres and Yitzhak Rabin -- initially rejected the idea of a Palestinian state, believing that some middle ground between statehood and the status quo was possible. Even as the path to statehood seemed clear, the country's leading doves were unwilling to reconcile themselves either publicly or privately with such a potentiality. In addition, the growth of Israeli settlements, in violation of the spirit if not the letter of Oslo, and the unwillingness of the Israeli government to halt them, have

become an almost insurmountable barrier to a workable two-state solution. On the other side, Yasir Arafat, the head of the Palestine Liberation Organization and Palestinian Authority, never publicly accepted the idea of peaceful reconciliation with Israel. He refused to countenance painful concessions on Jerusalem and the right of return, continued to view political violence as a tool for wrangling concessions out of Israel, and offered far too many public hints that a Palestinian state in the West Bank and Gaza was only the first step in a two-stage process of Palestinian liberation. The continued acceptance of violence as a viable means for achieving political goals, particularly by Hamas, has not surprisingly undermined Israeli enthusiasm for territorial concessions. Finally, each public has demonstrated an unwillingness to fully recognize and integrate the attitudes and fears of the other. Israelis are either blissfully unaware of, or not bothered by, the humiliation that is the hallmark of Israeli occupation.Hours spent at checkpoints, searches by Israeli soldiers, and transit roads that restrict movement and turn what should be quick trips into daylong excursions are just a few examples of the minor degradations that are a daily part of Palestinian life. At the end of Ramadan, last month, I attended a nonviolent demonstration at an Israeli checkpoint at Qalandia, where Palestinians were seeking to pass so that they could worship at the al-Aqsa mosque. As it was, such access was restricted to men over 50 and women over 40. For many Israelis, such indignities could be happening on the other side of the globe. Conversely, Palestinians have limited sympathy or appreciation for the trauma created in Israel by living in a state of constant siege and fear of terrorist attacks. Add all these various factors

together and the result is that while most Israelis and Palestinians believe a two-state solution is in the best interests of both peoples, the region is likely further away from that reality than at any point since Oslo. The Palestinian Authority's preparations to go to the United Nations and seek recognition as an independent state is compelling evidence that at least one side in the dispute sees no hope for a negotiated resolution.

"Israel Is a Beacon of Democracy in the Middle East."


Perhaps, but not forever. While supporters of Israel
commonly tout the Jewish state as the only true democracy in the Middle East, the trend lines are moving in the wrong direction. With emerging pro-democracy movements across the region, not to mention democracies in Iraq, Lebanon, and Turkey, it's getting harder to argue that Israel resides in an exclusive club. Recent developments in the Israeli Knesset portend a more ominous future. Lawmakers recently passed legislation that would threaten civil lawsuits against any Israeli who endorses boycott and divestiture campaigns against Israel. Other laws are being considered that would set up McCarthy-style committees to investigate left-leaning groups or even cancel Arabic as an official Israeli language, despite the fact that around 20 percent of Israeli citizens are Arabs. There have been growing calls for Jews not to rent apartments to Arabs; and according to peace activists with whom I spoke, harassment of human rights groups and NGOs is on the rise. Arab citizens of Israel already face serious discrimination on issues such as land ownership, employment, and resource allocation -- problems that are only increasing.

But do Israelis value democracy more than they do security? Israeli public opinion expert Dahlia Scheindlin told me in an email exchange, "There's a standard question (in Israeli public opinion polling) that asks (roughly): 'Sometimes security needs may conflict with democratic principles (or rule of law). When that happens which should come first -- security or democracy needs?' The response is always quite overwhelmingly in favor of security." Indeed, a June 2010 study done by the Friedrich Ebert Foundation, said Scheindlin, suggested that nearly three-quarters of Israeli youth (between the ages of 15 and 25), when given the option, chose security over democracy. According to the final version of the report, "With regard to Israel's future as a democratic and pluralistic society, the attitudes described [in the report] represent a major challenge to those social and political agents who are committed to the values and goals of the founding fathers of the State of Israel." Scheindlin suggested that these results could probably be replicated in a host of Western countries, but in few places is the choice as stark as in Israel. Continuing the status quo or military occupation in the West Bank has the strong likelihood of leading to an undemocratic future for Israel (a view endorsed by even right-wing Israelis). If demographic rates continue, the Jewish state may reach a point in the not-too-distant future when Israel and the occupied territories will feature a minority of Jews -- and a majority of Arabs without full political rights. It's bad enough that Arabs living in Israel and the territories do not have such full economic, social, or legal rights today -- but if these second-class citizens become a full majority of the population between the Mediterranean Sea and the Jordan River it will be hard to escape the conclusion that Israel is on the road to becoming an apartheid state.

"Israel Dismantled Settlements in Sinai and Gaza; It Can Do So in the West Bank Too."
Don't bet on it. Beyond the obvious explanation that the
biblical connection for Jews to the lands of Judea and Samaria, as many Israelis refer to the West Bank, is stronger than those to Gaza and Sinai, the larger problem is that the settlements have become so enmeshed with Palestinian communities that disentangling them is practically impossible. There are today more than 300,000 settlers in the West Bank. This doesn't even include the 190,000 Jews living in East Jerusalem, which was annexed by Israel after the 1967 Six-Day War. This figure is nearly triple the West Bank settler population when Oslo was signed. Although it is true that most of these settlements are in and around the 1967 borders, the reality on the ground is extraordinarily more complicated. For example, approximately 30 percent of the settler population resides outside the separation barrier, many in some of the most radicalized settler communities like Itamar and Kiryat Arba. In the unlikely event that Israel and the Palestinians did agree to a two-state solution and were to use the separation fence as a final border, there would still be more than 70,000 settlers and dozens of settlements on Palestinian land. These settlers would either have to accept living in a Palestinian state, which is unlikely, or have to be evacuated by the Israeli government. Not only would settlers almost certainly resist such a move, but already today settlers have set up dozens of illegal outposts in the West Bank and the Israeli government has made no effort to dislodge them. According to Hagit Ofran, director of Settlement Watch for Peace

Now, while Israel has removed the stray trailer or small shacks of settler youth, it has never evicted a single "real outpost" or taken down "infrastructure and removed families" from the larger outposts that are in clear violation of Israeli law. Removing tens, possibly hundreds, of thousands of Israeli citizens from the settlements would require the sort of political will from the Israeli government that it has never shown toward the powerful settler movement. And the fear that taking on the settler movement could lead to civil war or political violence hangs over Israelis' heads. While hawkish Likud Prime Minister Ariel Sharon was able to unilaterally evict around 9,000 settlers from Gaza in 2005, this move was met by widespread protests from the settler community. It could be a harbinger of things to come if a similar eviction were attempted in the West Bank. Considering that Prime Minister Benjamin Netanyahu's government has, according to a recent report by Peace Now, doubled construction in West Bank settlements since last year's building moratorium expired, it's hard to imagine that the world is likely to witness a fit of courage from Israeli leaders regarding settlements anytime soon (or that the current Israeli government has any inclination to evict settlers at all). What makes the settlement question even more difficult is that to prevent contact between Israelis and Palestinians -- and thus potential violence -- Israel has constructed a host of checkpoints, security fences, and transit roads that crisscross the West Bank and significantly inhibit the movement and daily life of Palestinians. South of Ramallah, for instance, the Palestinian villages of al-Jib, Bir Nabala, and Beit Hanina al-Balad are literally surrounded on all sides by Israeli settlements and the separation barrier; in Hebron, 30,000 Palestinians have their mobility and economic activity severely restricted by the Israeli army's need to

protect fewer than 1,000 settlers. These types of situations are replicated across the territories and are bolstered by provocative actions and even violence from the settlers that rarely are punished as strongly as Palestinian violence. As Israel continues to build more settlements, expand the barrier fence, and enforce the geographical divisions between Israelis and Palestinians, undoing these realities on the ground and the entanglement of both populations will become virtually impossible. Indeed, it likely already has.

"Israeli Security Concerns About a Palestinian State Are Ill-Founded."


Only if you ignore psychology. On the one hand, Israel
has the most powerful military in the Middle East; it possesses scores of nuclear weapons and it has the backing of the world's sole superpower, the United States. A few terrorist attacks, however deadly, won't change those facts. So on one level, the existential dangers of returning land to the Palestinians is overstated. On the other hand, that doesn't mean Israeli fears aren't real or legitimate. Many ordinary Israelis will tell you that the Palestinian response to Israeli concessions has been one of unremitting violence. After the signing of the Oslo Accords, Hamas (which opposed the accords) responded with suicide attacks that over the next several years killed more than 160 Israelis, wounded hundreds of others, and terrorized the population. After the so-called Camp David II negotiations among Israeli Prime Minister Ehud Barak, and Yasir Arafat, and brokered by U.S. President Bill Clinton, in which Israel made historic concessions to the Palestinians, the response was

even more violence. In 2002 alone, 220 Israelis were killed in suicide attacks. Israeli society shifted in response. At the height of the Second Intifada, from 2001 to 2003, life in Israel had become virtually intolerable. Citizens lived in a constant state of fear. The Israeli government responded with a series of steps that nominally improved security, including the military offensive against Hamas's and Fatah's military brigades, the construction of the separation barrier walling off Palestinians, and Sharon's unilateral withdrawal from Gaza. (Yet rocket attacks and terrorism from Gaza continued). Still, Israelis were able to return to some sense of normalcy in their daily lives. As a result, many Israelis believe the status quo, no matter how untenable, is preferable to the alternative. The risk that turning the West Bank over to the Palestinians will result in an unrequited Hamas state, allied with Iran, bent on taking back all Palestinian land (the latter view is held by a majority of Israelis) is one that many Israelis are not prepared to take.

"The Palestinian Authority Has the Legitimacy to Negotiate a Peace Deal."


Perhaps, but it's a quickly diminishing attribute. Israeli leaders often say they lack a partner for
peace. This is true, but for reasons that might not be immediately clear. In the 18 years since Arafat's PLO returned from exile in Tunisia to run the Palestinian Authority (PA), its credibility -- not just among Israelis, but also Palestinians -- has declined. The reasons are many, from endemic corruption and cronyism to human rights abuses by the PA police, to a lack of faith in PA

President Mahmoud Abbas. Yet it's the PA's cooperation with Israel that has perhaps had the greatest effect on the erosion in confidence among individual Palestinians. Consider Al Jazeera's publication in January of the so-called Palestine Papers, which revealed the specifics of IsraeliPalestinian negotiations. The most politically damaging impact of the leaks was not necessarily that the PA had conceded the annexation of key Israeli settlements in East Jerusalem while receiving nothing in return. Rather, the worst part was the revelation that the PA had basically been coordinating on security issues with Israel during the 2008-2009 Gaza war to undermine Hamas, its main political rival. In addition, under the precepts of Oslo one of the key responsibilities of the PA police in the West Bank is to protect Israelis from terrorist infiltration. So it's not surprising that Palestinians ask: What have you done for us? All this has contributed to the view that the PA is a handmaiden of the Israeli occupation. The PA's loss of authority among its own people has no doubt contributed to its decision to declare a Palestinian state and seek recognition at the United Nations later this month. But this strategy carries existential risks for Abbas's government. If there is no progress on the peace process or Palestinians move no closer to an independent state after whatever happens at Turtle Bay, the possibility of a third intifada is a distinct possibility. If violence increases, this would likely strengthen the hand of Hamas and weaken the PA further. In addition, there is also the distinct possibility that even pursuing unilateral declaration of statehood at the United Nations could to a GOP-led U.S. Congress cutting off funding to the PA, a move that would not only devastate the Palestinian economy, but could lead to the

dissolution of the PA altogether. This has not exactly left the Palestinian Authority in a position of negotiating from strength

"A One-State Solution Is the Alternative."


We may be about to find out. At this point only the most
optimistic -- or perhaps deluded -- observers can imagine a nearterm scenario under which Israelis and Palestinians sit down and negotiate a final two-state settlement to their conflict. The upcoming U.N. vote might have an impact on diplomatic relations between the two sides -- and may further isolate Israel internationally -- but it won't necessarily do much to change the realities on the ground. The result is that, largely by the force of inertia, Israelis and Palestinians are moving closer to a one-state solution. Some Palestinians and Israelis talk about a binational confederation in which each group has the same political rights. But this is highly unlikely to occur because it would almost certainly mean the end of Zionism and the dream of a Jewish state. On the other side of the spectrum, Israel could simply annex large swaths of the West Bank and leave the Palestinian Authority in a stateless limbo -- but at risk of significant international opprobrium. Then there is the most likely option: the maintenance of the status quo and a Zionist, Jewish state in which Israeli soldiers continue a military occupation of millions of Arabs with no political rights, but perhaps certain economic and social rights. As the two-state option slowly fades into oblivion, both sides will have to seriously contemplate an Israeli-Palestinian arrangement

that looks very similar to this. Indeed, as Daniel Levy, a senior research fellow at the New America Foundation noted, emphasizing this uncomfortable reality might be the most useful role the United States can play right now -- namely, beginning a conversation with Israelis that makes clear that unless there is significant movement toward a Palestinian state and, soon, a onestate military occupation, an increasing international isolation is Israel's long-term future. Any other scenario, unfortunately, is increasingly difficult to envisage. Warrick Page/Getty Images

"American Kids Are Falling Behind."


Not really. Anybody seeking signs of American decline in the early 21st century
need look no further, it would seem, than the latest international educational testing results. The Program for International Student Assessment (PISA) -- the mostwatched international measure in the field -- found that American high school students ranked 31st out of 65 economic regions in mathematics, 23rd in science, and 17th in reading. Students from the Chinese city of Shanghai, meanwhile, shot to the top of the ranking in all three categories -- and this was the first time they had taken the test. "For me, it's a massive wake-up call," Education Secretary Arne Duncan told the

Washington Post when the results were released in December. "Have we

ever been satisfied as Americans being average in anything? Is that our aspiration? Our goal should be absolutely to lead the world in education." The findings drove home the sense that the United States faced, as President Barack Obama put it in his State of the Union address, a "Sputnik moment." In fact, the U.S. education system has been having this sort of Sputnik moment since -- well, Sputnik. Six months after the 1957 Soviet satellite launch that shook the world, a

Life magazine cover story warned Americans of a "crisis in

education." An accompanying photo essay showed a 16-year-old boy in Chicago

sitting through undemanding classes, hanging out with his girlfriend, and attending swim-team practices, while his Moscow counterpart -- an aspiring physicist -- spent six days a week conducting advanced chemistry and physics experiments and studying English and Russian literature. The lesson was clear: Education was an international competition and one in which losing carried real consequences. The fear that American kids are falling behind the competition has persisted even as the competitors have changed, the budding Muscovite rocket scientist replaced with a would-be engineer in Shanghai. This latest showing of American 15-year-olds certainly isn't anything to brag about. But American students' performance is only cause for outright panic if you buy into the assumption that scholastic achievement is a zero-sum competition between nations, an intellectual arms race in which other countries' gain is necessarily the United States' loss. American competitive instincts notwithstanding, there is no reason for the United States to judge itself so harshly based purely on its position in the global pecking order. So long as American schoolchildren are not moving backward in absolute terms, America's relative place in global testing tables is less important than whether the country is improving teaching and learning enough to build the human capital it needs. And by this measure, the U.S. education system, while certainly in need of significant progress, doesn't look to be failing so spectacularly. The performance of American students in science and math has actually improved modestly since the last round of this international test in 2006, rising to the developed-country average in science while remaining only slightly below average in math. U.S. reading scores, in the middle of the pack for developed countries, are more or less unchanged since the most recent comparable tests in 2003. It would probably be unrealistic to expect much speedier progress. As Stuart Kerachsky, deputy commissioner of the National Center for Education Statistics, put it, "The needle doesn't move very far very fast in education."

"The United States Used to Have the World's Smartest Schoolchildren."

No, it didn't. Even at the height of U.S. geopolitical dominance and economic
strength, American students were never anywhere near the head of the class. In 1958, Congress responded to the Sputnik launch by passing the National Defense Education Act, which provided financial support for college students to study math, science, and foreign languages, and was accompanied by intense attention to raising standards in those subjects in American schools. But when the results from the first major international math test came out in 1967, the effort did not seem to have made much of a difference. Japan took first place out of 12 countries, while the United States finished near the bottom. By the early 1970s, American students were ranking last among industrialized countries in seven of 19 tests of academic achievement and never made it to first or even second place in any of them. A decade later, " A

Nation at Risk," the

landmark 1983 report by the National Commission on Excellence in Education, cited these and other academic failings to buttress its stark claim that "if an unfriendly foreign power had attempted to impose on America the mediocre educational performance that exists today, we might well have viewed it as an act of war." Each new cycle of panic and self-flagellation has brought with it a fresh crop of reformers touting a new solution to U.S. scholastic woes. A 1961 book by Arthur S. Trace Jr. called

What Ivan Knows That Johnny Doesn't, for instance,

suggested that American students were falling behind their Soviet peers because they weren't learning enough phonics and vocabulary. Today's anxieties are no different, with education wonks from across the policy spectrum enlisting the U.S. education system's sorry global ranking to make the case for their pet ideas. J. Michael Shaughnessy, president of the National Council of Teachers of Mathematics, argues that the latest PISA test "underscores the need for integrating reasoning and sense making in our teaching of mathematics." Randi Weingarten, head of the American Federation of Teachers, claims that the same results "tell us that if you don't make smart investments in teachers, respect them, or involve them in decision-making, as the top-performing countries do, students pay a price."

If Americans' ahistorical sense of their global decline prompts educators to come up with innovative new ideas, that's all to the good. But don't expect any of them to bring the country back to its educational golden age -- there wasn't one.

"Chinese Students Are Eating America's Lunch."


Only partly true. The biggest headline from the recent PISA results concerned
the first-place performance of students from Shanghai, and the inevitable "the Chinese are eating our lunch" meme was hard for American commentators and policymakers to resist. "While Shanghai's appearance at the top might have been a stunner, America's mediocre showing was no surprise," declared a

USA

Today editorial.
China's educational prowess is real. Tiger moms are no myth -- Chinese students focus intensely on their schoolwork, with strong family support -- but these particular results don't necessarily provide compelling evidence of U.S. inferiority. Shanghai is a special case and hardly representative of China as a whole; it's a talent magnet that draws from all over China and benefits from extensive government investment in education. Scores for the United States and other countries, by contrast, reflect the performance of a geographic cross-section of teenagers. China -- a vast country whose hinterlands are poorer and less-educated than its coastal cities -- would likely see its numbers drop if it attempted a similar assessment. What about perennial front-runners like Finland and South Korea, whose students were again top scorers? These countries undoubtedly deserve credit for high educational accomplishment. In some areas -- the importance of carefully selected, high-quality teachers, for example -- they might well provide useful lessons for the United States. But they have nothing like the steady influx of immigrants, mostly Latinos, whose children attend American public schools. And unfortunately, the racial, ethnic, and socioeconomic demographics of the United States -- none of which have analogues in Finland or South Korea -- correlate closely with yawning achievement gaps in education. Non-Hispanic white and Asian pupils in the United

States do about as well on these international tests as students from high-scoring countries like Canada and Japan, while Latino and black teens -- collectively more than a third of the American students tested -- score only about as well as those from Turkey and Bulgaria, respectively. To explain is not to excuse, of course. The United States has an obligation to give all its citizens a high-quality education; tackling the U.S. achievement gap should be a moral imperative. But alarmist comparisons with other countries whose challenges are quite different from those of the United States don't help. Americans should be less worried about how their own kids compare with kids in Helsinki than how students in the Bronx measure up to their peers in Westchester County.

"The U.S. No Longer Attracts the Best and Brightest."


Wrong. While Americans have worried about their elementary and high school
performance for decades, they could reliably comfort themselves with the knowledge that at least their college education system was second to none. But today, American university leaders fret that other countries are catching up in, among other things, the market for international students, for whom the United States has long been the world's largest magnet. The numbers seem to bear this out. According to the most recent statistics, the U.S. share of foreign students fell from 24 percent in 2000 to just below 19 percent in 2008. Meanwhile, countries like Australia, Canada, and Japan saw increased market shares from their 2000 levels, though they are still far below the American numbers. The international distribution of mobile students is clearly changing, reflecting an ever more competitive global higher-education market. But there are many more foreign students in the United States than there were a decade ago -- 149,000 more in 2008 than in 2000, a 31 percent increase. What has happened is that there are simply many more of them overall studying outside their home countries. Some 800,000 students ventured abroad in 1975; that number reached 2 million in 2000 and ballooned to 3.3 million in 2008. In other words, the United States has a smaller piece of the pie, but the pie has gotten much, much larger.

And even with its declining share, the United States still commands 9 percentage points more of the market than its nearest competitor, Britain. For international graduate study, American universities are a particularly powerful draw in fields that may directly affect the future competitiveness of a country's economy: science, technology, engineering, and mathematics. In disciplines such as computer science and engineering, more than six in 10 doctoral students in American programs come from foreign countries. But that doesn't mean there's nothing to worry about. Although applications from international students to American graduate schools have recovered from their steep post-9/11 decline, the number of foreigners earning science and engineering doctorates at U.S. universities recently dropped for the first time in five years. American schools face mounting competition from universities in other countries, and the United States' less-than-welcoming visa policies may give students from overseas more incentive to go elsewhere. That's a loss for the United States, given the benefits to both its universities and its economy of attracting the best and brightest from around the world.

"American Universities Are Being Overtaken."


Not so fast. There's no question that the growing research aspirations of
emerging countries have eroded the long-standing dominance of North America, the European Union, and Japan. Asia's share of the world's research and development spending grew from 27 to 32 percent from 2002 to 2007, led mostly by China, India, and South Korea, according to a

2010 UNESCO report. The traditional

research leaders saw decreases during the same period. From 2002 to 2008, the U.S. proportion of articles in the Thomson Reuters Science Citation Index, the authoritative database of research publications, fell further than any other country's, from 30.9 to 27.7 percent. Meanwhile, the number of Chinese publications recorded in the same index more than doubled, as did the volume of scientific papers from Brazil, a country whose research institutions wouldn't have been on anyone's radar 20 years ago.

This shift in the geography of knowledge production is certainly noteworthy, but as with the international study market, the United States simply represents a proportionally smaller piece of a greatly expanded pie. R&D spending worldwide massively surged in the last decade, from $790 billion to $1.1 trillion, up 45 percent. And the declining U.S. share of global research spending still represented a healthy increase in constant dollars, from $277 billion in 2002 to $373 billion in 2007. U.S. research spending as a percentage of GDP over the same period was consistent and very high by global standards. The country's R&D investments still totaled more than all Asian countries' combined. Similarly, a declining U.S. share of the world's scientific publications may sound bad from an American point of view. But the total number of publications listed in the Thomson Reuters index surged by more than a third from 2002 to 2008. Even with a shrinking global lead, U.S. researchers published 46,000 more scientific articles in 2008 than they did six years earlier. And in any case, research discoveries don't remain within the borders of the countries where they occur -- knowledge is a public good, with little regard for national boundaries. Discoveries in one country's research institutions can be capitalized on by innovators elsewhere. Countries shouldn't be indifferent to the rise in their share of the research -- big breakthroughs can have positive economic and academic spillover effects -- but they also shouldn't fear the increase of cutting-edge discoveries elsewhere.

"The World Will Catch Up."


Maybe, but don't count on it anytime soon. And don't count on it
mattering. The global academic marketplace is without doubt growing more competitive than ever. Countries from China and South Korea to Saudi Arabia have made an urgent priority of creating world-class universities or restoring the lost luster of once great institutions. And they're putting serious money into it: China is spending billions on expanding enrollment and improving its elite research institutions, while Saudi King Abdullah has funneled $10 billion into the brandnew

King Abdullah University of Science and Technology . 2008

But the United States doesn't have just a few elite schools, like most of its ostensible competitors; it has a deep bench of outstanding institutions. A

Rand Corp. report found that nearly two-thirds of the most highly cited
articles in science and technology come from the United States, and seven in 10 Nobel Prize winners are employed by American universities. And the United States spends about 2.9 percent of its GDP on postsecondary education, about twice the percentage spent by China, the European Union, and Japan in 2006. But while the old U.S.-centric order of elite institutions is unlikely to be wholly overturned, it will gradually be shaken up in the coming decades. Asian countries in particular are making significant progress and may well produce some great universities within the next half-century, if not sooner. In China, for instance, institutions such as Tsinghua and Peking universities in Beijing and Fudan and Shanghai Jiao Tong universities in Shanghai could achieve real prominence on the world stage. But over the long term, exactly where countries sit in the university hierarchy will be less and less relevant, as Americans' understanding of who is "us" and who is "them" gradually changes. Already, a historically unprecedented level of student and faculty mobility has become a defining characteristic of global higher education. Cross-border scientific collaboration, as measured by the volume of publications by co-authors from different countries, has more than doubled in two decades. Countries like Singapore and Saudi Arabia are jump-starting a culture of academic excellence at their universities by forging partnerships with elite Western institutions such as Duke, MIT, Stanford, and Yale. The notion of just how much a university really has to be connected to a particular location is being rethought, too. Western universities, from Texas A&M to the Sorbonne, have garnered much attention by creating, admittedly with mixed results, some 160 branch campuses in Asia and the Middle East, many launched in the last decade. New York University recently went one step further by opening a fullfledged liberal arts campus in Abu Dhabi, part of what NYU President John Sexton envisions as a "global network university." One day, as University of Warwick Vice Chancellor Nigel Thrift suggests, we may see outright mergers between institutions -- and perhaps ultimately the university equivalent of multinational corporations.

In this coming era of globalized education, there is little place for the Sputnik alarms of the Cold War, the Shanghai panic of today, and the inevitable sequels lurking on the horizon. The international education race worth winning is the one to develop the intellectual capacity the United States and everyone else needs to meet the formidable challenges of the 21st century -- and who gets there first won't matter as much as we once feared.

Вам также может понравиться