Monday, August 29, 2005

NY Times: Beijing's Quest for 2008: To Become Simply Livable

Beijing's Air Index was at 136 today. The blue skies of 2 weeks ago are a distant memory.

* * * * *
August 28, 2005
Beijing's Quest for 2008: To Become Simply Livable

By JIM YARDLEY
BEIJING, Aug. 27 - There is a placard beside Tiananmen Square that counts the days until the 2008 Summer Olympics, and every one of them would seem precious: Beijing must build or renovate 72 sports stadiums and training facilities, lay asphalt for 59 new roads and complete three new bridges by the opening ceremony.

It is a task that would overwhelm most cities, but Beijing is so efficient at pouring concrete that the International Olympic Committee has asked it to slow down rather than finish construction too soon. Far more difficult will be fulfilling Beijing's promise of playing host to a "green" Olympics as well as meeting a new goal in the city's revised master plan - to become "a city suitable for living."

"It's kind of a new concept for us," said Huang Yan, the well-regarded deputy director of the planning commission, when she announced the master plan in April. "We've never thought about this before."

For Beijing's 15.2 million inhabitants, that comment does not amount to much of a revelation. Beijing is clotted with gridlocked traffic as the number of cars has more than doubled in just six years. Air quality, after years of steady improvement, has leveled off recently in some categories and even worsened in others as Beijing continues to rank among the worst cities in the world for clean air. The city's water supply is so stressed that some experts have called for rationing.

Even with the daunting task of Olympic construction under way in the northern tier of the city, Beijing's troubleshooting mayor, Wang Qishan, has said his time is often dominated by non-Olympic concerns. In a speech early this year, Mr. Wang said he was besieged with public complaints, and "the hot topics are always rubbish, sewage, public toilets and traffic." The city has thousands of old and fetid public toilets that it is hurriedly trying to replace.

"Wherever I look," he said, according to the government's official English-language newspaper, China Daily, "there seem to be problems." He said the only person who did not complain to him was his wife.

It is uncertain whether Beijing's theoretical embrace of "livability" can be translated into real improvements in quality of life in a city that often feels like one enormous construction zone. (The city has roughly 8,000 construction sites.)

Critics are skeptical. They attribute Beijing's current predicament to previous failed planning policies and blame the government for the rampant development that has destroyed much of the historic old city while making a mess of the emerging new one.

"Bad planning over the past decades has already become a point of embarrassment for the city," said Wang Jun, whose best-selling book, "The Story of a City," documented the demolition of many of the city's old "hutong" neighborhoods, the ancient, densely populated enclaves of narrow, winding streets and crumbling courtyard homes.

Mr. Wang said Beijing never recovered from the 1950's, when Liang Sicheng, the country's pre-eminent architectural historian, warned that destroying the hutongs would lead to traffic and pollution and urged Mao to preserve Beijing's ancient city walls. Instead, Mao demolished them as a symbol of Chinese feudalism.

More recently, the hutongs have been steadily demolished, dislocating untold thousands of people, to make room for the thousands of development projects swallowing the city.

"Now, his predictions have come true," Mr. Wang said of the pollution and traffic.

The unrelenting pressure bearing down on Beijing and other Chinese cities is the influx of people. China is in the midst of one of the fastest periods of urbanization in history, with 300 million people expected to migrate to cities in the next 15 years.

The population of Beijing alone could surpass 21 million by 2020 if its growth continues at today's rate.

Ms. Huang said planners had been forced to rethink their priorities. Beijing actually encompasses a vast geographic area, much of it mountainous and dotted with rural villages.

The new master plan calls for creating suburban satellite towns to ease the population pressures on the city's center. Manufacturing, for example, would be clustered in the east, while high technology would be in the west. Ms. Huang said the city's limited access to water and the nationwide shortage of energy meant that smarter planning was now essential.

"In the past, we never thought of the capacity of resources," she said. "We only focused on development."

The ruling Communist Party considers the Olympics to be modern China's coming-out party to the rest of the world, and all of Beijing is looking toward 2008. The government has stipulated that major construction projects in the city be completed several months before the opening ceremony. The Olympic venues will be finished by the end of 2007.

But it is not clear whether Beijing will be able to meet the goals attached to its "green" Olympics promise. Already officials have committed to moving out some factories and closing others. Thousands of heavy polluting trucks and taxis have been replaced with vehicles that meet tougher fuel restrictions.

Last year, city officials rejoiced when Beijing met, albeit barely, its goal of 227 so-called Blue Sky days based on levels of three primary pollutants in the air.

But some residents were so skeptical that they accused officials of manipulating data. A recent report by the environmental office of the United States Embassy in Beijing acknowledged the increase in Blue Sky days but noted that the standard used is less stringent than in the United States. The report found that the number of days with "extremely unhealthy pollution levels" had jumped to 17 from 5 and that the overall pollution index had risen for the year.

The report also found that levels of particulate matter in the air were several times that of major American cities and cautioned that Beijing might not meet its goal of complying with World Health Organization air quality standards by 2008.

The spike in private automobiles - the number is now approaching three million - detracts from gains made by reining in polluting trucks and taxis. Private cars increasingly seem to be overwhelming the city, and officials are responding with a flurry of road building, even as subway lines and light rail are also being expanded. "It's basically using the Los Angeles model to solve the problems of New York," said Wang Jun, the author.

Los Angeles, of course, might provide a bit of inspiration for Beijing, having markedly reduced its air pollution levels.

Even now, there are moments when the pollution abates and Beijing is revealed for what it could be. In August, after a stretch of heavy rain, the sky was blue by any measure and the jagged mountains circling the city were on clear display.

But those days are rare. Not far from Tiananmen Square, the city's planning department offers a glimpse of what it hopes Beijing will look like by 2008, with a dazzling scale model: the business district is a sleek cluster of futuristic towers; the Olympic complex rises elegantly in the north, surrounded by green space; the ancient Forbidden City lies at the center.

It all seems orderly, even manageable, but perhaps that is because of a notable omission: the model has almost no people or cars.


Copyright 2005 The New York Times Company

Saturday, August 27, 2005

The New Yorker - The Climate of Man - III

THE CLIMATE OF MAN—III
by ELIZABETH KOLBERT
What can be done?
Issue of 2005-05-09
Posted 2005-05-02

In February, 2003, a series of ads on the theme of inundation began appearing on Dutch TV. The ads were sponsored by the Netherlands’ Ministry of Transport, Publi Works, and Water Management, and they featured a celebrity weatherman named Peter Timofeeff. In one commercial, Timofeeff, who looks a bit like Albert Brooks and bit like Gene Shalit, sat relaxing on the shore in a folding chair. “Sea level is rising,” he announced, as waves started creeping up the beach. He continued to sit and talk eve as a boy who had been building a sandcastle abandoned it in panic. At the end of the ad, Timofeeff, still seated, was immersed in water up to his waist
In another commercial, Timofeeff was shown wearing a business suit and standing by a bathtub. “These are our rivers,” he explained, climbing into the tub and turning on the shower full blast. “The climate is changing. It will rain more often, and more heavily.” Water filled the tub and spilled over the sides. It dripped through the floorboards, onto the head of his screeching wife, below. “We should give the water more space and widen the rivers,” he advised, reaching for a towel.
Both the beach-chair and the shower ads were part of a public-service campaign that also included radio spots, newspaper announcements, and free tote bags. Notwithstanding their comic tone—other commercials showed Timofeeff trying to start a motorboat in a cow pasture and digging a duck pond in his back yard—their message was sombre.
A quarter of the Netherlands lies below sea level, much of it on land wrested from either the North Sea or the Rhine or the River Meuse. Another quarter, while slightly higher, is still low enough that, in the natural course of events, it would regularly be flooded. What makes the country habitable is the world’s most sophisticated water-management system, which comprises more than ten thousand miles of dikes, dams, weirs, flood barriers, and artificial dunes, not to mention countless pumps, holding ponds, and windmills. (People in Holland like to joke, “God made the world, but the Dutch made the Netherlands.”)
Until recently, it was assumed that any threat to low-lying areas would be dealt with the same way such threats always had been: by raising the dikes, or by adding new ones. (The latest addition, the Maeslant barrier, which is supposed to protect Rotterdam from storm surges with the aid of two movable arms, each the size of a skyscraper, was completed in 1997.) But this is no longer the case. The very engineers who perfected the system have become convinced that it is unsustainable. After centuries of successfully manipulating nature, the Dutch, the ads warn, will have to switch course.
Eelke Turkstra runs a water-ministry program called Room for the River, which is just the sort of enterprise that Timofeeff was advocating when he climbed into the bathtub. A few months ago, I arranged to speak with Turkstra, and he suggested that we meet at a nature center along a branch of the Rhine known as the Nieuwe Merwede. The center featured an exhibit about the effects of climate change. One kid-friendly display allowed visitors to turn a crank and, in effect, drown the countryside. By 2100, the display showed, the Nieuwe Merwede could be running several feet above the local dikes.
From the nature center, Turkstra took me by car ferry across the river. On the other side, we drove through an area that was made up entirely of “polders”—land that has been laboriously reclaimed from the water. The polders were shaped like ice trays, with sloping sides and perfectly flat fields along the bottom. Every once in a while, there was a sturdy-looking farmhouse. The whole scene—the level fields, the thatched barns, even the gray clouds sitting on the horizon—could have been borrowed from a painting by Hobbema. Turkstra explained that the plan of Room for the River was to buy out the farmers who were living in the polders, then lower the dikes and let the Nieuwe Merwede flood when necessary. It was expected that the project would cost three hundred and ninety million dollars. Similar projects are under way in other parts of the Netherlands, and it is likely that in the future even more drastic measures will be necessary, including, some experts argue, the construction of a whole new outlet channel for the Rhine.
“Some people don’t get it,” Turkstra told me as we zipped along. “They think this project is stupid. But I think it’s stupid to continue in the old way.”

A few years ago, in an article in Nature, the Dutch chemist Paul Crutzen coined a term. No longer, he wrote, should we think of ourselves as living in the Holocene, as the period since the last glaciation is known. Instead, an epoch unlike any of those which preceded it had begun. This new age was defined by one creature—man—who had become so dominant that he was capable of altering the planet on a geological scale. Crutzen, a Nobel Prize winner, dubbed this age the Anthropocene. He proposed as its starting date the seventeen-eighties, the decade in which James Watt perfected his steam engine and, inadvertently, changed the history of the earth.
In the seventeen-eighties, ice-core records show, carbon-dioxide levels stood at about two hundred and eighty parts per million. Give or take ten parts per million, this was the same level that they had been at two thousand years earlier, in the era of Julius Caesar, and two thousand years before that, at the time of Stonehenge, and two thousand years before that, at the founding of the first cities. When, subsequently, industrialization began to drive up CO2 levels, they rose gradually at first—it took more than a hundred and fifty years to get to three hundred and fifteen parts per million—and then much more rapidly. By the mid-nineteen-seventies, they had reached three hundred and thirty parts per million, and, by the mid-nineteen-nineties, three hundred and sixty parts per million. Just in the past decade, they have risen by as much—twenty parts per million—as they did during the previous ten thousand years of the Holocene.
For every added increment of carbon dioxide, the earth will experience a temperature rise, which represents what is called the equilibrium warming. If current trends continue, atmospheric CO2 will reach five hundred parts per million—nearly double pre-industrial levels—around the middle of the century. It is believed that the last time CO2 concentrations were that high was during the period known as the Eocene, some fifty million years ago. In the Eocene, crocodiles roamed Colorado and sea levels were nearly three hundred feet higher than they are today.
For all practical purposes, the recent “carbonation” of the atmosphere is irreversible. Carbon dioxide is a persistent gas; it lasts for about a century. Thus, while it is possible to increase CO2 concentrations relatively quickly, by, say, burning fossil fuels or levelling forests, the opposite is not the case. The effect might be compared to driving a car equipped with an accelerator but no brakes.
The long-term risks of this path are well known. Barely a month passes without a new finding on the dangers posed by rising CO2 levels—to the polar ice cap, to the survival of the world’s coral reefs, to the continued existence of low-lying nations. Yet the world has barely even begun to take action. This is particularly true of the United States, which is the largest emitter of carbon dioxide by far. (The average American produces some twelve thousand pounds of CO2 emissions annually.) As we delay, the opportunity to change course is slipping away. “We have only a few years, and not ten years but less, to do something,” the Dutch state secretary for the environment, Pieter van Geel, told me when I went to visit him in The Hague.

In climate-science circles, a future in which current emissions trends continue, unchecked, is known as “business as usual,” or B.A.U. A few years ago, Robert Socolow, professor of engineering at Princeton, began to think about B.A.U. and what it implied for the fate of mankind. Socolow had recently become co-director of the Carbo Mitigation Initiative, a project funded by BP and Ford, but he still considered himself an outsider to the field of climate science. Talking to insiders, he was struck by th degree of their alarm. “I’ve been involved in a number of fields where there’s a lay opinion and a scientific opinion,” he told me when I went to talk to him shortly afte returning from the Netherlands. “And, in most of the cases, it’s the lay community that is more exercised, more anxious. If you take an extreme example, it would be nuclea power, where most of the people who work in nuclear science are relatively relaxed about very low levels of radiation. But, in the climate case, the experts—the people wh work with the climate models every day, the people who do ice cores—they are more concerned. They’re going out of their way to say, ‘Wake up! This is not a good thing to be doing.’ ”
Socolow, who is sixty-seven, is a trim man with wire-rimmed glasses and gray, vaguely Einsteinian hair. Although by training he is a theoretical physicist—he did his doctoral research on quarks—he has spent most of his career working on problems of a more human scale, like how to prevent nuclear proliferation or construct buildings that don’t leak heat. In the nineteen-seventies, Socolow helped design an energy-efficient housing development, in Twin Rivers, New Jersey. At another point, he developed a system—never commercially viable—to provide air-conditioning in the summer using ice created in the winter. When Socolow became co-director of the Carbon Mitigation Initiative, he decided that the first thing he needed to do was get a handle on the scale of the problem. He found that the existing literature on the subject offered almost too much information. In addition to B.A.U., a dozen or so alternative scenarios, known by code names like A1 and B1, had been devised; these all tended to jumble together in his mind, like so many Scrabble tiles. “I’m pretty quantitative, but I could not remember these graphs from one day to the next,” he recalled. He decided to try to streamline the problem, mainly so that he could understand it.
There are two ways to measure carbon-dioxide emissions. One is to count the full weight of the CO2; the other, favored by the scientific community, is to count just the weight of the carbon. Using the latter measure, global emissions last year amounted to seven billion metric tons. (The United States contributed more than twenty per cent of the total, or 1.6 billion metric tons of carbon.) “Business as usual” yields several different estimates of future emissions: a mid-range projection is that carbon emissions will reach 10.5 billion metric tons a year by 2029, and fourteen billion tons a year by 2054. Holding emissions constant at today’s levels means altering this trajectory so that fifty years from now seven billion of those fourteen billion tons of carbon aren’t being poured into the atmosphere.
Stabilizing CO2 emissions, Socolow realized, would be a monumental undertaking, so he decided to break the problem down into more manageable blocks, which he called “stabilization wedges.” For simplicity’s sake, he defined a stabilization wedge as a step that would be sufficient to prevent a billion metric tons of carbon per year from being emitted by 2054. Along with a Princeton colleague, Stephen Pacala, he eventually came up with fifteen different wedges—theoretically, at least eight more than would be necessary to stabilize emissions. These fall, very roughly, into three categories—wedges that deal with energy demand, wedges that deal with energy supply, and wedges that deal with “capturing” CO2 and storing it somewhere other than the atmosphere. Last year, the two men published their findings in a paper in Science which received a great deal of attention. The paper was at once upbeat—“Humanity already possesses the fundamental scientific, technical, and industrial know-how to solve the carbon and climate problem for the next half-century,” it declared—and deeply sobering. “There is no easy wedge” is how Socolow put it to me.

Consider wedge No. 11. This is the photovoltaic, or solar-power, wedge—probably the most appealing of all the alternatives, at least in the abstract. Photovoltaic cells which have been around for more than fifty years, are already in use in all sorts of small-scale applications and in some larger ones where the cost of connecting to th electrical grid is prohibitively high. The technology, once installed, is completely emissions-free, producing no waste products, not even water. Assuming that a thousand-megawatt coal-fired power plant produces about 1.5 million tons of carbon a year—in the future, coal plants are expected to become more efficient—to get a wedge out o photovoltaics would require enough cells to produce seven hundred thousand megawatts. Since sunshine is intermittent, two million megawatts of capacity is needed t produce that much power. This, it turns out, would require PV arrays covering a surface area of five million acres—approximately the size of Connecticut
Wedge No. 10 is wind electricity. The standard output of a wind turbine is two megawatts, so to get a wedge out of wind power would require at least a million turbines. Other wedges present different challenges, some technical, some social. Nuclear power produces no carbon dioxide; instead, it generates radioactive waste, with all the attendant problems of storage, disposal, and international policing. Currently, there are four hundred and forty-one nuclear power plants in the world; one wedge would require doubling their capacity. There are also two automobile wedges. The first requires that every car in the world be driven half as much as it is today. The second requires that it be twice as efficient. (Since 1987, the fuel efficiency of passenger vehicles in the U.S. has actually declined, by more than five per cent.)
Three of the possible options are based on a technology known as “carbon capture and storage,” or C.C.S. As the name suggests, with C.C.S. carbon dioxide is “captured” at the source—presumably a power plant or other large emitter. Then it is injected at very high pressure into geological formations, such as depleted oil fields, underground. No power plants actually use C.C.S. at this point, nor is it certain that CO2 injected underground will remain there permanently; the world’s longest-running C.C.S. effort, maintained by the Norwegian oil company Statoil at a natural-gas field in the North Sea, has been operational for only eight years. One wedge of C.C.S. would require thirty-five hundred projects on the scale of Statoil’s.
In a world like today’s, where there is, for the most part, no direct cost to emitting CO2, none of Socolow’s wedges are apt to be implemented; this is, of course, why they represent a departure from “business as usual.” To alter the economics against carbon requires government intervention. Countries could set a strict limit on CO2, and then let emitters buy and sell carbon “credits.” (In the United States, this same basic strategy has been used successfully with sulfur dioxide in order to curb acid rain.) Another alternative is to levy a tax on carbon. Both of these options have been extensively studied by economists; using their work, Socolow estimates that the cost of emitting carbon would have to rise to around a hundred dollars a ton to provide a sufficient incentive to adopt many of the options he has proposed. Assuming that the cost were passed on to consumers, a hundred dollars a ton would raise the price of a kilowatt-hour of coal-generated electricity by about two cents, which would add roughly fifteen dollars a month to the average American family’s electricity bill. (In the U.S., more than fifty per cent of electricity is generated by coal.)
All of Socolow’s calculations are based on the notion—clearly hypothetical—that steps to stabilize emissions will be taken immediately, or at least within the next few years. This assumption is key not only because we are constantly pumping more CO2 into the atmosphere but also because we are constantly building infrastructure that, in effect, guarantees that that much additional CO2 will be released in the future. In the U.S., the average new car gets about twenty miles to the gallon; if it is driven a hundred thousand miles, it will produce almost forty-three metric tons of carbon during its lifetime. A thousand-megawatt coal plant built today, meanwhile, is likely to last fifty years; if it is constructed without C.C.S. capability, it will emit some hundred million tons of carbon during its life. The overriding message of Socolow’s wedges is that the longer we wait—and the more infrastructure we build without regard to its impact on emissions—the more daunting the task of keeping CO2 levels below five hundred parts per million will become. Indeed, even if we were to hold emissions steady for the next half century, Socolow’s graphs show that much steeper cuts would be needed in the following half century to keep CO2 concentrations from exceeding that level. After a while, I asked Socolow whether he thought that stabilizing emissions was a politically feasible goal. He frowned.
“I’m always being asked, ‘What can you say about the practicability of various targets?’ ” he told me. “I really think that’s the wrong question. These things can all be done.
“What kind of issue is like this that we faced in the past?” he continued. “I think it’s the kind of issue where something looked extremely difficult, and not worth it, and then people changed their minds. Take child labor. We decided we would not have child labor and goods would become more expensive. It’s a changed preference system. Slavery also had some of those characteristics a hundred and fifty years ago. Some people thought it was wrong, and they made their arguments, and they didn’t carry the day. And then something happened and all of a sudden it was wrong and we didn’t do it anymore. And there were social costs to that. I suppose cotton was more expensive. We said, ‘That’s the trade-off; we don’t want to do this anymore.’ So we may look at this and say, ‘We are tampering with the earth.’ The earth is a twitchy system. It’s clear from the record that it does things that we don’t fully understand. And we’re not going to understand them in the time period we have to make these decisions. We just know they’re there. We may say, ‘We just don’t want to do this to ourselves.’ If it’s a problem like that, then asking whether it’s practical or not is really not going to help very much. Whether it’s practical depends on how much we give a damn.”

Marty Hoffert is a professor of physics at New York University. He is big and bearish, with a wide face and silvery hair. Hoffert got his undergraduate degree i aeronautical engineering, and one of his first jobs, in the mid-nineteen-sixties, was helping to develop the U.S.’s antiballistic-missile system. Eventually, he decided that h wanted to work on something, in his words, “more productive.” In this way, he became involved in climate research. Hoffert is primarily interested in finding new, carbon-free ways to generate energy. He calls himself a “technological optimist,” and a lot of his ideas about electric power have a wouldn’t-it-be-cool, Buck Rogers sound to them On other topics, though, Hoffert is a killjoy
“We have to face the quantitative nature of the challenge,” he told me one day over lunch at the N.Y.U. faculty club. “Right now, we’re going to just burn everything up; we’re going to heat the atmosphere to the temperature it was in the Cretaceous, when there were crocodiles at the poles. And then everything will collapse.”
Currently, the new technology that Hoffert is pushing is space-based solar power, or S.S.P. In theory, at least, S.S.P. involves launching into space satellites equipped with massive photovoltaic arrays. Once a satellite is in orbit, the array would unfold or, according to some plans, inflate. S.S.P. has two important advantages over conventional, land-based solar power. In the first place, there is more sunlight in space—roughly eight times as much, per unit of area—and, in the second, this sunlight is constant: satellites are not affected by clouds or by nightfall. The obstacles, meanwhile, are several. No full-scale test of S.S.P. has ever been conducted. (In the nineteen-seventies, nasa studied the idea of sending a photovoltaic array the size of Manhattan into space, but the project never, as it were, got off the ground.) Then, there is the expense of launching satellites. Finally, once the satellites are up, there is the difficulty of getting the energy down. Hoffert imagines solving this last problem by using microwave beams of the sort used by cell-phone towers, only much more tightly focussed. He believes, as he put it to me, that S.S.P. has a great deal of “long-term promise”; however, he is quick to point out that he is open to other ideas, like putting solar collectors on the moon, or using superconducting wires to transmit electricity with minimal energy loss, or generating wind power using turbines suspended in the jet stream. The important thing, he argues, is not which new technology will work but simply that some new technology be found. A few years ago, Hoffert published an influential paper in Science in which he argued that holding CO2 levels below five hundred parts per million would require a “Herculean” effort and probably could be accomplished only through “revolutionary” changes in energy production.
“The idea that we already possess the ‘scientific, technical, and industrial know-how to solve the carbon problem’ is true in the sense that, in 1939, the technical and scientific expertise to build nuclear weapons existed,” he told me, quoting Socolow. “But it took the Manhattan Project to make it so.”
Hoffert’s primary disagreement with Socolow, which both men took pains to point out to me and also took pains to try to minimize, is over the future trajectory of CO2 emissions. For the past several decades, as the world has turned increasingly from coal to oil, natural gas, and nuclear power, emissions of CO2 per unit of energy have declined, a process known as “decarbonization.” In the “business as usual” scenario that Socolow uses, it is assumed that decarbonization will continue. To assume this, however, is to ignore several emerging trends. Most of the growth in energy usage in the next few decades is due to occur in places like China and India, where supplies of coal far exceed those of oil or natural gas. (China, which has plans to build five hundred and sixty-two coal-fired plants by 2012, is expected to overtake the U.S. as the world’s largest carbon emitter around 2025.) Meanwhile, global production of oil and gas is expected to start to decline—according to some experts, in twenty or thirty years, and to others by the end of this decade. Hoffert predicts that the world will start to “recarbonize,” a development that would make the task of stabilizing carbon dioxide that much more difficult. By his accounting, recarbonization will mean that as many as twelve wedges will be needed simply to keep CO2 emissions on the same upward trajectory they’re on now. (Socolow readily acknowledges that there are plausible scenarios that would push up the number of wedges needed.) Hoffert told me that he thought the federal government should be budgeting between ten and twenty billion dollars a year for primary research into new energy sources. For comparison’s sake, he pointed out that the “Star Wars” missile-defense program, which still hasn’t yielded a workable system, has already cost the government nearly a hundred billion dollars.
A commonly heard argument against acting to curb global warming is that the options now available are inadequate. To his dismay, Hoffert often finds his work being cited in support of this argument, with which, he says, he vigorously disagrees. “I want to make it very clear,” he told me at one point. “We have to start working immediately to implement those elements that we know how to implement and we need to start implementing these longer-term programs. Those are not opposing ideas.”
“Let me say this,” he said at another point. “I’m not sure we can solve the problem. I hope we can. I think we have a shot. I mean, it may be that we’re not going to solve global warming, the earth is going to become an ecological disaster, and, you know, somebody will visit in a few hundred million years and find there were some intelligent beings who lived here for a while, but they just couldn’t handle the transition from being hunter-gatherers to high technology. It’s certainly possible. Carl Sagan had an equation—the Drake equation—for how many intelligent species there are in the galaxy. He figured it out by saying, How many stars are there, how many planets are there around these stars, what’s the probability that life will evolve on a planet, what’s the probability if you have life evolve of having intelligent species evolve, and, once that happens, what’s the average lifetime of a technological civilization? And that last one is the most sensitive number. If the average lifetime is about a hundred years, then probably, in the whole galaxy of four hundred billion stars, there are only a few that have intelligent civilizations. If the lifetime is several million years, then the galaxy is teeming with intelligent life. It’s sort of interesting to look at it that way. And we don’t know. We could go either way.”

In theory, at least, the world has already committed itself to addressing global warming, a commitment that dates back more than a decade. In June of 1992, the Unite Nations held the so-called Earth Summit, in Rio de Janeiro. There, representatives from virtually every nation on earth met to discuss the U.N. Framework Convention o Climate Change, which had as its sweeping objective the “stabilization of greenhouse gas concentrations in the atmosphere at a level that would prevent dangerou anthropogenic”—man-made—“interference with the climate system.” One of the early signatories was President George H. W. Bush, who, while in Rio, called on worl leaders to translate “the words spoken here into concrete action to protect the planet.” Three months later, Bush submitted the Framework Convention to the U.S. Senate which approved it by unanimous consent. Ultimately, the treaty was ratified by a hundred and sixty-five countries
What “dangerous anthropogenic interference,” or D.A.I., consists of was not precisely defined in the Framework Convention, although there are, it is generally agreed, a number of scenarios that would fit the bill—climate change dramatic enough to destroy entire ecosystems, for instance, or severe enough to disrupt the world’s food supply. The disintegration of one of the planet’s remaining ice sheets is often held up as the exemplary climate disaster; were the Greenland or the West Antarctic Ice Sheet to be destroyed, sea levels around the world would rise by at least fifteen feet, inundating areas where today hundreds of millions of people live. (Were both ice sheets to disintegrate, global sea levels would rise by thirty-five feet.) It could take hundreds, perhaps even thousands, of years for either of the ice sheets to disappear entirely, but, once the disintegration was under way, it would start to feed on itself, most likely becoming irreversible. D.A.I. is understood, therefore, to refer not to the end of the process but to the very beginning, which is to say, to the point at which greenhouse-gas levels became high enough to set disaster in motion.
Among the stipulations of the Framework Convention was that the parties meet regularly to assess their progress. (These meetings became known as the Conference of the Parties, or C.O.P., sessions.) As it turned out, there was hardly any progress to assess. Article 4, paragraph 2, subparagraph b of the convention instructs industrialized nations to “aim” to reduce their greenhouse-gas emissions to 1990 levels. By 1995, the collective emissions from these nations were still rising. (Virtually the only countries that had succeeded in returning to 1990 levels were some former members of the Soviet bloc, and this was because their economies were in free fall.) Several rounds of often bitter negotiations followed, culminating in an eleven-day session at the Kyoto International Conference Hall in December, 1997.
Technically speaking, the agreement that emerged from that session is an addendum to the Framework Convention. (Its full title is the Kyoto Protocol to the United Nations Framework Convention on Climate Change.) For lofty exhortations, the Kyoto Protocol substitutes mandatory commitments. These commitments apply to industrialized, or so-called Annex 1, nations, a group that includes the United States, Canada, Japan, Europe, Australia, New Zealand, and several countries of the erstwhile Eastern bloc. Different Annex 1 nations have slightly different obligations, based on a combination of historical and political factors. The European Union nations, for example, are supposed to reduce their greenhouse-gas emissions eight per cent below 1990 levels. The U.S. has a target of seven per cent below 1990 levels, and Japan has a target of six per cent below. The treaty covers five greenhouse gases in addition to CO2—methane, nitrous oxide, hydrofluorocarbons, perfluorocarbons, and sulfur hexafluoride—which, for the purposes of accounting, are converted into units known as “carbon-dioxide equivalents.” Industrialized nations can meet their targets, in part, by buying and selling emissions credits and by investing in “clean development” projects in developing, or so-called non-Annex 1, nations. This second group includes emergent industrial powers like China and India, oil-producing states like Saudi Arabia and Kuwait, and nations with mostly subsistence economies, like Sudan. Non-Annex 1 nations have no obligation to reduce their emissions during the period covered by the protocol, which ends in 2012.

In political terms, global warming might be thought of as the tragedy of the commons writ very, very large. The goal of stabilizing C 2 concentrations effectively turns emissions into a limited resource, which nobody owns but everybody with a book of matches has access to.
Even as Kyoto was being negotiated, it was clear that the treaty was going to face stiff opposition in Washington. In July of 1997, Senator Chuck Hagel, Republican of Nebraska, and Senator Robert Byrd, Democrat of West Virginia, introduced a “sense of the Senate” resolution that, in effect, warned the Clinton Administration against the direction that the talks were taking. The so-called Byrd-Hagel Resolution stated that the U.S. should reject any agreement that committed it to reducing emissions unless concomitant obligations were imposed on developing countries as well. The Senate approved the resolution by a vote of 95-0, an outcome that reflected lobbying by both business and labor. Although the Clinton Administration eventually signed Kyoto, it never submitted the protocol to the Senate for ratification, citing the need for participation by “key developing nations.”
From a certain perspective, the logic behind the Byrd-Hagel Resolution is unimpeachable. Emissions controls cost money, and this cost has to be borne by somebody. If the U.S. were to agree to limit its greenhouse gases while economic competitors like China and India were not, then American companies would be put at a disadvantage. “A treaty that requires binding commitments for reduction of emissions of greenhouse gases for the industrial countries but not developing countries will create a very damaging situation for the American economy” is how Richard Trumka, the secretary-treasurer of the A.F.L.-C.I.O., put it when he travelled to Kyoto to lobby against the protocol. It is also true that an agreement that limits carbon emissions in some countries and not in others could result in a migration, rather than an actual reduction, of CO2 emissions. (Such a possibility is known in climate parlance as “leakage.”)
From another perspective, however, the logic of Byrd-Hagel is deeply, even obscenely, self-serving. Suppose for a moment that the total anthropogenic CO2 that can be emitted into the atmosphere were a big ice-cream cake. If the aim is to keep concentrations below five hundred parts per million, then roughly half that cake has already been consumed, and, of that half, the lion’s share has been polished off by the industrialized world. To insist now that all countries cut their emissions simultaneously amounts to advocating that industrialized nations be allocated most of the remaining slices, on the ground that they’ve already gobbled up so much. In a year, the average American produces the same greenhouse-gas emissions as four and a half Mexicans, or eighteen Indians, or ninety-nine Bangladeshis. If both the U.S. and India were to reduce their emissions proportionately, then the average Bostonian could continue indefinitely producing eighteen times as much greenhouse gases as the average Bangalorean. But why should anyone have the right to emit more than anyone else? At a climate meeting in New Delhi three years ago, Atal Bihari Vajpayee, then the Indian prime minister, told world leaders, “Our per capita greenhouse gas emissions are only a fraction of the world average and an order of magnitude below that of many developed countries. We do not believe that the ethos of democracy can support any norm other than equal per capita rights to global environmental resources.”
Outside the U.S., the decision to exempt developing nations from Kyoto’s mandates was generally regarded as an adequate—if imperfect—solution. The point was to get the process started, and to persuade countries like China and India to sign on later. This “two-world” approach had been employed—successfully—in the nineteen-eighties to phase out chlorofluorocarbons, the chemicals responsible for depleting atmospheric ozone. Pieter van Geel, the Dutch environment secretary, who is a member of the Netherlands’ center-right Christian Democratic Party, described the European outlook to me as follows: “We cannot say, ‘Well, we have our wealth, based on the use of fossil fuels for the last three hundred years, and, now that your countries are growing, you may not grow at this rate, because we have a climate-change problem.’ We should show moral leadership by giving the example. That’s the only way we can ask something of these other countries.”

The Kyoto Protocol finally went into effect on February 16th of this year. In many cities, the event was marked by celebration; the city of Bonn hosted a reception in th Rathaus, Oxford University held an “Entry Into Force” banquet, and in Hong Kong there was a Kyoto prayer meeting. As it happened, that day, an exceptionally warm on in Washington, D.C., I went to speak to the Under-Secretary of State for Global Affairs, Paula Dobriansky
Dobriansky is a slight woman with shoulder-length brown hair and a vaguely anxious manner. Among her duties is explaining the Bush Administration’s position on global warming to the rest of the world; in December, for example, she led the U.S. delegation to the tenth Conference of the Parties, which was held in Buenos Aires. Dobriansky began by assuring me that the Administration took the issue of climate change “very seriously.” She went on, “Also let me just add, because in terms of taking it seriously, not only stating to you that we take it seriously, we have engaged many countries in initiatives and efforts, whether they are bilateral initiatives—we have some fourteen bilateral initiatives—and in addition we have put together some multilateral initiatives. So we view this as a serious issue.”
Besides the U.S., the only other major industrialized nation that has rejected Kyoto—and, with it, mandatory cuts in emissions—is Australia. I asked Dobriansky how she justified the U.S.’s stance to its allies. “We have a common goal and objective as parties to the U. N. Framework Convention on Climate Change,” she told me. “Where we differ is on what approach we believe is and can be the most effective.”
Running for President in 2000, George W. Bush called global warming “an issue that we need to take very seriously.” He promised, if elected, to impose federal limits on CO2. Soon after his inauguration, he sent the head of the Environmental Protection Agency, Christine Todd Whitman, to a meeting of environment ministers from the world’s leading industrialized nations, where she elaborated on his position. Whitman assured her colleagues that the new President believed global warming to be “one of the greatest environmental challenges that we face” and that he wanted to “take steps to move forward.” Ten days after her presentation, Bush announced that not only was he withdrawing the U.S. from the ongoing negotiations over Kyoto—the protocol had left several complex issues of implementation to be resolved later—he was now opposed to any mandatory curbs on carbon dioxide. Explaining his change of heart, Bush asserted that he no longer believed that CO2 limits were justified, owing to the “state of scientific knowledge of the causes of, and solutions to, global climate change,” which he labelled “incomplete.” (Former Treasury Secretary Paul O’Neill, who backed the President’s original position, has speculated publicly that the reversal was engineered by Vice-President Dick Cheney.)
The following year, President Bush came forward with the Administration’s current position on global warming. Central to this policy is a reworking of the key categories. Whereas Kyoto and the original Framework Convention aim at controlling greenhouse-gas emissions, the President’s policy targets greenhouse-gas “intensity.” Bush has declared his approach preferable because it recognizes “that a nation that grows its economy is a nation that can afford investments and new technology.”
Greenhouse-gas intensity is not a quantity that can be measured directly. Rather, it is a ratio that relates emissions to economic output. Say, for example, that one year a business produces a hundred pounds of carbon and a hundred dollars’ worth of goods. Its greenhouse-gas intensity in that case would be one pound per dollar. If the next year that company produces the same amount of carbon but an extra dollar’s worth of goods, its intensity will have fallen by one per cent. Even if it doubles its total emissions of carbon, a company—or a country—can still claim a reduced intensity provided that it more than doubles its output of goods. (Typically, a country’s greenhouse-gas intensity is measured in terms of tons of carbon per million dollars’ worth of gross domestic product.)
To focus on greenhouse-gas intensity is to give a peculiarly sunny account of the United States’ situation. Between 1990 and 2000, the U.S.’s greenhouse-gas intensity fell by some seventeen per cent, owing to several factors, including the shift toward a more service-based economy. Meanwhile, over-all emissions rose by some twelve per cent. (In terms of greenhouse-gas intensity, the U.S. actually performs better than many Third World nations, because even though we consume a lot more energy, we also have a much larger G.D.P.) In February of 2002, President Bush set the goal of reducing the country’s greenhouse-gas intensity by eighteen per cent over the following ten years. During that same decade, the Administration expects the American economy to grow by three per cent annually. If both expectations are met, over-all emission of greenhouse gases will rise by about twelve per cent.
The Administration’s plan, which relies almost entirely on voluntary measures, has been characterized by critics as nothing more than a subterfuge—“a total charade” is how Philip Clapp, the president of the Washington-based National Environmental Trust, once put it. Certainly, if the goal is to prevent “dangerous anthropogenic interference,” then greenhouse-gas intensity is the wrong measure to use. (Essentially, the President’s approach amounts to following the path of “business as usual.”) The Administration’s response to such criticism is to attack its premise. “Science tells us that we cannot say with any certainty what constitutes a dangerous level of warming and therefore what level must be avoided,” Dobriansky declared recently. When I asked her how, in that case, the U.S. could support the U.N. Framework Convention’s aim of averting D.A.I., she answered by saying—twice—“We predicate our policies on sound science.”

Earlier this year, the chairman of the Senate Environment and Public Works Committee, James Inhofe, gave a speech on the Senate floor, which he entitled “An Update o the Science of Climate Change.” In the speech, Inhofe, an Oklahoma Republican, announced that “new evidence” had come to light that “makes a mockery” of the notio that human-induced warming is occurring. The Senator, who has called global warming “the greatest hoax ever perpetrated on the American people,” went on to argue tha this important new evidence was being suppressed by “alarmists” who view anthropogenic warming as “an article of religious faith.” One of the authorities that Inhof repeatedly cited in support of his claims was the fiction writer Michael Crichton
It was an American scientist, Charles David Keeling, who, in the nineteen-fifties, developed the technology to measure CO2 levels precisely, and it was American researchers who, working out of Hawaii’s Mauna Loa Observatory, first showed that these levels were steadily rising. In the half century since then, the U.S. has contributed more than any other nation to the advancement of climate science, both theoretically, through the work of climate modellers, and experimentally, through field studies conducted on every continent.
At the same time, the U.S. is also the world’s chief purveyor of the work of so-called global-warming “skeptics.” The ideas of these skeptics are published in books with titles like “The Satanic Gases” and “Global Warming and Other Eco-Myths” and then circulated on the Web by groups like Tech Central Station, which is sponsored by, among others, ExxonMobil and General Motors. While some skeptics’ organizations argue that global warming isn’t real, or at least hasn’t been proved—“Predicting weather conditions a day or two in advance is hard enough, so just imagine how hard it is to forecast what our climate will be,” Americans for Balanced Energy Choices, a lobbying organization funded by mining and power companies, declares on its Web site—others maintain that rising CO2 levels are actually cause for celebration.
“Carbon dioxide emissions from fossil fuel combustion are beneficial to life on earth,” the Greening Earth Society, an organization created by the Western Fuels Association, a utility group, states. Atmospheric levels of seven hundred and fifty parts per million—nearly triple pre-industrial levels—are nothing to worry about, the society maintains, because plants like lots of CO2, which they need for photosynthesis. (Research on this topic, the group’s Web site acknowledges, has been “frequently denigrated,” but “it’s exciting stuff” and provides an “antidote to gloom-and-doom about potential changes in earth’s climate.”)
In legitimate scientific circles, it is virtually impossible to find evidence of disagreement over the fundamentals of global warming. This fact was neatly demonstrated last year by Naomi Oreskes, a professor of history and science studies at the University of California at San Diego. Oreskes conducted a study of the more than nine hundred articles on climate change published in refereed journals between 1993 and 2003 and subsequently made available on a leading research database. Of these, she found that seventy-five per cent endorsed the view that anthropogenic emissions were responsible for at least some of the observed warming of the past fifty years. The remaining twenty-five per cent, which dealt with questions of methodology or climate history, took no position on current conditions. Not a single article disputed the premise that anthropogenic warming is under way.
Still, pronouncements by groups like the Greening Earth Society and politicians like Senator Inhofe help to shape public discourse on climate change in this country. And this is clearly their point. A few years ago, the pollster Frank Luntz prepared a strategy memo for Republican members of Congress, coaching them on how to deal with a variety of environmental issues. (Luntz, who first made a name for himself by helping to craft Newt Gingrich’s “Contract with America,” has been described as “a political consultant viewed by Republicans as King Arthur viewed Merlin.”) Under the heading “Winning the Global Warming Debate,” Luntz wrote, “The scientific debate is closing (against us) but not yet closed. There is still a window of opportunity to challenge the science.” He warned, “Voters believe that there is no consensus about global warming in the scientific community. Should the public come to believe that the scientific issues are settled, their views about global warming will change accordingly.” Luntz also advised, “The most important principle in any discussion of global warming is your commitment to sound science.”

It is in this context, and really only in this context, that the Bush Administration’s conflicting claims about the science of global warming make any sense. Administratio officials are quick to point to the scientific uncertainties that remain about global warming, of which there are many. But where there is broad scientific agreement they ar reluctant to acknowledge it. “When we make decisions, we want to make sure we do so on sound science,” the President said, announcing his new approach to globa warming in February, 2002. Just a few months later, the Environmental Protection Agency delivered a two-hundred-and-sixty-three-page report to the U.N. which state that “continuing growth in greenhouse gas emissions is likely to lead to annual average warming over the United States that could be as much as several degrees Celsiu (roughly 3 to 9 degrees Fahrenheit) during the 21st century.” The President dismissed the report—the product of years of work by federal researchers—as something “pu out by the bureaucracy.” The following spring, the E.P.A. made another effort to give an objective summary of climate science, in a report on the state of the environment The White House interfered so insistently in the writing of the global-warming section—at one point, it tried to insert excerpts from a study partly financed by the America Petroleum Institute—that, in an internal memo, agency staff members complained that the section “no longer accurately represents scientific consensus.” (When the E.P.A finally published the report, the climate-science section was missing entirely.) Just two months ago, a top official with the federal Climate Change Science Progra announced that he was resigning, owing to differences with the White House. The official, Rick Piltz, said that he was disturbed that the Administration insisted on vettin climate-science reports, “rather than asking independent scientists to write them and let the chips fall where they may.

The day after the Kyoto Protocol took effect, I went to the United Nations to attend a conference entitled, appositely, “One Day After Kyoto.” The conference, whos subtitle was “Next Steps on Climate,” was held in a large room with banks of curved desks, each equipped with a little plastic earpiece. The speakers included scientists insurance-industry executives, and diplomats from all over the world, among them the U.N. Ambassador from the tiny Pacific island nation of Tuvalu, who described ho his country was in danger of simply disappearing. Britain’s permanent representative to the U.N., Sir Emyr Jones Parry, began his remarks to the crowd of two hundred o so by stating, “We can’t go on as we are.
When the U.S. withdrew from negotiations over Kyoto, in 2001, the entire effort nearly collapsed. According to the protocol’s elaborate ratification mechanism, in order to take effect it had to be approved by countries responsible for at least fifty-five per cent of the industrialized world’s CO2 emissions. All on its own, America accounts for thirty-four per cent of those emissions. European leaders spent more than three years working behind the scenes, lining up support from the remaining industrialized nations. The crucial threshold was finally crossed this past October, when the Russian Duma voted in favor of ratification. The Duma’s vote was understood to be a condition of European backing for Russia’s bid to join the World Trade Organization. (“russia forced to ratify kyoto protocol to become w.t.o. member,” read the headline in Pravda.)
As speaker after speaker at the U.N. conference noted, Kyoto is only the first step in a long process. Even if every country—including the U.S.—were to fulfill its obligations under the protocol before it lapses in 2012, CO2 concentrations in the atmosphere would still reach dangerous levels. Kyoto merely delays this outcome. The “next step on climate” requires, among other things, substantive commitments from countries like China and India. So long as U.S. emissions continue to grow, essentially unchecked, obtaining these commitments seems next to impossible. In this way, the U.S., having failed to defeat Kyoto, may be in the process of doing something even more damaging: ruining the chances of reaching a post-Kyoto agreement. “The blunt reality is that, unless America comes back into some form of international consensus, it is very hard to make progress” is how Britain’s Prime Minister, Tony Blair, diplomatically put it at a recent press conference.
Astonishingly, standing in the way of progress seems to be Bush’s goal. Paula Dobriansky explained the Administration’s position to me as follows: While the rest of the industrialized world is pursuing one strategy (emissions limits), the U.S. is pursuing another (no emissions limits), and it is still too early to say which approach will work best. “It is essential to really implement these programs and approaches now and to take stock of their effectiveness,” she said, adding, “We think it is premature to talk about future arrangements.” At C.O.P.-10, in Buenos Aires, many delegations pressed for a preliminary round of meetings so that work could start on mapping out Kyoto’s successor. The U.S. delegation opposed these efforts so adamantly that finally the Americans were asked to describe, in writing, what sort of meeting they would find acceptable. They issued half a page of conditions, one of which was that the session “shall be a one-time event held during a single day.” Another condition was, paradoxically, that, if they were going to discuss the future, the future would have to be barred from discussion; presentations, they wrote, should be limited to “an information exchange” on “existing national policies.” Annie Petsonk, a lawyer with the advocacy group Environmental Defense, who previously worked for the Administration of George Bush, Sr., attended the talks in Buenos Aires. She recalled the effect that the memo had on the members of the other delegations: “They were ashen.”
European leaders have made no secret of their dismay at the Administration’s stance. “It’s absolutely obvious that global warming has started,” France’s President, Jacques Chirac, said after attending last year’s G-8 summit with Bush. “And so we have to act responsibly, and, if we do nothing, we would bear a heavy responsibility. I had the chance to talk to the United States President about this. To tell you that I convinced him would be a total exaggeration, as you can imagine.” Blair, who currently holds the presidency of the G-8, recently warned that only “timely action” on climate change will avert “disaster.” He has promised to make the issue one of the top items on the agenda of this year’s summit, to be held in Scotland in July, but no one seems to be expecting a great deal to come of it. While attending a meeting in London this spring, the head of the White House Council on Environmental Quality, James Connaughton, announced that he wasn’t yet convinced that anthropogenic warming was a problem. “We are still working on the issue of causation, the extent to which humans are a factor,” he said.

The town of Maasbommel, sixty miles southeast of Amsterdam, is a popular tourist destination along the banks of the River Meuse. Every summer, it is visited b thousands of people who come to go boating and camping. Thanks to the risk of flooding, building is restricted along the river, but a few years ago one of the Netherlands largest construction firms, Dura Vermeer, received permission to turn a former R.V. park into a development of “amphibious homes.” The first of these were completed las fall, and a few months later I went to see them
The amphibious homes all look alike. They are tall and narrow, with flat sides and curved metal roofs, so that, standing next to one another, they resemble a row of toasters. Each one is moored to a metal pole and sits on a set of hollow concrete pontoons. Assuming that all goes according to plan, when the Meuse floods the homes will bob up and then, when the water recedes, they will gently be deposited back on land. Dura Vermeer is also working to construct buoyant roads and floating greenhouses. While each of these projects represents a somewhat different engineering challenge, they have a common goal, which is to allow people to continue to inhabit areas that, periodically at least, will be inundated. The Dutch, because of their peculiar vulnerability, can’t afford to misjudge climate change, or to pretend that by denying it they can make it go away. “There is a flood market emerging,” Chris Zevenbergen, Dura Vermeer’s environmental director, told me. Half a dozen families were already occupying their amphibious homes when I visited Maasbommel. Anna van der Molen, a nurse and mother of four, gave me a tour of hers. She said that she expected that in the future people all over the world would live in floating houses, since, as she put it, “the water is coming up.”
Resourcefulness and adaptability are, of course, essential human qualities. People are always imagining new ways to live, and then figuring out ways to remake the world to suit what they’ve imagined. This capacity has allowed us, collectively, to overcome any number of threats in the past, some imposed by nature, some by ourselves. It could be argued, taking this long view, that global warming is just one more test in a sequence that already stretches from plague and pestilence to the prospect of nuclear annihilation. If, at this moment, the bind that we’re in appears insoluble, once we’ve thought long and hard enough about it we’ll find—or maybe float—our way clear.
But it’s also possible to take an even longer view of the situation. We now have detailed climate records going back four full glacial cycles. What these records show, in addition to a clear correlation between CO2 levels and global temperatures, is that the last glaciation was a period of frequent and traumatic climate swings. During that period, which lasted nearly a hundred thousand years, humans who were, genetically speaking, just like ourselves wandered the globe, producing nothing more permanent than isolated cave paintings and large piles of mastodon bones. Then, ten thousand years ago, at the start of the Holocene, the climate changed. As the weather settled down, so did we. People built villages, towns, and, finally, cities, along the way inventing all the basic technologies—agriculture, metallurgy, writing—that future civilizations would rely upon. These developments would not have been possible without human ingenuity, but, until the climate coöperated, ingenuity, it seems, wasn’t enough.
Climate records also show that we are steadily drawing closer to the temperature peaks of the last interglacial, when sea levels were some fifteen feet higher than they are today. Just a few degrees more and the earth will be hotter than it has been at any time since our species evolved. Scientists have identified a number of important feedbacks in the climate system, many of which are not fully understood; in general, they tend to take small changes to the system and amplify them into much larger forces. Perhaps we are the most unpredictable feedback of all. No matter what we do at this point, global temperatures will continue to rise in the coming decades, owing to the gigatons of extra CO2 already circulating in the atmosphere. With more than six billion people on the planet, the risks of this are obvious. A disruption in monsoon patterns, a shift in ocean currents, a major drought—any one of these could easily produce streams of refugees numbering in the millions. As the effects of global warming become more and more apparent, will we react by finally fashioning a global response? Or will we retreat into ever narrower and more destructive forms of self-interest? It may seem impossible to imagine that a technologically advanced society could choose, in essence, to destroy itself, but that is what we are now in the process of doing.
(This is the third part of a three-part article.)

The New Yorker - The Climate of Man - II

THE CLIMATE OF MAN—II
by ELIZABETH KOLBERT
The curse of Akkad.
Issue of 2005-05-02
Posted 2005-04-25

The world’s first empire was established forty-three hundred years ago, between the Tigris and Euphrates Rivers. The details of its founding, by Sargon of Akkad, hav come down to us in a form somewhere between history and myth. Sargon—Sharru-kin, in the language of Akkadian—means “true king”; almost certainly, though, he was usurper. As a baby, Sargon was said to have been discovered, Moses-like, floating in a basket. Later, he became cupbearer to the ruler of Kish, one of ancient Babylonia’ most powerful cities. Sargon dreamed that his master, Ur-Zababa, was about to be drowned by the goddess Inanna in a river of blood. Hearing about the dream, Ur-Zabab decided to have Sargon eliminated. How this plan failed is unknown; no text relating the end of the story has ever been found
Until Sargon’s reign, Babylonian cities like Kish, and also Ur and Uruk and Umma, functioned as independent city-states. Sometimes they formed brief alliances—cuneiform tablets attest to strategic marriages celebrated and diplomatic gifts exchanged—but mostly they seem to have been at war with one another. Sargon first subdued Babylonia’s fractious cities, then went on to conquer, or at least sack, lands like Elam, in present-day Iran. He presided over his empire from the city of Akkad, the ruins of which are believed to lie south of Baghdad. It was written that “daily five thousand four hundred men ate at his presence,” meaning, presumably, that he maintained a huge standing army. Eventually, Akkadian hegemony extended as far as the Khabur plains, in northeastern Syria, an area prized for its grain production. Sargon came to be known as “king of the world”; later, one of his descendants enlarged this title to “king of the four corners of the universe.”
Akkadian rule was highly centralized, and in this way anticipated the administrative logic of empires to come. The Akkadians levied taxes, then used the proceeds to support a vast network of local bureaucrats. They introduced standardized weights and measures—the gur equalled roughly three hundred litres—and imposed a uniform dating system, under which each year was assigned the name of a major event that had recently occurred: for instance, “the year that Sargon destroyed the city of Mari.” Such was the level of systematization that even the shape and the layout of accounting tablets were imperially prescribed. Akkad’s wealth was reflected in, among other things, its art work, the refinement and naturalism of which were unprecedented.
Sargon ruled, supposedly, for fifty-six years. He was succeeded by his two sons, who reigned for a total of twenty-four years, and then by a grandson, Naram-sin, who declared himself a god. Naram-sin was, in turn, succeeded by his son. Then, suddenly, Akkad collapsed. During one three-year period, four men each, briefly, claimed the throne. “Who was king? Who was not king?” the register known as the Sumerian King List asks, in what may be the first recorded instance of political irony.
The lamentation “The Curse of Akkad” was written within a century of the empire’s fall. It attributes Akkad’s demise to an outrage against the gods. Angered by a pair of inauspicious oracles, Naram-sin plunders the temple of Enlil, the god of wind and storms, who, in retaliation, decides to destroy both him and his people:
For the first time since cities were built and founded,
The great agricultural tracts produced no grain,
The inundated tracts produced no fish,
The irrigated orchards produced neither syrup nor wine,
The gathered clouds did not rain, the masgurum did not grow.
At that time, one shekel’s worth of oil was only one-half quart,
One shekel’s worth of grain was only one-half quart. . . .
These sold at such prices in the markets of all the cities!
He who slept on the roof, died on the roof,
He who slept in the house, had no burial,
People were flailing at themselves from hunger.
For many years, the events described in “The Curse of Akkad” were thought, like the details of Sargon’s birth, to be purely fictional.

In 1978, after scanning a set of maps at Yale’s Sterling Memorial Library, a university archeologist named Harvey Weiss spotted a promising-looking mound at th confluence of two dry riverbeds in the Khabur plains, near the Iraqi border. He approached the Syrian government for permission to excavate the mound, and, somewhat t his surprise, it was almost immediately granted. Soon, he had uncovered a lost city, which in ancient times was known as Shekhna and today is called Tell Leilan
Over the next ten years, Weiss, working with a team of students and local laborers, proceeded to uncover an acropolis, a crowded residential neighborhood reached by a paved road, and a large block of grain-storage rooms. He found that the residents of Tell Leilan had raised barley and several varieties of wheat, that they had used carts to transport their crops, and that in their writing they had imitated the style of their more sophisticated neighbors to the south. Like most cities in the region at the time, Tell Leilan had a rigidly organized, state-run economy: people received rations—so many litres of barley and so many of oil—based on how old they were and what kind of work they performed. From the time of the Akkadian empire, thousands of similar potsherds were discovered, indicating that residents had received their rations in mass-produced, one-litre vessels. After examining these and other artifacts, Weiss constructed a time line of the city’s history, from its origins as a small farming village (around 5000 B.C.), to its growth into an independent city of some thirty thousand people (2600 B.C.), and on to its reorganization under imperial rule (2300 B.C.).
Wherever Weiss and his team dug, they also encountered a layer of dirt that contained no signs of human habitation. This layer, which was more than three feet deep, corresponded to the years 2200 to 1900 B.C., and it indicated that, around the time of Akkad’s fall, Tell Leilan had been completely abandoned. In 1991, Weiss sent soil samples from Tell Leilan to a lab for analysis. The results showed that, around the year 2200 B.C., even the city’s earthworms had died out. Eventually, Weiss came to believe that the lifeless soil of Tell Leilan and the end of the Akkadian empire were products of the same phenomenon—a drought so prolonged and so severe that, in his words, it represented an example of “climate change.”
Weiss first published his theory, in the journal Science, in August, 1993. Since then, the list of cultures whose demise has been linked to climate change has continued to grow. They include the Classic Mayan civilization, which collapsed at the height of its development, around 800 A.D.; the Tiwanaku civilization, which thrived near Lake Titicaca, in the Andes, for more than a millennium, then disintegrated around 1100 A.D.; and the Old Kingdom of Egypt, which collapsed around the same time as the Akkadian empire. (In an account eerily reminiscent of “The Curse of Akkad,” the Egyptian sage Ipuwer described the anguish of the period: “Lo, the desert claims the land. Towns are ravaged. . . . Food is lacking. . . . Ladies suffer like maidservants. Lo, those who were entombed are cast on high grounds.”) In each of these cases, what began as a provocative hypothesis has, as new information has emerged, come to seem more and more compelling. For example, the notion that Mayan civilization had been undermined by climate change was first proposed in the late nineteen-eighties, at which point there was little climatological evidence to support it. Then, in the mid-nineteen-nineties, American scientists studying sediment cores from Lake Chichancanab, in north-central Yucatán, reported that precipitation patterns in the region had indeed shifted during the ninth and tenth centuries, and that this shift had led to periods of prolonged drought. More recently, a group of researchers examining ocean-sediment cores collected off the coast of Venezuela produced an even more detailed record of rainfall in the area. They found that the region experienced a series of severe, “multiyear drought events” beginning around 750 A.D. The collapse of the Classic Mayan civilization, which has been described as “a demographic disaster as profound as any other in human history,” is thought to have cost millions of lives.
The climate shifts that affected past cultures predate industrialization by hundreds—or, in the case of the Akkadians, thousands—of years. They reflect the climate system’s innate variability and were caused by forces that, at this point, can only be guessed at. By contrast, the climate shifts predicted for the coming century are attributable to forces that are now well known. Exactly how big these shifts will be is a matter of both intense scientific interest and the greatest possible historical significance. In this context, the discovery that large and sophisticated cultures have already been undone by climate change presents what can only be called an uncomfortable precedent.

The Goddard Institute for Space Studies, or giss, is situated just south of Columbia University’s main campus, at the corner of Broadway and West 112th Street. The institute is not well marked, but most New Yorkers would probably recognize the building: its ground floor is home to Tom’s Restaurant, the coffee shop made famous by “Seinfeld.”
giss, an outpost of nasa, started out, forty-four years ago, as a planetary-research center; today, its major function is making forecasts about climate change. giss employs about a hundred and fifty people, many of whom spend their days working on calculations that may—or may not—end up being incorporated in the institute’s climate model. Some work on algorithms that describe the behavior of the atmosphere, some on the behavior of the oceans, some on vegetation, some on clouds, and some on making sure that all these algorithms, when they are combined, produce results that seem consistent with the real world. (Once, when some refinements were made to the model, rain nearly stopped falling over the rain forest.) The latest version of the giss model, called ModelE, consists of a hundred and twenty-five thousand lines of computer code.
giss’s director, James Hansen, occupies a spacious, almost comically cluttered office on the institute’s seventh floor. (I must have expressed some uneasiness the first time I visited him, because the following day I received an e-mail assuring me that the office was “a lot better organized than it used to be.”) Hansen, who is sixty-three, is a spare man with a lean face and a fringe of brown hair. Although he has probably done as much to publicize the dangers of global warming as any other scientist, in person he is reticent almost to the point of shyness. When I asked him how he had come to play such a prominent role, he just shrugged. “Circumstances,” he said.
Hansen first became interested in climate change in the mid-nineteen-seventies. Under the direction of James Van Allen (for whom the Van Allen radiation belts are named), he had written his doctoral dissertation on the climate of Venus. In it, he had proposed that the planet, which has an average surface temperature of eight hundred and sixty-seven degrees Fahrenheit, was kept warm by a smoggy haze; soon afterward, a space probe showed that Venus was actually insulated by an atmosphere that consists of ninety-six per cent carbon dioxide. When solid data began to show what was happening to greenhouse-gaslevels on earth, Hansen became, in his words, “captivated.” He decided that a planet whose atmosphere could change in the course of a human lifetime was more interesting than one that was going to continue, for all intents and purposes, to broil away forever. A group of scientists at nasa had put together a computer program to try to improve weather forecasting using satellite data. Hansen and a team of half a dozen other researchers set out to modify it, in order to make longer-range forecasts about what would happen to global temperatures as greenhouse gasescontinued to accumulate. The project, which resulted in the first version of the giss climate model, took nearly seven years to complete.
At that time, there was little empirical evidence to support the notion that the earth was warming. Instrumental temperature records go back, in a consistent fashion, only to the mid-nineteenth century. They show that average global temperatures rose through the first half of the twentieth century, then dipped in the nineteen-fifties and sixties. Nevertheless, by the early nineteen-eighties Hansen had gained enough confidence in his model to begin to make a series of increasingly audacious predictions. In 1981, he forecast that “carbon dioxide warming should emerge from the noise of natural climate variability” around the year 2000. During the exceptionally hot summer of 1988, he appeared before a Senate subcommittee and announced that he was “ninety-nine per cent” sure that “global warming is affecting our planet now.” And in the summer of 1990 he offered to bet a roomful of fellow-scientists a hundred dollars that either that year or one of the following two years would be the warmest on record. To qualify, the year would have to set a record not only for land temperatures but also for sea-surface temperatures and for temperatures in the lower atmosphere. Hansen won the bet in six months.

Like all climate models, giss’s divides the world into a series of boxes. Thirty-three hundred and twelve boxes cover the earth’s surface, and this pattern is repeated twenty times moving up through the atmosphere, so that the whole arrangement might be thought of as a set of enormous checkerboards stacked on top of one another. Each box represents an area of four degrees latitude by five degrees longitude. (The height of the box varies depending on altitude.) In the real world, of course, such a large area would have an incalculable number of features; in the world of the model, features such as lakes and forests and, indeed, whole mountain ranges are reduced to a limited set of properties, which are then expressed as numerical approximations. Time in this grid world moves ahead for the most part in discrete, half-hour intervals, meaning that a new set of calculations is performed for each box for every thirty minutes that is supposed to have elapsed in actuality. Depending on what part of the globe a box represents, these calculations may involve dozens of different algorithms, so that a model run that is supposed to simulate climate conditions over the next hundred years involves more than a quadrillion separate operations. A single run of the giss model, done on a supercomputer, usually takes about a month.
Very broadly speaking, there are two types of equations that go into a climate model. The first group expresses fundamental physical principles, like the conservation of energy and the law of gravity. The second group describes—the term of art is “parameterize”—patterns and interactions that have been observed in nature but may be only partly understood, or processes that occur on a small scale, and have to be averaged out over huge spaces. Here, for example, is a tiny piece of ModelE, written in the computer language fortran, which deals with the formation of clouds:
c**** compute the autoconversion rate of cloud water to precipitation
rho=1.e5*pl(l)/(rgas*tl(l))
tem=rho*wmx(l)/(wconst*fcld+ 1.e-20)
if(lhx.eq.lhs) tem=rho*wmx(l)/ (wmui*fcld+1.e-20)
tem=tem*tem
if(tem.gt.10.) tem=10.
cm1=cm0
if(bandf) cm1=cm0*cbf
if(lhx.eq.lhs) cm1=cm0
cm=cm1*(1.-1./exp(tem*tem))+1. *100.*(prebar(l+1)+
* precnvl(l+1)*bydtsrc)
if(cm.gt.bydtsrc) cm=bydtsrc
prep(l)=wmx(l)*cm
end if
c**** form clouds only if rh gt rh00
219 if(rh1(l).lt.rh00(l)) go to 220.
All climate models treat the laws of physics in the same way, but, since they parameterize phenomena like cloud formation differently, they come up with different results. (At this point, there are some fifteen major climate models in operation around the globe.) Also, because the real-world forces influencing the climate are so numerous, different models tend, like medical students, to specialize in different processes. giss’s model, for example, specializes in the behavior of the atmosphere, other models in the behavior of the oceans, and still others in the behavior of land surfaces and ice sheets.
Last fall, I attended a meeting at giss which brought together members of the institute’s modelling team. When I arrived, about twenty men and five women were sitting in battered chairs in a conference room across from Hansen’s office. At that particular moment, the institute was performing a series of runs for the U.N. Intergovernmental Panel on Climate Change. The runs were overdue, and apparently the I.P.C.C. was getting impatient. Hansen flashed a series of charts on a screen on the wall summarizing some of the results obtained so far.
The obvious difficulty in verifying any particular climate model or climate-model run is the prospective nature of the results. For this reason, models are often run into the past, to see how well they reproduce trends that have already been observed. Hansen told the group that he was pleased with how ModelE had reproduced the aftermath of the eruption of Mt. Pinatubo, in the Philippines, which took place in June of 1991. Volcanic eruptions release huge quantities of sulfur dioxide—Pinatubo produced some twenty million tons of the gas—which, once in the stratosphere, condenses into tiny sulfate droplets. These droplets, or aerosols, tend to cool the earth by reflecting sunlight back into space. (Man-made aerosols, produced by burning coal, oil, and biomass, also reflect sunlight and are a countervailing force to greenhouse warming, albeit one with serious health consequences of its own.) This cooling effect lasts as long as the aerosols remain suspended in the atmosphere. In 1992, global temperatures, which had been rising sharply, fell by half of a degree. Then they began to climb again. ModelE had succeeded in simulating this effect to within nine-hundredths of a degree. “That’s a pretty nice test,” Hansen observed laconically.

One day, when I was talking to Hansen in his office, he pulled a pair of photographs out of his briefcase. The first showed a chubby-faced five-year-old girl holding som miniature Christmas-tree lights in front of an even chubbier-faced five-month-old baby. The girl, Hansen told me, was his granddaughter Sophie and the boy was his ne grandson, Connor. The caption on the first picture read, “Sophie explains greenhouse warming.” The caption on the second photograph, which showed the baby smilin gleefully, read, “Connor gets it.
When modellers talk about what drives the climate, they focus on what they call “forcings.” A forcing is any ongoing process or discrete event that alters the energy of the system. Examples of natural forcings include, in addition to volcanic eruptions, periodic shifts in the earth’s orbit and changes in the sun’s output, like those linked to sunspots. Many climate shifts of the past have no known forcing associated with them; for instance, no one is certain what brought about the so-called Little Ice Age, which began in Europe some five hundred years ago. A very large forcing, meanwhile, should produce a commensurately large—and obvious—effect. One giss scientist put it to me this way: “If the sun went supernova, there’s no question that we could model what would happen.”
Adding carbon dioxide, or any other greenhouse gas, to the atmosphere by, say, burning fossil fuels or levelling forests is, in the language of climate science, an anthropogenic forcing. Since pre-industrial times, the concentration of CO2in the earth’s atmosphere has risen by roughly a third, from 280 parts per million to 378 p.p.m. During the same period, concentrations of methane, an even more powerful (but more short-lived) greenhouse gas, have more than doubled, from .78 p.p.m. to 1.76 p.p.m. Scientists measure forcings in terms of watts per square metre, or w/m2, by which they mean that a certain number of watts of energy have been added (or, in the case of a negative forcing, subtracted) for every single square metre of the earth’s surface. The size of the greenhouse forcing is estimated, at this point, to be 2.5 w/m2. A miniature Christmas light gives off about four tenths of a watt of energy, mostly in the form of heat, so that, in effect (as Sophie supposedly explained to Connor), we have covered the earth with tiny bulbs, six for every square metre. These bulbs are burning twenty-four hours a day, seven days a week, year in and year out.
If greenhouse gases were held constant at today’s levels, it is estimated that it would take several decades for the full impact of the forcing that is already in place to be felt. This is because raising the earth’s temperature involves not only warming the air and the surface of the land but also melting sea ice, liquefying glaciers, and, most significant, heating the oceans—all processes that require tremendous amounts of energy. (Imagine trying to thaw a gallon of ice cream or warm a pot of water using an Easy-Bake oven.) It could be argued that the delay that is built into the system is socially useful, because it enables us—with the help of climate models—to prepare for what lies ahead, or that it is socially disastrous, because it allows us to keep adding CO2to the atmosphere while fobbing the impacts off on our children and grandchildren. Either way, if current trends continue, which is to say, if steps are not taken to reduce emissions, carbon-dioxide levels will probably reach 500 parts per million—nearly double pre-industrial levels—sometime around the middle of the century. By that point, of course, the forcing associated with greenhouse gases will also have increased, to four watts per square metre and possibly more. For comparison’s sake, it is worth keeping in mind that the total forcing that ended the last ice age—a forcing that was eventually sufficient to melt mile-thick ice sheets and raise global sea levels by four hundred feet—is estimated to have been just six and a half watts per square metre.
There are two ways to operate a climate model. In the first, which is known as a transient run, greenhouse gases are slowly added to the simulated atmosphere—just as they would be to the real atmosphere—and the model forecasts what the effect of these additions will be at any given moment. In the second, greenhouse gases are added to the atmosphere all at once, and the model is run at these new levels until the climate has fully adjusted to the forcing by reaching a new equilibrium. Not surprisingly, this is known as an equilibrium run. For doubled CO2, equilibrium runs of the giss model predict that average global temperatures will rise by 4.9 degrees Fahrenheit. Only about a third of this increase is directly attributable to more greenhouse gases; the rest is a result of indirect effects, the most important among them being the so-called “water-vapor feedback.” (Since warmer air holds more moisture, higher temperatures are expected to produce an atmosphere containing more water vapor, which is itself a greenhouse gas.) giss’s forecast is on the low end of the most recent projections; the Hadley Centre model, which is run by the British Met Office, predicts that for doubled CO2the eventual temperature rise will be 6.3 degrees Fahrenheit, while Japan’s National Institute for Environmental Studies predicts 7.7 degrees.
In the context of ordinary life, a warming of 4.9, or even of 7.7, degrees may not seem like much to worry about; in the course of a normal summer’s day, after all, air temperatures routinely rise by twenty degrees or more. Average global temperatures, however, have practically nothing to do with ordinary life. In the middle of the last glaciation, Manhattan, Boston, and Chicago were deep under ice, and sea levels were so low that Siberia and Alaska were connected by a land bridge nearly a thousand miles wide. At that point, average global temperatures were roughly ten degrees colder than they are today. Conversely, since our species evolved, average temperatures have never been much more than two or three degrees higher than they are right now.
This last point is one that climatologists find particularly significant. By studying Antarctic ice cores, researchers have been able to piece together a record both of the earth’s temperature and of the composition of its atmosphere going back four full glacial cycles. (Temperature data can be extracted from the isotopic composition of the ice, and the makeup of the atmosphere can be reconstructed by analyzing tiny bubbles of trapped air.) What this record shows is that the planet is now nearly as warm as it has been at any point in the last four hundred and twenty thousand years. A possible consequence of even a four- or five-degree temperature rise—on the low end of projections for doubled CO2—is that the world will enter a completely new climate regime, one with which modern humans have no prior experience. Meanwhile, at 378 p.p.m., CO2 levels are significantly higher today than they have been at any other point in the Antarctic record. It is believed that the last time carbon-dioxide levels were in this range was three and a half million years ago, during what is known as the mid-Pliocene warm period, and they likely have not been much above it for tens of millions of years. A scientist with the National Oceanic and Atmospheric Administration (noaa) put it to me—only half-jokingly—this way: “It’s true that we’ve had higher CO2levels before. But, then, of course, we also had dinosaurs.”
David Rind is a climate scientist who has worked at giss since 1978. Rind acts as a trouble-shooter for the institute’s model, scanning reams of numbers known as diagnostics, trying to catch problems, and he also works with giss’s Climate Impacts Group. (His office, like Hansen’s, is filled with dusty piles of computer printouts.) Although higher temperatures are the most obvious and predictable result of increased CO2, other, second-order consequences—rising sea levels, changes in vegetation, loss of snow cover—are likely to be just as significant. Rind’s particular interest is how CO2levels will affect water supplies, because, as he put it to me, “you can’t have a plastic version of water.”
One afternoon, when I was talking to Rind in his office, he mentioned a visit that President Bush’s science adviser, John Marburger, had paid to giss a few years earlier. “He said, ‘We’re really interested in adaptation to climate change,’ ” Rind recalled. “Well, what does ‘adaptation’ mean?” He rummaged through one of his many file cabinets and finally pulled out a paper that he had published in the Journal of Geophysical Research entitled “Potential Evapotranspiration and the Likelihood of Future Drought.” In much the same way that wind velocity is measured using the Beaufort scale, water availability is measured using what’s known as the Palmer Drought Severity Index. Different climate models offer very different predictions about future water availability; in the paper, Rind applied the criteria used in the Palmer index to giss’s model and also to a model operated by noaa’s Geophysical Fluid Dynamics Laboratory. He found that as carbon-dioxide levels rose the world began to experience more and more serious water shortages, starting near the equator and then spreading toward the poles. When he applied the index to the giss model for doubled CO2, it showed most of the continental United States to be suffering under severe drought conditions. When he applied the index to the G.F.D.L. model, the results were even more dire. Rind created two maps to illustrate these findings. Yellow represented a forty-to-sixty-per-cent chance of summertime drought, ochre a sixty-to-eighty-per-cent chance, and brown an eighty-to-a-hundred-per-cent chance. In the first map, showing the giss results, the Northeast was yellow, the Midwest was ochre, and the Rocky Mountain states and California were brown. In the second, showing the G.F.D.L. results, brown covered practically the entire country.
“I gave a talk based on these drought indices out in California to water-resource managers,” Rind told me. “And they said, ‘Well, if that happens, forget it.’ There’s just no way they could deal with that.”
He went on, “Obviously, if you get drought indices like these, there’s no adaptation that’s possible. But let’s say it’s not that severe. What adaptation are we talking about? Adaptation in 2020? Adaptation in 2040? Adaptation in 2060? Because the way the models project this, as global warming gets going, once you’ve adapted to one decade you’re going to have to change everything the next decade.
“We may say that we’re more technologically able than earlier societies. But one thing about climate change is it’s potentially geopolitically destabilizing. And we’re not only more technologically able; we’re more technologically able destructively as well. I think it’s impossible to predict what will happen. I guess—though I won’t be around to see it—I wouldn’t be shocked to find out that by 2100 most things were destroyed.” He paused. “That’s sort of an extreme view.”

On the other side of the Hudson River and slightly to the north of giss, the Lamont-Doherty Earth Observatory occupies what was once a weekend estate in the town of Palisades, New York. The observatory is an outpost of Columbia University, and it houses, among its collections of natural artifacts, the world’s largest assembly of ocean-sediment cores—more than thirteen thousand in all. The cores are kept in steel compartments that look like drawers from a filing cabinet, only longer and much skinnier. Some of the cores are chalky, some are clayey, and some are made up almost entirely of gravel. All can be coaxed to yield up—in one way or another—information about past climates.
Peter deMenocal is a paleoclimatologist who has worked at Lamont-Doherty for fifteen years. He is an expert on ocean cores, and also on the climate of the Pliocene, which lasted from roughly five million to two million years ago. Around two and a half million years ago, the earth, which had been warm and relatively ice-free, started to cool down until it entered an era—the Pleistocene—of recurring glaciations. DeMenocal has argued that this transition was a key event in human evolution: right around the time that it occurred, at least two types of hominids—one of which would eventually give rise to us—branched off from a single ancestral line. Until quite recently, paleoclimatologists like deMenocal rarely bothered with anything much closer to the present day; the current interglacial—the Holocene—which began some ten thousand years ago, was believed to be, climatically speaking, too stable to warrant much study. In the mid-nineties, though, deMenocal, motivated by a growing concern over global warming—and a concomitant shift in government research funds—decided to look in detail at some Holocene cores. What he learned, as he put it to me when I visited him at Lamont-Doherty last fall, was “less boring than we had thought.”
One way to extract climate data from ocean sediments is to examine the remains of what lived or, perhaps more pertinently, what died and was buried there. The oceans are rich with microscopic creatures known as foraminifera. There are about thirty planktonic species in all, and each thrives at a different temperature, so that by counting a species’ prevalence in a given sample it is possible to estimate the ocean temperatures at the time the sediment was formed. When deMenocal used this technique to analyze cores that had been collected off the coast of Mauritania, he found that they contained evidence of recurring cool periods; every fifteen hundred years or so, water temperatures dropped for a few centuries before climbing back up again. (The most recent cool period corresponds to the Little Ice Age, which ended about a century and a half ago.) Also, perhaps even more significant, the cores showed profound changes in precipitation. Until about six thousand years ago, northern Africa was relatively wet—dotted with small lakes. Then it became dry, as it is today. DeMenocal traced the shift to periodic variations in the earth’s orbit, which, in a generic sense, are the same forces that trigger ice ages. But orbital changes occur gradually, over thousands of years, and northern Africa appears to have switched from wet to dry all of a sudden. Although no one knows exactly how this happened, it seems, like so many climate events, to have been a function of feedbacks—the less rain the continent got, the less vegetation there was to retain water, and so on until, finally, the system just flipped. The process provides yet more evidence of how a very small forcing sustained over time can produce dramatic results.
“We were kind of surprised by what we found,” deMenocal told me about his work on the supposedly stable Holocene. “Actually, more than surprised. It was one of these things where, you know, in life you take certain things for granted, like your neighbor’s not going to be an axe murderer. And then you discover your neighbor is an axe murderer.”

Not long after deMenocal began to think about the Holocene, a brief mention of his work on the climate of Africa appeared in a book produced by National Geographic. On the facing page, there was a piece on Harvey Weiss and his work at Tell Leilan. DeMenocal vividly remembers his reaction. “I thought, Holy cow, that’s just amazing!” he told me. “It was one of these cases where I lost sleep that night, I just thought it was such a cool idea.”
DeMenocal also recalls his subsequent dismay when he went to learn more. “It struck me that they were calling on this climate-change argument, and I wondered how come I didn’t know about it,” he said. He looked at the Science paper in which Weiss had originally laid out his theory. “First of all, I scanned the list of authors and there was no paleoclimatologist on there,” deMenocal said. “So then I started reading through the paper and there basically was no paleoclimatology in it.” (The main piece of evidence Weiss adduced for a drought was that Tell Leilan had filled with dust.) The more deMenocal thought about it, the more unconvincing he found the data, on the one hand, and the more compelling he found the underlying idea, on the other. “I just couldn’t leave it alone,” he told me. In the summer of 1995, he went with Weiss to Syria to visit Tell Leilan. Subsequently, he decided to do his own study to prove—or disprove—Weiss’s theory.
Instead of looking in, or even near, the ruined city, deMenocal focussed on the Gulf of Oman, nearly a thousand miles downwind. Dust from the Mesopotamian floodplains, just north of Tell Leilan, contains heavy concentrations of the mineral dolomite, and since arid soil produces more wind-borne dust, deMenocal figured that if there had been a drought of any magnitude it would show up in gulf sediments.“In a wet period, you’d be getting none or very, very low amounts of dolomite, and during a dry period you’d be getting a lot,” he explained. He and a graduate student named Heidi Cullen developed a highly sensitive test to detect dolomite, and then Cullen assayed, centimetre by centimetre, a sediment core that had been extracted near where the Gulf of Oman meets the Arabian Sea.
“She started going up through the core,” DeMenocal told me. “It was like nothing, nothing, nothing, nothing, nothing. Then one day, I think it was a Friday afternoon, she goes, ‘Oh, my God.’ It was really classic.” DeMenocal had thought that the dolomite level, if it were elevated at all, would be modestly higher; instead, it went up by four hundred per cent. Still, he wasn’t satisfied. He decided to have the core re-analyzed using a different marker: the ratio of strontium 86 and strontium 87 isotopes. The same spike showed up. When deMenocal had the core carbon-dated, it turned out that the spike lined up exactly with the period of Tell Leilan’s abandonment.
Tell Leilan was never an easy place to live. Much like, say, western Kansas today, the Khabur plains received enough annual rainfall—about seventeen inches—to support cereal crops, but not enough to grow much else. “Year-to-year variations were a real threat, and so they obviously needed to have grain storage and to have ways to buffer themselves,” deMenocal observed. “One generation would tell the next, ‘Look, there are these things that happen that you’ve got to be prepared for.’ And they were good at that. They could manage that. They were there for hundreds of years.”
He went on, “The thing they couldn’t prepare for was the same thing that we won’t prepare for, because in their case they didn’t know about it and because in our case the political system can’t listen to it. And that is that the climate system has much greater things in store for us than we think.”

Shortly before Christmas, Harvey Weiss gave a lunchtime lecture at Yale’s Institute for Biospheric Studies. The title was “What Happened in the Holocene,” which, a Weiss explained, was an allusion to a famous archeology text by V. Gordon Childe, entitled “What Happened in History.” The talk brought together archeological an paleoclimatic records from the Near East over the last ten thousand years
Weiss, who is sixty years old, has thinning gray hair, wire-rimmed glasses, and an excitable manner. He had prepared for the audience—mostly Yale professors and graduate students—a handout with a time line of Mesopotamian history. Key cultural events appeared in black ink, key climatological ones in red. The two alternated in a rhythmic cycle of disaster and innovation. Around 6200 B.C., a severe global cold snap—red ink—produced aridity in the Near East. (The cause of the cold snap is believed to have been a catastrophic flood that emptied an enormous glacial lake—called Lake Agassiz—into the North Atlantic.) Right around the same time—black ink—farming villages in northern Mesopotamia were abandoned, while in central and southern Mesopotamia the art of irrigation was invented. Three thousand years later, there was another cold snap, after which settlements in northern Mesopotamia once again were deserted. The most recent red event, in 2200 B.C., was followed by the dissolution of the Old Kingdom in Egypt, the abandonment of villages in ancient Palestine, and the fall of Akkad. Toward the end of his talk, Weiss, using a PowerPoint program, displayed some photographs from the excavation at Tell Leilan. One showed the wall of a building—probably intended for administrative offices—that had been under construction when the rain stopped. The wall was made from blocks of basalt topped by rows of mud bricks. The bricks gave out abruptly, as if construction had ceased from one day to the next.
The monochromatic sort of history that most of us grew up with did not allow for events like the drought that destroyed Tell Leilan. Civilizations fell, we were taught, because of wars or barbarian invasions or political unrest. (Another famous text by Childe bears the exemplary title “Man Makes Himself.”) Adding red to the time line points up the deep contingency of the whole enterprise. Civilization goes back, at the most, ten thousand years, even though, evolutionarily speaking, modern man has been around for at least ten times that long. The climate of the Holocene was not boring, but at least it was dull enough to allow people to sit still. It is only after the immense climatic shifts of the glacial epoch had run their course that writing and agriculture finally emerged.
Nowhere else does the archeological record go back so far or in such detail as in the Near East. But similar red-and-black chronologies can now be drawn up for many other parts of the world: the Indus Valley, where, some four thousand years ago, the Harappan civilization suffered a decline after a change in monsoon patterns; the Andes, where, fourteen hundred years ago, the Moche abandoned their cities in a period of diminished rainfall; and even the United States, where the arrival of the English colonists on Roanoke Island, in 1587, coincided with a severe regional drought. (By the time English ships returned to resupply the colonists, three years later, no one was left.) At the height of the Mayan civilization, population density was five hundred per square mile, higher than it is in most parts of the U. S. today. Two hundred years later, much of the territory occupied by the Mayans had been completely depopulated. You can argue that man through culture creates stability, or you can argue, just as plausibly, that stability is for culture an essential precondition.
After the lecture, I walked with Weiss back to his office, which is near the center of the Yale campus, in the Hall of Graduate Studies. This past year, Weiss decided to suspend excavation at Tell Leilan. The site lies only fifty miles from the Iraqi border, and, owing to the uncertainties of the war, it seemed like the wrong sort of place to bring graduate students. When I visited, Weiss had just returned from a trip to Damascus, where he had gone to pay the guards who watch over the site when he isn’t there. While he was away from his office, its contents had been piled up in a corner by repairmen who had come to fix some pipes. Weiss considered the piles disconsolately, then unlocked a door at the back of the room.
The door led to a second room, much larger than the first. It was set up like a library, except that instead of books the shelves were stacked with hundreds of cardboard boxes. Each box contained fragments of broken pottery from Tell Leilan. Some were painted, others were incised with intricate designs, and still others were barely distinguishable from pebbles. Every fragment had been inscribed with a number, indicating its provenance.
I asked what he thought life in Tell Leilan had been like. Weiss told me that that was a “corny question,” so I asked him about the city’s abandonment. “Nothing allows you to go beyond the third or fourth year of a drought, and by the fifth or sixth year you’re probably gone,” he observed. “You’ve given up hope for the rain, which is exactly what they wrote in ‘The Curse of Akkad.’ ” I asked to see something that might have been used in Tell Leilan’s last days. Swearing softly, Weiss searched through the rows until he finally found one particular box. It held several potsherds that appeared to have come from identical bowls. They were made from a greenish-colored clay, had been thrown on a wheel, and had no decoration. Intact, the bowls had held about a litre, and Weiss explained that they had been used to mete out rations—probably wheat or barley—to the workers of Tell Leilan. He passed me one of the fragments. I held it in my hand for a moment and tried to imagine the last Akkadian who had touched it. Then I passed it back.
(This is the second part of a three-part article.)