Category: Uncategorized

  • Report: 243 million Americans affected by weather disasters since 2007

    Report: 243 million Americans affected by weather disasters since 2007

    Posted by Justin Grieser on April 9, 2013 at 10:30 am

    21

    Comments

    More

    Drought, record heat and Hurricane Sandy were among the major weather-related disasters that affected the United States in 2012. But just how many Americans felt the impact of these events? A newly released report from the Environment America Research and Policy Center says 243 million people – nearly 80 percent of the U.S. population – live in counties that experienced at least one weather-related disaster since 2007.

    U.S. counties federally declared disaster areas due to extreme weather, 2007-2012. (Environment America)
    U.S. counties federally declared disaster areas due to extreme weather, 2007-2012. (Environment America)

    The report, titled “In the Path of the Storm,” is based on six years of county-level disaster declaration data gathered from the Federal Emergency Management Agency (FEMA). It offers a comprehensive overview of which regions in the U.S. are most vulnerable to various types of severe weather, including hurricanes, floods, drought, tornadoes and winter storms. An interactive online map allows the public to learn more about the types of disasters that have occurred in individual counties across the country.

    Among some of the report’s major findings:

    Number of Americans living in counties declared federal disaster areas due to severe weather-related events. (Data source: Environment America)
    Number of Americans living in counties declared federal disaster areas due to severe weather-related events. (Data source: Environment America)
    ◾About 4 out of 5 Americans live in counties that have been declared federal disaster areas in the past six years.
    ◾In 2012, the U.S. recorded 11 weather disasters causing economic losses of $1 billion or more, including Hurricane Sandy.
    ◾Sandy claimed 72 lives in the Northeast and caused over $70 billion in damages, making it the costliest hurricane ever to hit the East Coast and the costliest weather disaster since Hurricane Katrina in 2005.
    ◾77 million Americans live in counties that experienced federally declared weather-related disasters in 2012 alone.
    ◾19 million Americans live in counties that have averaged one or more such weather-related disasters per year since 2007.

    While every state but South Carolina has had at least one county declared a federal disaster area, the Plains states are the most disaster-prone. In Oklahoma, for example, five of the state’s counties have each declared 10 or more weather-related disasters in the past six years.

    Clickable interactive map showing number of weather-related disasters by county during the 2007-2012 period. (Environment America)
    Interactive map showing number of weather-related disasters in Washington, D.C. during the 2007-2012 period. (Environment America)

    By shedding light on these numbers, Environment America hopes to raise public awareness about the the nation’s vulnerability to climate change and the possibility of more frequent extreme weather events in the coming decades.

    The report reaffirms scientific research that shows the U.S. is already experiencing greater extremes in precipitation and will likely see an increase in the number of heat waves and severe storms as Earth’s climate warms. In 2012, the U.S. experienced its hottest year on record and worst drought in more than 50 years.

    “Extreme weather is happening, it is causing very serious problems, and global warming increases the likelihood that we’ll see even more extreme weather in the future,” says Nathan Willcox, co-author of the report and global warming program director at Environment America.

    Willcox is calling upon local, state, and federal leaders to seek further reductions in CO2 emissions and invest in more renewable energy projects to reduce the impact of global-warming induced severe weather events.

    In particular, the report denounces further development in carbon-rich fuels such as tar sands, and urges the Obama administration to reject construction of the Keystone XL pipeline in the central United States.

    Meanwhile, Environment America is lauding state-based and regional efforts to curb greenhouse gas emissions, such as the Regional Greenhouse Gas Initiative (RGGI), comprised of nine northeastern states. Two months ago, all nine members pledged steeper cuts in carbon emissions from power plants, which would result in a 20 percent reduction over the next decade.

    “In the wake of Winter Storm Nemo [Feb. 2013 Nor’easter], Hurricane Sandy and Hurricane Irene, we need all of the Northeast to double down on its commitment to lead the nation in reducing the pollution that’s warming the planet and changing our climate,” said Willcox.

    Justin Grieser grew up in Northern VA and is a features writer for the Capital Weather Gang. He has a BA in linguistics from the University of Virginia and enjoys using his language skills toward researching international climate data.

  • World Bank ponders impact of sea level rises on developing countries

    World Bank ponders impact of sea level rises on developing countries

    Posted by Anthony Harrington, April 9, 2013

    Recommend this Article

    0 Comments
    Login to add your comment

    World Bank ponders impact of sea level rises on developing countries Anthony Harrington

    Proponents of global warming have long warned of the potential for sea levels to rise one to three meters over the course of the present century. Now analysts from the World Bank have put together a working paper: The impact of sea level rise on developing countries: a comparative analysis. But first, to put the study in context, let us consider a report from The Independent, published on 07 January which focused on a study of the impact of melting ice sheets on sea levels, carried out by Professor Jonathan Bamber of Bristol University, and colleagues.

    The study was published in the journal Nature: Climate Change. What the study highlights, it seems, is the difficulty of predicting quite how fast the world’s ice sheets are melting. The real concern, according to Professor Bamber, is that instead of melting raising sea levels by around 29 cm (not 1-3 meters, note) by 2100, accelerated ice sheet melt could add another 83 cm, making a total of just over one meter.

    As the Independent puts it, “… glaciologists believe that there is a one-in-20 chance of sea levels rising by a meter or more by 2100…”

    “The ice sheets of Greenland and Antarctica contain about 99.5% of the Earth’s glacier ice and could raise sea levels by 65 meters if they melted completely – although experts think this is highly unlikely in the foreseeable future. However, a survey of the world’s top 26 glaciologists found most believe melting of the ice sheets could be more rapid and severe than previously estimated…”

    It is always hard to tell with these reports that look factual but actually turn out to be based on opinion and finger-in-the-wind speculation whether one is reading a ghost story or something that deserves serious attention no matter how fuzzy and indistinct the science that supposedly grounds the report might be (a survey of 26 people is a media event, not science). However, what the World Bank report does is to say: OK, let’s say that sea levels did rise between 1 and 5 meters. What would that do to developing countries?

    The results show that the effects and consequences of sea level rises are extremely skewed, with severe impacts being limited to a relatively small number of countries. However, for these countries the impact will be potentially catastrophic. The countries affected in this way include Vietnam and The Bahamas. Some large countries, including China, will also face extremely severe consequences. East Asia and the Middle East and North Africa will be hit hardest. The World Bank also found that there is, as yet, almost no serious planning going forward in countries that will be affected, to enable populations to adapt to the coming changes – assuming they are coming, that is.

    The World Bank study contains a couple of scary possibilities that would accelerate sea level rise fairly dramatically. One that they cite concerns the West Antarctic ice sheet (WAIS). At present WAIS is anchored on bedrock, but the possibility exists that WAIS may not be that stable. Human induced global warming could cause the WAIS to slide off the bedrock and into the ocean. An ice sheet of that immensity winding up in the ocean would displace huge volumes of water without it necessarily having to melt to any large degree. If this were to happen sea levels could rise five to six meters in very short order, how short no one quite knows.

    The problem this kind of study poses for businesses across all sectors and sizes is that it is impossible to respond in any meaningful way to the scenarios being sketched out. Pulling out useful strategic directives from browsing the World Bank study is extremely difficult.

    “Try to avoid having too much of your business tied up with clients or suppliers in very low lying coastal plains,” might be one gem to take away, but who, seriously, is going to forego a business opportunity today on a one-in-twenty chance of an adverse sea level rise sometime in the next 87 years?

    Further information on global warming and catastrophe insurance:

    •Building Potential Catastrophe Management into a Strategic Risk Framework, by Duncan Martin

    •Catastrophe bonds: what are they and how do they function?

    •Catastrophe bonds: bet on a hurricane? by Anthony Harrington

  • Analysis – Rethinking the lithium-ion battery revolution over cost, safety

    Anlaysis – Rethinking the lithium-ion battery revolution over cost, safety

    ReutersApril 10, 2013, 3:06 pm

    tweet

    Email
    Print

    UK-BOEING-BATTERY-LITHIUM-ION:Anlaysis – Rethinking the lithium-ion battery revolution over cost, safety
    Reuters © Enlarge photo

    By Deepa Seetharaman

    TROY, Michigan (Reuters) – For nearly two years, a team of former Chevrolet Volt and Toyota Prius engineers has been working on the next big thing in electric cars: the latest version of the 154-year-old lead-acid battery.

    Their aim is to build a battery strong enough to power a wider range of vehicles, something they think the current cutting-edge technology – lithium ion – can’t do cheaply, particularly given recent safety scares.

    The focus of Energy Power Systems on a technology older than the automobile itself illustrates the difficulty with lithium-ion batteries. While widely used in everything from laptops to electric cars and satellites, a number of high-profile incidents involving smoke and fire have been a reminder of the risks and given them an image problem.

    The overheating of the batteries on two of Boeing Co’s high-tech 787 Dreamliners, which prompted regulators to ground the aircraft, served to underline the concerns and forced the plane maker to redesign the battery system.

    On Thursday, battery experts will gather in Washington, D.C., to discuss the technology in a forum organized by the National Transportation Safety Board, which is leading the investigation of one of the 787 incidents.

    Experts are certain to point out red flags. Indeed, a growing number of engineers now say the lithium-ion battery revolution has stalled, undercut by high costs, technical complexity and safety concerns.

    “Smart people have been working on this for 10 years already and no one is close to a new kind of battery,” said Fred Schlachter, a lithium-ion battery expert and retired physicist from the U.S.-funded Lawrence Berkeley National Laboratory.

    Many experts now believe it will take at least another decade for lithium-ion technology to be ready for widespread adoption in transportation. Others, including Toyota Motor Corp , believe the solution lies beyond lithium-ion.

    Interviews with two dozen battery executives, experts and researchers, including the founder of Securaplane, which made Boeing’s battery charger, reveal an industry in which some are having second thoughts about using lithium-ion, and are instead looking to enhance previous technologies or to leap ahead.

    These people say expectations were set too high, too fast. People projected that “clean technology” batteries would shrink in size and weight at the speed of the microchip revolution. That hasn’t happened, and Schlachter says it won’t any time soon. “We’re not going to see a different chemistry, unless we’re very lucky, for decades.”

    Just as recent developments in technology have allowed cars to improve their mileage using traditional engines, the lead-acid battery research is aiming for improved power in a smaller package.

    BEYOND LITHIUM-ION

    Lithium-ion supporters, including Boeing, Tesla Motors Co and General Motors Co , maker of the Volt, say they can make the batteries safe, and problems with new technologies are to be expected.

    GM overcame an early problem when a Volt caught fire during tests run by the U.S. National Highway Transportation Safety Administration, for instance, and after all, car and plane engineers successfully tamed gasoline and jet fuel.

    “GM is committed to lithium-ion technology for our vehicle electrification solutions,” the largest U.S. automaker said in a statement, adding that it has been seeing “improved economies” on the technology.

    Boeing said it has years of experience with the technology and is confident in its safety and reliability. “Nothing that we’ve learned as a result of the ongoing investigations has caused to change the decision to use lithium-ion batteries,” said the company in a statement.

    Tom Gage, a battery expert whose company EV Grid works on ways to manage the charging of electric cars, says lithium-ion may be a little more “tempestuous” than other technologies, but is the best industry now has and he’s confident it will improve.

    “It’s just a technological challenge,” he said.

    But other companies are looking beyond the technology. Toyota, for example, has tasked one team of battery engineers to explore a range of alternatives to lithium-ion.

    “We don’t think that lithium-ion batteries are going to help us get to a point where we can dramatically increase volume and really call it a mass market,” Toyota spokesman John Hanson said. “We’re going to have a more significant breakthrough and probably go into some other area of battery chemistry.”

    Subhash Dhar, who founded Energy Power Systems, the advanced lead-acid battery company, said promised improvements with lithium ion were never met.

    Dhar, who helped develop the nickel-metal hydride battery for the Toyota Prius, described his own “disenchantment” with lithium-ion’s complexity and cost.

    “Before the technology was fully optimized and before the markets were ready, we just kind of threw billions of dollars in setting up these manufacturing plants,” Dhar said. “They’re all sitting idle right now.”

    In February 2010, the U.S. government gave more than $150 million to help build a lithium-ion battery factory owned by a unit of Korea’s LG Group , LG Chem Ltd. The plant was supposed to make enough battery cells for 60,000 electric vehicles by the end of 2013.

    But the demand never materialized. A U.S. Department of Energy report described the Holland, Michigan factory as a place where workers spent their days playing board games, watching movies or volunteering at a local animal shelter.

    COMPLEXITY

    The Volt battery makes the case for critics and fans. After initial problems, it has worked without incident. But to keep it safe, the battery has more parts than the rest of the car combined, including 600 seals and cooling components.

    “That’s 600 seals that all have to stay for the entirety of its life otherwise you have catastrophic failures,” said Josh Payne, who worked on the first Volt battery and is now senior engineering manager at Troy, Michigan-based Energy Power Systems. GM has repeatedly said the Volt battery is safe.

    Lithium-ion batteries have been used to power electronics like camcorders, cell phones and laptops since 1992, and the Economist magazine in 2002 hailed them as the “foot soldiers of the digital revolution.”

    The batteries also were flammable. Some early laptops burned spectacularly, but the consumer electronics industry largely solved the problem. That set up expectations that bigger batteries would be tamed as well.

    Inside the aviation industry, however, experts are wary.

    “These things go into a flare type burn. I call it a whoosh. It is not an explosion per se,” said aviation consultant Richard Lukso. “I’ve heard that whoosh four times, so I know what I’m talking about.”

    Lukso founded Securaplane, the company that made the battery charger for the 787. He left to found a lithium-ion battery startup, but after spending $6 million, failed to build a lithium-ion battery safe enough for planes.

    Lukso was startled by how difficult it was to track what was happening inside the battery. New microchips do the job, but he did not have them when he closed his company several years ago.

    Meanwhile, Lukso faced the same problem as Boeing – once a fire got started in a big lithium-ion battery, it is tough to put out, since it creates its own oxygen and has its own fuel. The cost and weight of safeguards to stop heat spreading between cells, and to contain a fire, offset lithium-ion’s advantages.

    Boeing rival Airbus dropped lithium-ion batteries for its next passenger jet, the A350, to let the technology “mature.”

    “They are definitely powerful and they’re more advanced,” said Subhas Chalasani, an engineer who consulted for Boeing on the 787 battery and previously worked at GM. “But you’ve got to babysit them.

    “We need, at various levels, some kind of breakthrough so we can make this technology more robust and safe,” added Chalasani, who declined to specifically discuss his time at Boeing.

    Chalasani is not working on that lithium-ion breakthrough. He’s working at East Penn Manufacturing Co on advanced lead-acid batteries.
    (Additional reporting by Peter Henderson in San Francisco, Mari Saito and James Topham in Tokyo; Editing by Peter Henderson, Alwyn Scott, Martin Howell and Tim Dobbyn)

  • Coalition boosts commitment to broadband network

    Coalition boosts commitment to broadband network

    By chief political correspondent Emma Griffiths, ABCUpdated April 9, 2013, 11:11 pm

    tweet

    Email
    Print

    The Coalition has significantly boosted its commitment to a national broadband network, promising to spend nearly $30 billion to build a scheme within the next seven years.

    It represents a stark difference from the Opposition’s policy at the 2010 election, which vowed to scrap the Government’s National Broadband Network (NBN) and instead spend $6 billion relying on the private sector to expand internet services.

    At the time, Opposition Leader Tony Abbott came under fire for saying he was “no tech-head” when asked to explain the policy on ABC1’s 7.30 Report.

    But today Mr Abbott said he was “very proud” of the Coalition’s new broadband plan, which he launched as the Opposition’s first major policy foray into the 2013 campaign.

    “We believe in a national broadband network,” he told journalists in Sydney.

    “We will deliver a better national broadband network faster and more affordably than this Government possibly can.

    “Our modern lives are absolutely unimaginable without access to broadband technology.”

    The Opposition Leader was flanked by his communications spokesman, Malcolm Turnbull.

    “I’m confident in the years to come, Malcolm is going to be Mr Broadband, and an incoming Coalition government can finally bring Australia into the broadband world, into the digital world,” Mr Abbott said.

    The Coalition policy will cost $29.5 billion, with a goal of giving all Australians access to internet speeds of at least 25 megabits per second by the end of 2016.

    But it will not roll out the fibre cables to most premises as the NBN aims to – just to so-called nodes that will then rely on copper wire to connect homes and businesses.

    The policy document states the “Coalition’s NBN” would deliver fibre directly to 22 per cent of premises but 71 per cent of homes and businesses would have fibre to the node.

    Four per cent of premises will have to rely on fixed wireless and 3 per cent on satellite.

    The Government’s NBN will deliver fibre to 93 per cent of premises.

    Mr Turnbull said the Coalition was taking a “much smarter approach”.

    “We’ll be taking fibre out into the field but not all the way into the customers’ premises, and that saves about three quarters at least of the cost,” he said.

    “What this will deliver is speeds that are more than capable of delivering all of the services and all of the applications households need.”

    But in the policy document, the Opposition acknowledges that “some users may want higher speeds” before the broader market, but should be “prepared to pay for it”.

    It also admits that “needs will evolve over time, and eventually may require further upgrade of the network”.

    In a broad swipe at the Government’s NBN, Mr Turnbull said the Coalition’s plan would be rolled out faster, with less cost for taxpayers, and lower prices for consumers.

    He said the NBN was a “failing project”.

    ‘Ignorance’

    But Communications Minister Stephen Conroy rubbished the Coalition policy, saying it “fails miserably” and would deprive millions of Australians from having high-speed internet connections.

    Senator Conroy said Mr Turnbull had displayed “ignorance” of the broadband needs of the future.

    “Customers using the NBN are also connecting more devices to the NBN and this is where Malcolm Turnbull and their version of a broadband network fails miserably,” he said.

    “If you understand broadband, if you understand that it is being used for more applications that require more bandwidth every single day, then you know that Malcolm Turnbull’s network is a fail.

    “Malcolm Turnbull is going to build a one-lane Sydney Harbour Bridge because he says he can do it cheaper and faster.”

    Senator Conroy said the plan’s reliance on Telstra’s ageing copper network was a key fault.

    “I’ve got to say, I can’t find a dumber piece of public policy than buying the copper from Telstra – I mean come on down Alan Bond,” he said.

    “Kerry Packer would be laughing all the way to the bank if he found a mug willing to buy Telstra’s copper network.”

    Costs dispute

    He rejected Coalition claims reported in the Daily Telegraph that the NBN costs could double.

    The finance industry modelling is included in the Coalition’s broadband policy and warns the NBN could take four years longer to build and cost more than $90 billion.

    “Claims about cost blowouts have not been substantiated,” Senator Conroy said.

    “These are false and fanciful figures; they’re concocted figures.

    “Malcolm Turnbull is becoming the king of telling a lie using a fact.”

    Regional Development Minister Anthony Albanese said the Coalition’s plan was a “policy disaster for regional Australia”.

    He said it abandoned the NBN’s commitment to provide equal access across the country which would make “an enormous difference to people’s lives”.

    “This policy announcement today gets rid of the consistent wholesale price,” he said.

    “What that ensures is whether you live in a regional town or whether you live in the capital city CBD, you have access to the same services at the same fundamental price – that’s a foundation of (NBN) policy.

    “The great difficulty in a nation such as ours has been a relatively sparse population spread across a very vast land; the National Broadband Network is the transport mode of the 21st century.”
    In an apparent endorsement of earlier details carried in the Daily Telegraph, the Opposition Leader’s office had distributed copies of the story to Press Gallery journalists, though it had refused to release any official information prior to the launch.

  • Rapid Climate Change and the Role of the Southern Ocean

    Science News

    … from universities, journals, and other research organizations

    Save Email Print Share

    Rapid Climate Change and the Role of the Southern Ocean

    Apr. 8, 2013 — Scientists from Cardiff University and the University of Barcelona have discovered new clues about past rapid climate change.

    ——————————————————————————–

    Share This:

    14

    Related Ads:
    •Climate Change
    •Global Warming
    •Water System
    •Oceanography

    See Also:

    Earth & Climate
    •Global Warming
    •Oceanography
    •Climate
    •Earth Science
    •Geography
    •Geochemistry

    Reference
    •Carbon cycle
    •Ocean acidification
    •Phytoplankton
    •Plankton

    The research, published this month in the journal Nature Geoscience, concludes that oceanographic reorganisations and biological processes are linked to the supply of airborne dust in the Southern Ocean and this connection played a key role in past rapid fluctuations of atmospheric carbon dioxide levels, an important component in the climate system.

    The scientists studied a marine sediment core from the Southern Ocean and reconstructed chemical signatures at different water depths using stable isotope ratios in the shells of foraminifera, single-celled marine organisms. They found that the chemical difference between intermediate level and deep waters over the last 300,000 years closely resembled the changes in atmospheric carbon dioxide levels and the input of windblown dust.

    Dr Martin Ziegler, School of Earth and Ocean Sciences, explained: “The deep ocean is by far the largest pool of available carbon on short timescales. In the Southern Ocean, water from the deep rises to the sea surface and comes in contact with the atmosphere. These waters will release their carbon to the atmosphere unless marine phytoplankton captures this carbon through photosynthesis and transports it back into the deep when it dies and sinks. The efficiency of this biological activity in the Southern Ocean is thought to depend on the input of nutrients, namely iron, contained in wind blown dust. It is also this efficiency that determines the strength of chemical stratification in the Southern Ocean.”

    Professor Ian Hall, School of Earth and Ocean Sciences, added: “Our study finds large changes in chemical stratification of the Southern Ocean not only across the shifts from ice ages to warm interglacial conditions, but also on more rapid, millennial timescales. However, changes in dust flux on these short timescales are much smaller. This could suggest that the biological response to a change in dust input is much more sensitive when the dust flux is relatively low such as it is today. This iron fertilization process might be therefore more important than previously thought.”

    These findings provide an important benchmark for climate modeling studies and more research will be needed to determine the significance and impact of future changes in dust input into the Southern Ocean.

    The research was supported by the Natural Environment Research Council and is part of the international Gateways training network, funded by the 7th Framework Programme of the European Union.

    Share this story on Facebook, Twitter, and Google:

  • Oceans turn up the AC on global warming

    Oceans turn up the AC on global warming

    Science

    Enlarge Image
    @iStockphoto.com/Nastco

    The world’s oceans acted like giant air conditioners during the last decade, according to a new study. By absorbing heat from the air, the oceans played a major role in thwarting a projected rise in global temperatures from 2000 to 2010.

    With the high seas serving as reservoirs of warmth, though, it’s possible ocean currents could eventually return the heat back into the atmosphere, researcher Virginie Guemas told Reuters. Guemas is the lead author of the study, published at the website of Nature Climate Change on Sunday.

    The researchers point to natural cycles of ocean currents beginning around the turn of the century—such as a La Niña event in the Pacific Ocean—for bringing cool seawater to the surface and absorbing excess heat from the atmosphere. Ocean heat uptake is one of several factors, along with sun cycles and water vapor in the stratosphere, which scientists blame for a recent slowdown in global warming.

    According to NASA data, 10 of the hottest years since global temperature records began in 1880 have occurred since 1998. Yet, the rise in temperature has been less extreme than climate models predicted: In spite of a 58 percent rise in greenhouse gas emissions around the world since 1990, the global average temperature has changed little in the past decade.

    According to an unpublished United Nations report leaked late last year, the climate forecast models the UN has relied upon since 1990 have consistently overestimated global warming by two to five times the actual observed rate.

    The UN uses climate predictions to press for an international treaty to slow the release of carbon dioxide into the atmosphere. The first such treaty, the Kyoto Protocol, expired in 2012 and failed to contain emissions because many developed nations broke their promises to cut back on carbon. Kyoto never placed limits on the emissions of developing nations, such as China and India, where industry—and pollution—has risen rapidly.

    At the most recent UN climate change conference, held in Qatar last December, a few developed nations (not including the United States, Japan, or Canada) agreed to extend their Kyoto carbon cuts for a few more years. The UN hopes to convince world powers to sign another emissions-reduction treaty by 2015. If past negotiations are any indication, it has little chance of success.

    The Nature Climate Change paper follows on the heels of another study, published last month in Geophysical Research Letters, concluding that about 30 percent of ocean heat uptake since 1998 has occurred in deeper waters, below about 2,300 feet. The authors said the recent rates of deep-water warming “appear to be unprecedented.”