Author: Neville

  • NSW Parliament set to debate voluntary euthanasia legislation

    NSW Parliament set to debate voluntary euthanasia legislation

    AAP
    May 02, 2013 9:27AM

    Increase Text Size
    Decrease Text Size
    Print
    Email
    Share

    0

    Greens MP Cate Faehrmann

    Greens MP Cate Faehrmann at Hyde Park in the city. Picture: Rohan Kelly Source: NewsLocal

    HIGH profile supporters of voluntary euthanasia are expected to attend NSW parliament when the legislation is debated in the state’s Upper House.

    Greens MP Cate Faehrmann says she will introduce a bill giving terminally ill people the right to die on Thursday.

    She believes many members of pro-euthanasia lobby groups and some prominent individuals, including former NSW attorney general John Dowd, former NSW director of Public Prosecutions Nick Cowdery and former NT Chief Minister

    Marshall Perron, the architect of Australia’s original right to die laws, will come to watch the debate.
    Members of the NSW government and opposition are expected to get a conscience vote on the issue, making it more likely the laws will pass.

    However Ms Faehrmann said while MPs were telling they supported the right of the terminally ill to die with dignity they were baulking at enshrining them in law because of concerns about the laws being abused.

    She said she hoped to convince them that adequate safeguards existed in her bill, and if she succeeded she was hopeful the legislation would pass after “a couple of weeks worth of talking to people about it”.

    “It’s going to be a very tight fight, its going to be a very hard fight but it’s time we did it and I think it’s time that MPs listen to the community about it,” she told the ABC.

    Eighty-five per cent of Australians supported terminally ill people having the right to ask for assistance to die, Ms Faehrmann said.

    Ms Faehrmann launched a photo book and online video in March promoting voluntary euthanasia part of her campaign for the Rights of the Terminally Ill Bill.

    Print
    Email

  • Extinction: just how bad is it and why should we care?

    2 May 2013, 6.11am AEST
    Extinction: just how bad is it and why should we care?

    “Dad, the world is missing amazing animals. I wish extinction wasn’t forever”. Despite my wife and I working as biologists, our five-year-old son came to make this statement independently. He is highlighting what I and many others consider to be society’s biggest challenge, and arguably failure: the…

    Author

    Euan Ritchie

    Lecturer in ecology at Deakin University
    .

    Disclosure Statement

    Euan Ritchie does not work for, consult to, own shares in or receive funding from any company or organisation that would benefit from this article, and has no relevant affiliations.

    Deakin University Provides funding as a Member of The Conversation.
    deakin.edu.au

    Rp3mcs92-1367197949The passing of Lonesome George, the last Pinta Island Tortoise, is emblematic of the mass extinction of species the earth is currently experiencing. Flickr/A Davey .

    “Dad, the world is missing amazing animals. I wish extinction wasn’t forever”.

    Despite my wife and I working as biologists, our five-year-old son came to make this statement independently.

    He is highlighting what I and many others consider to be society’s biggest challenge, and arguably failure: the continuing loss of species from Earth. The massive impact we are having on the planet has firmly entrenched us in a period of our history commonly called the Anthropocene.

    The environment was front and centre of public consciousness and a key election focus in Australia in 2007, but following the global financial crisis and continuing economic uncertainty, we seem to care less and less about the environment and more and more about budgets and surpluses.

    If the environment were a bank and species its money, it would need a rescue package that would make the recent European bail-outs look insignificant.

    The state of extinction

    We still have little idea of how many species exist on Earth. Only a fraction (~1.5 million of an estimated 5 million) have been formally described, and even fewer assessed for their conservation status. How do we conserve what we don’t know exists?

    If Earth were a house, it would be as though we had listed the contents of only one room, and even then were not aware of their true value, while simultaneously the house was being demolished.

    It is important to note that extinction – the permanent loss of species – is a natural process that is counterpoint to speciation, the creation of new species through evolution.

    Background or “normal” rates of extinction vary through time but are typically in the order of one to two species per year. Current rates of extinction, however, are estimated to have reached 1000 to 10,000 times this rate. Put bluntly, the annual species body count is no longer a mere handful, it’s an avalanche.

    If you want to see a Japanese river otter, you’ll have to visit the museum. Hamura Municipal Zoo, Tokyo/Wikimedia Commons.

    There have been at least five episodes of mass extinctions in the past, during which anywhere from 60 to 96% of existing species became extinct. Indeed, 99% of all existing species that have ever existed are now extinct.

    Volcanic eruptions and asteroid impacts are among the prime suspects as the cause of previous mass extinctions – including the oft-cited demise of the dinosaurs. Yes, extinctions, even mass extinctions, are not unprecedented. The difference this time is that humanity is the cause of the earth’s sixth mass extinction event, through such anthropogenic impacts as habitat loss and modification, the spread of invasive species and climate change.

    Farewelling species

    Some 875 species have been recorded as declining to extinction between 1500 and 2009 which, the observant will note, is entirely consistent with a background of extinction rate of 1-2 species per year. What, then, are the grounds for supposing that the current rate of extinction actually exceeds this value by such a huge margin?

    The key phrase is “have been recorded”. As already discussed, the majority of species have not been identified or described. A reasonable supposition is that unrecognised species are lost at a rate comparable with that of known ones.

    We now also have reasonable estimates of species diversity in particular habitats, such as insects in tropical forests. Our measures of the proportion of such habitats that have been destroyed therefore provide a good basis for estimating species loss. If these estimates are right, we are now living through a period where the rate of extinction is 1,000 to 10,000 times the background rate.

    Delving deeper, the IUCN Red List of Threatened Species notes that 36% of the 47,677 species assessed are threatened with extinction, which represents 21% of mammals, 30% of amphibians, 12% of birds, 28% of reptiles, 37% of freshwater fishes, 70% of plants, and 35% of invertebrates.

    More recently we have bid farewell to species such as the Baiji Dolphin, the Alaotra Grebe and the Japanese River Otter. And who could forget the passing of “Lonesome George”, the last individual Pinta Island Tortoise, who died on 24 June 2012? Closer to home, our most recent casualty was a small bat, the Christmas Island Pipistrelle.

    There is one brighter note: a recent study by Fisher and Blomberg has shown that depending on species’ characteristics and other factors such as the places where they occur, remnant populations of some species may still turn up.

    But an exclusive focus on extinction is inappropriate anyway, given that many surviving species are hanging on only by the barest of threads. The dire situation of Australia’s marsupials is stark evidence of this. Even iconic and once abundant species such as the Tasmanian Devil are now on the brink of oblivion.

    Many species listed as critically endangered, like this leaf-scaled sea snake, are close to or already extinct. Hal Cogger.

    Deep in debt

    A further sobering thought is encompassed in the concept of “extinction debt”. Recent studies in Europe have demonstrated that the species currently at highest risk of extinction most likely got that way because of human actions 50 to 100 years ago.

    I’m sure many of us have driven on an Australian country road, admiring the grand old eucalypts that stand alone in the nearby paddocks – remnants of the pre-agricultural landscape. But you may also have noticed that under the big trees there are often no little trees. Hence, when the big trees die, as they inevitably will, there will be nothing to replace them.

    If we want to avert extinctions from our legacies we will need to direct conservation efforts most into areas carrying the highest debts.

    At our own peril

    But why should it matter to us if we have a few less species? The simple answer is that we are connected to and deeply dependent on other species. From pollination of our crops by bees, to carbon storage by our forests, and even the bacteria in our mouths, we rely upon biodiversity for our very existence. We neglect this at our own peril. And of course there are equally justified arguments for keeping species based purely on their aesthetic and cultural importance, or for their own sake.

    Doom-and-gloom predictions tend to paralyse us, rather than jolting us into action. So what can be done? There are wonderful examples of individuals and organisations working at both small and large scales to tackle and even sometimes turn back the tide of extinctions.

    There are also some compelling personal approaches, such as that of Alejandro Frid who is writing a series of letters to his daughter as a way of confronting the issues of climate change and biodiversity loss. But what is urgently needed, of course, is radical change in society as a whole in the way it interacts with its environment.

    Until then, my fellow ecologists and I must continue to work hard to sell our message and spread awareness of society’s biggest challenge.

  • Finding a Sensible Balance for Natural Hazard Mitigation With Mathematical Models

    Finding a Sensible Balance for Natural Hazard Mitigation With Mathematical Models

    Apr. 30, 2013 — Uncertainty issues are paramount in the assessment of risks posed by natural hazards and in developing strategies to alleviate their consequences.

    ——————————————————————————–

    Share This:

    2

    See Also:

    Earth & Climate
    •Natural Disasters
    •Earthquakes

    Computers & Math
    •Computer Modeling
    •Mathematical Modeling

    Science & Society
    •Disaster Plan
    •Resource Shortage

    Reference
    •North Anatolian Fault
    •Elastic-rebound theory of earthquakes
    •Great Chilean Earthquake
    •New Madrid Seismic Zone

    In a paper published last month in the SIAM/ASA Journal on Uncertainty Quantification, the father-son team of Jerome and Seth Stein describe a model that estimates the balance between costs and benefits of mitigation — efforts to reduce losses by taking action now to reduce consequences later — following natural disasters, as well as rebuilding defenses in their aftermath. Using the 2011 Tohoku earthquake in Japan as an example, the authors help answer questions regarding the kinds of strategies to employ against such rare events.

    “Science tells us a lot about the natural processes that cause hazards, but not everything,” says Seth Stein. “Meteorologists are steadily improving forecasts of the tracks of hurricanes, but forecasting their strength is harder. We know a reasonable amount about why and where earthquakes will happen, some about how big they will be, but much less about when they will happen. This situation is like playing the card game ’21’, in which players see only some of the dealer’s cards. It is actually even harder, because we do not fully understand the rules of the game, and are trying to figure them out while playing it.”

    How much mitigation is needed? The bottom of a U-shaped curve is a “sweet spot” — a sensible balance. Photo Credit: Jerome Stein and Seth Stein

    Earthquake cycles — triggered by movement of the Earth’s tectonic plates and the resulting stress and strain at plate boundaries — are irregular in time and space, making it hard to predict the timing and magnitude of earthquakes and tsunamis. Hence, forecasting the probabilities of future rare events presents “deep uncertainty,” Stein says. “Deep uncertainties arise when the probabilities of outcomes are poorly known, unknown, or unknowable. In such situations, past events may give little insight into future ones.”

    Another conundrum for authorities in such crisis situations is the appropriate amount of resources to direct toward a disaster zone. “Much of the problem comes from the fact that formulating effective natural hazard policy involves using a complicated combination of geoscience, mathematics, and economics to analyze the problem and explore the costs and benefits of different options. In general, mitigation policies are chosen without this kind of analysis,” says Stein. “The challenge is deciding how much mitigation is enough. Although our first instinct might be to protect ourselves as well as possible, resources used for hazard mitigation are not available for other needs. For example, does it make sense to spend billions of dollars building buildings in the central U.S. to the same level of earthquake resistance as in California, or would these funds do more good if used otherwise?”

    The Japanese earthquake and tsunami in 2011 toppled seawalls 5-10 meters high. The seawalls being rebuilt are about 12 meters high, and would be expected to protect against large tsunamis expected every few hundred years. But critics argue that it would be more cost effective and efficient to focus on relocation and evacuation strategies for populations that may be affected by such tsunamis rather than building higher seawalls, especially in areas where the population is small and dwindling.

    In this paper, Stein says, the authors set out to “find the amount of mitigation — which could be the height of a seawall or the earthquake resistance of buildings — that is best for society.” The objective is to provide methods for authorities to use their limited resources in the best possible way in the face of uncertainty.

    Selecting an optimum strategy, however, depends on estimating the expected value of damage. This, in turn, requires prediction of the probability of disasters.

    It is still unknown whether to assume that the probability of a large earthquake on a fault line is constant with time (as routinely assumed in hazard planning) or whether the probability gets smaller after the last incidence and increases with time. Hence, the authors incorporate both these scenarios using the general probability model of drawing balls from an urn. If an urn contains balls that are labeled “E” for event and “N” for no event, each year is like drawing a ball. “If after drawing a ball, we replace it, the probability of an event stays constant. Thus an event is never ‘overdue’ because one has not happened recently, and the fact that one happened recently does not make another less likely,” explains Stein. “In contrast, we can add E-balls after a draw when an event does not occur, and remove E-balls when an event occurs. This makes the probability of an event increase with time until one happens, after which it decreases and then grows again.”

    Since the likelihood of future earthquakes depends on strain accumulation at plate boundaries, the model incorporates parameters for how fast strain accumulates between quake incidences, and strain release that happens during earthquakes.

    The authors select the optimal mitigation strategy by using a general stochastic model, which is a method used to estimate the probability of outcomes in different situations under constrained data. They minimize the expected present value of damage, the costs of mitigation, and the risk premium, which reflects the variance, or inconsistency, of the hazard. The optimal mitigation is the bottom of a U-shaped curve summing up the cost of mitigation and expected losses, a sensible balance.

    To determine the advantages and pitfalls of rebuilding after such disasters, the authors present a deterministic model. Here, outcomes are precisely determined by taking into account relationships between states and events. The authors use this model to determine if Japan should invest in nuclear power plant construction given the Fukushima Daiichi nuclear reactor meltdown during the 2011 tsunami. Taking into account the financial and societal benefits of reactors, and balancing them against risks — both financial and natural — the model determines the preferred outcome.

    Such models can also be applied toward other disaster situations, such as hurricanes and floods, and toward policies to diminish the effects of climate change. Stein gives an example: “Given the damage to New York City by the storm surge from Hurricane Sandy, options under consideration range from doing nothing, using intermediate strategies like providing doors to keep water out of vulnerable tunnels, to building up coastlines or installing barriers to keep the storm surge out of rivers. In this case, a major uncertainty is the effect of climate change, which is expected to make flooding worse because of the rise of sea levels and higher ferocity and frequency of major storms. Although the magnitude of these effects is uncertain, this formulation can be used to develop strategies by exploring the range of possible effects.”

    Share this story on Facebook, Twitter, and Google:

  • Exploring the Saltiness of the Ocean to Study Climate Change

    Exploring the Saltiness of the Ocean to Study Climate Change

    Apr. 30, 2013 — Details are emerging from a recent research expedition to the Sub-Tropical North Atlantic. The objective of the expedition was to study the salt concentration (salinity) of the upper ocean. Scientists aboard the Spanish research vessel Sarmiento de Gamboa, including National University of Ireland Galway’s Dr Brian Ward with two of his PhD students, Graig Sutherland and Anneke ten Doeschate, explored the essential role of the ocean in the global water cycle.

    This oceanographic research campaign is aimed at understanding the salinity of the upper ocean, which is a much more reliable indicator of the water cycle than any land-based measurement. How the water cycle evolves in response to global warming is one of the most important climate change issues.

    The experiment was located in the North Atlantic Salinity Maximum, which has the highest salt concentration of any of the world’s oceans. Dr Ward explains: “It is not the depths of the ocean which is its most important aspect, but its surface. Everything that gets exchanged between the ocean and atmosphere, such as water, must cross the air-sea interface. We are trying to better understand how small scale turbulence is responsible for the air-sea exchange of freshwater. What is surprising is that these small-scale processes can affect large-scale patterns over the North Atlantic, and we are trying to connect the dots.”

    The initial part of this ocean field campaign was to conduct a survey of the area to map out horizontal and vertical distribution of salinity using an instrument that was towed behind the ship. “We found quite a lot of fresher water intermingled with the background salty water, but it is moving around quite a bit due to ocean currents, and when we returned to the fresh patch, it had moved. We were currently hunting for this freshwater, as one of the objectives is to understand the spatial inhomogeneity of the upper ocean salinity”, explains Dr Ward.

    Studying the processes at the ocean surface requires specialised instrumentation, as most measurements ‘miss’ the upper few meters. The National University of Ireland Galway’s AirSea Group are measuring the salinity, temperature, and turbulence of the upper 10 metres of the ocean with very fine detail using their Air-Sea Interaction Profiler (ASIP). The torpedo-shaped device, which is deployed into the water to gather data autonomously, is unique and the only one of its kind.

    Dr Ward explains: “The ocean surface has been the focus of my research for several years, but there was no easy way to measure what is going on here as there were no instruments available, so we built our own.” The ability to make these unique measurements has resulted in international recognition for the research being conducted at National University of Ireland Galway.

    Dr Ward’s Research Group is the AirSea Laboratory, which is affiliated with the Ryan Institute and resides in the School of Physics at the National University of Ireland Galway. The main objective of the AirSea Laboratory is to study the upper ocean and lower atmosphere processes which are responsible for atmosphere-ocean exchange. This experiment is concerned with air-sea exchange of water, but other studies that the AirSea Laboratory have been involved with were looking at how carbon dioxide, a major greenhouse gas, is transported between the air and sea.

    Dr Ward explains: “The ocean and atmosphere are a coupled system and therefore need to be studied in unison. A major part of our research is to determine how this system affects and is affected by climate and environmental change.”

    This Irish and Spanish collaboration is part of a bigger international effort called SPURS – Salinity Processes in the Upper Ocean Regional Study. There was also an American research ship in the area participating in the SPURS study, and the Spanish ship was visited by Dr Ray Schmitt from the Woods Hole Oceanographic Institution (WHOI).

    Dr Ward collaborates extensively with the WHOI scientists: “The WHOI scientists have autonomous gliders with microsensors attached, similar to our ASIP. During our measurements, they directed their gliders to the same area as ASIP, and we provided them with data to ground-truth their measurements. This was an excellent opportunity to enhance our links with WHOI, who are the largest oceanographic research institution in the USA.”

    One of the biggest motivators for SPURS was the recent launch of two satellites for measuring ocean salinity: the European Space Agency’s Soil Moisture Ocean Salinity (SMOS), and NASA’s Aquarius mission. Dr Ward explains: “It is envisioned that with the combination of the in-situ measurements, satellites, and computer models, we can improve our estimates of global climate change and the water cycle. These data will also be used to improve weather forecasting, and we worked with the European Centre for Medium Range Weather Forecasting during this field experiment.”

    The research vessel left the Canary Islands on 16 March and completed its journey in the Azores on 13 April, during which time the vessel was home to 19 scientists, 6 technicians and 18 crew members.

  • Gillard backflip on disability levy would come at political cost

    Gillard backflip on disability levy would come at political cost

    Swan today will say the budget will put in place measures to permanently improve the bottom line. AAP/Lukas Coch.

    The levy being planned by the government to part pay for the national disability insurance scheme is not in itself a bad option.

    It wasn’t the preferred choice of the Productivity Commission, which thought it less efficient than funding out of consolidated revenue.

    But if the community wants desirable and costly plans – and this is both – they must be financed by spending cuts or higher taxes, and the government needs some of each in its difficult budget circumstances.

    There is, however, a political problem with this particular levy (which would take the form of an addition to the Medicare one).

    Gillard has previously flatly ruled it out.

    The idea came up at a Lodge dinner she had with premiers last July. Federal Labor’s bete noire, Queensland premier Campbell Newman, floated it, but Gillard was adamant.

    “We will make the appropriate [funding] arrangements out of the Commonwealth’s budget without a new income tax”, she said publicly.

    Newman tweeted at the time, “PM did fail to seize unique opportunity to fully fund NDIS on Tuesday night with support of ALL premiers. I am still asking why!!??”.

    In view of what is now being canvassed, why indeed? Presumably one reason was because the government still believed it could get back to surplus, difficult as that was already starting to seem.

    Perhaps Gillard should have been more circumspect in December, when she said, “I have in the past ruled out a levy and I will do it again now”. Later that month the surplus commitment was dumped.

    That was then. Yesterday Finance Minister Penny Wong said the government was considering “a number of funding options” and “obviously a levy is something stakeholders have been calling for and have been calling for in the last 24 hours”.

    Even if a levy won public support – the NDIS is very popular – the backflip would feed into the Opposition’s storyline that Gillard’s word can’t be trusted. A “broken promise” on another tax would be an unfortunate bookend to the broken promise on the carbon tax.

    The levy debate, however, is also raising doubts about how a Coalition government would handle the hugely expensive NDIS, to which it has also committed.

    Tony Abbott has always been a stronger supporter than shadow treasurer Joe Hockey, who has worried about the cost. Asked yesterday whether the scheme was still affordable, Hockey said, “Well, it depends on what the state of the budget is”.

    Abbott said: “we will fund the national disability insurance scheme over the long run by building a strong economy.” Asked whether the Coalition was still completely committed to delivering the NDIS in the same time frame that the government had put forward, he parried the question. The Opposition appears to be leaving itself wriggle room on timing.

    Abbott will have the choice of accepting a levy to help the Opposition’s bottom line or opposing it and saying he’d make room for the NDIS by savings, or if necessary would string out its implementation.

    On the ABC last night former treasurer Peter Costello, with the luxury of not being a politician seeking votes, gave both sides a lecture. The proposed big spending on Gonski and the NDIS should not be undertaken in the present budgetary situation, he said, and also made it clear that neither should Abbott be pursuing his very generous paid parental leave scheme or the Coalition’s costly direct action climate plan.

    “The easiest cut you will make is the stuff you never go into,” Costello observed.

    Treasurer Wayne Swan today will continue to hammer home the pre-budget message that the government is on the fiscal warpath.

    He will tell a CEDA function: “In this budget, we’ll announce further measures to build on the very substantial structural savings we’ve already put in place to address long-term fiscal pressures by permanently improving the bottom line”.

    He will say the government’s expenditure as a percentage of the economy over the forward estimates is set to come in at less than the average of the 30 years prior to the Labor government.

    Swan says that Costello got $334 billon revenue windfall but failed to invest in the nation’s future. “I’ve copped $160 billion in revenue downgrades”. The Coalition had left an “unsustainable” surplus.

    Swan will also put pressure on Abbott to spell out his plans in his budget reply, which he makes on May 16.

    “He will know the full extent of the revenue write downs which add to his $70 billion budget crater to give him his starting point.

    “Australians rightly expect Mr Abbott to outline the alternate choices he would make to return the budget to surplus while he funds his promises”.

    Swan admits he has “lost some political paint” from saying the budget would remain in deficit longer than previously forecast. “But I’m happy to wear it – because it’s the right decision to support jobs and growth”. He says that if the government had the same tax-to-GDP ratio in 2012-13 as the Howard government had in 2007-08 – equal to 23.7 % – its revenues in that year alone would be about $34 billion higher.

    “So if our level of tax receipts was as high as Howard and Costello had, we would have a budget surplus in 2012-13”.

  • Global carbon dioxide levels set to pass 400ppm milestone

    Global carbon dioxide levels set to pass 400ppm milestone

    The concentration of carbon in the atmosphere over the next few days is expected to hit record levels
    Share 846

    inShare.26
    Email

    John Vidal

    The Guardian, Monday 29 April 2013 20.32 BST

    MAUNA LOA OBSERVATORY
    Hawaii’s Mauna Loa observatory, where record CO2 increases are being documented. Photograph: Richard Vogel/AP

    The concentration of carbon dioxide in the atmosphere has reached 399.72 parts per million (ppm) and is likely to pass the symbolically important 400ppm level for the first time in the next few days.

    Readings at the US government’s Earth Systems Research laboratory in Hawaii, are not expected to reach their 2013 peak until mid May, but were recorded at a daily average of 399.72ppm on 25 April. The weekly average stood at 398.5 on Monday.

    Hourly readings above 400ppm have been recorded six times in the last week, and on occasion, at observatories in the high Arctic. But the Mauna Loa station, sited at 3,400m and far away from major pollution sources in the Pacific Ocean, has been monitoring levels for more than 50 years and is considered the gold standard.

    “I wish it weren’t true but it looks like the world is going to blow through the 400ppm level without losing a beat. At this pace we’ll hit 450ppm within a few decades,” said Ralph Keeling, a geologist with the Scripps Institution of Oceanography which operates the Hawaiian observatory.

    “Each year, the concentration of CO2 at Mauna Loa rises and falls in a sawtooth fashion, with the next year higher than the year before. The peak of the sawtooth typically comes in May. If CO2 levels don’t top 400ppm in May 2013, they almost certainly will next year,” Keeling said.

    CO2 atmospheric levels have been steadily rising for 200 years, registering around 280ppm at the start of the industrial revolution and 316ppm in 1958 when the Mauna Loa observatory started measurements. The increase in the global burning of fossil fuels is the primary cause of the increase.

    The approaching record level comes as countries resumed deadlocked UN climate talks in Bonn. No global agreement to reduce emissions is expected to be reached until 2015.

    “The 400ppm threshold is a sobering milestone, and should serve as a wake up call for all of us to support clean energy technology and reduce emissions of greenhouse gases, before it’s too late for our children and grandchildren,” said Tim Lueker, an oceanographer and carbon cycle researcher with Scripps CO2 Group.

    The last time CO2 levels were so high was probably in the Pliocene epoch, between 3.2m and 5m years ago, when Earth’s climate was much warmer than today.