Category: Uncategorized

  • Ford jumps EV battery gap with $8m research lab

    Ford jumps EV battery gap with $8m research lab

    By on 17 October 2013
    Print Friendly

    CleanTechnica

    Ford is one of the main funders behind a new lab that will help EV battery developers leap over the dreaded “Valley of Death” that lies between basic research and full fledged manufacturing. The new $8 million facility is the first to focus specifically on manufacturing batteries on a pilot scale for testing.

    Until now, developers had to wait until they came up with a production-ready model for testing and validation, so the new lab enables researchers to catch flaws and areas for improvement much earlier in the process.

    A Unique New EV Battery Lab

    The lab, which just opened on Monday, is located at the University of Michigan campus, adding another element to the 60-year relationship between Ford and the institution.

    Ford pitched in $2.1 million of the $8 million for the lab. That’s just a drop in the bucket relative to the company’s reported $135 million investment in EV battery projects over the past 20 years, but it forges a critical link in the lab-to-road chain. Ted Miller, manager for Ford’s battery research, explains:

    We have battery labs that test and validate production-ready batteries, but that is too late in the development process for us to get our first look. This lab will give us a stepping-stone between the research lab and the production environment, and a chance to have input much earlier in the development process. This is sorely needed, and no one else in the auto industry has anything like it.

    Specifically, Miller notes that the new lab will enable the more rapid development of new, better, and cheaper battery chemistries.

    According to Miller, the industry has already seen a short, 15-year turnaround from lead-acid to nickel-metal-hydride and finally to lithium-ion. Lithium-ion is still the gold standard but that could change soon, if some of the up-and-coming research bears out.

    Some of the emerging technologies we’ve covered here and on our sister site Gas2.org include lithium-air batteries, flow batteries, and zinc-air batteries.

    We Built This New EV Battery!

    Group hug, taxpayers: in addition to support from Ford and the University of Michigan, the new lab received funding from the U.S. Department of Energy.

    That’s just part of the mine, yours and ours investment in advanced EV battery research under the Obama Administration. Aside from millions of dollars in federal funding for dozens of individual projects such as a group called RANGE (Robust Affordable Next Generation Energy Storage Systems), there are a couple of collaborative initiatives worth noting.

    New this year is AMPED, for Advanced Management and Protection of Energy Storage Devices. Ford is also a partner in this initiative, which focuses partly on improving energy storage in existing technology.

    Other partners include the National Renewable Energy Laboratory, Eaton Corporation, Washington University, and Utah State University.

    Another new collaboration the is massive JCESR (Joint Center for Energy Storage Research) project, which launched last year as part of a new national network of technology innovation hubs.

    Aside from federal and state funding, JCESR partners include numerous federal laboratories, Northwestern University, University of Chicago, University of Illinois-Chicago, University of Illinois-Urbana Champaign, and University of Michigan along with Dow Chemical Company, Applied Materials Inc., Johnson Controls Inc., and Clean Energy Trust.

    Read more at http://cleantechnica.com/2013/10/16/ford-jumps-ev-battery-gap-new-8-million-research-lab/#D31TTudCLh1yKWt8.99

  • Sea level Rise In The 5th IPCC Report

     

    Sea Level Rise In The 5th IPCC Report

    By Stefan Rahmstorf

    15 October, 2013
    Realclimate.org

    What is happening to sea levels? That was perhaps the most controversial issue in the 4th IPCC report of 2007. The new report of the Intergovernmental Panel on Climate Change is out now, and here I will discuss what IPCC has to say about sea-level rise (as I did here after the 4th report).

    Let us jump straight in with the following graph which nicely sums up the key findings about past and future sea-level rise: (1) global sea level is rising, (2) this rise has accelerated since pre-industrial times and (3) it will accelerate further in this century. The projections for the future are much higher and more credible than those in the 4th report but possibly still a bit conservative, as we will discuss in more detail below. For high emissions IPCC now predicts a global rise by 52-98 cm by the year 2100, which would threaten the survival of coastal cities and entire island nations. But even with aggressive emissions reductions, a rise by 28-61 cm is predicted. Even under this highly optimistic scenario we might see over half a meter of sea-level rise, with serious impacts on many coastal areas, including coastal erosion and a greatly increased risk of flooding.

    Fig. 1. Past and future sea-level rise. For the past, proxy data are shown in light purple and tide gauge data in blue. For the future, the IPCC projections for very high emissions (red, RCP8.5 scenario) and very low emissions (blue, RCP2.6 scenario) are shown. Source: IPCC AR5 Fig. 13.27.

    In addition to the global rise IPCC extensively discusses regional differences, as shown for one scenario below. For reasons of brevity I will not discuss these further in this post.

    Fig. 2. Map of sea-level changes up to the period 2081-2100 for the RCP4.5 scenario (which one could call late mitigation, with emissions starting to fall globally after 2040 AD). Top panel shows the model mean with 50 cm global rise, the following panels show the low and high end of the uncertainty range for this scenario. Note that even under this moderate climate scenario, the northern US east coast is risking a rise close to a meter, drastically increasing the storm surge hazard to cities like New York. Source: IPCC AR5 Fig. 13.19.

    I recommend to everyone with a deeper interest in sea level to read the sea level chapter of the new IPCC report (Chapter 13) – it is the result of a great effort by a group of leading experts and an excellent starting point to understanding the key issues involved. It will be a standard reference for years to come.

    Past sea-level rise

    Understanding of past sea-level changes has greatly improved since the 4th IPCC report. The IPCC writes:

    Proxy and instrumental sea level data indicate a transition in the late 19th to the early 20th century from relatively low mean rates of rise over the previous two millennia to higher rates of rise (high confidence). It is likely that the rate of global mean sea level rise has continued to increase since the early 20th century.

    Adding together the observed individual components of sea level rise (thermal expansion of the ocean water, loss of continental ice from ice sheets and mountain glaciers, terrestrial water storage) now is in reasonable agreement with the observed total sea-level rise.

    Models are also now able to reproduce global sea-level rise from 1900 AD better than in the 4th report, but still with a tendency to underestimation. The following IPCC graph shows a comparison of observed sea level rise (coloured lines) to modelled rise (black).

    Fig. 3. Modelled versus observed global sea-level rise. (a) Sea level relative to 1900 AD and (b) its rate of rise. Source: IPCC AR5 Fig. 13.7.

    Taken at face value the models (solid black) still underestimate past rise. To get to the dashed black line, which shows only a small underestimation, several adjustments are needed.

    (1) The mountain glacier model is driven by observed rather than modelled climate, so that two different climate histories go into producing the dashed black line: observed climate for glacier melt and modelled climate for ocean thermal expansion.

    (2) A steady ongoing ice loss from ice sheets is added in – this has nothing to do with modern warming but is a slow response to earlier climate changes. It is a plausible but highly uncertain contribution – the IPCC calls the value chosen “illustrative” because the true contribution is not known.

    (3) The model results are adjusted for having been spun up without volcanic forcing (hard to believe that this is still an issue – six years earlier we already supplied our model results spun up with volcanic forcing to the AR4). Again this is a plausible upward correction but of uncertain magnitude, since the climate response to volcanic eruptions is model-dependent.

    The dotted black line after 1990 makes a further adjustment, namely adding in the observed ice sheet loss which as such is not predicted by models. The ice sheet response remains a not yet well-understood part of the sea-level problem, and the IPCC has only “medium confidence” in the current ice sheet models.

    One statement that I do not find convincing is the IPCC’s claim that “it is likely that similarly high rates [as during the past two decades] occurred between 1920 and 1950.” I think this claim is not well supported by the evidence. In fact, a statement like “it is likely that recent high rates of SLR are unprecedented since instrumental measurements began” would be more justified.

    The lower panel of Fig. 3 (which shows the rates of SLR) shows that based on the Church & White sea-level record, the modern rate measured by satellite altimeter is unprecedented – even the uncertainty ranges of the satellite data and those of the Church & White rate between 1920 and 1950 do not overlap. The modern rate is also unprecedented for the Ray and Douglas data although there is some overlap of the uncertainty ranges (if you consider both ranges). There is a third data set (not shown in the above graph) by Wenzel and Schröter (2010) for which this is also true. The only outlier set which shows high early rates of SLR is the Jevrejeva et al. (2008) data – and this uses a bizarre weighting scheme, as we have discussed here at Realclimate. For example, the Northern Hemisphere ocean is weighted more strongly than the Southern Hemisphere ocean, although the latter has a much greater surface area. With such a weighting movements of water within the ocean, which cannot change global-mean sea level, erroneously look like global sea level changes. As we have shown in Rahmstorf et al. (2012), much or most of the decadal variations in the rate of sea-level rise in tide gauge data are probably not real changes at all, but simply an artefact of inadequate spatial sampling of the tide gauges. (This sampling problem has now been overcome with the advent of satellite data from 1993 onwards.) But even if we had no good reason to distrust decadal variations in the Jevrejeva data and treated all data sets the same, three out of four global tide gauge compilations show recent rates of rise that are unprecedented – enough for a “likely” statement in IPCC terms.

    Future sea-level rise

    For an unmitigated future rise in emissions (RCP8.5), IPCC now expects between a half metre and a metre of sea-level rise by the end of this century. The best estimate here is 74 cm.

    On the low end, the range for the RCP2.6 scenario is 28-61 cm rise by 2100, with a best estimate of 44 cm. Now that is very remarkable, given that this is a scenario with drastic emissions reductions starting in a few years from now, with the world reaching zero emissions by 2070 and after that succeeding in active carbon dioxide removal from the atmosphere. Even so, the expected sea-level rise will be almost three times as large as that experienced over the 20th Century (17 cm). This reflects the large inertia in the sea-level response – it is very difficult to make sea-level rise slow down again once it has been initiated. This inertia is also the reason for the relatively small difference in sea-level rise by 2100 between the highest and lowest emissions scenario (the ranges even overlap) – the major difference will only be seen in the 22nd century.

    There has been some confusion about those numbers: some media incorrectly reported a range of only 26-82 cm by 2100, instead of the correct 28-98 cm across all scenarios. I have to say that half of the blame here lies with the IPCC communication strategy. The SPM contains a table with those numbers – but they are not the rise up to 2100, but the rise up to the mean over 2081-2100, from a baseline of the mean over 1985-2005. It is self-evident that this is too clumsy to put in a newspaper or TV report so journalists will say “up to 2100”. So in my view, IPCC would have done better to present the numbers up to 2100 in the table (as we do below), so that after all its efforts to get the numbers right, 16 cm are not suddenly lost in the reporting.

    Table 1: Global sea-level rise in cm by the year 2100 as projected by the IPCC AR5. The values are relative to the mean over 1986-2005, so subtract about a centimeter to get numbers relative to the year 2000.

    And then of course there are folks like the professional climate change down-player Björn Lomborg, who in an international newspaper commentary wrote that IPCC gives “a total estimate of 40-62 cm by century’s end” – and also fails to mention that the lower part of this range requires the kind of strong emissions reductions that Lomborg is so viciously fighting.

    The breakdown into individual components for an intermediate scenario of about half a meter of rise is shown in the following graph.

    Fig. 4. Global sea-level projection of IPCC for the RCP6.0 scenario, for the total rise and the individual contributions.

    Higher projections than in the past

    To those who remember the much-discussed sea-level range of 18-59 cm from the 4th IPCC report, it is clear that the new numbers are far higher, both at the low and the high end. But how much higher they are is not straightforward to compare, given that IPCC now uses different time intervals and different emissions scenarios. But a direct comparison is made possible by table 13.6 of the report, which allows a comparison of old and new projections for the same emissions scenario (the moderate A1B scenario) over the time interval 1990-2100(*). Here the numbers:

    AR4: 37 cm (this is the standard case that belongs to the 18-59 cm range).
    AR4+suisd: 43 cm (this is the case with “scaled-up ice sheet discharge” – a questionable calculation that was never validated, emphasised or widely reported).
    AR5: 60 cm.

    We see that the new estimate is about 60% higher than the old standard estimate, and also a lot higher than the AR4 attempt at including rapid ice sheet discharge.

    The low estimates of the 4th report were already at the time considered too low by many experts – there were many indications of that (which we discussed back then), including the fact that the process models used by IPCC greatly underestimated the past observed sea-level rise. It was clear that those process models were not mature, and that was the reason for the development of an alternative, semi-empirical approach to estimating future sea-level rise. The semi-empirical models invariably gave much higher future projections, since they were calibrated with the observed past rise.

    However, the higher projections of the new IPCC report do not result from including semi-empirical models. Remarkably, they have been obtained by the process models preferred by IPCC. Thus IPCC now confirms with its own methods that the projections of the 4th report were too low, which was my main concern at the time and the motivation for publishing my paper in Science in 2007. With this new generation of process models, the discrepancy to the semi-empirical models has narrowed considerably, but a difference still remains.

    Should the semi-empirical models have been included in the uncertainty range of the IPCC projections? A number of colleagues that I have spoken to think so, and at least one has said so in public. The IPCC argues that there is “no consensus” on the semi-empirical models – true, but is this a reason to exclude or include them in the overall uncertainty that we have in the scientific community? I think there is likewise no consensus on the studies that have recently argued for a lower climate sensitivity, yet the IPCC has widened the uncertainty range to encompass them. The New York Times concludes from this that the IPCC is “bending over backward to be scientifically conservative”. And indeed one wonders whether the semi-empirical models would have been also excluded had they resulted in lower estimates of sea-level rise, or whether we see “erring on the side of the least drama” at work here.

    What about the upper limit?

    Coastal protection professionals require a plausible upper limit for planning purposes, since coastal infrastructure needs to survive also in the worst case situation. A dike that is only “likely” to be good enough is not the kind of safety level that coastal engineers want to provide; they want to be pretty damn certain that a dike will not break. Rightly so.

    The range up to 98 cm is the IPCC’s “likely” range, i.e. the risk of exceeding 98 cm is considered to be 17%, and IPCC adds in the SPM that “several tenths of a meter of sea level rise during the 21st century” could be added to this if a collapse of marine-based sectors of the Antarctic ice sheet is initiated. It is thus clear that a meter is not the upper limit.

    It is one of the fundamental philosophical problems with IPCC (causing much debate already in conjunction with the 4th report) that it refuses to provide an upper limit for sea-level rise, unlike other assessments (e.g. the sea-level rise scenarios of NOAA (which we discussed here) or the guidelines of the US Army Corps of Engineers). This would be an important part of assessing the risk of climate change, which is the IPCC’s role (**). Anders Levermann (one of the lead authors of the IPCC sea level chapter) describes it thus:

    In the latest assessment report of the IPCC we did not provide such an upper limit, but we allow the creative reader to construct it. The likely range of sea level rise in 2100 for the highest climate change scenario is 52 to 98 centimeters (20 to 38 inches.). However, the report notes that should sectors of the marine-based ice sheets of Antarctic collapse, sea level could rise by an additional several tenths of a meter during the 21st century. Thus, looking at the upper value of the likely range, you end up with an estimate for the upper limit between 1.2 meters and, say, 1.5 meters. That is the upper limit of global mean sea-level that coastal protection might need for the coming century.

    Outlook

    For the past six years since publication of the AR4, the UN global climate negotiations were conducted on the basis that even without serious mitigation policies global sea-level would rise only between 18 and 59 cm, with perhaps 10 or 20 cm more due to ice dynamics. Now they are being told that the best estimate for unmitigated emissions is 74 cm, and even with the most stringent mitigation efforts, sea level rise could exceed 60 cm by the end of century. It is basically too late to implement measures that would very likely prevent half a meter rise in sea level. Early mitigation is the key to avoiding higher sea level rise, given the slow response time of sea level (Schaeffer et al. 2012). This is where the “conservative” estimates of IPCC, seen by some as a virtue, have lulled policy makers into a false sense of security, with the price having to be paid later by those living in vulnerable coastal areas.

    Is the IPCC AR5 now the final word on process-based sea-level modelling? I don’t think so. I see several reasons that suggest that process models are still not fully mature, and that in future they might continue to evolve towards higher sea-level projections.

    1. Although with some good will one can say the process models are now consistent with the past observed sea-level rise (the error margins overlap), the process models remain somewhat at the low end in comparison to observational data.

    2. Efforts to model sea-level changes in Earth history tend to show an underestimation of past sea-level changes. E.g., the sea-level high stand in the Pliocene is not captured by current ice sheet models. Evidence shows that even the East Antarctic Ice Sheet – which is very stable in models – lost significant amounts of ice in the Pliocene.

    3. Some of the most recent ice sheet modelling efforts that I have seen discussed at conferences – the kind of results that came too late for inclusion in the IPCC report – point to the possibility of larger sea-level rise in future. We should keep an eye out for the upcoming scientific papers on this.

    4. Greenland might melt faster than current models capture, due to the “dark snow” effect. Jason Box, a glaciologist who studies this issue, has said:

    There was controversy after AR4 that sea level rise estimates were too low. Now, we have the same problem for AR5 [that they are still too low].

    Thus, I would not be surprised if the process-based models will have closed in further on the semi-empirical models by the time the next IPCC report gets published. But whether this is true or not: in any case sea-level rise is going to be a very serious problem for the future, made worse by every ton of CO2 that we emit. And it is not going to stop in the year 2100 either. By 2300, for unmitigated emissions IPCC projects between 1 and more than 3 meters of rise.

    Weblink

    I’m usually suspicious of articles that promise to look “behind the scenes”, but this one by Paul Voosen is not sensationalist but gives a realistic and matter-of-fact insight into the inner workings of the IPCC, for the sea-level chapter. Recommended reading!


    (*) Note: For the AR5 models table 13.6 gives 58 cm from 1996; we made that 60 cm from 1990.

    (**) The Principles Governing IPCC Work explicitly state that its role is to “assess…risk”, albeit phrased in a rather convoluted sentence:

    The role of the IPCC is to assess on a comprehensive, objective, open and transparent basis the scientific, technical and socio-economic information relevant to understanding the scientific basis of risk of human-induced climate change, its potential impacts and options for adaptation and mitigation.

    References

    1. J.A. Church, and N.J. White, “Sea-Level Rise from the Late 19th to the Early 21st Century”, Surveys in Geophysics, vol. 32, pp. 585-602, 2011. http://dx.doi.org/10.1007/s10712-011-9119-1

    2. R.D. Ray, and B.C. Douglas, “Experiments in reconstructing twentieth-century sea levels”, Progress in Oceanography, vol. 91, pp. 496-515, 2011. http://dx.doi.org/10.1016/j.pocean.2011.07.021

    3. M. Wenzel, and J. Schröter, “Reconstruction of regional mean sea level anomalies from tide gauges using neural networks”, Journal of Geophysical Research, vol. 115, 2010. http://dx.doi.org/10.1029/2009JC005630

    4. S. Jevrejeva, J.C. Moore, A. Grinsted, and P.L. Woodworth, “Recent global sea level acceleration started over 200 years ago?”, Geophysical Research Letters, vol. 35, 2008. http://dx.doi.org/10.1029/2008GL033611

    5. S. Rahmstorf, M. Perrette, and M. Vermeer, “Testing the robustness of semi-empirical sea level projections”, Climate Dynamics, vol. 39, pp. 861-875, 2012. http://dx.doi.org/10.1007/s00382-011-1226-7

    6. S. Rahmstorf, “A Semi-Empirical Approach to Projecting Future Sea-Level Rise”, Science, vol. 315, pp. 368-370, 2007. http://dx.doi.org/10.1126/science.1135456

    7. M. Schaeffer, W. Hare, S. Rahmstorf, and M. Vermeer, “Long-term sea-level rise implied by 1.5 °C and 2 °C warming levels”, Nature Climate Change, vol. 2, pp. 867-870, 2012. http://dx.doi.org/10.1038/nclimate1584

    Stefan Rahmstorf: A physicist and oceanographer by training, Stefan Rahmstorf has moved from early work in general relativity theory to working on climate issues. He has done research at the New Zealand Oceanographic Institute, at the Institute of Marine Science in Kiel and since 1996 at the Potsdam Institute for Climate Impact Research in Germany (in Potsdam near Berlin). His work focuses on the role of ocean currents in climate change, past and present. In 1999 Rahmstorf was awarded the $ 1 million Centennial Fellowship Award of the US-based James S. McDonnell foundation. Since 2000 he teaches physics of the oceans as a professor at Potsdam University. Rahmstorf is a member of the Advisory Council on Global Change of the German government and of the Academia Europaea. He is a lead author of the paleoclimate chapter of the 4th assessment report of the

  • “Very Generous Immigration Program” Pushing Out Would Be Home Buyers

     From MP Kelvin Thompson
    Wednesday, October 16, 2013

    “Very Generous Immigration Program” Pushing Out Would Be Home Buyers

    Australia used to be the country where everyone could afford to have a home of their own. But for far too many of today’s young Australians, that is no longer true. Housing affordability has declined.

    Treasurer Joe Hockey confirmed yesterday in New York in an interview with CNBC that our large migration program is one of the key drivers of housing unaffordability for young people.
    He told CNBC that “Australia is a long way from a housing bubble….The fact is we have a very generous immigration program and we have very slow supply coming in the market”.
    Mr Hockey is correct that the high migration program is a driver of rising house prices in Australia. Where I differ from Mr Hockey is that I don’t believe rising house prices is a good thing. The fact is that housing is a necessity, like food, water, electricity and petrol. No-one cheers when the price of food, water, electricity and petrol goes up – why should we cheer when the price of a house goes up?
    That cheering drowns out the quiet sad shrug of a generation being locked out of the opportunities which my generation and the one before me had the good fortune to have.
  • Tell Whitehaven shareholders that Maules Creek’s a risky business

    Tell Whitehaven shareholders that Maules Creek’s a risky business

    Inbox
    x
    Charlie Wood – 350.org Australia <charlie@350.org>
    6:43 PM (11 minutes ago)

    to me
    Images are not displayed. Display images below – Always display images from charlie@350.org

    Dear friend

    It’s time Whitehaven Coal shareholders understood that degrading ecosystems, dividing communities and damaging the climate is a risky business. On Monday the 4th of November, we’re heading to their AGM in Sydney, with our friends at Quit Coal, to explain exactly why. We hope you will join us.

    Right now, Whitehaven is progressing plans for its Maules Creek mine – a 2000 hectare open-cut coal mine in NSW’s beautiful Leard State Forest.

    The mine will see 1600 hectares of unique bushland and farmland cleared, 700 hectares of which is classified as critically endangered. When fully operational, Maules Creek and its neighbouring Boggabri coal mine will destroy habitat for 396 native species, drain the local water table by up to 6-7 metres, pump 18,000 tonnes of coal dust onto surrounding communities and release 60 megatons of carbon dioxide per year – more than the annual individual emissions of 165 countries, including Sweden, Hungary and our neighbours, New Zealand.

    It’s pretty clear that Whitehaven knows this isn’t right. As we speak, they are under federal investigation over claims that it provided false and misleading information in its environmental approval application, which was rushed through earlier this year. And on the very same day as the AGM, 2 former Whitehaven directors will be in Court over undeclared political donations.

    Shareholders deserve to know the truth about Maules Creek. Will you join us outside Whitehaven’s AGM on November the 4th to spread this truth?

    As if the environmental and social risks weren’t enough, Maules Creek is beset with financial risks. 400km from the nearest port, it faces major infrastructure bottle-necks. Additionally, the project requires an enormous capital outlay and faces an uncertain market as coal prices continue to drop in the wake of a global coal glut. With Whitehaven’s share price now at its lowest level since 2009, these are financial risks that the company can’t afford.

    But as the risks are swelling, so too is community opposition. Maules Creek is fast becoming the eye of a brewing storm of sustained public action to protect Australia’s environment, communities and climate against the destructive effects of fossil fuel expansion and proposed environmental deregulation. Over the Summer, 350.org will be supporting efforts like this around the country as part of our Summer Heat campaign.

    The fight is heating up and it starts with Maules Creek. On November 4th, join us outside Whitehaven’s AGM to tell shareholders to protect their pockets and the planet by getting out of this risky coal investment while they still can.

    In solidarity,

    Charlie on behalf of 350.org Australia and Alex on behalf of Quit Coal

    P.S. Join and share our Whitehaven AGM facebook event.


    350.org is building a global movement to solve the climate crisis. Connect with us on Facebook and Twitter, and sign up for email alerts. You can help power our work by getting involved locally, sharing your story, and donating here. To change your email address or update your contact info, click here.

     

  • The real threat to the national interest comes from the rich and powerful

    Monbiot.com

    Inbox
    x
    George Monbiot news@monbiot.com via google.com
    6:03 PM (5 minutes ago)

    to me

    Monbiot.com


    Elite Insurgency

    Posted: 14 Oct 2013 12:25 PM PDT

    The real threat to the national interest comes from the rich and powerful

     

    By George Monbiot, published in the Guardian 15th October 2013

    Subversion ain’t what it used to be. Today it scarcely figures as a significant force. Nation states are threatened by something else. Superversion: an attack from above.

    It takes several forms. One is familiar, but greatly enhanced by new technology: the tendency of spooks and politicians to use the instruments of state to amplify undemocratic powers. We’ve now learnt that even members of the Cabinet and the National Security Council had no idea what GCHQ was up to(1). No one told them that it was developing the capacity to watch, if it chooses, everything we do online. The real enemies of state (if by state we mean the compact between citizens and those they elect) are people like the head of MI5, and Theresa May, the Home Secretary, who appears to have failed to inform her Cabinet colleagues.

    Allied to the old abuses is a newer kind of superversion: the attempts by billionaires and their lieutenants to destroy the functions of the state. Note the current shutdown – and the debt ceiling confrontation scheduled for Thursday – in the United States. The Republicans, propelled by a Tea Party movement created by the Koch brothers and financed by a gruesome collection of multi-millionaires(2,3), have engineered what in other circumstances would be called a general strike. The difference is that the withdrawal of their labour has been imposed on the workers.

    The narrow purpose of the strike is to prevent the distribution of wealth to poorer people, through the Affordable Care Act. The wider purpose (aside from a refusal to accept the legitimacy of a black president) is to topple the state as an effective instrument of taxation, regulation and social protection. The Koch shock troops in the Republican party seem prepared to inflict almost any damage in pursuit of this insurgency, including – if they hold out on Thursday – a US government default, which could trigger a new global financial crisis(4).

    They do so on behalf of a class which has, in effect, seceded(5). It floats free of tax and the usual bonds of citizenship, jetting from one jurisdiction to another as it seeks the most favourable havens for its wealth. It removes itself so thoroughly from the life of the nation that it scarcely uses even the roads. Yet, through privatisation and outsourcing, it is capturing the public services on which the rest of us depend.

    Using an unreformed political funding system to devastating effect(6), this superversive class demands that the state stop regulating, stop protecting, stop intervening. When this abandonment causes financial crisis, the remaining taxpayers are forced to bail out the authors of the disaster, who then stash their bonuses offshore.

    One result is that those who call themselves conservatives and patriots appear to be deeply confused about what they are defending. In his article last week attacking the Guardian for revealing GCHQ’s secret surveillance programmes, Paul Dacre, the editor of the Daily Mail, characterised his readers as possessing an “over-riding suspicion of the state and the People Who Know Best.”(7) Strangely, this suspicion of the state and the People Who Know Best does not appear to extend to the security services, whose assault on our freedoms Dacre was defending.

    To the right-wing press and the Conservative party, patriotism means standing up to the European Union. But it also means capitulating to the United States. It’s an obvious and glaring contradiction, which is almost never acknowledged, let alone explained. In reality the EU and the US have become proxies for something which transcends national boundaries. The EU stands for state control and regulation while the US represents deregulation and atomisation.

    In reality, this distinction is outdated, as the handful of people who have heard of the Transatlantic Trade and Investment Partnership (TTIP) will appreciate. The European Commission calls it “the biggest trade deal in the world”(8). Its purpose is to create a single transatlantic market, in which all regulatory differences between the US and the EU are gradually removed.

    It has been negotiated largely in secret. This time, they’re not just trying to bring down international trade barriers, but, as the commission boasts, “to tackle barriers behind the customs border – such as differences in technical regulations, standards and approval procedures.”(9) In other words, our own laws, affecting our own people.

    A document published last year by two huge industrial lobby groups – the US Chamber of Commerce and BusinessEurope – explains the partnership’s aims(10). It will have a “proactive requirement”, directing governments to change their laws. The partnership should “put stakeholders at the table with regulators to essentially co-write regulation.” Stakeholder is a euphemism for corporation.

    They want it; they’re getting it. New intellectual property laws that they have long demanded, but which sovereign governments have so far resisted – not least because of the mass mobilisation against the Stop Online Piracy Act and Protect IP Act in the US(11) – are back on the table, but this time largely inaccessible to public protest. So are data protection, public procurement and financial services(12). You think that getting your own government to regulate bankers is hard enough? Try appealing to a transnational agreement brokered by corporations and justified by the deemed consent of citizens who have been neither informed nor consulted.

    This deal is a direct assault on sovereignty and democracy. So where are the Mail and the Telegraph and the other papers which have campaigned so hard against all transfers of power to the European Union? Where are the Conservative MPs who have fought for an EU referendum? Eerie silence descends. They do not oppose the TTIP because their allegiance lies not with the nation but with the offshored corporate elite.

    These fake patriots proclaim a love for their country, while ensuring that there is nothing left to love. They are loyal to the pageantry – the flags, the coinage, the military parades – but intensely disloyal to the nation these symbols are supposed to represent. The greater the dissonance becomes, the louder the national anthem plays.

    www.monbiot.com

    References:

    1. http://www.theguardian.com/commentisfree/2013/oct/06/prism-tempora-cabinet-surveillance-state

    2. http://www.theguardian.com/commentisfree/cifamerica/2010/oct/25/tea-party-koch-brothers

    3. http://www.youtube.com/user/astroturfwars

    4. http://www.nytimes.com/2013/09/30/opinion/krugman-rebels-without-a-clue.html

    5. http://www.theamericanconservative.com/articles/revolt-of-the-rich/

    6. http://www.theguardian.com/commentisfree/2012/oct/29/capitalism-bankrolls-politics-pay-price

    7. http://www.theguardian.com/commentisfree/2013/oct/12/left-daily-mail-paul-dacre

    8. http://ec.europa.eu/trade/policy/in-focus/ttip/

    9. http://ec.europa.eu/trade/policy/in-focus/ttip/

    10. US Chamber of Commerce and BusinessEurope, October 2012. Regulatory Cooperation in the EU-US Economic Agreement.

    http://corporateeurope.org/sites/default/files/businesseurope-uschamber-paper.pdf

    11. http://www.pcworld.com/article/248298/sopa_and_pipa_just_the_facts.html

    12. https://www.gov.uk/government/speeches/setting-the-terms-for-global-trade-the-transatlantic-trade-and-investment-

  • Current-tracking drone subs to help improve climate predictions

    Current-tracking drone subs to help improve climate predictions

    14 October 2013 | By Stephen Harris

    /m/h/x/TE_glider_472.jpg

    Underwater drones that can navigate ocean currents are to help British scientists improve climate change predictions as part of a newly funded project.

    The UK’s Natural Environment Research Council (NERC) has announced funding for two projects in collaboration with the US to study the circulation of water in the Atlantic that keep Europe’s climate mild and how it could be affected by changing global temperatures.

    One of the projects, known as OSNAP, involves mooring monitoring arrays that reach from the bottom to the surface of the ocean at key points across the northern Atlantic, and sending autonomous underwater gliders to gather data from in between.

    click here

    Dr Sheldon Bacon of the National Oceanography Centre at Southampton University, who is leading the UK team for OSNAP, said this will help them better understand how geographical features of the seabed affect the currents that transfer heat across the Atlantic.

    ‘If we want models to represent these processes correctly so that we can have betters projection of future climate, if we want to understand regional details like how Britain is likely to be affected in coming decades, we have to understand these features that affect the decadal variability of the ocean,’ he told The Engineer.

    The project’s gliders will spend up to four months at a time navigating the ocean currents, following a pre-planned route and surfacing around once a once a week to check its position, correct its course and transmit data back to base.

    Each 150cm-long glider uses an external bladder filled and emptied with a reservoir of oil to change the device’s density, in order to sink to around 1km and then rise to the surface. As the glider moves up and down, its wings enable it to move forward at the same time.

    The moored arrays will take measurements from the bottom of the ocean that can be used to calculate what happens along the flat part of the seabed. The gliders will navigate the warmer, upper currents in order to study how they are affected by topographical ridges and troughs.

    The instruments are due to be deployed next summer and will be changed every one to two years until 2018. ‘These gliders have been around for a number of years but they’re only now becoming mature and reliable enough for long-term missions,’ said Bacon.

    Western Europe has a relatively mild climate for a region so far north because of the so-called conveyor belt currents in the Atlantic that bring warm water at the top of the ocean from the tropics northwards.

    As the water transfers heat to the atmosphere, it cools, becoming denser and saltier and sinking to the bottom of the ocean – a process known as overturning – where it returns southwards.

    Researchers in the other NERC-funded project, known as RAPID, have been studying the conveyor belt between Florida and the Canary Islands using a series of moored arrays for a decade and the programme is now due to run for a further six years.

    But in the part of the ocean between Britain and Canada, known as the North Atlantic Subpolar Gyre, the process is more complicated due to the wind-driven horizontal circulation of waters on the surface and the more complex seabed topography.

    OSNAP (Overturning in the Subpolar North Atlantic Program) will enable researchers to study the relationship between the horizontal and vertical currents to build a better understanding of how the water returns to the southern Atlantic rather than just circulating in the north.

    ‘It turns out you get eddies spinning off at great depths and forming a pathway,’ said Bacon. ‘If we want models to represent these processes correctly so we can have a better projection of future climate, we have to understand these features that affect the decadal variability of the ocean.’

    Total funding for the projects is worth £44m, supplied by NERC and the US’s National Science Foundation (NSF) and the National Oceanic and Atmospheric Administration (NOAA).

    Have your say

    Mandatory
    Mandatory
    Mandatory
    Mandatory

    Read more: http://www.theengineer.co.uk/rail-and-marine/news/current-tracking-drone-subs-to-help-improve-climate-predictions/1017299.article#ixzz2hlUltwpy