Posts Tagged ‘science’

On Climategate

Monday, November 30th, 2009

at The Register

This piece originally had a much longer section summing up the state of climate “science” – which the CRU leak has verified. The peculiar nature of the problem is why anecdote and modelling play such an important part in the persuasion business.

Scientific theories fall by the wayside when they fail to be the give us the most convincing explanation of the evidence. The onus is therefore on the supporters of the theory to make the demonstrations, not for opponents to ‘trump’ them, and come up with one better. Otherwise we’d still be discussing the distribution of phlogiston, or the particular qualities of ectoplasm.

Prior to 1980, the dominant factor influencing modulations in climate was thought to be the sun. This makes sense, since our primary energy source (unless you happen to live by a volcano vent) is the sun. If the current vogue for greenhouse gases loses favour, the result will not be a dangerous unstable rip in the fabric of space time. It’s simply likely that the consensus will, in the absence of a more compelling explanation, revert to solar influences.

(Ironically Hubert Lamb, the father of climatology who left the Met Office to found CRU in 1972, remained sceptical of the greenhouse gas theory until the end).

Now every scientific challenges is unique, but the manmade global warming hypothesis poses several specific problems for even the most honest scientist. The real battleground is over aspects of the ‘energy budget’ model – and convincing people means overcoming a number of challenges. The theory posits that small increases in CO2 concentrations (advocates prefer the phrase ‘well-mixed greenhouse gases’) have significant amplification effects. It’s accepted that a doubling of CO2 introduces very little warmth into the system – less than a degree centigrade, which is quite toasty and leaves us someway short of Thermageddon. Increasing the CO2 concentration doesn’t make an appreciable difference; since absorption is logarithmic, it doesn’t matter after a certain point.

So positive feedbacks play a central role in the hypothesis, which suggests that with more clouds, more energy is ‘trapped’, permafrost melts, methane is released, and so on, all increasing temperatures further. Global Warming theory rests on these strong positive feedbacks. If the earth absorbs larger amounts of CO2 than predicted – then the theory fails. If the earth radiates more out to space, then it fails. If the negative feedbacks outweigh the positive feedbacks, then the theory fails. As you may tell by now, demonstrating that greenhouse gases play some kind of role in the climate is not difficult. Demonstrating that they play the dominant role is.

Additionally, and to the perennial amazement of newcomers to the field, there is no ‘fingerprint’ or telltale signal that anthropogenically produced gases are the primary forcing factor. A few candidates have briefly starred in the role – C-14 isotopes, or signs of a ‘hotspot’ under the stratosphere – but these are rarely cited now. The ‘smoking pistols’ have proved to be ambiguous, or missing in action. With the human component just a small part (5 per cent) of CO2, and CO2 a small (5 per cent) part of the overall greenhouse gas mix, the challenge is clear.

Hence the increasing dependence, since 1980, of a range of anecdotal evidence, and computer modelling. In instances where simple empirical tests are sufficient to provide a theory, neither is needed. But science has now moved into what critics call a ‘post modern’ phase. In 2001, the IPCC published its Third Assessment Report and observed:

“Our knowledge about the processes, and feedback mechanisms determining them, must be significantly improved in order to extract early signs of such changes from model simulations and observations.”

So, while expressing quite frankly the state of the science, the IPCC was giving increasing weight to computer models as it was to observations. Modelling was beginning to eclipse empirical evidence.

So reasonable doubt exists whether something as significant as clouds are a positive or negative feedback. The Fourth Annual Assessment acknowledged that the “Level of scientific understandings” of non-Greenhouse forcings was low. That was charitable, the science hasn’t really been done yet.

Now it’s clear from the CRU exchanges – particularly between the Wigley and Trenberth “Where did the Warming go?” dialog – that the energy budget isn’t scientifically understood at all.

Not Proven is a reasonable verdict.

To the Moon – with extreme engineering

Wednesday, July 22nd, 2009

Lunar Orbiter

Apollo space program as a triumph of power and industrial might. The superpowers’ space programs were, of course, political and chauvinistic, designed to showcase national wealth. But there’s a better way of looking at the program, Dennis Wingo reminded me recently. Masses of money helped put man on the Moon of course, but the Moon program is really a tale of engineering improvisation and human organisation.

Space expert and entrepreneur Dennis Wingo put the first webserver – an Apple Mac – in orbit, for just $7m, and has helped piece together a lot of historical material that NASA didn’t appreciate at the time – and forgot about, or wiped. There is one piece of kit in particular that encapsulates two stories: NASA’s negligence, and the quite amazing improvisation of the engineers. It’s the Lunar Orbiter, which mapped the moon’s surface prior to manned descent. Wingo painstakingly recovered and restored much of the imagery it took.

To give us an idea of how much Apollo owed to seat-of-the-pants ingenuity, it’s worth remembering that the story of the Orbiter begins in 1961 – the year of the first human orbit of the Earth by Yuri Gagarin. The space pioneers were seeing a high death rate from test subjects – dogs (the USSR) and chimps (the USA), the latter proving to be a duff move – the chimps panicked in the claustrophobic conditions.

The US program lagged far behind the Soviets’, and NASA’s early attempts to keep up had become a national joke. The Ranger had been the first project to photograph the moon, with the modest ambition of crashing a probe onto the surface. But of the first six Rangers, two failed to leave the Earth’s orbit, one failed en route, two missed the Moon completely, and although the sixth reached the target, its cameras failed.

Yet by 1964, much of the technology that eventually put man on the Moon had been already designed and built. The colossal Apollo expenditures were on the physical implementation of the program, including the many test flights. By 1965, the Apollo Lunar Excursion Module (LEM) was already being prepared as a long-term shelter and accommodation unit. And as Wingo points out, it was really down to 400 engineers – a fraction of what Google devotes to inserting advertisements into web pages – being given the freedom to put Heath Robinson designs into practice.

The Lunar Orbiter astonishes even today. It had to take pictures, scan and develop the film on board, and broadcast it successfully back to earth. Naturally, the orbiter had to provide its own power, orient itself without intervention from ground control, and maintain precise temperature conditions and air pressure for the film processing, and protect itself from solar radiation and cosmic rays – all within severe size and weight constraints. This was far beyond the capabilities of the newest spy satellites, which back then returned the film to earth in a canister, retrieved by a specially kitted-out plane. The Orbiter challenge was the Apollo challenge in miniature.
…Read more at The Register

Breaking Bad: the joy of chemistry

Friday, May 29th, 2009

Breaking Bad

Here’s a show with the perfect profile to be a huge cult British hit – black humour, suspense, all the stuff we love. But what’s puzzling is how the British public broadcasters dropped the ball by failing to notice the show – particularly the BBC.

…Read more at The Register

BBC's science: 'Evangelical, shallow and sparse'

Friday, May 22nd, 2009

The BBC’s environmental coverage has come under fire from a former science correspondent. Award-winning author and journalist David Whitehouse says the corporation risks public ridicule – or worse – with what he calls “an evangelical, inconsistent climate change reporting and its narrow, shallow and sparse reporting on other scientific issues.”

Whitehouse relates how he was ticked off for taking a cautious approach to apocalyptic predictions when a link between BSE in cattle (“Mad Cow Disease”) and vCJD in humans was accepted by government officials in 1996. Those predictions “…rested on a cascade of debateable assumptions being fed into a computer model that had been tweaked to hindcast previous data,” he writes.

“My approach was not favoured by the BBC at the time and I was severely criticised in 1998 and told I was wrong and not reporting the BSE/vCJD story correctly.”

The Beeb wasn’t alone. With bloodthirsty glee, the Observer newspaper at the time predicted millions infected, crematoria full of smoking human remains – and the government handing out suicide pills to the public. Whitehouse feels his caution is now vindicated. The number of cases traced to vCJD in the UK is now 163 – and the only suicides were farmers who had feared their livelihoods destroyed.

Writes Whitehouse:

“Reporting the consensus about climate change…is not synonymous with good science reporting. The BBC is at an important point. It has been narrow minded about climate change for many years and they have become at the very least a cliché and at worst lampooned as being predictable and biased by a public that doesn’t believe them anymore.”

(more…)

Junk science and booze tax – a study in spin

Thursday, December 11th, 2008

Let's find out what everybody is doing, and stop them doing it - A P Herbert
“Let’s find out what everybody is doing, and stop them doing it” – A P Herbert

Putting the price of alcohol up to a minimum of 40p a unit would keep 41,000 people a year out of hospital, save the NHS £116m a year, and avoid 12,400 cases of unemployment, a report from Sheffield University claimed last week. These appear to be remarkably precise predictions. The government used the report – widely quoted in the press – to justify higher duties and greater regulation of the sale of alcohol. Yet on close examination, the report appears to be a prime example of “policy-based evidence making”.

The blockbuster report, from Sheffield University’s Section of Public Health, is in two major parts: a review of evidence, and a statistical model, totalling over 500 pages. Researchers examined the effects of alcohol pricing and alcohol promotion (and advertising) on three areas: consumption, public health and crime. I won’t cover the latter, because these proposals were dropped before the Queen’s Speech, but it is evident from the amount of time the Sheffield researchers devoted to this, that this was a legislative priority. Academia marches in lockstep with its financial benefactor – in this case, of course, the Department of Health.

Read more at The Register

The Large Hadron Collider: Anton Wylie

Tuesday, September 9th, 2008

CERN's LHC

The LHC comes at a crucial time for particle or quantum physics. In particular, it comes at a crucial time for the dominant theory, known as the Standard Model.

The Standard Model has been to modern particle physics rather what the periodic table was to 19th century chemistry. It served both to organise the known entities systematically, and as an impetus to fill in the holes in our knowledge. The Standard Model can claim to have predicted the existence of several previously unexpected particles, which were subsequently discovered experimentally. Arguably, too, it has also seeded the separate field of quantum information theory, and quantum computing.

From the point of view of having things neat and tidy, there is just one hole in the jigsaw of Standard Theory. The missing piece is the (by now surely) world-famous Higgs boson – popularly known now as the “God particle”. So named not because it could resolve the Augustinian Dilemma, but perhaps as in, “Oh God, when are we going to find it?”. More seriously, the Higgs boson could account for the mass properties of the other entities – why some have it, and some don’t. So if particle physicists observe the Higgs boson, they can effectively draw a line under 50 years or so research, slap themselves on the back, and move on.

Unfortunately, the Higgs boson has spent over 40 years hiding – ironically not because it is tiny. The Standard Model unfortunately does not predict its mass. As efforts have concentrated on manufacturing the boson in particle accelerators, its continuing elusiveness has been put down to it being big – a tad too big.

“Particle physics has other gaping explanatory holes to fill.”

Hence the LHC, which crudely speaking whizzes bits of matter to as fast a speed as possible. Experimenters let these crash into various targets to see what new interesting bits emerge. The record-breaking energies of the LHC require similarly record-breaking electro-magnets to achieve.

….Read more at The Register

How the middle classes' superstitions keep Africa poor and hungry

Monday, September 8th, 2008

The man dubbed the “King of Climate Porn” achieved notoriety at the turn of the decade as the architect of the Foot and Mouth holocaust – which unnecessarily slaughtered seven million animals, and cost the country billions of pounds. But King astonished observers by saying something sensible last week – and he promises to do so again tonight.

Speaking at the British Association’s Science Week, King will say that the Greenies’ anti-science superstitions are causing unnecessary suffering in Africa. King blames “anti-poverty” campaigners, aid agencies and environmental activists for keeping modern farming techniques and bio-technology out of Africa. King tells the Times today:

“The suffering within [Africa], I believe, is largely driven by attitudes developed in the West which are somewhat anti-science, anti-technology – attitudes that lead towards organic farming, for example, attitudes that lead against the use of genetic technology for crops that could deal with increased salinity in the water, that can deal with flooding for rice crops, that can deal with drought resistance.”

King wonders why recent productivity revolutions in agriculture, which have been such a success in Asia and India, have not been implemented in Africa on the same scale. He concludes that the blame lies not with Africans, but with Western “do-gooders” who prefer Africans to remain picturesque and dirt poor.

An example he cites is the attempts of eco-campaigners Friends of the Earth to keep drought-resistant crops out of Africa.

He has a point.

“Where once there were ambitions for people in the third world to enjoy Western standards of living, now the voice of the voiceless instead celebrates the primitive lifestyles that the world’s poorest people suffer,” wrote Ben Pile and Stuart Blackman recently in a scathing critique of the charity Oxfam, called Backwards to the future.

Indeed, and the same middle-class superstitions that endeavour to keep Wi-Fi out of schools are used to justify keeping biotechnology out of Africa.

For example, Friends of the Earth continues to argue that modern seed technologies should not be used to make agriculture easier and more productive for poor farmers – even when this causes more ecological damage than the new technology. FoE’s most recent campaign against biotech means that subsistence farmers must continue to use seeds that require more fertiliser than GM varieties, and which need environmentally-destructive tilling.

Whatever it is that motivates these self-styled “Greens”, it isn’t a concern for the environment. Nor, despite claims to the contrary, is there any valid concern of “over-population”. The UN estimates global population growth to peak in the 2040s at 7.87bn, then decline, assuming modest development is permitted to continue. Not only does economic development mean fewer people, but it means less suffering: those fewer people are much happier.

Clearly, we can easily generate enough food to feed everyone on the planet and we have the means to ensure there’s less human suffering. Some people want that to happen – and some don’t. You’ll find many nursing their Malthusian or Eugenics prejudices under the banner of Greenery in the former camp – but it’s a refreshing surprise to find King in the latter camp, or at least edging away from the Greens’ death cult.

©Situation Publishing 2008.

Freeman Dyson: climate models are rubbish

Thursday, August 14th, 2008

British-born physicist Freeman Dyson has revealed three “heresies”, two of which challenge the current scientific orthodoxy that anthropogenic carbon causes climate change.

“The fuss about global warming is grossly exaggerated,” writes Dyson in his new book Many Colored Glass: Reflections on the Place of Life in the Universe, published on Wednesday.

He pours scorn on “the holy brotherhood of climate model experts and the crowd of deluded citizens who believe the numbers predicted by the computer models”.

“I have studied the climate models and I know what they can do. The models solve the equations of fluid dynamics, and they do a very good job of describing the fluid motions of the atmosphere and the oceans. They do a very poor job of describing the clouds, the dust, the chemistry, and the biology of fields and farms and forests,” writes Dyson.

Biomass holds the key to carbon, he writes – leaving us to infer that he thinks the human contribution is negligible. Overall, Dyson issues a plea for more scientific research into the behaviour of the planet’s biomass.

“Many of the basic processes of planetary ecology are poorly understood. They must be better understood before we can reach an accurate diagnosis of the present condition of our planet,” he says.
(more…)

Physicists warned not to debate global warming

Monday, July 21st, 2008

Bureaucrats at the American Physical Society (APS) have issued a curious warning to their members about an article in one of their own publications. Don’t read this, they say – we don’t agree with it. But what is it about the piece that is so terrible, that like Medusa, it could make men go blind?

It’s an article that examines the calculation central to climate models. As the editor of the APS’s newsletter American Physics Jeffrey Marque explains, the global warming debate must be re-opened:

“There is a considerable presence within the scientific community of people who do not agree with the IPCC conclusion that anthropogenic CO2 emissions are very probably likely to be primarily responsible for the global warming that has occurred since the Industrial Revolution. Since the correctness or fallacy of that conclusion has immense implications for public policy and for the future of the biosphere, we thought it appropriate to present a debate within the pages of P&S concerning that conclusion.”

American Physics invited both believers and sceptics to submit articles, and has published a submission by Viscount Monckton questioning the core calculation of the greenhouse gas theory: climate sensitivity. The believers are represented by two physicists from Cal Poly San Luis Obispo, who state that:

“Basic atmospheric models clearly predict that additional greenhouse gasses will raise the temperature of Earth. To argue otherwise, one must prove a physical mechanism that gives a reasonable alternative cause of warming. This has not been done. Sunspot and temperature correlations do not prove causality.”

But within a few days, Monckton’s piece carried a health warning: in bright red ink:

The following article has not undergone any scientific peer review. Its conclusions are in disagreement with the overwhelming opinion of the world scientific community. The Council of the American Physical Society disagrees with this article’s conclusions.

Not so much Medusa, then, as Nanny telling the children what not to think.

(more…)

Bringing it all back Hume: Anton Wylie

Wednesday, July 9th, 2008
A philosophy of science that may be the best thing we’ve ever run

WiReD magazine’s editor-in-chief Chris Anderson has just seen the end for scientific theories. And it is called Google.

The concept of the mind, and by extension that of a person, was also affected, with far reaching implications.

In psychology, Behaviourism was one favoured development. Its ontology does not include people with minds, only biological entities with patterns of behaviour. The rise and rise of neuro-science is correlated with this. Another is politics. The New Labour government in the UK boasts almost daily that it is in the business of “modifying behaviour”.

Even when this type of thinking is felt to be repugnant, the tendency remains to treat people as parametrically determined objects. The phrase “hearts and minds” admits that people feel and think, but implies that what matters is to ascertain which feelings and thoughts affect them most strongly. Modern politics consists to a large extent of this type of appeal, and that part conducted through the media, almost exclusively.

Read more at The Register