On Climategate

at The Register

This piece originally had a much longer section summing up the state of climate “science” – which the CRU leak has verified. The peculiar nature of the problem is why anecdote and modelling play such an important part in the persuasion business.

Scientific theories fall by the wayside when they fail to be the give us the most convincing explanation of the evidence. The onus is therefore on the supporters of the theory to make the demonstrations, not for opponents to ‘trump’ them, and come up with one better. Otherwise we’d still be discussing the distribution of phlogiston, or the particular qualities of ectoplasm.

Prior to 1980, the dominant factor influencing modulations in climate was thought to be the sun. This makes sense, since our primary energy source (unless you happen to live by a volcano vent) is the sun. If the current vogue for greenhouse gases loses favour, the result will not be a dangerous unstable rip in the fabric of space time. It’s simply likely that the consensus will, in the absence of a more compelling explanation, revert to solar influences.

(Ironically Hubert Lamb, the father of climatology who left the Met Office to found CRU in 1972, remained sceptical of the greenhouse gas theory until the end).

Now every scientific challenges is unique, but the manmade global warming hypothesis poses several specific problems for even the most honest scientist. The real battleground is over aspects of the ‘energy budget’ model – and convincing people means overcoming a number of challenges. The theory posits that small increases in CO2 concentrations (advocates prefer the phrase ‘well-mixed greenhouse gases’) have significant amplification effects. It’s accepted that a doubling of CO2 introduces very little warmth into the system – less than a degree centigrade, which is quite toasty and leaves us someway short of Thermageddon. Increasing the CO2 concentration doesn’t make an appreciable difference; since absorption is logarithmic, it doesn’t matter after a certain point.

So positive feedbacks play a central role in the hypothesis, which suggests that with more clouds, more energy is ‘trapped’, permafrost melts, methane is released, and so on, all increasing temperatures further. Global Warming theory rests on these strong positive feedbacks. If the earth absorbs larger amounts of CO2 than predicted – then the theory fails. If the earth radiates more out to space, then it fails. If the negative feedbacks outweigh the positive feedbacks, then the theory fails. As you may tell by now, demonstrating that greenhouse gases play some kind of role in the climate is not difficult. Demonstrating that they play the dominant role is.

Additionally, and to the perennial amazement of newcomers to the field, there is no ‘fingerprint’ or telltale signal that anthropogenically produced gases are the primary forcing factor. A few candidates have briefly starred in the role – C-14 isotopes, or signs of a ‘hotspot’ under the stratosphere – but these are rarely cited now. The ‘smoking pistols’ have proved to be ambiguous, or missing in action. With the human component just a small part (5 per cent) of CO2, and CO2 a small (5 per cent) part of the overall greenhouse gas mix, the challenge is clear.

Hence the increasing dependence, since 1980, of a range of anecdotal evidence, and computer modelling. In instances where simple empirical tests are sufficient to provide a theory, neither is needed. But science has now moved into what critics call a ‘post modern’ phase. In 2001, the IPCC published its Third Assessment Report and observed:

“Our knowledge about the processes, and feedback mechanisms determining them, must be significantly improved in order to extract early signs of such changes from model simulations and observations.”

So, while expressing quite frankly the state of the science, the IPCC was giving increasing weight to computer models as it was to observations. Modelling was beginning to eclipse empirical evidence.

So reasonable doubt exists whether something as significant as clouds are a positive or negative feedback. The Fourth Annual Assessment acknowledged that the “Level of scientific understandings” of non-Greenhouse forcings was low. That was charitable, the science hasn’t really been done yet.

Now it’s clear from the CRU exchanges – particularly between the Wigley and Trenberth “Where did the Warming go?” dialog – that the energy budget isn’t scientifically understood at all.

Not Proven is a reasonable verdict.

To the Moon – with extreme engineering

Lunar Orbiter

Apollo space program as a triumph of power and industrial might. The superpowers’ space programs were, of course, political and chauvinistic, designed to showcase national wealth. But there’s a better way of looking at the program, Dennis Wingo reminded me recently. Masses of money helped put man on the Moon of course, but the Moon program is really a tale of engineering improvisation and human organisation.

Space expert and entrepreneur Dennis Wingo put the first webserver – an Apple Mac – in orbit, for just $7m, and has helped piece together a lot of historical material that NASA didn’t appreciate at the time – and forgot about, or wiped. There is one piece of kit in particular that encapsulates two stories: NASA’s negligence, and the quite amazing improvisation of the engineers. It’s the Lunar Orbiter, which mapped the moon’s surface prior to manned descent. Wingo painstakingly recovered and restored much of the imagery it took.

To give us an idea of how much Apollo owed to seat-of-the-pants ingenuity, it’s worth remembering that the story of the Orbiter begins in 1961 – the year of the first human orbit of the Earth by Yuri Gagarin. The space pioneers were seeing a high death rate from test subjects – dogs (the USSR) and chimps (the USA), the latter proving to be a duff move – the chimps panicked in the claustrophobic conditions.

The US program lagged far behind the Soviets’, and NASA’s early attempts to keep up had become a national joke. The Ranger had been the first project to photograph the moon, with the modest ambition of crashing a probe onto the surface. But of the first six Rangers, two failed to leave the Earth’s orbit, one failed en route, two missed the Moon completely, and although the sixth reached the target, its cameras failed.

Yet by 1964, much of the technology that eventually put man on the Moon had been already designed and built. The colossal Apollo expenditures were on the physical implementation of the program, including the many test flights. By 1965, the Apollo Lunar Excursion Module (LEM) was already being prepared as a long-term shelter and accommodation unit. And as Wingo points out, it was really down to 400 engineers – a fraction of what Google devotes to inserting advertisements into web pages – being given the freedom to put Heath Robinson designs into practice.

The Lunar Orbiter astonishes even today. It had to take pictures, scan and develop the film on board, and broadcast it successfully back to earth. Naturally, the orbiter had to provide its own power, orient itself without intervention from ground control, and maintain precise temperature conditions and air pressure for the film processing, and protect itself from solar radiation and cosmic rays – all within severe size and weight constraints. This was far beyond the capabilities of the newest spy satellites, which back then returned the film to earth in a canister, retrieved by a specially kitted-out plane. The Orbiter challenge was the Apollo challenge in miniature.
…Read more at The Register

Breaking Bad: the joy of chemistry

Breaking Bad

Here’s a show with the perfect profile to be a huge cult British hit – black humour, suspense, all the stuff we love. But what’s puzzling is how the British public broadcasters dropped the ball by failing to notice the show – particularly the BBC.

…Read more at The Register

BBC's science: 'Evangelical, shallow and sparse'

The BBC’s environmental coverage has come under fire from a former science correspondent. Award-winning author and journalist David Whitehouse says the corporation risks public ridicule – or worse – with what he calls “an evangelical, inconsistent climate change reporting and its narrow, shallow and sparse reporting on other scientific issues.”

Whitehouse relates how he was ticked off for taking a cautious approach to apocalyptic predictions when a link between BSE in cattle (“Mad Cow Disease”) and vCJD in humans was accepted by government officials in 1996. Those predictions “…rested on a cascade of debateable assumptions being fed into a computer model that had been tweaked to hindcast previous data,” he writes.

“My approach was not favoured by the BBC at the time and I was severely criticised in 1998 and told I was wrong and not reporting the BSE/vCJD story correctly.”

The Beeb wasn’t alone. With bloodthirsty glee, the Observer newspaper at the time predicted millions infected, crematoria full of smoking human remains – and the government handing out suicide pills to the public. Whitehouse feels his caution is now vindicated. The number of cases traced to vCJD in the UK is now 163 – and the only suicides were farmers who had feared their livelihoods destroyed.

Writes Whitehouse:

“Reporting the consensus about climate change…is not synonymous with good science reporting. The BBC is at an important point. It has been narrow minded about climate change for many years and they have become at the very least a cliché and at worst lampooned as being predictable and biased by a public that doesn’t believe them anymore.”

Continue reading “BBC's science: 'Evangelical, shallow and sparse'”

Junk science and booze tax – a study in spin

Let's find out what everybody is doing, and stop them doing it - A P Herbert
“Let’s find out what everybody is doing, and stop them doing it” – A P Herbert

Putting the price of alcohol up to a minimum of 40p a unit would keep 41,000 people a year out of hospital, save the NHS £116m a year, and avoid 12,400 cases of unemployment, a report from Sheffield University claimed last week. These appear to be remarkably precise predictions. The government used the report – widely quoted in the press – to justify higher duties and greater regulation of the sale of alcohol. Yet on close examination, the report appears to be a prime example of “policy-based evidence making”.

The blockbuster report, from Sheffield University’s Section of Public Health, is in two major parts: a review of evidence, and a statistical model, totalling over 500 pages. Researchers examined the effects of alcohol pricing and alcohol promotion (and advertising) on three areas: consumption, public health and crime. I won’t cover the latter, because these proposals were dropped before the Queen’s Speech, but it is evident from the amount of time the Sheffield researchers devoted to this, that this was a legislative priority. Academia marches in lockstep with its financial benefactor – in this case, of course, the Department of Health.

Read more at The Register

The Large Hadron Collider: Anton Wylie

CERN's LHC

The LHC comes at a crucial time for particle or quantum physics. In particular, it comes at a crucial time for the dominant theory, known as the Standard Model.

The Standard Model has been to modern particle physics rather what the periodic table was to 19th century chemistry. It served both to organise the known entities systematically, and as an impetus to fill in the holes in our knowledge. The Standard Model can claim to have predicted the existence of several previously unexpected particles, which were subsequently discovered experimentally. Arguably, too, it has also seeded the separate field of quantum information theory, and quantum computing.

From the point of view of having things neat and tidy, there is just one hole in the jigsaw of Standard Theory. The missing piece is the (by now surely) world-famous Higgs boson – popularly known now as the “God particle”. So named not because it could resolve the Augustinian Dilemma, but perhaps as in, “Oh God, when are we going to find it?”. More seriously, the Higgs boson could account for the mass properties of the other entities – why some have it, and some don’t. So if particle physicists observe the Higgs boson, they can effectively draw a line under 50 years or so research, slap themselves on the back, and move on.

Unfortunately, the Higgs boson has spent over 40 years hiding – ironically not because it is tiny. The Standard Model unfortunately does not predict its mass. As efforts have concentrated on manufacturing the boson in particle accelerators, its continuing elusiveness has been put down to it being big – a tad too big.

“Particle physics has other gaping explanatory holes to fill.”

Hence the LHC, which crudely speaking whizzes bits of matter to as fast a speed as possible. Experimenters let these crash into various targets to see what new interesting bits emerge. The record-breaking energies of the LHC require similarly record-breaking electro-magnets to achieve.

….Read more at The Register

How the middle classes' superstitions keep Africa poor and hungry

The man dubbed the “King of Climate Porn” achieved notoriety at the turn of the decade as the architect of the Foot and Mouth holocaust – which unnecessarily slaughtered seven million animals, and cost the country billions of pounds. But King astonished observers by saying something sensible last week – and he promises to do so again tonight.

Speaking at the British Association’s Science Week, King will say that the Greenies’ anti-science superstitions are causing unnecessary suffering in Africa. King blames “anti-poverty” campaigners, aid agencies and environmental activists for keeping modern farming techniques and bio-technology out of Africa. King tells the Times today:

“The suffering within [Africa], I believe, is largely driven by attitudes developed in the West which are somewhat anti-science, anti-technology – attitudes that lead towards organic farming, for example, attitudes that lead against the use of genetic technology for crops that could deal with increased salinity in the water, that can deal with flooding for rice crops, that can deal with drought resistance.”

King wonders why recent productivity revolutions in agriculture, which have been such a success in Asia and India, have not been implemented in Africa on the same scale. He concludes that the blame lies not with Africans, but with Western “do-gooders” who prefer Africans to remain picturesque and dirt poor.

An example he cites is the attempts of eco-campaigners Friends of the Earth to keep drought-resistant crops out of Africa.

He has a point.

“Where once there were ambitions for people in the third world to enjoy Western standards of living, now the voice of the voiceless instead celebrates the primitive lifestyles that the world’s poorest people suffer,” wrote Ben Pile and Stuart Blackman recently in a scathing critique of the charity Oxfam, called Backwards to the future.

Indeed, and the same middle-class superstitions that endeavour to keep Wi-Fi out of schools are used to justify keeping biotechnology out of Africa.

For example, Friends of the Earth continues to argue that modern seed technologies should not be used to make agriculture easier and more productive for poor farmers – even when this causes more ecological damage than the new technology. FoE’s most recent campaign against biotech means that subsistence farmers must continue to use seeds that require more fertiliser than GM varieties, and which need environmentally-destructive tilling.

Whatever it is that motivates these self-styled “Greens”, it isn’t a concern for the environment. Nor, despite claims to the contrary, is there any valid concern of “over-population”. The UN estimates global population growth to peak in the 2040s at 7.87bn, then decline, assuming modest development is permitted to continue. Not only does economic development mean fewer people, but it means less suffering: those fewer people are much happier.

Clearly, we can easily generate enough food to feed everyone on the planet and we have the means to ensure there’s less human suffering. Some people want that to happen – and some don’t. You’ll find many nursing their Malthusian or Eugenics prejudices under the banner of Greenery in the former camp – but it’s a refreshing surprise to find King in the latter camp, or at least edging away from the Greens’ death cult.

©Situation Publishing 2008.

Bringing it all back Hume: Anton Wylie

A philosophy of science that may be the best thing we’ve ever run

WiReD magazine’s editor-in-chief Chris Anderson has just seen the end for scientific theories. And it is called Google.

The concept of the mind, and by extension that of a person, was also affected, with far reaching implications.

In psychology, Behaviourism was one favoured development. Its ontology does not include people with minds, only biological entities with patterns of behaviour. The rise and rise of neuro-science is correlated with this. Another is politics. The New Labour government in the UK boasts almost daily that it is in the business of “modifying behaviour”.

Even when this type of thinking is felt to be repugnant, the tendency remains to treat people as parametrically determined objects. The phrase “hearts and minds” admits that people feel and think, but implies that what matters is to ascertain which feelings and thoughts affect them most strongly. Modern politics consists to a large extent of this type of appeal, and that part conducted through the media, almost exclusively.

Read more at The Register

'Use me as a mouthpiece', pleads Guardian hack

Ben Goldacre, The Guardian‘s Mr “Bad Science” writes witheringly about sloppy science journalists. Many of them are simply “juggling words about on a page, without having the first clue what they mean, pretending they’ve got a proper job, their pens all lined up neatly on the desk,” he writes.

They trade on scare stories, and rely on “rejiggable press releases”. Dr Goldacre is a real scientist, you see.

But last week found Ben in a frantic rush, commissioned to write a feature about biometric technology. So he put in an email request to the Open Rights Group, the endearingly hopeless British EFF-clone.

(This isn’t surprising – we suspect that at El Graun, hacks are equipped with two office telephones: a normal one, and one with only one button, which dials the ORG directly.)

And as every journalist knows, desperate deadlines call for desperate measures. Here’s his request –

hi, my name’s ben and i write “badscience” in the guardian (and badscience.net )

i wanted to write something on the shitness of biometrics tomorrow for the col on sat, if anyone’s got a nice big bundle of stuff i need (a) people like, say, hang on, gordon brown in PMQ making grand claims about how they will cure all ills and (b) good evidence/arguments/rocksolidundeniablefacts on why these claims are nonsense.

So far, so standard – although eyebrows may be raised at the way that fact/assertion sort of run/into/each/other.

Then comes a bit where he slowly starts to sink into the merde.

incidentally, before you assume that i’m a lazy journo, i dont write like this with anyone else, but in fact i am offering ORG the chance to use me as a mouthpiece for your righteous rightness.

Er, a what?? Ben elaborates –

think of it as a “pull” model for lobbying, rather than the usual push.

Ah, perfectly clear.

essentially i have a bag of kittens and will drown one on the hour every hour until you give me a good biometrics story.

Presumably, the “rejiggable material” from the ORG presumably arrived on time – for the mouthpiece duly opened on Saturday.

So this is how journalism really works: Don’t bother yourself with that any of that cool judgement and independent appraisal of facts business. Find the argument, then some facts to suit. And finally, ring up your favourite lobby group and demand to be used as a mouthpiece.

However, when using the ORG – a sort of Dad’s Army in the War on Copyright – it’s a perilous approach.

Two years ago, the Group made a submission to the UK Parliament’s enquiry into DRM – something close to all our hearts. Only the technical part of the argument based on a ludicrous misunderstanding of the Church-Turing Thesis – one of the fundamentals of computer science, and a mistake so great it would be enough to get a grad paper marked “FAIL”. Only, no one at the lobby group seems to have noticed yet – it’s still listed as one of the group’s finest achievements.

Even the most “righteously righteous” lobby group can get its rocksolidundeniablefacts/arguments wrong. Take note.

Smart radios are still pretty dumb

More than three years ago, your reporter got a good taste of how miserable technology utopians can be. It was at Intel’s Developer Forum in San Francisco, and the debate was about liberating analog TV spectrum for exciting new digital uses. The analog switchover is slated for February 2009.

On behalf of Microsoft, Google, and Intel, the technology evangelists argued that smart radios were here, but the evil regulator the FCC wouldn’t permit them to deploy the technologies. Broadcasters countered that these experimental new technologies caused interference with their signals. [See Abolish Free TV – Intel).

In the hallways afterwards, one delegate and deregulation evangelist couldn’t understand why the FCC couldn’t just confiscate the spectrum from the TV broadcasters and be done with it?

“Why do the broadcasters need any spectrum at all?” she asked.

Because free TV is one of the few pleasures some Americans can afford, perhaps. A slightly less arrogant and more technically adept argument was advanced instead, which claimed that the space between allocated TV channels was “beachfront property”. Instead, the regulator copped it – it was all the fault of the FCC’s “command and control” outlook.

(The deregulation fanatics want a spectrum free-for-all and dream of the FCC being scrapped. The FCC is permitting fixed WSDs (white space devices) from 2009, but the industry wants mobile handheld WSDs to be permitted too.)

Now, agile radio has been tested and found to be not quite so agile as its proponents touted. At the end of July, the FCC’s engineering office published two sets of results from a four month trial of agile radio equipment submitted by the “White Space Coalition”, which includes Microsoft, Google, Intel, Dell, and HP.

“Depending on the effectiveness of shielding of a TV receiver’s tuner, emissions within a broadcast white space (i.e., within an unused broadcast channel) could potentially cause co-channel interference to a TV receiver tuned to a digital cable channel that overlaps the spectrum of the white-space device emission,” the FCC noted.

The lab found that the spectrum sensing of the equipment it tested couldn’t detect the white space with sufficient accuracy.

For one prototype sensor, the FCC noted:

“the results of the bench test for determining the baseline minimum detection sensitivity demonstrates that the device will not meet the manufacturer-specified threshold of -114 dBm (or the IEEE 802.22 proposed threshold of -116 dBm for fixed devices) and in fact, fails to meet both of the thresholds by about 20 dB. The results of the field tests also demonstrate inconsistent performance”

The manufacturer may have misread the spec, it suggests. The sensor also failed to detect the presence of a wireless microphone at all.

A second prototype sensor performed to the 114 dBM but got confused when a second DTV channel was turned on – the manufacturer asked it be excluded from more real-world tests. This prototype also failed to pick up a wireless mic, except on the two lowest channels. Both were also severely hampered by the microphones themselves.

Tests of a prototype transmitter also demonstrated interference, and generated some skepticism from engineers whether the filtering required to avoid knocking out TV signals can be implemented in a real product.

So smart radios have a long way to go, and this white space looks less like a “beachfront property” and more like a Cambodian minefield.

Microsoft told the Washington Post today that it had given the FCC a successful demonstration last week – and insisted it will all work out in the end.

As soon as it’s got the pesky physics sorted out.