Posts Tagged ‘junk science’

Doug Keenan on Open Data

Wednesday, June 29th, 2011

Doug Keenan, the statistician whose work highlighted severe flaws in the work of the Climatic Research Unit at East Anglia, has welcomed the Sunshine order to open up the station records.

Scientists need the raw data to replicate temperature records, but CRU refused to release the data requested – a subset of weather station records from around the world – to a top UK Oxford physicist, despite having already shared the data with Georgia Tech in the United States.

The ICO comprehensively demolished the reasons CRU offered – including intellectual property and fear of jeopardising international relations. In doing so, it’s raised the standard for academics working across all UK sciences.
(more…)

Captain Cyborg: Computers are alive, like bats or cows

Friday, June 17th, 2011

Self-harming attention-seeker Kevin Warwick has admitted to snooping on the public in a previous life. Warwick made the creepy confession on Radio 4, recalling an earlier job as a GPO engineer:

“I remember taking ten different calls and plugging them all together; one call would continue, the other nine would listen in. Then I’d patch everything back again.”

In a 30-minute interview with Michael Buerk, Warwick compared his cat-chipping operation a decade ago to Yuri Gagarin’s first space flight. They were both scientific pioneers.
(more…)

Greatest Living Briton gets £30m for ‘web science’

Monday, March 22nd, 2010

As an alliance of the desperate, this one takes some beating. The Greatest Living Briton (Sir Timothy Berners Lee) has been thrown £30m of taxpayers’ money for a new institute to research “web science”.

Meanwhile the Prime Minister waxed lyrical today about the semantic web – how “data” would replace files, with machine speaking unto machine in a cybernetic paradise.

It’s really a confluence of two groups of people with a shared interest in bureaucracy.

Computer Science is no longer about creating graduates who can solve engineering challenges, but about generating work for the academics themselves. The core expertise of a CompSci department today is writing funding applications. And the Holy Grail for these paper chasers is a blank cheque for work which can be conducted without scrutiny for years to come. With its endless committees defining standards (eg, “ontologies”, “folksonomies”) that no one will ever use, the “Semantic Web” fits the bill perfectly.

Of course, most web data is personal communication that happens to have been recorded. Most of the rest is spam, generated by robots, or cut-and-paste material ‘curated’ by the unemployed or poor graduates – another form of spam, really. The enterprise is doomed. But nobody’s told the political class.

(more…)

Mystic Met Office abandons long range forecasts

Friday, March 5th, 2010

Tea leaves

The Met Office has confirmed it is to abandon long range weather forecasts, finally acknowledging criticism. The most recent forecasts were so inaccurate, that even the BBC is reconsidering whether to appoint an alternative supplier, such as Accuweather, after 88 years of continuous service from the 1,700-strong MoD unit.

The Mystic Met predicted a barbecue summer for 2009, and the third washout in a row, with the wettest July since 1914, duly followed. A mild winter was then given a high probability, only for the UK to suffer its coldest winter for 30 years. Yet Met Office staff received performance-related pay bonuses worth over £12m over 5 years, it was revealed last week, in response to a Parliamentary question.
(more…)

UK Physicists on Climategate

Monday, March 1st, 2010

The body representing 36,000 UK physicists has called for a wider enquiry into the Climategate affair, saying it raises issues of scientific corruption. The Institute of Physics doesn’t pull any punches in the submission, one of around 50 presented to the Commons Select Committee enquiry into the Climategate archive. The committee holds its only oral hearing later today.

The IOP says the enquiry should be broadened to examine possible “departure from objective scientific practice, for example, manipulation of the publication and peer review system or allowing pre-formed conclusions to override scientific objectivity.”

It deplores the climate scientists’ “intolerance to challenge” and the “suppression of proxy results for recent decades that do not agree with contemporary instrumental temperature measurements.”
(more…)

Nu Lab’s favourite boffin

Monday, January 11th, 2010

New Labour’s favourite boffin has lost her job – for a very New Labour reason – and has responded with a classically New Labour riposte.

Oxford neuroscientist Susan Greenfield was made redundant from her post as the Director of the Royal Institution after failing to balance the books. The full-time post itself is being abolished. In return, the Life Peer and WiReD magazine UK star is the suing the science charity for sex discrimination.

Greenfield’s £22m refurbishment of the Institution’s HQ saw it go into the red by £3m, and it had to sell property to balance the books. The refurbishment saw a new cafe bar and restaurant open at Albemarle Street.

 

Read more at The Register…

Global Warming ate my data

Friday, August 14th, 2009

The dog did it

The world’s source for global temperature record admits it’s lost or destroyed all the original data that would allow a third party to construct a global temperature record. The destruction (or loss) of the data comes at a convenient time for the Climatic Research Unit (CRU) in East Anglia – permitting it to snub FoIA requests to see the data.

The CRU has refused to release the raw weather station data and its processing methods for inspection – except to hand-picked academics – for several years. Instead, it releases a processed version, in gridded form. NASA maintains its own (GISSTEMP), but the CRU Global Climate Dataset, is the most cited surface temperature record by the UN IPCC. So any errors in CRU cascade around the world, and become part of “the science”.

Professor Phil Jones, the activist-scientist who maintains the data set, has cited various reasons for refusing to release the raw data. Most famously, Jones told an Australian climate scientist in 2004:

Even if WMO agrees, I will still not pass on the data. We have 25 or so years invested in the work. Why should I make the data available to you, when your aim is to try and find something wrong with it.

In 2007, in response to Freedom of Information Act requests, CRU initially said it didn’t have to fulfil the requests because “Information accessible to applicant via other means Some information is publicly available on external websites”.

Now it’s citing confidentiality agreements with Denmark, Spain, Bahrain and our own Mystic Met Office. Others may exist, CRU says in a statement, but it might have lost them because it moved offices. Or they were made verbally, and nobody at CRU wrote them down.

Read more at The Register.

Japan's boffins: 'Global warming isn't man-made'

Wednesday, February 25th, 2009

Japanese scientists have made a dramatic break with the UN and Western-backed hypothesis of climate change in a new report from its Energy Commission.

Three of the five researchers disagree with the UN’s IPCC view that recent warming is primarily the consequence of man-made industrial emissions of greenhouse gases. Remarkably, the subtle and nuanced language typical in such reports has been set aside.

One of the five contributors compares computer climate modelling to ancient astrology. Others castigate the paucity of the US ground temperature data set used to support the hypothesis, and declare that the unambiguous warming trend from the mid-part of the 20th Century has ceased.

The report by Japan Society of Energy and Resources (JSER) is astonishing rebuke to international pressure, and a vote of confidence in Japan’s native marine and astronomical research. Publicly-funded science in the West uniformly backs the hypothesis that industrial influence is primarily responsible for climate change, although fissures have appeared recently. Only one of the five top Japanese scientists commissioned here concurs with the man-made global warming hypothesis.

JSER is the academic society representing scientists from the energy and resource fields, and acts as a government advisory panel. The report appeared last month but has received curiously little attention. So The Register commissioned a translation of the document – the first to appear in the West in any form. Below you’ll find some of the key findings – but first, a summary.

see the Translation at The Register.

The BBC, Thermageddon, and a Giant Snake

Sunday, February 15th, 2009

a giant snake

Listeners to BBC World Service’s Science in Action program got a nasty surprise last week. In the midst of a discussion about the large snake fossil, a scientist dropped this bombshell:

“The Planet has heated and cooled repeatedly throughout its history. What we’re doing is the rate at which we’re heating the planet is many orders of magnitude faster than any natural process – and is moving too fast for natural systems to respond.”

Hearing this, I did what any normal person would do: grab all the bags of frozen peas I could find in the ice compartment of my refridgerator, and hunker down behind the sofa to wait for Thermageddon.

Hours passed. My life flashed before my eyes a few times, and a few times more. But then I noticed that the house was still there, and so was the neighbourhood. And so was I!

(more…)

Climate Models vs. Reality: Anton Wylie

Thursday, December 27th, 2007

Climate Modes vs Reality

Climate models appear to be missing an atmospheric ingredient, a new study suggests.

December’s issue of the International Journal of Climatology from the Royal Meteorlogical Society contains a study of computer models used in climate forecasting. The study is by joint authors Douglass, Christy, Pearson, and Singer – of whom only the third mentioned is not entitled to the prefix Professor.

Their topic is the discrepancy between troposphere observations from 1979 and 2004, and what computer models have to say about the temperature trends over the same period. While focusing on tropical latitudes between 30 degrees north and south (mostly to 20 degrees N and S), because, they write – “much of the Earth’s global mean temperature variability originates in the tropics” – the authors nevertheless crunched through an unprecedented amount of historical and computational data in making their comparison.

For observational data they make use of ten different data sets, including ground and atmospheric readings at different heights.

On the modelling side, they use the 22 computer models which participated in the IPCC-sponsored Program for Climate Model Diagnosis and Intercomparison. Some models were run several times, to produce a total of 67 realisations of temperature trends. The IPCC is the United Nation’s Intergovernmental Panel on Climate Change and published their Fourth Assessment Report [PDF, 7.8MB] earlier this year. Their model comparison program uses a common set of forcing factors.

Notable in the paper is a generosity when calculating a figure for statistical uncertainty for the data from the models. In aggregating the models, the uncertainty is derived from plugging the number 22 into the maths, rather than 67. The effect of using 67 would be to confine the latitude of error closer to the average trend – with the implication of making it harder to reconcile any discrepancy with the observations. In addition, when they plot and compare the observational and computed data, they also double this error interval.

So to the burning question: on their analysis, does the uncertainty in the observations overlap with the results of the models? If yes, then the models are supported by the observations of the last 30 years, and they could be useful predictors of future temperature and climate trends.

…Read more at The Register.