Posts Tagged ‘Techno utopians’

Tim Kring

Monday, July 19th, 2010

The audience are the actors in writer Tim Kring’s latest adventure. In his famous creation, the TV show Heroes, people discover they have superhero powers, and go off and battle Evil. In his latest, people go and battle Evil, and discover they have been given Nokia smartphones.

The ambitious, Nokia-sponsored interactive extravaganza began this weekend, and it’s an interesting experiment. In Kring’s own words, this series of events, called Conspiracy For Good, is “not quite a drama, not quite a flashmob, not quite an ARG [alternate reality game]”.

What is it, then, and how did it come about?

(more…)

Web politics: The honeymoon is over

Wednesday, April 28th, 2010

Parallel moves in Canada and the US may signal the end of the honeymoon for web-based political campaigning – or change it beyond recognition.

Politicians are becoming increasingly familiar with sudden squalls of email filling up their inboxes, and policy makers with responses to public consultations arriving via a web intermediary. But not surprisingly many of these can be phoney, inflating the true size of what purports to be "grassroots" campaign.

The shortcomings of the web-based approach were illustrated here recently. Photographers on a shoestring budget successfully mobilised against Clause 43 – but internet campaigners concerned about file-sharing who used a site to send 20,000 emails about the Digital Economy Act failed to make an impression, resulting in a triumph for the BPI.

Earlier this month Obama’s internet guru, Harvard academic Cass Sunstein, warned departments that internet opinion shouldn’t be used as an opinion poll or focus group. He advised that:

Agencies exercise good judgment and caution when using rankings, ratings, or tagging. Specifically, agency use of the information generated by these tools should be limited to organizing, ranking, and sorting comments. Because, in general, the results of online rankings, ratings, and tagging (e.g., number of votes or top rank) are not statistically generalizable, they should not be used as the basis for policy or planning.

That’s pretty conclusive. Four years ago, Sunstein published a love letter to Web 2.0 called Infotopia: How Many Minds Produce Knowledge that praised Wikipedia, blogs and prediction markets. But this is an altogether more sober assessment. He seems to have had second thoughts.

A vivid illustration of how a few single-issue fanatics can skew the results of an opinion poll is currently being digested in Canada. Lawyer Richard Owens has investigated the responses and found something quite interesting.

In response to a copyright form paper, over 8,000 responses were submitted, but 65 per cent of these were an identical form email from one IP address, the "Canadian Coalition for Electronic Rights". Owens notes that these:

…Included Submissions in which: no names were used; only first names were used (there were, for example, sixty-eight “Chris” and seventy-two “John” who made Submissions); and, suspect names, such as – “D Man”, “El Qwazo”, “pr0f1t”, “Cereal”, and “Eagle” – were used. Given the ability to submit anonymously or under false identification, is highly probable that there are multiple Submissions from the same persons.

 

The CCER form letter had been circulated around Bittorrent P2P fan sites. But most of the visitors to these sites aren’t Canadian. Quantity overruled quality.

Observers wondered whether something similar might have happened in the UK. When the Open Rights Group ventured into the real world, the numbers were small: it mustered just over 100 bodies for its main demo, and only single figures for its "flash mobs". The ORG’s "Your Message To Mandelson" campaign launched last year rapidly gathered 300 anonymous messages – but stalled at around the 500 mark.

Such disparities led people to question how representative the online activity really was. "Is this a particularly well-focussed campaign by a relatively small group of activists?", asked the BBC’s Rory Cellan Jones.

Read more at The Register

How the photographers won, while digital rights failed

Monday, April 19th, 2010

How did the music business end up with a triumph with the new Digital Economy Act? How did photographers, whose resources were one laptop and some old fashioned persuasion, carry an unlikely and famous victory? How did the digital rights campaigners fail so badly?

Back in January, a senior music business figure explained to me that Clause 17, which gave open-ended powers to the Secretary of State, was unlikely to survive the wash-up. But he didn’t much care; the other sections which compelled the ISPs to take action against infringers were good enough. Anything else was a bonus – possibly even a distraction. Yet to the amazement of the music business, web blocking is now legislation.

I think this is a watershed in internet campaigning. It’s not just a tactical defeat, it’s a full-on charge of the light brigade…

Read more at The Register

Obama’s got a Google problem

Monday, April 12th, 2010

Obama has created an exquisite problem by hiring so many senior executives from Google – some of the Oompa Loompas don’t seem to realise they no longer work for the company. Now a Congressman has called for an enquiry.

The issue was made apparent when a trail of correspondence by administration official Andrew McLaughlin was exposed recently. McLaughlin is Obama’s deputy CTO – a freshly minted post, with CTO meaning either Citizens Twitter Overlord, or Chief Technology Officer – we believe it’s the latter. He was previously Google’s chief lobbyist, or ‘Head of Global Public Policy and Government Affairs’.

McLaughlin’s contacts were also exposed. In an irony to savour, the exposure was by Google itself, as it introduced its privacy-busting Buzz feature in February. As our Cade pointed out, it would be hard to imagine a better Google story.
(more…)

Google abandons Search

Wednesday, December 9th, 2009

It’s hard to explain to people new to the web since 2004 – the Digg kids – the effect that Google had on the internet at the turn of the decade. They can’t conceive the Before and the After. Google was miraculous, and so much better than the competition that they effectively gave up trying to compete with it. But Google’s PageRank also unleashed social and political fads which reverberate right through to this day.

Much of the junk science of the web comes from Googlemania of this period. New institutes and venerable academic departments today all drink from the seemingly bottomless well. It permeates into Birtspeak 2.0, and you can see it in the Thumbs Up and Thumbs Down you see in Comments, for example. The mini-industry called “Social media marketing” wouldn’t really exist without it, either.

Google kindled the idea that the Web was a democracy, a great big voting machine. But only Google was uniquely qualified to divine these intentions – only Google had the capability and know-how to discern the ‘Hive Mind’. Google said so itself; its PR blurb explicitly made the connection between a New Form of Democracy and its own innovation, the “uniquely democratic nature of the web”.

For a couple of years, PageRank™ worked wonders. Then reality began to mess things up. What had worked well for conferring authority to peer-reviewed academic papers didn’t work quite so well in the wild. As Google grew, the importance of appearing in its rankings also grew. SEO and dirty tricks became big business. (See Meet the Jefferson of Web 2.0.)

This was first pointed out by your reporter in 2003, and it was manifest in two ways. Firstly, via the ease with which a small group of motivated people could hijack search terms, thanks to the dense interlinking nature of blogs. (A more perfect machine for rigging PageRank has yet to be invented). This was Googlewashing. And secondly, the ease with which spammers could clog the system with noise. The period also saw the migration of large amounts of information to the web in a searchable format. The real-time chatter from protocols that had previously been beyond the reach of search engines – such as AOL chatrooms – found its way into its Google. The result, by mid-2003, was a system that was broken.

You may recall that it was heresy at the time to doubt the quite magical technical ability of Google to get it ‘right’. The bandwagon of Web 2.0 had barely started to roll – it wasn’t christened until the following year – but there was already serious money on riding on it. But it was an even greater heresy to question the moral authority that the technology utopians had by then conferred on Google.

For Google wasn’t just ranking web pages, but adding to the human epistemological cannon – it was telling us what was wrong and right – filtered and legitimised through the people-powered Hive Mind. Thanks to the now-burdensome “Don’t Be Evil”, it constantly reminded us of its impeccable moral credentials.

Well, as you may have seen, PageRank™ is now dead. Google has given up on the job of ranking pages – it can’t cope any more – and outsourced the task of evaluating the job to the user. Needs must, and so it will make a virtue of the very feature that helped destroy the index – real-time noise. As Danny Sullivan points out, this is very big news indeed. I think it’s even bigger than Danny thinks it is – with an extra penthouse layer of bigness on top – for all the social and political implications mentioned above.

By outsourcing the ranking of pages to the hoi polloi, Google is saying that is no longer in the business of ‘arbitrating’ democracy. This is now the job of hordes of roaming single issue fanatics, voting pages up and down. You could say the internet has returned to its primordial soup.

(more…)

Kick me again, RIAA!

Thursday, August 6th, 2009
“ The anti-copyright gaggle has an insatiable need to feel victimized. Injustice burns deep, and is triggered by the merest hint that “The Man” might be tampering with one’s “bits”. Another example of technology utopians trying to bypass politics and claim victimhood – the Net Neutrality” campaign – shows very similar characteristics.”

A while ago I joked that perhaps the RIAA had secretly recruited Charlie Nesson to be its court opponent. Everyone from Ray Beckerman at the “Recording Industry vs The People” blog to Nesson’s old pals at the Berkman Centre at Harvard had advised him to knock it off – or at least not pursue a crackpot defence. But when it comes to the technology utopians, all jokes come true eventually.

Nesson has achieved something I thought was completely impossible in 2009, and that’s to allow the US recording industry’s lobby group to paint itself in a sympathetic light. No longer must the RIAA explain why their biggest members are not using technology to make money for the people they represent. The Boston case allowed the four major labels to justify an enforcement policy against opponents who appeared compulsively dishonest, irrational, paranoid, and with an abnormal sense of entitlement.

Nice work, Charlie.

Ken Kesey's Merry Pranksters bus

Nesson failed in his avowed mission “to put the record industry on trial”. He failed to show why disproportionate statutory damages are harmful, which could have had a lasting constitutional effect. He failed to paint the defendent as sympathetic, or “one of us”. He failed to demonstrate why copyright holders make lousy cops. He even had a Judge noted for her antipathy to the big record labels. In short, he ceded the moral high ground completely and utterly to the plaintiffs, the four major record labels. The labels’ five year campaign against end users is finally at a close, but Nesson’s performance leaves it looking (undeservedly) quite fragrant.

Read more at The Register

The Tragedy of the Creative Commons

Thursday, July 16th, 2009

The Creative Commons initiative fulfilled a major ambition last week – but it’s taken only days for the dream to turn to crap.

Google granted the wish by integrating the ability to search images based on rights licences into Google Image Search. Yahoo! Image Search has had a separate image search facility for years, but Google integrated the feature into its main index.

The idea of making the licences machine-readable was a long-standing desire of the project, and lauded as a clever one. It was intended to automate the business of negotiating permissions for using material, so machine would instead negotiate with machine, in a kind of cybernetic utopia. Alas, it hasn’t quite worked out.

As Daryl Lang at professional photography website PDN writes, the search engine is now choked with copyright images that have been incorrectly labelled with Creative Commons licences. These include world-famous images by photographers including Bert Stern and Steve McCurry. As a result, the search feature is all but useless.

Since there’s no guarantee that the licence really allows you to use the photo as claimed, then the publisher (amateur or professional) must still perform the due diligence they had to anyway. So it’s safer (and quicker) not to use it at all.

What’s gone wrong, as Lang explains, is the old engineering principle of GIGO, or Garbage In, Garbage Out:

“The system relies on Internet users to properly identify the status of the images they publish, Unfortunately, many don’t… Many Flickr users still don’t understand the concept of a Creative Commons licence, or don’t care.

“It’s time consuming to put a different label on every image [in their collection], and there are no checks in place [our emphasis] to hold users accountable for unauthorized copying or incorrect licensing labels.”

So Google won’t take responsibility for the accuracy of the licensing metadata, and Creative Commons, as a small private internet quango, says it can’t afford to. (The disclaimer on the website is simple: go find yourself a lawyer.)

Just as we predicted, in fact: the filtering is less than perfect, and it’s a lip-service to creators. Now, why did it have to fail?

(more…)

"A country bumpkin approach to slinging generalizations around"

Thursday, June 25th, 2009

Anderson plagiarism

WiReD magazine Editor-in-Chief Chris Anderson has copped to lifting chunks of material for his second book Free from Wikipedia and other sources without credit. But it could be about to get a lot worse.

In addition to the Wikipedia cut’n’pastes, Anderson appears to have lifted passages from several other texts too. And in a quite surreal twist, we discover that the Long Tail author had left a hard drive backup wide open and unsecured for Google to index, then accused one of his accusers of “hacking”.

Does the WiReD editor and New Economy guru need basic lessons in how to use a computer?

Waldo Jaquith of Virginia Quarterly Review unearthed a dozen suspect passages after what he called “a cursory investigation”, and posted his findings here on Tuesday. Wikipedia entries for ‘There Ain’t No Such Thing as a Free Lunch’, ‘Learning Curve’ and ‘Usury’ had been pasted into Anderson’s book.

In addition to Wikipedia citations, which Anderson reproduced with the errors intact (oops), Jacquith suggests he also lifted from an essay and a recent book. Presented with the evidence, Anderson blamed haste and (curiously) not being able to decide on a presentation format for citations, for his decision to omit the citations altogether. Other examples were “writethroughs”, he said.

Then lit blogger Edward Champion documented several more examples which he says show

“a troubling habit of mentioning a book or an author and using this as an excuse to reproduce the content with very few changes — in some cases, nearly verbatim.”

Champion’s examples of churnalism include blog posts, a corporate websites and (again) Wikipedia.
(more…)

Newspapers: David Simon vs Google

Thursday, May 7th, 2009

Google, the nemesis of newspapers, was at the Congress yesterday, to turn a blonde deaf ear to their troubles. The company’s pin-up VP of products Marissa Meyer described quite a bright future to the Senate’s commerce committee – but it’s a bright future for Google, and people with a lot of time fiddling with their computers. Also testifying was creator of The Wire David Simon.

Let’s contrast how each of them addressed the crisis.

Meyer said Google’s policy “first and foremost” was to respect the wishes of content producers, but offered nothing in the way of new business partnerships. Instead, she gave them a short but haughty lecture on how they should present their stories – they should become more like Wikipedia:

“Consider instead how the authoritativeness of news articles might grow if an evolving story were published under a permanent, single URL as a living, changing, updating entity,” she said in her statement. “We see this practice today in Wikipedia’s entries and in the topic pages at NYTimes.com. The result is a single authoritative page with a consistent reference point that gains clout and a following of users over time.”

So instead of publishing 50 stories a day, the implication is that publications should only publish 50 a year – tweaking those 50 constantly, in the hope they wriggle up through the Google search results. Yes, that’ll fix things.

She also said they should offer more scope for mash-ups. At both ends of the news chain, then, you have people fiddling – instead of writing (at one end) and reading (at the other). That’s very Web 2.0, and you couldn’t get a clearer statement that Google doesn’t really understand what news is for. (It’s merely the stuff that goes between the BODY tags, silly.)

The creator of The Wire and former reporter David Simon said he found the phrase “citizen journalism” Orwellian. He added:

“A neighbor who is a good listener and cares about people is a good neighbor – he is not in any sense a citizen social worker. Just as a neighbor with a garden hose and good intentions is not a citizen firefighter. To say so is a heedless insult to social workers and firefighters.”

(more…)

Charlie Nesson's trip

Thursday, April 30th, 2009

L.S.and D.

Has Charlie Nesson been at the magic mushrooms again? The hippy head of the Berkman Center, the influential New Age techno-utopian think tank that’s attached to Harvard Law School, wants to enlist Radiohead in his fight against the Recording Industry Association of America (RIAA).

Nesson, a long-time opponent of creator’s digital rights, is contesting the statutory damages in infringement cases. A Boston graduate student called Joel Tenenbaum was ordered to reach a settlement with the record companies after being sued for copyright infringement, having shared files using the Kazaa P2P network back in 2003. Nesson’s strategy in Sony BMG Music vs Tenenbaum is to put the music business on trial. That’s fine – suing freetards isn’t going to stop P2P file sharing and it isn’t going to save the music business. It only adds to the anoraks’ persecution complex. Even the RIAA has now concluded it’s the wrong strategy.

But is Nesson the man to fight The Man? Nesson’s novel argument is that unlicensed P2P file sharing is “fair use”. Even his Harvard students, who are doing the work for him, think that’s stretch. And maybe he doesn’t want to win, just preen about in front of a camera. He wants it televised, he Arse Technica, because:

“It’s like a reality show that we can all be participants in as we go along… It’s an incredibly powerful expansion of the idea of teaching.”
(more…)