Submitted by Joseph Peart, June 18, 2012

Governments worry that the same world-wide-web that allows global social communication is also being used for spying and cyber-attacks. 

Although earlier experimental cyber-attacks have been attributed to Russia, attention has lately shifted to China.  Australia and the United States don’t want anything to do with the Chinese telco, Huawei, which has already supplied technology to New Zealand and is ready to provide more.

You could say that other countries have more to lose, or you could say New Zealand is acting more like a grown-up.  It begs some questions:

1. Will excluding a company from doing business prevent malicious use of the internet by its home country? 

2. Why should we worry about business interactions with the legitimate government of the World’s largest nation, even if we don’t agree with its politics?

3. Is this any worse than signing away our democratic rights to the world’s fourth most populous empire, ruled by the benign dictator, Mark Zuckerberg?

If you consider the World’s fourth largest population (Facebook), add to that consideration that Facebook is even more powerful, bloated by a massive injection of equity.  Former chief executive of General Electric, Jack Welch, warns that this wealth could make senior managers complacent, like the rulers of Rome before its decline.

But it’s not so much the rise or fall of Facebook financially that should cause concern, but the rise of its algorithms.  Why me worry?

Most of us who use Amazon are aware that its algorithms are so designed as to recommend items that are bound to be of interest to each of us personally.  Also, apparently, around 60 percent of movies hired through Netflix are as result of recommendations via the site’s algorithms.  That seems harmless enough.

Similarly, in New Scientist (14 April, 2012) Helen Knight notes that more than a third of American shoppers in physical stores use the Internet via their smartphones to help them make a purchase decision.  That didn’t seem scary until, two weeks later, New Scientist informed me that my attitudes, beliefs and personality type (as well as a very good guess about where I live) can all be worked out by online algorithms.  In that issue, Jim Giles assured New Scientist readers that, even if all of your privacy settings are at their maximum, purchases and searches, cellphone records and email traffic provide plenty of data to match us with friends, ‘like-minds’ and products that we could find attractive, if not irresistible.

Lars Backstrom, a researcher at Facebook, showed in 2010 that he could locate two-thirds of the site’s users within 40 kilometres, by identifying where their friends live. (Proc. 19th Int. Conf. on World Wide Web, p.61).  A similar algorithm for guessing sexual orientation also boasted 80 per cent accuracy (First Monday, vol. 14, No. 10).

It’s all about a balance between how much we voluntarily share with our social media providers and product suppliers (to help find the things and friends we want) and how much their algorithms deduce without our permission.

Seth Godin, the Internet entrepreneur and author is not worried.  He says the power that consumers wield now is all about a rapidly expanding period of consumer choice.  "If you can make it clear to consumers that you have a better offer," says Godin, who has long studied how the Web affects marketing, "it's infinitely easier to acquire a million consumers than ever before."

At the same time, our consumption of ideas and culture, as much as economic products and services, moulds our thinking and exposes us to algorithmic selection or ranking.

Remember George Gerbner’s “mean world syndrome” which he identified in the 1970s to describe how violent television consumption can lead us to think the world is nastier than it actually is?

Well, Dean Eckles of Stanford University suggest that the filtering of content, such as giving “likes” on Facebook a priority, could fool us into thinking that everyone else is having more fun than we are.

It could be that Facebook, newsfeeds and algorithms select our Internet content with so much bias that we suffer from what Eckles calls “friendly world syndrome”.   To properly understand the implications of that state of mind, I recommend you read Aldous Huxley’s Brave New World, paying particular attention to his references to “Soma”.

Joseph Peart

Comments are closed.