The PROJECT [r]evolution Conference will take place at AUT University on August 30 and 31 this year. 

The conference will examine how digital social media bring change (revolution) through massive upheaval (the Arab Spring, Occupy etc) and development (evolution) through innovation and survival of the fittest (Facebook versus Google etc).

World-renowned speakers include: Alec Ross (who was president Obama's social media election strategist and now works for Hillary Clinton), Emily Banks (associate editor for Mashable) and Google’s California-based global ambassador, Michael T. Jones.

Business, government and academic speakers include Prof Jim Macnamara (University of Sydney, former company founder/director), Prof Graham Murdock (Loughborough University, renowned media analyst), Christopher Barger, from Voce (formerly with GM and IBM) and Dan Neely (Wellington Emergency Management Office).

Conference partners, AUT University, the US Embassy, Wellington, and Social Media NZ have launched the website  where delegates can register for a restricted number of places.

The PROJECT [r]evolution is  “a collision of thought on social media and digital communication” .  This Centre within the University, has been a catalyst for the increased use of social media into our classrooms, both as a learning tool and a topic for research and analysis.  It makes sense that we bring this critical view into a conference setting, and bring together the very different views of innovators, marketers, media analysts and those with a political perspective.

The event is underwritten and sponsored by AUT University with sponsoring partner, the US Embassy in NZ, and includes a third partner/advisor, John Lai of Social Media NZ

Some of the other speakers are:
“The Future of the Social web” – Thomas Scovell, Clemenger
“Revolution in Government” – Anthony Deos
“Copyright protection in our connected world” – Rick Shera, Lowndes Jordan  (What SOPA/PIPA mean)
“When disaster strikes: Digital in times of darkness” – Dan Neely
“Digital dilemmas: why we need an ethical [r]evolution” – Assoc Prof Martin Hirst 
“The Emotion – how digital shifts minds for market brands” – Julian Smith, BRR Ltd 
“The rise of mobile” – Paul Brislen, TUANZ
“The Revolution of Data - The Science of Data in Digital” – Hayden Raw, Common Room
“The tomorrow is now and it belongs to interactivity” – Richard McManus  and several more.

Topics and speaking panels are still evolving.  Visit the website
for the up to date programme and to register for this event.

Submitted by Joseph Peart, June 18, 2012

Governments worry that the same world-wide-web that allows global social communication is also being used for spying and cyber-attacks. 

Although earlier experimental cyber-attacks have been attributed to Russia, attention has lately shifted to China.  Australia and the United States don’t want anything to do with the Chinese telco, Huawei, which has already supplied technology to New Zealand and is ready to provide more.

You could say that other countries have more to lose, or you could say New Zealand is acting more like a grown-up.  It begs some questions:

1. Will excluding a company from doing business prevent malicious use of the internet by its home country? 

2. Why should we worry about business interactions with the legitimate government of the World’s largest nation, even if we don’t agree with its politics?

3. Is this any worse than signing away our democratic rights to the world’s fourth most populous empire, ruled by the benign dictator, Mark Zuckerberg?

If you consider the World’s fourth largest population (Facebook), add to that consideration that Facebook is even more powerful, bloated by a massive injection of equity.  Former chief executive of General Electric, Jack Welch, warns that this wealth could make senior managers complacent, like the rulers of Rome before its decline.

But it’s not so much the rise or fall of Facebook financially that should cause concern, but the rise of its algorithms.  Why me worry?

Most of us who use Amazon are aware that its algorithms are so designed as to recommend items that are bound to be of interest to each of us personally.  Also, apparently, around 60 percent of movies hired through Netflix are as result of recommendations via the site’s algorithms.  That seems harmless enough.

Similarly, in New Scientist (14 April, 2012) Helen Knight notes that more than a third of American shoppers in physical stores use the Internet via their smartphones to help them make a purchase decision.  That didn’t seem scary until, two weeks later, New Scientist informed me that my attitudes, beliefs and personality type (as well as a very good guess about where I live) can all be worked out by online algorithms.  In that issue, Jim Giles assured New Scientist readers that, even if all of your privacy settings are at their maximum, purchases and searches, cellphone records and email traffic provide plenty of data to match us with friends, ‘like-minds’ and products that we could find attractive, if not irresistible.

Lars Backstrom, a researcher at Facebook, showed in 2010 that he could locate two-thirds of the site’s users within 40 kilometres, by identifying where their friends live. (Proc. 19th Int. Conf. on World Wide Web, p.61).  A similar algorithm for guessing sexual orientation also boasted 80 per cent accuracy (First Monday, vol. 14, No. 10).

It’s all about a balance between how much we voluntarily share with our social media providers and product suppliers (to help find the things and friends we want) and how much their algorithms deduce without our permission.

Seth Godin, the Internet entrepreneur and author is not worried.  He says the power that consumers wield now is all about a rapidly expanding period of consumer choice.  "If you can make it clear to consumers that you have a better offer," says Godin, who has long studied how the Web affects marketing, "it's infinitely easier to acquire a million consumers than ever before."

At the same time, our consumption of ideas and culture, as much as economic products and services, moulds our thinking and exposes us to algorithmic selection or ranking.

Remember George Gerbner’s “mean world syndrome” which he identified in the 1970s to describe how violent television consumption can lead us to think the world is nastier than it actually is?

Well, Dean Eckles of Stanford University suggest that the filtering of content, such as giving “likes” on Facebook a priority, could fool us into thinking that everyone else is having more fun than we are.

It could be that Facebook, newsfeeds and algorithms select our Internet content with so much bias that we suffer from what Eckles calls “friendly world syndrome”.   To properly understand the implications of that state of mind, I recommend you read Aldous Huxley’s Brave New World, paying particular attention to his references to “Soma”.

Joseph Peart