WHEN BROADCASTING was first introduced around the world - radio from the 1920s and television from the late 1940s – governments everywhere decided that such a powerful method of communication needed to be regulated to ensure that the information disseminated did not cause harm to society.
Here in the UK, OfCom exists to protect the public from harmful or offensive material. It aims to ensure that reporting is honest, fair and balanced. And all around the world similar regulators try to ensure that reporting also remains responsible.
However such supervision of content dissemination has never applied to online social media systems such as Facebook and YouTube, and this has become very significant as Facebook now has 2.2 billion users globally and YouTube has 1.9 billion; many times more than any broadcaster anywhere.
In many social groups and geographies these platforms have become the major source of information for a large proportion of their populations.
In Myanmar for example, Facebook has over 18 million users and for many of them it is the main or only way of getting and sharing news. It has now become clear that Facebook was the method used by the Myanmar military, hiding behind fake identities, to spread hate against the Rohingya Muslims. As a result, they were able to conduct their violent campaigns in which thousands died, and more than 700,000 Rohingya fled to Bangladesh amid reports of arbitrary killing, rape and burning of villages. The United Nations explicitly accused Facebook of being ‘slow and ineffective’ in its response to the spread of online hatred – at the time Facebook employed only two Burmese speakers to monitor content.
When Mark Zuckerberg quoted his guiding motto for Facebook ‘move fast and break things’ he probably wasn’t thinking about causing the genocide of minority communities in far-off lands, or indeed the harvesting of massive quantities of personal information by the likes of Cambridge Analytica.
Facebook has also been accused of allowing undercover Russian agents to spread ‘false news’ and propaganda, interfering in elections across Europe and the USA. A recent report in the New York Times described how Facebook’s then head of security, Alex Stamos, was suppressed from raising his concerns about Russian interference by the leadership team of Zuckerberg and Sheryl Sandberg. Stamos left the company in March last year.
All this has even led to severe criticisms from other technology leaders, particularly Tim Cook of Apple, who has called for tougher regulation of the way that personal data is used by leading digital companies. Typically, Facebook has retaliated in a petty manner – Zuckerberg has banned Facebook executives from using iPhones and reportedly hired a PR firm to smear Cook.
Zuckerberg, who has demonstrated repeatedly that he doesn’t really ‘get’ these issues is supremely powerful at Facebook, controlling 60 per cent of the voting shares, which is why they have now hired former UK Deputy Prime Minister Nick Clegg to try and clean up their image.
They have proposed to set up a ‘supreme court’ to supervise the type of content that can be posted on their platform. With 2.2 billion users, many posting in their own language, this will be a massive task.
Of course Clegg is famous for his solemn pledge to scrap university tuition fees in 2010, quickly reversed when he became deputy PM. You might wonder if he couldn’t stand up to David Cameron, what chance has he got with Zuckerberg?
UK Students, among others, who voted LibDem in 2010 might wonder whether this is the right man to restore ‘trust’ in Facebook.