Facebook isn’t psychologically healthy. Its pretend installations are part of the problem | Susan Pollay

Mark Zuckerberg’s plan for a Meta experience from Facebook – involving an immersive VR headset – would seem to be a last-ditch attempt to stop the blood pressure from rising on Wall Street over…

Facebook isn’t psychologically healthy. Its pretend installations are part of the problem | Susan Pollay

Mark Zuckerberg’s plan for a Meta experience from Facebook – involving an immersive VR headset – would seem to be a last-ditch attempt to stop the blood pressure from rising on Wall Street over his mishandling of personal data in Cambridge Analytica scandal.

But it goes much deeper than just rebranding the Facebook experience into a Meta illusion. Zuckerberg has, by his own admission, failed to tackle a series of real-world, existential problems for his social network.

The real estate bubble he helped create is catching up with him: houses for sale are down 60% from their 2005 peak. Facebook’s own penetration rate in a typical new home declined from 1,228 hours per month in 2014 to 840 in 2017. Meanwhile, in last quarter, there were 9.9% fewer Facebook users overall compared with the same period a year ago.

Facebook has also long been accused of condoning people drinking too much, smoking weed and driving after partying at the firm’s Central Park and Chelsea parties. This creates a perverse incentive for Facebook’s advertisers: binge drinking cuts your ad reach, so if you sell Tylenol to people drinking too much, you’ll be in much more trouble.

Then there’s the escape-room craze: because reality is so close at hand, and because Facebook effectively forces its users to stay on its service, every visit to a new escape room can mean leaving Facebook for ever.

Inevitably, users are not always using Facebook when they go to escape rooms, they’re using Amazon. There are more than 400,000 escape rooms across the country, the biggest of which have raised more than $80m from investors in the past two years, and more than 10 times Facebook’s daily active users.

Then there’s fake news, Facebook’s favoured scapegoat: the algorithm was causing users to get fake news about elections. This was causing all sorts of bad things, including conspiracy theories and propaganda. That was why the platform trialled running a floodlight on people’s activity on Facebook and on Instagram Stories, the app that places stories from friends and brands front and centre.

Algorithms that hate ambiguity. Anyone who writes an opinion article or comments on a news post expecting a disclaimer of age, gender or religion may be at risk of an ideologically powered takedown

Yet, a growing number of algorithms ignore complexities. There are now more than 2m domains on Facebook, including a huge variety of memes, and groups dedicated to leftwing conspiracy theories, African-American liberation theology, the NRA and Hinduism. It’s only a matter of time before the platform is overrun with illegal counterfeit goods – or with political critics who label it with a litmus test and a political label.

Its algorithm hates ambiguity. Anyone who writes an opinion article or comments on a news post expecting a disclaimer of age, gender or religion may be at risk of an ideologically powered takedown.

It’s even more clear cut in their recommendations about what you will read next. If you don’t agree with Facebook’s politics, it will take you to its own news hubs.

Zuckerberg’s initial response to the Cambridge Analytica revelations was to say it had the right thing to do and he would fix it – and then refused to ask the UK Information Commissioner to launch an investigation into the app that harvested data on tens of millions of Facebook users. There are over 400,000 elections services on Facebook, most of which have an appalling record of distributing bad material.

This is part of the existential challenge the company now faces: it was never designed to not be held accountable by users. After Brexit, there are already signs that the company may be reluctant to act again: Zuckerberg claimed it was a question of user perception. When users have enough negative feelings, it’s too late. Or he said that the Cambridge Analytica fallout was not as bad as he’d expected.

The vision he was spouting sounds like bad Facebook products designed to mask the real service: manipulating emotions, organising and controlling people, keeping users on your service. Users want freedom, not corseted and suffocated digital environments. I’m less interested in a Meta Facebook than in a real-world real Facebook.

Leave a Comment