Black Mirror
Hunt for ratings: Lacie from an episode of Netflix's new series Black Mirror

When I book a room on Airbnb, I find myself adopting a weird persona. Airbnb Judith uses a lot of exclamation marks. She is jolly, friendly and very quick to respond to messages. She is so easy-going that she ignores building sites and weird smells to be kind about her hosts. Airbnb Judith is a dream!!!

I feel slightly ridiculous doing this, but the thing is, I can’t help but care about my rating. On Airbnb, hosts and guests give feedback on each other, and if I give a host a bad rating, I fear they will give me the same treatment. It’s the same with Uber: I constantly give cash tips to drivers to improve the chances of them giving me five stars, even if I think they only deserve two.

This is partly a pragmatic thing, to make sure I can always get Ubers and Airbnbs fast. But there’s also part of me that can’t help feeling these reciprocal ratings reflect something about my actual value as a person — the same part that is still smarting from a low mark for “dress” from my work experience employers, aged 15.

I thought of this recently when watching the first episode, Nosedive, of Charlie Brooker’s new series Black Mirror . This is the dystopian sci-fi series, newly moved to Netflix. Spoiler alert: what I’m about to say won’t ruin the ending of the episode, but it will contain a few giveaways about the middle.

In Nosedive, people rate each other, as they do on Airbnb, Uber, Lyft and so on. But in Charlie Brooker’s alternative universe, they do so constantly. Every human interaction, whether with a friend, colleague or barista, triggers a rating, which in turn sends your publicly visible average up or down.

At the start of the show, the protagonist, Lacie, appears insane in her obsession with ratings. With the demeanour of a deranged cheerleader, she adopts flawless pastel dress and a hysterically upbeat manner; her day-to-day life is like Instagram in the real world.

But viewers soon realise she has sensible reasons for focusing much of her life on achieving a higher rating. Not only do ratings determine your access to shared cars and holiday homes, they also are a gateway — or a bar — to jobs, homes, bank accounts, flights and even healthcare.

Lacie’s weird world is not so far from the truth. In China, a planned “social credit” system aims to use big data and algorithms to assess the “honesty” and “trustworthiness” of citizens by 2020. According to a report in the Washington Post, this system envisages a single score for each person, which will be affected by factors as diverse as missed loan payments, failing to care for one’s parents or criticising the government. Such black marks might result in bans from upmarket hotels or the first-class sections of planes, or from travelling abroad — exactly the kinds of restrictions Lacie faces in Black Mirror when her rating plummets.

This system faces political barriers even in authoritarian China. But life in democratic countries is creeping in a similar direction.

Online reviews can make or break a business — the Competition and Markets Authority found that £23bn a year of consumer spending is influenced by them — yet there are often weak or no systems in place to ensure they are genuine. Credit analytics companies are testing the use of social media profiles to determine whether banks should offer loans. Meanwhile, social media vilification, as Jon Ronson documents in his book So You’ve Been Publicly Shamed, can make a person unemployable, even when their original offence was relatively minor.

True, the old-school ways of determining creditworthiness, or whether a person deserves other services, can also be unaccountable. A phantom £20,000 debt invented by a credit-rating agency almost undermined my mortgage application; consumers have been driven to take cases to the Supreme Court and change names in the effort to clear their credit files. But is a jury of our peers any better as a judge of how trustworthy we are? Social media witch-hunts like that of Lindsey Stone, who lost her job and suffered a year of deep depression because of online vilification over a goofy Facebook photo, suggest we may not be.

Black Mirror
In the fictional world of Black Mirror, high ratings meant a better standard of living

Mercifully, there are barriers to the uncontrolled spread of peer ratings as a way to determine a person’s worth. One is growing public consciousness of digital privacy. Facebook in 2015 restricted external companies’ use of its data because of users’ unease over how their information was being deployed; moves like this suggest that other apps would not face an easy ride if they wanted to share ratings information, for example. We heard this week that the UK insurer Admiral was testing a system that analyses people’s Facebook posts to determine the level of their car insurance premiums — only for the scheme to be scuppered by privacy concerns on the part of the social network.

Another limiting factor is that peer-to-peer rating systems are subject to patterns that make them less useful than they might seem. According to an academic study released in 2015, entitled “A First Look at Online Reputation on Airbnb, Where Every Stay is Above Average”, almost 95 per cent of Airbnb properties have an average rating of either 4.5 or the maximum 5 stars, while almost none come in below 3.5 stars. The same was not true on TripAdvisor, where guests rate hosts but not vice versa, and the range of ratings is much broader.

The researchers pointed out that the high average ratings might be partly due to bad hosts improving their properties or quitting the platform. But they also note the likelihood that “sociological factors” are at play, including the fear of revenge ratings and “herding behaviour” around existing scores. In other words, it is thanks to ratings obsessives like me that such systems may not be such a data gold mine after all.

Judith Evans is the FT’s property correspondent. Email: judith.evans@ft.com. Twitter: @JudithREvans

Copyright The Financial Times Limited 2024. All rights reserved.
Reuse this content (opens in new window) CommentsJump to comments section

Follow the topics in this article

Comments