Share Share
Tweet Tweet
Forward Forward
Share Share

Earning Social Credit 

It's crazy to think Uber ratings have become such a hot topic. Comparing them is now standard cocktail conversation. It's even crazier that something as complicated as reputation can be reduced to a single score.

Today's newsletter shares two perspectives on the topic of social credit systems. In the first take below, my colleague Lydia Lee, President of Weber Shandwick China, explains why so many in China are in favor of the system, and why it's difficult for many of us in the West to see the benefits. The second is my own POV. This read is a little longer than most, but hope you’ll stick with it.

Social Credit in China

First piloted in 2014, China’s social credit system utilizes surveillance technology to assign numerical scores to individuals’ behavior. It encourages activities like paying taxes and bills on time, donating blood and volunteering, while punishing behaviors such as jaywalking, sorting waste incorrectly or failing to show up to a reservation. 

Many articles compare it to Black Mirror, 1984, Big Brother and other sci-fi fueled dystopian futures. But the all-important why behind the system is often unexplored. Despite misconceptions, the system is solving a problem that, if executed correctly, will bring the country needed stability, fairness and justice. These fundamental benefits explain why 80% of Chinese citizens either somewhat or strongly approve of the system:

1.  Stability: With a population of 1.4 billion, China is massive, and if managed poorly, the force of its disruption can have catastrophic effects far beyond its borders. While Western media often frame the social credit system as a control mechanism, Chinese citizens see it as a source of stability. If China is stable, that means 20% of the earth's population is stable. And this in itself is an achievement. 

2. Fairness: China has always been an agricultural society. Trade came later, as did the infrastructure to measure and track performance and reputation. So, it's only sensible that the government should leverage available technology to manage an objective system that benefits individual farmers, global businesses and everyone in-between. 

3. Justice: Even in today’s society, the rich and powerful often have special means to avoid punishment. Now with impartial technology, everyone is given the same score for the same mistake, meaning everyone is equal in the eyes of the system.

The US and other Western countries also have social credit systems in place, though the Chinese government's role in creating rules, assigning scores and managing back-end technology causes outsiders to view ours in a negative light. This is due, in part, to differences in how the West and East view the role of government. In China's eyes, the government is the torch bearer of Chinese civilization, the defender of culture and values that span thousands of years, and the embodiment of fairness. Given this special role, it is only natural that Chinese expect the government to act upon and solve problems on behalf of society.

New use cases for emerging tech always come with controversy. But the social credit system is seen by most in China as another example of a solution — much like the Great Wall or the Grand Canals — to advance Chinese civilization. And with the current COVID-19 epidemic, it’s worth paying attention to how this system will provide additional value to the stability-fairness-justice of Chinese society.

Lydia Lee

Social Credit in the US

With headlines like "
Why China's Social Credit System Is Worse Than You Think" and "China’s Nightmarish Citizen Scores Are a Warning For Americans," we're quick to label China’s system as “tech gone wrong.” The program has some genuinely unsettling elements — including harsh punishment for switching jobs too often, frequently playing video games or speaking negatively about the government, for example.

Some experts believe we've 
got it wrong — or we're oversimplifying. Perhaps the biggest contradiction to our judgement is, as Lydia mentioned, how many similar systems are in place in the West. You're likely being rated on your behaviors without even realizing it. Sites like MyLife have caused a stir over assigning people overall "reputation" scores based on public records. Other examples of our own version of social behavior monitoring can be found in:

1. Finance and Insurance: We've long been given scores to reward and punish our financial behavior. But some states have taken it a step further. Last year, regulators in New York
introduced guidelines allowing life insurance providers to base premium costs on what they find in customers’ social media posts. (Thinking about posting that vacation bungee jump photo? Think again.)

2. Travel and Entertainment: Review culture has us constantly rating establishments on Trip Advisor or Yelp, and increasingly, we're scored on apps like AirBnB and OpenTable. Emerging tools like 
PatronScan can even identify a person in bars across multiple countries and ban them from entrance based on past behaviors. 

3. Communication: We like to think freedom of speech is inherently protected, but that isn't necessarily the case when private companies have the final say. For example, you can be banned on WhatsApp if too many other users block you. Or prevented from speaking in online groups if a community monitor decides your behavior doesn't align with guidelines. 

The fundamental difference between these systems and China's, of course, is the enforcing body behind the rules. As opposed to the government, our social monitoring responsibilities are fragmented across various corporate overseers. But don't be too comforted by this distinction. As others have 
pointed out, our version of social "laws" have no legal backing — none of the rights we're promised, including presumption of innocence, legal representation or appeal. In other words, if the system in question says you're out, you're out. 

The effects of these practices have long been a source of trouble for 
many. And emerging avenues for behavior monitoring suggest things will only get more complicated. While some impact behind these systems is undoubtedly valuable, the growing need to score behavior leaves much to question about the long-term consequences. 

Chris Perry


As always, if you find this newsletter valuable we would be grateful if you encouraged others to sign up by directing them here.
What We're Reading
Synthetic Content
Can Synthetic Media Drive New Content Experiences?
By Ahmed Razek, BBC

'Deepfakes' have rightfully grabbed negative media attention, but is there a creative and editorially solid opportunity to exploit the underlying technology? BBC Blue Room's Ahmed Razek has been experimenting with this controversial technology.

Media Forensics
Teens Have Figured Out How to Mess with Instagram's Tracking Algorithm
By Alfred Ng, CNET

These teenagers are relying on a sophisticated network of trusted Instagram users to post content from multiple different devices, from multiple different locations. If you wanted to confuse Instagram, here's how.

Platform Dominance
Are Mental Health Apps Totally Useless?

By Tom Chivers, UnHerd

If you go to the App Store on your iPhone and type “depression” into the search bar, you’ll find dozens of apps purporting to screen for or help alleviate depression. One study claims there are 10,000 such apps, another some 350 or so. The trouble is that there doesn’t seem to be a great deal of evidence that they actually work.

Content Experience
The Underground Economy of Influencer Fitness Plans
By Hussein Kesvani, Mel Magazine

Despite fitness plans’ being easy to produce, they continue to be popular among fitness influencers looking to make their mark. For them, it’s not about the correct combination of techniques, or getting the macros in a meal plan exactly right — it’s about selling hope to those without it, and structure to those who need it so much that they’d happily pay for it.

Media Forensics
Media Executives Pressure Advertisers on Quality News
By Sara Fischer, Axios

News content can be tricky for advertisers to navigate, but media executives argue that it's possible — and beneficial — for advertisers to figure out how to engage quality news with their dollars.

Media Forensics
Michael Bloomberg’s Campaign Suddenly Drops Memes Everywhere
By Taylor Lornez, The New York Times

A campaign of sponsored content for the candidate flourished suddenly on Instagram. A new outfit, called Meme 2020, is behind it.

Platform Dominance
YouTube Continues to Mirror Twitch, Tests New Clap Feature that lets Fans Donate to Creators
By Julia Alexander, The Verge

YouTube is trying to figure out how to help creators earn more revenue through alternative monetization methods, and its newest attempt reiterates that YouTube is still following in Twitch’s footsteps.

Emerging Technology
Tool to Help Journalists Spot Doctored Images Is Unveiled by Jigsaw
By Davey Alba, The New York Times
Jigsaw, a company that develops cutting-edge tech and is owned by Google’s parent, unveiled a free tool that researchers said could help journalists spot doctored photographs — even ones created with the help of artificial intelligence.
Deep Take
Tech in 2020: Standing on the Shoulders of Giants
By Benedict Evans
‘Standing on the shoulders of giants’ looks at what it means that 4bn people have a smartphone; we connected everyone, and now we wonder what the Next Big Thing is, but meanwhile, connecting everyone means we connected all the problems. Tech is becoming a regulated industry, but we don’t really know what that will mean.  
Copyright © 2020 Weber Shandwick

Want to change how you receive these emails?
You can update your preferences or unsubscribe from this list.

Email Marketing Powered by Mailchimp