Uh-oh: Silicon Valley is building a Chinese-style social credit system

Source: Fast Company | 08-26-19 | Mike Elgan

Have you heard about China’s social credit system? It’s a technology-enabled, surveillance-based nationwide program designed to nudge citizens toward better behavior. The ultimate goal is to “allow the trustworthy to roam everywhere under heaven while making it hard for the discredited to take a single step,” according to the Chinese government.

In place since 2014, the social credit system is a work in progress that could evolve by next year into a single, nationwide point system for all Chinese citizens, akin to a financial credit score. It aims to punish for transgressions that can include membership in or support for the Falun Gong or Tibetan Buddhism, failure to pay debts, excessive video gaming, criticizing the government, late payments, failing to sweep the sidewalk in front of your store or house, smoking or playing loud music on trains, jaywalking, and other actions deemed illegal or unacceptable by the Chinese government.

It can also award points for charitable donations or even taking one’s own parents to the doctor.

Punishments can be harsh, including bans on leaving the country, using public transportation, checking into hotels, hiring for high-visibility jobs, or acceptance of children to private schools. It can also result in slower internet connections and social stigmatization in the form of registration on a public blacklist.

China’s social credit system has been characterized in one pithy tweet as “authoritarianism, gamified.”

It can happen here

Many Westerners are disturbed by what they read about China’s social credit system. But such systems, it turns out, are not unique to China. A parallel system is developing in the United States, in part as the result of Silicon Valley and technology-industry user policies, and in part by surveillance of social media activity by private companies.

[The article discusses insurance companies, bar and restaurant ID checking apps, Uber, Air BnB and WhatsApp]

What’s wrong with social credit, anyway?

Nobody likes antisocial, violent, rude, unhealthy, reckless, selfish, or deadbeat behavior. What’s wrong with using new technology to encourage everyone to behave?

The most disturbing attribute of a social credit system is not that it’s invasive, but that it’s extralegal. Crimes are punished outside the legal system, which means no presumption of innocence, no legal representation, no judge, no jury, and often no appeal. In other words, it’s an alternative legal system where the accused have fewer rights.

Social credit systems are an end-run around the pesky complications of the legal system. Unlike China’s government policy, the social credit system emerging in the U.S. is enforced by private companies. If the public objects to how these laws are enforced, it can’t elect new rule-makers.

If current trends hold, it’s possible that in the future a majority of misdemeanors and even some felonies will be punished not by Washington, D.C., but by Silicon Valley. It’s a slippery slope away from democracy and toward corporatocracy.

In other words, in the future, law enforcement may be determined less by the Constitution and legal code, and more by end-user license agreements.

 

Viewing 4 posts - 1 through 4 (of 4 total)
  • Discussion
  • Woodcutter #35137

    In China, scoring citizens’ behavior is official government policy. U.S. companies are increasingly doing something similar, outside the law.

    Woodcutter #35138

    Breitbart has recently written about this:

    More left-wing journalists are waking up to the fact that unaccountable Silicon Valley corporations are ranking Americans with a system that bears a growing resemblance to China’s totalitarian “social credit” system

    https://www.breitbart.com/tech/2020/01/22/bokhari-leftists-are-learning-that-big-tech-already-has-a-social-credit-system/

    Woodcutter #35142

    Engadget recently had an article about Air BnB:

    The Evening Standard reported on Airbnb’s patent for AI that crawls and scrapes everything it can find on you, “including social media for traits such as ‘conscientiousness and openness’ against the usual credit and identity checks and what it describes as ‘secure third-party databases’.”

    They added, “Traits such as “neuroticism and involvement in crimes” and “narcissism, Machiavellianism, or psychopathy” are “perceived as untrustworthy.”

    https://www.engadget.com/2020/01/17/your-online-activity-effectively-social-credit-score-airbnb/?guccounter=1

    Woodcutter #35143

    The UK Evening Standard article (Booker beware: Airbnb can scan your online life to see if you’re a suitable guest) says…

    Airbnb has developed technology that looks at guests’ online “personalities” when they book a break to calculate the risk of them trashing a host’s home.

    Details have emerged of its “trait analyser” software built to scour the web to assess users’ “trustworthiness and compatibility” as well as their “behavioural and personality traits” in a bid to forecast suitability to rent a property.

    It comes after complaints from London hosts that guests thought to be suitable have held rowdy parties at properties. They include a woman who said her £2.5 million flat was wrecked by hundreds of drug-fuelled ravers after renting it out for a “baby shower”.

    https://www.standard.co.uk/tech/airbnb-software-scan-online-life-suitable-guest-a4325551.html

Viewing 4 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic.