Sign in for free: Preamble (PDF, ebook & audiobook) + Forum access + Direct purchases Sign In

Unscarcity Research

Social Credit: China's 60 Blacklists vs. America's FICO System

80.7 billion records, 200,000 added in 2025, 28M flight bans. Both nations score citizens—China centralized, US fragmented. How algorithmic judgment works.

10 min read 2221 words /a/social-credit-system

Note: This is a research note supplementing the book Unscarcity, now available for purchase. These notes expand on concepts from the main text. Start here or get the book.

Social Credit Systems: The Wrong Turn We Almost Took

Here’s a fun thought experiment: Imagine if your landlord, your employer, your bank, and your government all agreed to share their notes on you. Every late payment, every jaywalking ticket, every time you complained about the president, every “problematic” book you checked out—all collated into a single number that determines whether you can rent an apartment, get a job, or board a train.

Now imagine that system is invisible, uncontestable, and run by algorithms nobody fully understands.

Dystopian fiction? No. It’s what several governments and corporations have actually tried to build. And understanding why it’s terrifying—and how Unscarcity avoids falling into the same trap—is essential to building trust-based systems that don’t become Orwellian nightmares.


The Chinese Experiment: Less Dystopia, More Bureaucracy (Which Is Still Worrying)

Let’s start by demolishing some mythology. The Western media narrative about China’s social credit system has been… let’s say enthusiastic. Headlines screamed about a Black Mirror-style unified score tracking every citizen’s every move, automatically banning dissidents from flights while rewarding model communists with faster internet.

The reality, as of late 2025, is messier—and instructive.

China’s social credit system was never designed as a holistic citizen-ranking score. The original 2014-2020 planning outline didn’t even mention individual scores. What it described was closer to a coordinated database for tracking contract violations, regulatory compliance, and court-ordered obligations—essentially, a more aggressive version of what credit bureaus do in the West.

The pilot programs that did create individual scores have largely fizzled. Rongcheng, in Shandong Province—the poster child for dystopian social scoring—reformed its program in 2021. Participation is now strictly voluntary. The system can only issue rewards, not punishments. When German journalists visited in 2024, they found that nobody cared about collecting points anymore. The dystopia had become a bureaucratic afterthought.

But here’s the thing: the infrastructure never went away.

The March 2025 policy update—a 23-point directive from Communist Party leadership—standardized the system’s scope and made enforcement more rigorous. The National Credit Information Sharing Platform has aggregated over 80.7 billion credit records from 180 million business entities. Approximately 200,000 people were added to blacklists in 2025 alone, with 46% for contractual disputes. Over 60 nationwide blacklists now cover everything from tax evasion to food-safety violations to securities fraud.

As Lizzie Lee of the Asia Society Policy Institute put it: “The popular idea of a single, all-knowing big brother score that tracks every citizen’s behavior isn’t accurate. What China calls the social credit system is really a patchwork of administrative databases and sector-specific blacklists.”

But patchwork infrastructures have a way of becoming unified ones when the political moment arrives. The machinery exists. It’s just waiting for someone to flip the switch.


The Western Version: We Already Have It (We Just Don’t Call It That)

Here’s the uncomfortable truth: Americans clutching their pearls about China’s social credit system often ignore the fragmented version they already live under.

Consider what happens when you apply for an apartment in most U.S. cities:

  • Your FICO credit score determines whether you get approved
  • Tenant screening services check your rental history (and any evictions)
  • Background check companies search criminal databases
  • Some landlords check your social media
  • Income verification ensures you make 3x the rent

Each of these is a separate “score” run by a different company, using different data, with different (often opaque) algorithms. But the net effect? A composite judgment on your worthiness as a human being, with life-altering consequences.

And it gets worse. The tenant screening industry is now a billion-dollar business with little oversight. These algorithms enable what the Center for Democracy and Technology calls “racial and disability discrimination at scale”—flagging applicants with any criminal history (even arrests that never led to convictions), using credit scores that disproportionately harm people of color due to the economic legacy of redlining and lending discrimination.

In one landmark case, SafeRent—a major screening company—paid $2.3 million to settle a class action by Black and Latino tenants whose applications were automatically rejected based on subprime credit scores. The judge ruled the algorithm’s disparate impact was discriminatory, regardless of intent.

The difference from China isn’t the surveillance—it’s the fragmentation. America’s social credit system is run by hundreds of private companies instead of one state. This makes it harder to game, sure. But it also makes it nearly impossible to contest. Who do you sue when five different algorithms from five different companies, using data you didn’t know they had, all independently decide you’re a risk?

At least in China you know who to blame.


Why Behavioral Scoring Corrodes Everything It Touches

Let’s step back from the geopolitics and examine why comprehensive behavioral scoring is fundamentally toxic, regardless of who runs it.

1. It Rewards Performative Compliance, Not Genuine Virtue

When you know you’re being scored, you don’t become a better person—you become a better performer. You volunteer at the soup kitchen, but only for the photo op. You attend the party meeting, but your heart isn’t in it. You say the right things on social media while thinking the wrong ones in private.

This is the performative paradox: the act of measurement destroys the authenticity of what’s being measured. A genuine act of kindness done for intrinsic reasons becomes a cynical bid for points once a score enters the picture.

Campbell’s Law—named after social scientist Donald Campbell and often called Goodhart’s evil twin—predicts this exactly: “The more any quantitative social indicator is used for social decision-making, the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended to monitor.” In plain English: the moment you measure something to change behavior, people start gaming the measurement instead of doing the thing you actually wanted.

2. It Privileges the Measurable Over the Meaningful

What gets measured gets managed—and what doesn’t get measured gets ignored.

Social credit systems inevitably reward behaviors that are easy to quantify while ignoring those that actually matter. Paying your bills on time? Easy to track. Being a loving parent? Impossible to reduce to a number.

The result is a civilization optimized for legibility rather than flourishing. The grandmother who spends decades caring for a disabled grandchild—unmeasured, unrewarded. The influencer who posts propaganda for the state—swimming in points.

3. It Creates Irreversible Harm

Credit scores in the West already demonstrate this problem. A medical emergency in your twenties—a period when you were uninsured and made $23,000 a year—can haunt your credit report for seven years, locking you out of housing and employment at the exact moment you’re trying to rebuild your life.

Behavioral scoring amplifies this by orders of magnitude. Did you attend a protest in 2028 that the government later deemed “disruptive”? That marker stays on your file. Did you post something edgy on social media when you were 19? It’s in the training data now.

There’s no forgiveness in algorithmic judgment. No recognition that people change. No understanding that the person who defaulted on a loan during the 2008 recession is fundamentally different from a serial fraudster.

4. It Enables Political Weaponization

The Chinese blacklists—even in their current, limited form—can be applied to dissidents, journalists, and anyone who embarrasses the wrong official. Once you’re on the list, you can’t fly, can’t take high-speed trains, can’t send your kids to good schools. By 2023, over 28 million flight bans and 6 million high-speed rail bans had been imposed, according to Supreme Court data.

But you don’t need authoritarian intent to see how this plays out. Imagine an American president who decides that anyone who’s ever been arrested at a climate protest should lose their federal student loans. The database exists. The legal framework could be constructed. The temptation is permanent.


How Unscarcity Does It Differently: The Architecture of Civic Standing

If behavioral scoring is so dangerous, how can Unscarcity have anything like Civic Standing or Impact Points?

The answer lies in architectural constraints that seem subtle but make all the difference.

Separation of Survival from Status

The most important safeguard is absolute: Your access to the Foundation—housing, food, healthcare, energy, education—is completely decoupled from any reputation metric.

You cannot lose your home because of a low Civic Standing score. You cannot be denied healthcare because you haven’t contributed enough Impact Points. The Foundation delivers to everyone who passes the Spark Threshold—a simple test that you’re a conscious being capable of experiencing suffering and flourishing—unconditionally.

This is not a minor detail. It’s the entire point.

In China’s system (and in Western credit scoring), the stick is survival itself. Pay your debts or lose access to housing. Follow the rules or lose the ability to travel. The punishment for nonconformity is material deprivation.

In Unscarcity, the worst that can happen to a low-reputation citizen is that they have less influence over optional frontier opportunities—things like who gets priority for life extension research or interstellar missions. They still eat. They still have healthcare. They still live with dignity.

This transforms the nature of the game entirely. You’re not playing to survive—you’re playing for the privilege of additional influence over genuinely scarce opportunities.

Narrow, Transparent, Contestable

Civic Standing measures exactly three things:

  1. Completion of Civic Service (did you contribute your time to community infrastructure?)
  2. Validated contributions (have you done things that others found valuable?)
  3. Adherence to Commons norms (did you keep your commitments within your community?)

That’s it. Not your political opinions. Not your social media posts. Not who your friends are. Not what books you read.

The metrics are:

  • Published and auditable: The algorithms are open-source, inspectable by anyone
  • Contestable: There’s a due process system for challenging assessments
  • Decaying: Per Axiom IV (Power Must Decay), old contributions fade over time, preventing permanent hierarchies
  • Locally interpretable: Different Commons can weight these factors differently according to their values

No Surveillance Infrastructure

Here’s what Unscarcity explicitly doesn’t build: mass behavioral surveillance.

Resource coordination uses anonymized, aggregated demand data—like how Google Maps knows traffic patterns without tracking individual cars. The system knows “this neighborhood needs more food deliveries on Tuesday” without knowing “Maria ordered extra groceries because her mother is visiting.”

Civic Standing accrues through voluntary registration of contributions, not passive observation. You log that you completed your Civic Service term. Your Commons validates that you fulfilled your mentorship commitment. These are discrete, explicit acts—not continuous monitoring.

Designed Against Weaponization

The Diversity Guard mechanism—requiring consensus across demonstrably different Commons for major decisions—makes it structurally difficult to weaponize reputation systems for political purposes.

If the Heritage Commons (traditionalist, religious) and the Synthesis Commons (progressive, pluralist) both have to agree that someone deserves sanctioning, it’s a lot harder for any one faction to abuse the system against its enemies.

And the Axiomatic Hierarchy—the ranked list of principles where higher ones override lower ones, like how constitutional rights override regular laws—makes certain rights truly inviolable. Not even emergency powers can strip someone’s Foundation access based on their reputation score. The architecture makes certain abuses impossible, not just prohibited.


The Deeper Lesson: Systems Create Incentives, Incentives Create Cultures

The reason to study social credit systems—both China’s bureaucratic version and the West’s corporate-fragmented version—isn’t just to avoid copying their obvious mistakes.

It’s to understand a deeper truth: the systems you build shape the people who live within them.

A society where every act is scored, where surveillance is total, where your material survival depends on conformity—that society will produce anxious, performative, cynical citizens who smile for the camera while harboring resentment in their hearts.

A society where basic dignity is unconditional, where reputation is narrow and transparent, where influence must be continually re-earned through genuine contribution—that society might produce citizens who act authentically, because they don’t have a sword hanging over their necks.

We say “might” because nobody has actually tried it at scale yet. But the architecture matters. The incentives matter. The defaults matter.

Unscarcity’s bet is that if you remove the survival threat, narrow the scope, maximize transparency, and build in decay—you can have trust metrics that enhance coordination without corroding autonomy.

The Chinese experiment proved that comprehensive behavioral scoring doesn’t even work on its own terms—the system stalled, the pilots fizzled, the population grew cynical. The American experiment proved that fragmented corporate scoring creates a Kafkaesque nightmare where no one is in charge but everyone is judged.

There has to be a third way. The architecture of Civic Standing is our attempt to find it.


References

Share this article: