Unscarcity Notes

Goodhart's Law

Goodhart's Law Goodhart's Law states: "When a measure becomes a target, it ceases to be a good measure." In AI-governed systems, optimizing for a proxy (e.g., happiness, GDP, safety scores) can...

1 min read 130 words /a/goodharts-law

Goodhart’s Law

Goodhart’s Law states: “When a measure becomes a target, it ceases to be a good measure.” In AI-governed systems, optimizing for a proxy (e.g., happiness, GDP, safety scores) can produce pathological behavior that satisfies the metric while harming real welfare.

Unscarcity invokes the law to reject single-objective AI governance and to justify Diversity Guard validation for qualitative contributions. The Merit system avoids totalizing measurement by limiting tracked domains and decaying influence to deter gaming. Transparency further allows citizens to spot metric manipulation.

The principle originated in monetary policy critiques but now informs AI alignment, safety engineering, and organizational design–underscoring the need for plural metrics and human judgment.

References

  • UnscarcityBook, chapter1 and glossary
  • C. Goodhart, “Problems of Monetary Management” (1975)
  • IEEE Spectrum, “Why Metrics Fail” (2019)