When the bank’s quarterly audit revealed the old scorecard’s hidden discrimination, Miriam presented her evidence. The board, cornered by regulators and dazzled by her prototype, adopted the Thomas Lens. Loans began flowing to a forgotten side of the city. Bakeries opened. Repair shops thrived. A single mother bought a delivery van.
Years later, retiring, Miriam placed that worn book into the hands of a young intern. “Remember,” she said, “Thomas taught us how to predict the future. But we decide which future to build.” Credit Scoring And Its Applications By L C Thomas
Curious, Miriam dug into the bank’s digital tomb. She fed ten years of rejected applications into a model Thomas himself might have built. The result was quiet heresy: sixty percent of those rejected—mostly immigrants, women, and the elderly—would have repaid. The bank’s “fair” scorecard had systematically coded historical bias as risk. When the bank’s quarterly audit revealed the old
She didn’t go to her boss. Instead, she taught a class of junior data scientists from the book. They built a new algorithm, one that learned from Thomas’s principles but added a conscience: fairness constraints, transparency logs, and a “human override” flag. They called it the Thomas Lens. Bakeries opened