Корзина
Hidden within are the “Stratification Algorithms”—the secret logic that doesn’t just test students but shapes them. Rohan discovers the truth: The CSC’s 12th Standard isn’t designed to unlock potential. It’s designed to students into pre-determined socio-economic layers: Blue for governance, Green for tech, Red for manual services. The Crucible isn’t a test of problem-solving; it’s a loyalty check. The system rewards students who make predictable, risk-free choices.
Near-future India, 2032. The government’s CSC (Common Service Centres) have evolved from simple digital kiosks into sprawling, AI-driven “Stratospheric Learning Hubs.” Every village and urban block has one. The final exam of the 12th Standard is no longer a written test but a 48-hour immersive simulation called “The Crucible.”
And every year, during the 12th Standard Crucible, a single question appears on every student’s screen—the one Rohan added to the source code before they patched him out:
But as they are about to wipe his records, Rohan holds up his father’s watch. “Before you do, run Project Phoenix.” CSC Struds 12 Standard
“You broke the Crucible,” Rathore whispers. “No one has ever rejected the tree.” Rohan is hauled to the central adjudication chamber. The regional minister watches via hologram. “You have disrupted the 12th Standard for 10,000 students,” the minister booms. “Your rank is void. You will be expelled from all CSC streams.”
His hands tremble. The watch also contains one final, corrupted file: Project Phoenix —an alternate evaluation model that his father had been working on before he died. It was scrapped because it valued “unstructured human judgment.” The morning of The Crucible arrives. Rohan enters the simulation pod, heart pounding. Around him, a hundred other Struds plug in, their faces calm, sedated by preparatory beta-blockers. Meera gives him a worried nod.
But Meera, who had followed the guards, steps forward. She points to the screen. “Sir, look at the secondary data.” The Crucible isn’t a test of problem-solving; it’s
Rohan ignores it. He manually overrides the drone controls, orders the fishing villagers to use their traditional wooden boats (which the algorithm had dismissed as “obsolete”), and reroutes the rescue AI to act as a decentralized swarm—each boat captain making real-time decisions.
Rohan Deshmukh, a bright but anxious student from the Latur district. He is a “CSC Strud” (a slang term for a student exclusively trained in the CSC’s high-pressure, stratified curriculum). His only possession of value is a cracked, antique smartwatch that belonged to his late father—a former government officer who believed in human intuition over machine logic. Part 1: The Stratified World Rohan lives in a world where your “CSC Rank” determines your future. At age 17, every student enters the CSC’s 12th Standard program. The Hubs are sterile, humming palaces of holographic tutorials, bio-sensor desks, and neural-feedback headsets. The motto on the wall reads: “Personalized Learning. Perfect Outcome.”
But Rohan is failing. Not in marks—the system won’t let you fail. It simply “re-routes” you. His AI mentor, a floating orb named AURA-12, keeps flashing a yellow warning: “Cognitive Divergence Detected. Student Rohan shows persistent analog thinking patterns. Recommend re-assignment to Basic Service Sector.” The government’s CSC (Common Service Centres) have evolved
The AI warns: “Unauthorized deviation. Solutions must be selected from the decision tree.”
With shaking hands, the tech officer plugs the watch into the mainframe. On the giant screen, a new evaluation appears—not a rank, but a :
Введите логин и пароль от вашего личного кабинета