Scan Your Eyes to Get Paid

Nine countries banned it. Altman's company is still scanning.

The device is a silver sphere about the size of a bowling ball. Sam Altman's company calls it an Orb. You sit in front of it, it captures the pattern of both irises, and in exchange you get two things: a "World ID" (a digital proof that you're a human being) and a share of WLD cryptocurrency tokens. The company frames this as a form of universal basic income for the AI age.

In Part 1, I covered 50 years of UBI pilot data. The results were consistent: people don't stop working, poverty drops, spending goes to food and rent. In Part 2, I followed the money behind UBI research and found the same tech executives funding basic income pilots were spending billions to block regulation of the AI displacing workers. This part is about what happens when income distribution requires you to hand over something you can never get back.

The Orb's Terms of Service

World, rebranded from Worldcoin in October 2024, is co-founded by Sam Altman, the CEO of OpenAI. The stated purpose is "proof of personhood," a system for verifying that a person is real in an internet increasingly filled with AI. The mechanism is an iris scan and the reward is cryptocurrency, which makes the template explicit: no scan, no ID, no income.

World says it deletes raw iris images and keeps only encrypted iris codes that are "mathematically irreversible." Regulators in nine countries have disputed some combination of whether consent was genuine, whether data was actually deleted, and whether offering cryptocurrency to economically desperate populations constitutes coercion.

Here's the regulatory record:

A cybersecurity expert quoted by Fox Business during World's US launch put it plainly: "Once you link an unchangeable biometric like your eye to a global ID system, you can't take it back. It's the ultimate honeypot for surveillance."

The encrypted iris codes cannot be revoked. If the system is breached in 2030, the biometric data is permanently compromised, and unlike a password, there's no reset option.

India Already Built This

If you want to know what biometric-gated welfare looks like at scale, India has been running the experiment for over a decade.

Aadhaar is a 12-digit national identification system using fingerprints and iris scans, now linked to welfare benefit delivery for more than one billion people. It was launched as voluntary and limited in scope. Enrollment is now effectively mandatory for most national welfare programs, and India still has no comprehensive data protection law governing how the biometric data is used.

The failure rates are the part nobody building these systems wants to discuss. According to data cited in Economic and Political Weekly and peer-reviewed academic analysis, the biometric authentication failure-to-match rate hit 49% in Jharkhand state. In Rajasthan: 37%. The people failing authentication are elderly citizens whose fingerprints have degraded, manual laborers whose prints are worn smooth, and rural residents in areas where connectivity drops during authentication. The system was designed around people whose bodies cooperate with the scanner, and anyone who doesn't gets locked out of their food rations, pension payments, and wages. In 2025, the Aadhaar authority itself acknowledged the high failure rates and issued new scrutiny guidelines.

The human cost of those failure rates is documented. Activist-gathered data compiled by the Right to Food Campaign recorded 42 hunger-related deaths since 2017, with researchers attributing 25 to Aadhaar-related access failures. An 11-year-old girl in Jharkhand, Santoshi Kumari in Simdega district, reportedly died of starvation after her family's food ration access was cancelled for failure to link to Aadhaar. The Indian government disputed the causal attribution in several cases, arguing other factors contributed. But the pattern is consistent: biometric authentication fails, benefits stop, and the person on the other end of that failure has no fallback.

If you require a scan to receive food rations, some percentage of the people who need them most will be excluded by the scanner. That's not a theoretical risk. It's what happened.

What Happens After the Scan

Biometric identity handles enrollment. The question is what happens once someone passes the scan and receives the payment.

If a future UBI is delivered through a Central Bank Digital Currency, the payments can carry conditions baked into the currency itself. The OECD's 2023 analysis of CBDCs and democratic values described how programming money enables regulators to limit when, where, and how it's spent. The IMF's 2024 FinTech Note on CBDC data use described the same instruments as programmable, capable of restricting, redirecting, or denying access to funds based on rules set by central authorities. A 2020 CoinDesk analysis outlined the convergence early: "How Central Banks Could Use Digital Cash to Deliver Universal Basic Income."

What does "programmable" look like in practice? Expiration dates, where your UBI vanishes if you don't spend it by the end of the month. Category restrictions, where the currency works at a grocery store but not a liquor store. Geographic limits that confine spending to your registered area. Behavioral triggers that adjust your balance based on compliance with conditions set by whoever administers the system. All of these are described as features, not hypotheticals, in official policy documents from the institutions that would implement them.

With a paper check or a bank transfer, the money is yours once you receive it. The government can set eligibility rules on the front end, but once the payment clears, the recipient decides how to use it. A programmable CBDC changes that relationship. The issuing authority retains control of the money after it reaches the recipient's account. That's a fundamentally different kind of payment, and it's the kind that every major central bank is actively researching.

China is already running a version of this intersection. On the private side, Ant Financial's Zhima Credit Score (the credit rating system for Alipay's hundreds of millions of users) rates people based on spending behavior. Li Yingyun, technology director of Zhima Credit, told Chinese magazine Caixin in February 2015 that buying diapers is "responsible" while playing video games for ten hours a day would lower one's score. On the government side, people on social credit blacklists face restrictions on travel, restaurants, housing, and insurance. A 2021 report from the Center for a New American Security specifically analyzed the risk that China's digital yuan would be integrated with the social credit system to expand behavioral control over financial transactions.

English-language reporting has sometimes overstated how uniformly China's system operates. Some reported features are proposals, not implemented policy. But the technical capability is real: digital currency programmed with behavioral conditions, delivered through a biometric identity system. The OECD and IMF documented exactly that architecture, and China is the closest thing to a live deployment of it.

Who Benefits

Put both halves together: a biometric identity system that gates who receives income, and a digital currency that controls how they spend it. The people who build and operate that infrastructure hold two levers over every recipient at once.

World gets a global biometric identity database. Even if the stated intent is proof-of-personhood, the infrastructure is a registry of iris patterns linked to financial accounts. Whoever controls that registry controls enrollment. Sam Altman's company is building this while simultaneously running the AI company most aggressively pursuing artificial general intelligence, the technology most likely to displace the workers that UBI is supposed to help.

Governments adopting CBDC-delivered benefits get something they've never had with cash or checks: the ability to dictate what recipients buy, when they buy it, and where. That's a policy tool with legitimate uses (preventing fraud, for example) and obvious abuse potential (punishing disfavored behavior, restricting political activity, conditioning payments on compliance).

The people who benefit least from this architecture are the ones the payments are supposed to help. India's Aadhaar proved that the most vulnerable populations (elderly, disabled, rural, manual laborers) are the most likely to fail biometric authentication. Adding programmable spending restrictions on top of that doesn't make the system more equitable. It builds a second gate behind the first one.

The Design Choice

UBI can be delivered by check, direct deposit, or prepaid debit card. Alaska has been distributing dividend payments for 43 years without scanning anyone's eyes or programming what the money can buy. The Stockton pilot used prepaid debit cards. Finland used its existing social insurance system. GiveDirectly's Kenya program sends mobile money transfers. None of them required biometric enrollment or a blockchain. The money reached recipients and the recipients decided what to do with it.

The surveillance layer being built around income distribution is a design choice. There's no technical reason a cash transfer requires an iris scan or a programmable currency. It's being built by specific people with specific financial interests in the identity infrastructure, and tested first on populations where regulatory pushback is weakest. Nine countries have looked at the terms and rejected them. India has been running the experiment for over a decade and documented the outcome. The countries that haven't acted yet have both of those records available to them.

Part 4 of this series does the cost math. What would a national UBI actually cost, what could fund it, and what are the alternatives that nobody with a venture capital portfolio is promoting?