Author: The Guardian
Compiled by: Deep Tide TechFlow
Deep Tide Introduction: This investigative report reveals a rapidly growing gray industry: thousands of people worldwide are earning money for AI training by selling their voices, faces, call records, and daily videos.
This is not a general discussion about privacy controversies, but an investigation with real people, real amounts of money, and real consequences—an actor who sold his face later saw "himself" promoting an unknown medical product on Instagram, with people in the comments evaluating his "looks."
When the data hunger of AI companies combines with global economic disparities, it is creating an unequal transaction.
Full text as follows:
One morning last year, Jacobus Louw, who lives in Cape Town, South Africa, went out for his usual walk, feeding seagulls along the way. But this time he recorded a few videos—filming his footsteps and view as he walked on the sidewalk. This video earned him $14, about 10 times the country's minimum wage and equivalent to half a week's food expenses for this 27-year-old.
This was a "city navigation" task Louw completed on Kled AI. Kled AI is an app that pays users to upload photos, videos, and other data for training AI models. In just a few weeks, Louw earned $50 by uploading photos and videos from his daily life.
Thousands of miles away, in Ranchi, India, 22-year-old student Sahil Tigga regularly earns money through Silencio—an app that crowdsources audio data for AI training, accessing his phone's microphone to capture ambient noises like inside restaurants or busy intersections. He also uploads recordings of his own voice. Sahil makes special trips to unique locations, such as hotel lobbies not yet recorded on Silencio's map. He earns over $100 per month from this, enough to cover all his food expenses.
In Chicago, 18-year-old welding apprentice Ramelio Hill sold his private phone conversations with friends and family to Neon Mobile—a conversational AI training platform that pays $0.50 per minute—earning a few hundred dollars. For Hill, the calculation was simple: he believes tech companies already have vast amounts of his private data, so he might as well get a share of the profits.
These "AI training gigs"—uploading surroundings, personal photos, videos, and audio—are at the forefront of a new global data gold rush. As Silicon Valley's hunger for high-quality human data exceeds what can be scraped from the open internet, a booming data market industry has emerged to bridge this gap. From Cape Town to Chicago, thousands of people are micro-licensing their biometric identities and private data to the next generation of AI.
But this new gig economy comes at a cost. Behind the few dollars earned, these trainers are fueling an industry that may ultimately render their skills obsolete, while exposing themselves to future risks of deepfakes, identity theft, and digital exploitation—risks they are only beginning to understand.
Keeping the AI Gears Turning
AI language models like ChatGPT and Gemini require massive amounts of material to continuously improve, but they are facing a data shortage. The most commonly used training data sources—C4, RefinedWeb, and Dolma—which comprise a quarter of the highest-quality datasets on the web, are now restricting generative AI companies from using their data to train models. Researchers estimate that AI companies could run out of available fresh, high-quality text as early as 2026. Although some labs have begun training models with synthetic data generated by AI itself, this recursive process leads to models outputting error-filled "garbage," eventually causing a collapse.
This is where apps like Kled AI and Silencio come in. In these data markets, millions of people are feeding and training AI by selling their identity data. Beyond Kled AI, Silencio, and Neon Mobile, AI trainers have many choices: Luel AI, backed by the famous incubator Y-Combinator, acquires multilingual conversation material at about $0.15 per minute; ElevenLabs allows you to digitally clone your voice and make it available for others to use at a base rate of $0.02 per minute.
Bouke Klein Teeselink, an economics professor at King's College London, says AI training gigs are an emerging work category that will grow significantly.
AI companies know that paying people for data licensing helps avoid copyright disputes that might arise from relying entirely on web-scraped content, Teeselink says. AI researcher Veniamin Veselovsky adds that these companies also need high-quality data to model new, improved behaviors for their systems. "For now, human data is the gold standard for sampling outside the model's distribution," Veselovsky added.
The humans driving these machines—especially those in developing countries—often need the money and have few alternatives. For many AI training gig workers, taking on this work is a pragmatic response to economic disparity. In countries with high unemployment rates and depreciating local currencies, earning dollars is often more stable and lucrative than local jobs. Some struggle to find entry-level work and are forced into AI training out of necessity. Even in wealthier countries, rising living costs make selling oneself a logical financial choice.
Cape Town-based AI trainer Louw is well aware of the privacy cost. Although the income is unstable and not enough to cover all his monthly expenses, he is willing to accept these conditions to earn money. He has suffered from neurological diseases for years, making it difficult to find work, but the money he earned from AI data markets (including Kled AI) allowed him to save $500 to enroll in a spa training course to become a massage therapist.
"As a South African, receiving dollars is more valuable than people think," Louw said.
Mark Graham, a professor of internet geography at Oxford University and author of "Feeding the Machine," acknowledges that for individuals in developing countries, the money may have practical significance in the short term, but he warns, "Structurally, this work is unstable, has no upward mobility, and is essentially a dead end."
Graham added that AI data markets rely on "a race to the bottom in wages" and a "temporary demand for human data." Once that demand shifts, "workers will have no security, no transferable skills, and no safety net."
Graham said the only winners are "the platforms in the Global North, which capture all the lasting value."
Full Authorization
Chicago-based AI trainer Hill has mixed feelings about selling his private phone calls to Neon Mobile. About 11 hours of call content earned him $200, but he said the app often goes offline and delays payments. "Neon always seemed suspicious to me, but I kept using it just to earn some extra pocket money to pay bills," Hill said.
Now he is reconsidering whether the money was really that easy. Last September, Neon Mobile went offline just weeks after launching, after TechCrunch discovered a security vulnerability that allowed anyone to access users' phone numbers, call recordings, and transcripts. Hill said Neon Mobile never notified him of this situation, and now he is worried his voice could be misused online.
Jennifer King, a data privacy researcher at Stanford University's Institute for Human-Centered Artificial Intelligence, is concerned that AI data markets are not clear about how and where user data will be used. She added that without understanding their rights or being able to negotiate them, "consumers face the risk of their data being reused in ways they dislike, do not understand, or did not anticipate, with almost no recourse."
When AI trainers share data on Neon Mobile and Kled AI, they grant a full authorization (global, exclusive, irrevocable, transferable, and royalty-free) allowing the platform to sell, use, publicly display, and store their likeness, and even create derivative works based on it.
Kled AI founder Avi Patel said his company's data agreement limits use to AI training and research purposes. "The entire business model relies on user trust. If contributors think their data might be misused, the platform cannot function." He said the company vets buyers before selling datasets, avoiding cooperation with "suspicious-intent" organizations, such as the pornography industry, and "government agencies" they believe might use the data in ways that violate that trust.
Neon Mobile did not respond to requests for comment.
Enrico Bonadio, a law professor at City, University of London, pointed out that these agreement terms allow the platform and its clients to "do almost anything with that material, permanently, without additional payment, and contributors have no practical way to withdraw consent or renegotiate."
More worrying risks include: trainers' data being used to create deepfakes and identity impersonation. Although data markets claim to strip identifying information (such as names and locations) from data before sale, biometric patterns are inherently difficult to anonymize meaningfully, Bonadio added.
Seller's Remorse
Even if AI trainers could negotiate more detailed protections for how their data is used, they might still regret it. In 2024, New York-based actor Adam Coy sold his likeness for $1,000 to Captions—an AI video editing software now renamed Mirage. His agreement stipulated that his identity would not be used for any political purposes, not for promoting alcohol, tobacco, or pornography, and the license would last for one year.
Captions did not respond to requests for comment.
Soon after, Adam's friends began forwarding videos they found online featuring his face and voice, which had garnered millions of views. In one Instagram video, Adam's AI replica claimed to be a "vaginal doctor" promoting unverified medical supplements for pregnant and postpartum women.
"It's embarrassing to explain this to others," Coy said.
"The comments were weird because they were evaluating my appearance, but it wasn't even me," Coy added. "My thinking when I made the decision (to sell my likeness) was that most models would scrape data and portraits from the internet anyway, so I might as well get paid."
Coy said he has not taken any AI data gigs since. He said he would only consider doing it again if a company offered significant compensation.








