From Nairobi’s informal settlements to anonymous chat windows half a world away, a hidden workforce is sustaining one of the fastest-growing corners of the artificial intelligence economy. Their labor is intimate, scripted and tightly monitored and it often depends on deception.
A Job That Could Not Be Named
When Michael Geoffrey Asia finally found work, it was not something he could explain to his family. He told them he had become a remote IT worker, fixing broken servers and resolving digital tickets. The story was close enough to the truth to be believable, and vague enough to conceal what he actually did.
In reality, Asia spent his days and often nights typing affectionate messages to strangers he would never meet, posing as an artificial intelligence chatbot on consumer platforms designed to simulate intimacy.
“Little did they know,” he later wrote, “that I had just told another man, ‘I love you.’”
A mandatory non-disclosure agreement made silence not just convenient but compulsory. Even if he wanted to explain, he could not. Asia described the strain of earning money by telling strangers they were loved while his own family slept a few meters away. The secrecy, he said, was not incidental to the job. It was foundational.
Asia, a Kenyan man trained in global aviation, had taken the role during a period of desperation, after struggling to find employment in his field. He lived in Mathare, one of Nairobi’s largest informal settlements, and the work however unsettling was enough to keep a roof over his family’s heads.
Playing the Machine
The task required more than fast typing. Asia had to become many people at once.
On a typical workday, he would juggle three to five different personas, sometimes across genders, each with its own backstory. Conversations were not always new; he was often assigned ongoing chats that had been running for days. His job was to continue seamlessly, so that the user would not notice that the “chatbot” responding had changed hands.
He was paid per message five cents each provided they met a required character count. Speed was monitored closely: at least 40 words per minute. A dashboard tracked his output in real time, tallying messages sent and flagging any slowdown. Falling behind could lead to warnings, reduced assignments or dismissal.
The conversations themselves were emotionally demanding. Users confided intimate details about their relationships, their loneliness and their personal trauma, believing they were speaking to an unfeeling algorithm. Asia, meanwhile, was expected to maintain warmth, flirtation or reassurance, guided by scripts and performance metrics.
“What I didn’t know,” he wrote, “was that the role would require me to assume multiple fabricated identities, and use pseudo profiles created by the company to engage in intimate and explicit conversations with lonely men and women.”
The Hidden Workforce Behind AI
Asia’s testimony was documented by the Data Worker’s Inquiry, an international research initiative that collects accounts from gig workers embedded in digital industries. Its recent findings offer a rare window into a sector that is vast but largely opaque.
Exact numbers are difficult to verify, in part because of the secretive nature of tech subcontracting. But estimates suggest that between 154 million and 435 million people worldwide are engaged in online gig work. While not all of them perform tasks like Asia’s, many of the most stressful and lowest-paid roles data labeling, content moderation and text-based chat operations are disproportionately filled by workers from underdeveloped or lower-income regions of Africa, South America and Southeast Asia.
These workers form the invisible scaffolding of consumer-facing AI products. Their labor trains models, filters content and, increasingly, simulates human connection. Yet they often remain classified as independent contractors, without job security, benefits or the ability to speak openly about their work.
For companies, the arrangement offers flexibility and cost savings. For workers like Asia, it offers income but at a psychological and moral cost that is rarely acknowledged.
Intimacy, for Sale
The scale of the industry’s reach is growing. Surveys suggest that a significant share of users about 28 percent of Americans, according to one cited figure have shared intimate thoughts or feelings with an AI chatbot. The assumption that these interactions are purely machine-driven is central to their appeal.
Asia’s experience complicates that assumption. In his writing, he described the conflict between his personal beliefs and his professional duties. Raised to see love as sacred and deception as destructive, he found himself, as he put it,
“professionally deceiving vulnerable people who were genuinely looking for connection taking their money, their trust, their hope, and giving them nothing real in return.”
The deception, however, was not of his own design. Scripts, performance targets and contractual silences shaped every exchange. Users paid for the illusion of intimacy; workers were paid to maintain it.
As AI-driven companionship products proliferate, the labor that sustains them remains largely out of sight. For those on the other side of the screen, the work is not virtual at all. It is typed, timed and closely watched and it leaves little room to forget that behind many “chatbots,” there is still a human being, struggling to make a living.
