When Google's algorithms work like they're supposed to — when they show you the bit of information you're looking for and not nonsense or weird porn or propaganda or dangerous medical advice — you have people like Ed Stackhouse to thank.
Stackhouse is what Google calls a "rater." When its search engine is unsure about the results it's produced for a given search, Stackhouse's job is to decide which one people should see.
This can be a time-consuming and research-intensive process. It can also be harrowing, a window into the darkest corners of the human experience. "The one I've been seeing the most lately are the suicide forums," he said. "'What's the best method to hang myself if I'm 5'9"?' 'What length rope should I use?' "
Another thing he's been seeing in the last few weeks is a new kind of search result: what appear to be AI-generated articles, with a prompt to designate them as accurate or false, appropriate or creepy, stylistically on point or off target.
"They don't actually tell us this is Bard," Stackhouse said, "but this has Bard's fingerprints all over it."
Bard, if you're not familiar, is Google's entrant in what's being hailed as an AI arms race, which has the biggest players in Big Tech running all-out to perfect and market chatbots that can converse like humans. Its primary rival is Microsoft, which is eager to show off a new version of Bing, its also-ran search engine, freshly infused with the internet sensation known as ChatGPT. (Facebook parent Meta, circling the periphery, is testing the waters to see if its scandal-laden reputation will allow it to compete.)
The stakes could prove to be quite high: As things stand, Google has a near-monopoly in search, collecting $40 billion in revenue per quarter from that part of its business. Microsoft Chief Executive Satya Nadella thinks the shift to conversational search could reset the scoreboard. "I'm excited for users to have a choice finally," he told the Wall Street Journal.
Investors seem to agree: The market capitalization of Google's parent company, Alphabet, plunged by more than $100 billion after Bard made an error in its initial demo, offering a glimpse of both the financial incentives at play and the costs of getting it wrong.
Search engines have always been unreliable conduits for truth, but chatbot search interfaces are particularly treacherous, owing to what's been dubbed the "one true answer" problem: Rather than present a list of possible results and inviting users to select among them, they serve up one canonical answer. When the question is "Is it safe to boil a baby?" and the answer is "Yes" — a real-life example from Bing — you've got trouble.
That being the case, you might expect the companies involved would put a premium on the quality-control work that helps their supposedly smart new software tools avoid stupid blunders. Instead, the humans who do that work are treated as disposable.
"We make some of the lowest wages in the U.S.," Stackhouse, who has been a rater for nearly a decade, says. "I personally make $3 less per hour than my daughter working in fast food."
In early February, as Google and Microsoft were prepping their splashy chatbot announcements, Stackhouse joined a group of raters in delivering a petition to Google, demanding the company meet its own stated minimum standards. Raters often make as little as $14 an hour — less than the $15 that Google promises contractors. Stackhouse has a serious heart condition requiring medical management, but his employer, Appen — whose sole client is Google — caps his hours at 26 per week, keeping him part-time, and ineligible for benefits. This is standard policy for thousands of raters.
The raters join a wave of discontented contractors in speaking out: workers who ensure YouTube videos contain the correct metadata went on strike the same week, alleging unfair labor practices. That's not even the bottom of the barrel. A Time magazine investigation revealed that contractors in Kenya were being paid less than $2 an hour to review content for ChatGPT — much of it so toxic that it left them traumatized.
There's a certain cruel irony in the fact that as the highest-profile technology in years makes its debut, the ones best suited to keep it on the rails are also the most precarious at the companies that need them. That's no accident. A chatbot is a sort of magic trick; for the illusion to work properly, the assistants curled up inside the box must remain hidden from the audience, their contribution unremarked.
While Google and Microsoft want you to forget that they exist, for the workers, forgetting doesn't come so easily.
"I have seen tasks with racial slurs, bigotry, and violence," Stackhouse says, as well as grisly medical gore and hardcore pornography. "There are times that I have seen some of the graphic content replayed in my dreams. This is why I never work late at night anymore. Twice in my 10 years, I have seen child porn but thank God that is ultra-rare. I would quit."
Worst of all, perhaps, if you want work, you can't limit the kind of content you review. "If you opt out of pornographic content, you might see your tasks diminished," Stackhouse says. "So most people do not opt out."
There are as many as 5,000 raters like Stackhouse working at Appen, and thousands more at other subsidiaries rating data for other parts of Google's, Microsoft's and Meta's business.
"It's like we're anonymous," Stackhouse says. "We're ghosts."
"We're not asking for a whole lot, just fair pay and to work remotely," Katie Marschher says, "just to be acknowledged as human beings, not just numbers on a spreadsheet." Marschher is currently on strike from a similar job with Cognizant, a contractor whose workforce does quality control for YouTube.
"We are really part of this two-tiered system," she said. "There are Google employees, and they get stock options and paid really well, and there is this whole other workforce."
Marschher and her colleagues recently voted to form a union with the Code Workers of America. Shortly after, Cognizant informed them it was instating a mandatory work-from-the-office policy. Since many of the workers can't afford to relocate, the workers viewed the move as retaliation. When it wouldn't budge, the YouTube workers voted to strike.
"We want to take a stab at dismantling this system; it exploits workers and makes them really vulnerable, and we don't think it's right," she says.
It's never been right. But with some of the world's most profitable companies now poised to extend their dominance with a whole new business that would not exist without the toil of their lowest-paid workers, it's high time that the ghosts in the machine show themselves, and get the recognition they're due.