Today’s guest post comes from Debbie Ginsberg, Faculty Services Manager at Harvard Law School Library.
I was supposed to write a blog post about the Harvard AI summit about six months ago. For various reasons (e.g., “didn’t get my act together”), that hasn’t happened. But one of the things that was brought up at the summit was who wasn’t at the table—who didn’t have access, whose data wasn’t included, and similar issues.
Since then, I’ve been thinking about the haves and have-nots of AI. There’s one group that I don’t think gets discussed enough. That’s the giant human workforce that AI needs to function.
Whenever I think of how AI is trained, I imagine a bunch of people somewhat like her (ok, there aren’t so many women and POC in real life, but I’m not going to tell ChatGPT to draw more white men):
And that they’ve been working on processes that look somewhat like this:
But that’s only part of the picture. Underlying all these processes are people like this:
Who are they?
Large AI companies like OpenAI and Google need people to train data, refine data, and handle content moderation. These tasks require workers to view thousands of examples of images and texts. To say, “This is a cat,” “The AI got this right,” or “This is not offensive.” And then do this over and over again. These are the “ghost workers” behind the machine. Without them, AI doesn’t function.
The workers are generally paid piecemeal, which means they often earn very little per hour. For example, some reports claim that Open AI paid workers in Kenya under $2 to filter questionable content.
The working conditions are not optimal, especially when the workers are reviewing content. The workers generally do not receive sufficient training or time to do the work they are asked to do. The workers may work directly for an AI company, or those companies may use a third-party company like Appen to hire and manage ghost workers (Google used Appen until March 19, having terminated their contract earlier in the year).
That said, this work is an essential source of income for many around the world. The jobs are relatively flexible as to location and time, and the workers take pride in their output.
As AI continues to grow, there has been more focus on improving working conditions. For example, the US has launched investigations into some of the large tech companies in response to concerns about how ghost workers are treated. And while some AI experts predict that AI will eventually be able to do this work itself, many others believe that AI will continue to depend on ghost workers for a long time to come.
And considering how much profit is at stake, I’m thinking that maybe they should paid more than $2/hour.
Footnote:
Did I use AI to write this? Kind of? I used Google’s NotebookLM tool to review my sources and create notes. In addition to the sources above, check out:
- Science Behind ‘miracle’ AI is an army of ‘ghost workers’ — and they’re speaking out about Appen
- AI: Ghost workers demand to be seen and heard
- The “Ghost Workers” Underpinning the World’s Artificial Intelligence Systems
- ghostwork.org/
- Lost in the crowd? An investigation into where microwork is conducted and classifying worker types – DOI: 10.1177/09596801231171997
Oh yeah good points.. Not only does it not pay well, it must have gotten so boring after a while!