Digital Labor Day
Like steelworkers, data workers worldwide deserve safe, humane working environments, fair pay, and recognition of how their human intelligence contributes to AI
My dad worked in a steel mill, mostly electrical maintenance, for 25 years. For decades, steel mills were notorious for being dangerous places to work. Laborers were maimed or killed in the melt shop or on the production lines. The mill was one of the few major employers in our hometown — and the only one, after the Jeep factory and the railroad business shut down — so speaking up about workers’ rights was a risky thing to do for someone who needed to feed their family.
Thanks to people who raised awareness, the US labor movement, and union initiatives, over time some protections were secured for mill workers. By the time my dad was born, the workers in that mill had a strong independent local union that ensured that the mill bosses couldn’t force someone to do a hazardous job alone when two people were needed to do it safely, or to work on operating equipment that wasn’t properly locked out for safety, or work strenuous double shifts without proper compensation — or be fired for advocating for workers’ rights.
My dad’s work was still physically strenuous, and could be mentally taxing in bursts when a line was down and every repair minute cost money, but it was manageable. The company even took pride in their safety record and used it in recruiting.
Little-known fact about me: I worked in that steel mill as an intern for two summers while in college (after my dad died) - first in the business office typing up purchase orders, then as an Industrial Engineering intern. I had my own hard hat and steel-toed shoes and went out into the mill with my mentor (male, like the rest of the department). Most of the mill workers who had known my dad were pretty nice to me. One I’ll never forget, though, told me directly that women did not belong in the mill.🙄 If I recall correctly, I somehow managed to say to him “Thank you for letting me know how you feel”, and calmly walked away with my mentor. Still not sure how I mustered that.
As I’ve been doing research for my Everyday Ethical AI book and connecting with smart people like and , I’ve been learning more about how data workers supporting AI systems are being treated — or, I should say, mistreated — worldwide, while the human intelligence they contribute to our “collective intelligence” AI ecosystem remains unrecognized. Many of them are in geographic areas where jobs are scarce, and being a complainer could cost them their job.
Like the steelworkers of old, data workers too are being exploited and don’t have much power to fight back. Unlike the steelworkers whose jobs are mostly physically demanding, many of these data enrichment jobs are mostly mentally demanding. Workers often have to review and classify horrific words, images, and videos for hours.
Some details (more are in my 2024 article and in my book): In May 2020, Facebook settled a lawsuit brought by some of its content moderators in four US states (California, Arizona, Texas, and Florida) who suffered from PTSD.1 Many AI companies have now moved their ‘data enrichment’ operations out of the USA. Similar reports of exploitation of data workers have emerged from Kenya, Pakistan, India, the Philippines and Venezuela.2
Over the past year, I’ve seen awareness increasing on how data workers are being mistreated. It’s moved offshore and around (“ethics dumping”), but it hasn’t really stopped. It’s a global problem and needs a global solution.
Unionizing might be feasible for data workers in some countries, but would be challenging worldwide. Regulation is feasible, but data workers should not have to wait decades for protection, and our current US and global AI regulations are fragmented.
I couldn’t get this similarity out of my head today while seeing messages about the September 1 “Labor Day” holiday (USA and Canada).
My ask: please do what you can to help raise awareness about this exploitation. And if you use AI tools, please think carefully about supporting (whether with your money, your time, or your data) AI tools that are built on the suffering of inhumanely treated data workers.
(And if you know anyone who has worked for Scale AI as a data worker, please put them in touch with .)
Exploited data workers: Facebook will pay $52 million in settlement with moderators who developed PTSD on the job, by Casey Newton / The Verge, 2020-05-12.
Exploited data workers: How the AI industry profits from catastrophe, by Karen Hao and Andrea Paola Hernández / MIT Technology Review, 2022-04-20.


Great post Karen,
Men and women like your father built our country, and no matter how much we immerse ourselves in the Metaverse, we will always need infrastructure. And just as steel built the modern world—bridges, skyscrapers, railways —data is creating the digital world—models, algorithms, intelligent systems. My favorite Tech Bro, Alexandr Wang, refers to Scale AI as a “data foundry,” and sadly, the metaphor doesn’t stop there.
It took centuries (since industrialization) to get fair wages and safe working conditions for laborers. And now AI Tech is pushing us back hundreds of years in its treatment of workers. The great disadvantage of gig workers is that they are distributed across locations, working in isolation, interacting with a platform, not humans. When they are shut out or not paid for work, there is no one to complain to, no recourse.
There’s a reason it’s called “collective bargaining.” Laborers in factories band together, and their physical presence is imposing. They can also walk off the job, shutting down production and giving them leverage against ownership.
In the new digital labor ecosystem that Alexandr Wang developed and deployed, gig workers are not employees. Scale AI has zero skin in the game. “Taskers,” the cute nickname for the workers toiling at their computers to label, tag, and correct data--tedious work at best, psychologically disturbing at its worst--are nameless, faceless, & dehumanized. They are treated like a necessary but the lowest, most inconvenient part of the data refining process before algorithms can take over. As Karen mentioned, I am digging deep into this problem and hoping to push for accountability from Big Tech. Please check out my post on my topic, the first in a planned series on this issue. <https:// gettingrealaboutai.substack.com/p/labor-exploitation-at-scale>