AI needs your humanities skills
Harvesting and plagiarizing the vast resources of digital content on the Internet isn’t enough; AI companies are recruiting experts in the humanities to help train their systems.
[Drawing by
]There’s a long history of knowledge and skills being taken from a group of people for cynical purposes and ultimately being used against the people from whom it’s taken. For example, rice cultivation involved a special kind of knowledge which certain cultures of West Africa, particularly the women, held and which was especially sought after by slave traders, knowledge which rice plantation owners depended on and which then became a target of brutal exploitation.[1] The practice of inoculation was first introduced in North America by an African slave, a practice which was later experimented on slaves as well as used on slaves in order to maximize profits. George Washington had the Continental Army soldiers inoculated to protect them from smallpox, a move which was likely decisive in the victory of the American Revolution—a move which, again, depended upon the knowledge of African slaves.[2] A medicine (quinine) useful for treating malaria invented by indigenous peoples in Peru using the bark of a tree (cinchona) native to the region was highly sought after by European colonizers—who weren’t satisfied with receiving the medicine via export and instead sought all sorts of ways to smuggle the seeds of the tree out of the country, eventually succeeding via an illegal bribe.[3] The scientific knowledge and methods of craftspeople which was essential to the origins of the Scientific Revolution would later be incorporated into a factory system in the Industrial Revolution, in an economy which eventually made craftspeople become dependent wage workers in factories.[4] Today, the critical knowledge of experts in humanities is being sought by AI companies, and I assure you, the results won’t be for the benefit of these experts—not so long as these AI systems remain in private hands, and not so long as we live in a society that enables the expulsion of workers into unemployment via the introduction of new automated technology.
Virtually nobody enters the fields of humanities with the number one priority of making money. Folks primarily motivated by money will be found in business school. Rather, we enter the humanities because we’re passionate, curious, and creative—motivations and skillsets that are generally shunned and devalued in capitalist society. Sure, some of us have fantasized about fame or fortune for writing the next Great Novel or just had the foresight to dream of a life free from the drudgery of working in a cubicle, but we all know there are far better routes than poetry to attaining riches if that’s one’s motivation. Despite social and familial pressures, despite the societal pressures in an economy that prioritizes exchange value above all else, salability over usefulness or spiritual or intellectual enrichment, students of the humanities persevere by staying close to their creative passions and intellectual curiosities. And yet, despite being told that they’ve made the wrong choice, that they’ll never get a job, that “there’s no money in [x],” despite enduring life in a society thoroughly hostile to their critical thinking skills, today, AI companies are seeking experts in the humanities (among other fields) for those critical thinking skills. That’s right, the technology that’s being hyped as the pinnacle technology of the future, what venture capitalists are investing grotesque amounts of money into, appears to depend on attaining a certain set of skills that we’ve so long been taught aren’t valuable.
Humanity’s Last Exam?
Since the beginning of this year, I think I’ve been targeted by 4 or 5 sponsored ads on LinkedIn from companies aiming to recruit me to help train AI. First, they were aimed at people with expertise in writing, but the last two have been more specific: one for philosophy, one for humanities M.A.s or PhDs. Here’s one I received last week, inviting me to participate in the rather ominously titled, “Humanity’s Last Exam, a rigorous new way to test Al's intelligence and reasoning:”
Hi Kevin,
My name is [x], and your expertise in humanities is exactly what we need for our research at Outlier (a company owned & operated by Scale Al).
Our Mission: we're challenging Al with Humanity's Last Exam, a rigorous new way to test Al's intelligence and reasoning. To do so, we need experts with Master's or PhDs in Humanities to create high-level problems that AI struggles to solve—in topics like Psychology, Sociology, Literature, Fine Arts, Music, & History.
When you click on the link in the ad, you learn that their program is used to “accelerate AI development at leading enterprises,” including Microsoft, Eureka Labs, Cisco, and TIME. They hype you up a bit, make you feel smart, valued, important even: “Outsmart Artificial Intelligence with Your Human Brilliance,” they say. Like it’s a game, a chess match, and they need you to intellectually wrestle their relatively weak AI into submission. Don’t worry, nothing economic at stake here! Oh, it’s also exclusive and elite: “Join the top 1% of scholars in shaping the future of AI.” They must’ve mistaken us humanities people for those business school people who believe they’ll become part of the economic “1%.”
Well, at least for those highly prized skills, essential for advancing their technology, they must be offering top dollar, right? Full-time with benefits? An offer you couldn’t refuse even if you were already living relatively comfortably? They’re offering “up to $118 per submission (contributed problem & solution pair)” and “Unlimited submissions - high earning potential.” Ok, what does that come out to per hour? How many hours are people putting into a submission? Don’t worry about that! You also get the cool knowledge that you’re outsmarting AI! Well, for now, at least… That is, until they decide they no longer need you. Then what?
Here's a “problem and solution pair” for you
Despite my revulsion at the prospect, despite the cynical and dystopian nature of the pitch, the elephant in the room is that there’s a job market full of critical thinkers who are no doubt compelled to seriously consider such a prospect because of how little work there is already on the job market, not the least due to the reckless introduction of AI into various industries. And not the least due to the critical skills they have, many are likely aware of what’s going on here and may still be compelled to accept the offer.
I can sympathize on an individual level with someone accepting the deal due to having virtually no other options. However, they should know that they’re engaging in an historically significant process of appropriation. I think we should not aid the AI companies in subsuming our already limited lines of work. Rather, we should resist it. I’m not against the advance of technology, or of even AI in particular, but I am against the ways in which automation is used in capitalist production—ways that regularly undercut the livelihoods of large swaths of people, while providing them no safety net, no alternative. It’s on you to play musical chairs against all the others on the market.
This is all a part of the long process of accumulation of capital, a process of appropriation which, in the modern capitalist era, extends all the way back to the enclosures of the commons in the U.K. And as is the case for the commons that were taken from the people, humanities experts won’t personally benefit from the commodification of their skills in AI technology—until they take it back. As an English folk poem, circa 1764, goes:
They hang the man and flog the woman
That steal the goose from off the Common,
But let the great villain loose
That steals the Common from the goose.
The law demands that we atone
When we take things we do not own.
But leaves the Lords and Ladies fine
Who take things that are yours and mine.
The law locks up the man or woman
Who steals the goose from off the Common,
And geese will still a Common lack
‘Til they go and steal it back.
Notes
[1]. Clifford D. Connor, A People’s History of Science (New York: Nation Books, 2005), 90-3.
[2]. Ibid, 102-5.
[3]. Ibid, 95-7.
[4]. Ibid, 349-50.
Unless… bananas are circus animals and Katherine the great had a theory about eternal return. I really hate it how gunk is poi so ing our knowledge pool. Wink wink 😜 should we get organized to enter the belly of the beast?