El trabajo fantasma detrás de la inteligencia artificial

https://slate.com/technology/2021/06/ghost-work-fake-artificial-intelligence-skeleton-crew.html

An expert on how data and algorithms are changing work responds to Janelle Shane’s “The Skeleton Crew.”


“The Skeleton Crew” asks us to consider two questions. The first is an interesting twist on an age-old thought experiment. But the second is more complicated, because the story invites us to become aware of a very real phenomenon and to consider what, if anything, should be done about the way the world is working for some people.


The first question explores what it would mean if our machines, robots, and now artificial intelligences had feelings the way we do. (Recall the Haley Joel Osment child A.I. that was created to suffer an unending love for its human mother while society dies around it.) “The Skeleton Crew” offers an interesting twist because the A.I. indeed has feelings just like us, because it is, in fact, us: The A.I. is a group of remote workers faking the operations of a haunted house to make it seem automated and intelligent.


ADVERTISEMENT

It’s a fun take on the trope. That the A.I. actually is real people with real feelings underscores the villainy, heroism, or oblivious indifference of other characters around them. The villains interact with the A.I. in murderous ways, and their fear of it is their ultimate downfall. The badass damsel in distress graciously thanks the A.I. for saving her life before she knows that it’s humans. The billionaire is oblivious to the actual workings of this world he’s created, whether it is shoddy A.I. or real people, and he ghosts as soon as his moneymaking is in question. Interestingly, the crowds of people who go through the haunted house seem most interested in seeing whether they can break the A.I. and prove it’s not actually intelligent (recall the Microsoft Tay release). Perhaps this represents our human bravado, wanting to prove we’re a little harder to replace than A.I. tech companies think we are.


ADVERTISEMENT

The second question, less familiar and comfortable, is queued up when Bud Crack, the elderly Filipino remote team manager, says to his team: “I’m trying to explain things to them. What we are. They’re confused.”


Before “they”—those operating in expected, visible roles in society—can offer any kind of assistance, they need to wrap their minds around the very existence of remote workers faking the operations of A.I. In this case, the 911 operator has to get from her belief that “the House of A.I. is run by an advanced artificial intelligence” to a new understanding that there is a frantic remote worker in New Zealand who has been remotely controlling the plastic Closet Skeleton in the House of A.I. and is now the only person in the world with (remote) eyes on a dangerous situation.


ADVERTISEMENT

This fictional moment mirrors an actual reality that is detailed in the award-winning book Ghost Work by Mary Gray, an anthropologist at Microsoft Research and a 2020 MacArthur Fellow, and Siddharth Suri, a computer scientist at Microsoft Research. Ghost work refers to actual, in-the-flesh human beings sitting in their homes doing actual paid work to make A.I. systems run. Most machine learning models today use supervised learning, where the model learns how to make correct decisions from a dataset that has been labeled by people. Ghost work refers to the paid, piecework data labeling that humans do so the models can learn correct decisions: for instance, labeling images, flagging X-rated content, tagging text or audio content, proofreading, and much more. You may have done some of this data labeling work for free by completing a reCAPTCHA identifying all the bikes or traffic lights in a photo in order to sign in to different websites.


Comentarios

Popular

Herramientas de Evaluación de Sistemas Algorítmicos

Sistemas multiagentes: Desafíos técnicos y éticos del funcionamiento en un grupo mixto

Controversias éticas en torno a la privacidad, la confidencialidad y el anonimato en investigación social