Ghost work, artificial intelligence and “The Skeleton Crew” by Janelle Shane.


Expert on how data and algorithms change work responds to Janelle Shane “The skeleton crew. “

“The Skeleton Crew” asks us to think about two questions. The first is an interesting variation of a centuries-old thought experiment. But the second is more complicated, because the story invites us to become aware of a very real phenomenon and to think about what, if anything, should be done about the way the world works for some people.

The first question explores what it would mean if our machines, our robots, and now our artificial intelligences had feelings like us. (Remember the Haley Joel Osment child AI who was created to suffer from endless love for her human mother as society dies around her.) “The Skeleton Crew” offers an interesting twist because AI indeed has feelings just like us, because she is, in fact, us: AI is a group of telecommuters who simulate the operations of a haunted house to make it appear automated and intelligent.

It’s a fun take on the trope. The fact that the AI ​​is in fact real people with real feelings underscores the meanness, heroism, or unconscious indifference of the other characters around them. Villains interact with AI in deadly ways, and their fear is their ultimate downfall. The Badass Damsel in Distress graciously thanks the AI ​​for saving her life before she knew they were humans. The billionaire is oblivious to how this world he created actually works, be it shoddy AI or real people, and he ghosts whenever his money is in question. Interestingly, the crowds of people walking through the haunted house seem more interested in seeing if they can break down the AI ​​and prove that it isn’t actually smart (recall Microsoft Tay version). Perhaps this represents our human bravado, wanting to prove that we’re a little harder to replace than AI tech companies think.

The second question, less familiar and less comfortable, lines up when Bud Crack, the old Filipino remote team manager, tells his team, “I’m trying to explain things to them. What we are. They are confused.

Before “they” – those who play expected and visible roles in society – can offer any kind of assistance, they must think about the very existence of remote workers simulating AI operations. In this case, the 911 operator must get from his belief that “the House of AI is run by advanced artificial intelligence” to a new understanding that there is a frantic remote worker in New Zealand who controls at distances the plastic cupboard skeleton in the AI ​​House and is now the only person in the world with (distant) eyes on a dangerous situation.

This fictional moment reflects a real reality which is detailed in the award winning book Ghost work by Mary Gray, Anthropologist at Microsoft Research and MacArthur Fellow 2020, and Siddharth Suri, Computer Scientist at Microsoft Research. Ghost work refers to real human beings in the flesh sitting in their homes and doing paid work to run AI systems. Most machine learning models today use supervised learning, where the model learn to make the right decisions from a dataset that has been tagged by people. Ghost work refers to the piece-wise labeling of data that humans do so that models can learn the right decisions: for example, tag images, report X rated content, tag text or audio content, proofreading, and much more. You may have done some of this data labeling work for free by completing a reCAPTCHA identify all bikes or traffic lights in a photo in order to connect to different websites.

The ten years of academic research emerging around this subject provide the opportunity to understand these working conditions as well as the experiences of people entering and leaving participation in these platforms. Three themes are linked to the history of “Skeleton Crew” and offer a certain visibility on this work experience.

First of all, many but not all of these workplaces are subject to “algorithmic management”, Which includes functions such as automated hiring and firing, as well as performance review gamification, with scores linked to wages. Here in Silicon Valley, these automated management functions are designed to allow “scalability” because supervisors or human evaluators are no longer needed. In “The Skeleton Crew”, the automated management functions that monitored workers’ success in scaring visitors, among others, had a “hostility that was matched only by [the] deep stupidity. In history, as in many settings, people working on these platforms undergo self-termination without recourse because it is particularly cruel. I suspect that many of us have encountered some form of “algorithmic cruelty, how to get locked out of an online account or get ripped off by a fake flower website, no recourse, no phone number to call, no one to talk to. Now imagine your income and livelihood being subjected to such automated systems and dehumanizing responses. Or by Hatim Rahman’s research, imagine losing income and professional status on an automated platform for reasons that will never be explained to you and which indeed seem intentionally opaque. “The Skeleton Crew” suggests shoddy systems and dehumanizing treatments are completely unnecessary and almost confusing, perhaps because billionaire society needed to pretend there were no humans exploiting the system . Real-world examples of people or companies simulating AI operations are strange, but not uncommon. New Zealand company appears to have rigged digital AI assistant for doctors, with absurd interfaces like clients needing to email the AI ​​system. The founders berated a questioning journalist for choosing to “not believe”. But companies don’t have to make explicit false claims about AI to engage in shadow work. Some academics and activists, including Lilly Irani, to have argued that many human-in-the-loop automated systems such as Amazon Mechanical Turk rely on the invisibility of those involved, as this seems to make the technology appear to be more advanced and self-sufficient than it actually is. reality, and rhetoric and system design trying to create the invisibility of human work.

Second, despite some of these cultural conditions and system designs, these workplaces, like almost all, are collaborative, social and meaningful. Take for example, the bandage of Uber and Lyft drivers together to play on the prices managed in an algorithmic way. Such collaboration is common, even on purposely individualistic crowd platforms. Another research article by Gray and Suri showed the collaborative network created by people working on Amazon Mechanical Turk, where they collaborated as ‘crowdworkers’ to ensure top wages and create social connections (see also Turkopticon system). Likewise, the Skeleton Crew actively collaborated to create workarounds within the dysfunctional system, including dividing Closet Skeleton shifts because the gamified “Scare-O-Meter” was so bad for this role that it usually meant no pay of them realized the Scare-O-Meter recorded a mop as a scared human face and the whole team could again be paid for Closet Skeleton shifts). Because of their collaboration and the way they had “worked their way together” through this bizarre system, it would have been catastrophic to lose a colleague in what might appear to be individualistic jobs.

Third, “The Skeleton Crew” provides a glimpse into these worlds of work through its living examples of how humans are incredibly good at improvising and developing situated expertise, abilities that remain difficult for automated systems. Lucy suchman and his colleagues have several books and articles which analyze the improvisation and situated expertise of humans, and “The Skeleton Crew” illustrates these ideas in fun detail: on the wall; Cheesella knows she can eject one of her cheap plastic skeleton hands to distract baddies and is also thinking about sounding the fire alarm when she realizes her remote-controlled skeleton has no way of communicating with the villains. people in the room. The Skeleton Crew’s understanding of the idiosyncratic context and the collaborative improvisation it took for its members to expertly use the framework to thwart the attack, provides a fun and realistic take on how groups of people work together. even when completely distant and even when mediated by virtual communication.

These themes of history give us insight into these working conditions and prompt us to reflect on the second more complicated question that history asks us to consider: as society begins to better see and understand the potential cruelties of workers. ghost working conditions, is there something that can be done? Gray sometimes compares the present moment to when society began to truly understand the realities of child labor and the urgent need for more protective laws. She argues that what is needed is regulation, especially regulation that recognizes a new “form of employment which does not correspond to full-time or full part-time or even clearly self-employment”. Such regulations imply obtaining the new classification of employment right, and also securing the necessary provisions and benefits for all kinds of relevant work, even when technologies, jobs and employment status change. Lobbying for this new job classification, and the associated provisions and regulations, requires us to see working conditions that have not been readily visible, and also for companies to recognize that it is not just about temporary working conditions “in the process of automation” and to take action. Hopefully “The Skeleton Crew” helps start or continue this awareness and conversation.

Future Tense is a partnership of Slate,
New America, and
Arizona State University
which examines emerging technologies, public policies and society.


About Author

Leave A Reply