Dark Content (2015-ongoing), the centrepiece of the show, probes into the Internet’s most secret corners by looking closely at the largely anonymous labour force of content moderators that has emerged with the rise of social media. Content moderators are in essence the gatekeepers to any material accessed online. Their role is to screen all uploads according to specific corporate standards of propriety: making sure beheadings and gang rapes don’t crop up in our Facebook news feeds, and nipples don’t surprise us on Instagram. Along with the major corporations that employ them, these de facto patrols of the Internet shape our daily existence. With each ‘yes’ or ‘no’, they edit the experience of our culture in the name of many different forces: political, moral, ethical, even religious. Once the offending material has been removed, it’s virtually impossible to know it was ever there.
Content moderation is for the most part a very discreet service as the companies who hire this workforce seek to present themselves as free and transparent tools of self-expression. The moderators in turn often hide their work from their friends and families, preferring not to acknowledge the kinds of things they are required to witness everyday. Over a period of more than a year, the Mattes interviewed a number of them about their work. These conversations developed into a series of video episodes – first released on the Darknet – with the interviewees’ identities withheld through the use of voice software and stock avatars, a further distancing from both the person and the content itself. Each video is displayed in sculptural surrounds made from mass-produced office furniture, alluding to the home offices and identikit office cubicles around the world where this work often takes place. Alongside, a new series of wall-mounted insulation panels are printed with corporate moderation guidelines that were leaked to the artists in the course of their investigations. They serve as a reminder that this act of filtering and ‘safeguarding’ is always a reflection of the ideology of the organisation commissioning it.
By Everyone For No One Every Day (BEFNOED, 2014) also calls attention to a new form of labour in the digital age. For this series, the artists used a range of crowdsourcing websites to give instructions to anonymous workers to realise a variety of absurd performances for webcam. The artists laid out instructions for them to follow and interpret. Once received, the videos are dispersed on obscure, peripheral or forgotten social networks around the world. While for Dark Content the artists act as a sort of confidante, here their position is more complex, as their unusual demands appear to both entertain and exploit their performers. In the gallery, the placement of the monitors in the space is such that, in order to watch the work, viewers are forced into a series of physically awkward and bizarre positions, in a sense taking on the role of performers themselves.
In much the same way that the performances are crowd-sourced in BEFNOED, the production of their series Image Search Result (2014-ongoing) is entirely outsourced to the Internet. For each work on display, the artists have selected a search term from their personal browsing history in their preparation for this exhibition. Adopting an idea purchased from a fellow artist, they take the first image a search engine yields for that term and have it printed on a range of objects by online custom printing services. Once produced, the objects are delivered directly to the exhibition venue, where they are unpacked, exhibited and seen by the artists for the first time.
Eva and Franco Mattes share a long-held fascination with the invisible. In Emily’s Video (2012), they show a disturbing video sourced on the Darknet only through the reactions of its viewers. They’ve also programmed a computer virus (Biennale.py, 2001), and co-curated an inaccessible exhibition in the Fukushima exclusion zone (Don’t Follow The Wind, 2015). For the New York-based duo, what’s concealed is often what matters most.