[Senne] is thrilled to present Algorithmic Bias, a group exhibition featuring new and previous works from 13 international artists, including Zach Blas, Joy Buolamwini , Bob Bicknell-Knight, Jacob Ciocci, Ami Clarke, Heather Dewey-Hagborg, Stephanie Dinkins, Ben Grosser, Joel Holmberg, Esther Hovers, Claire Jervert, JODI and Lynn Hershman Leeson.
Curated by Bob Bicknell-Knight
Algorithmic Bias is an exhibition concerned with the systems and structures embedded within the internet of things, many of which were and continue to be created with an in-built bias. Algorithms have become a common tool used in the framework of social media platforms, created by unknown coders, reinforcing social biases of race, gender, sexuality, and ethnicity. The works in the exhibition explore and critique the procedurally generated invisible rules that control our online and offline lives.
In a newly built city, glittering in the early morning light of a dying sun, I weave in between crowds of tourists. Every day more of them appear, transfixed by the appearance of the citadel, an incredibly tall and thin building that stretches high into the sky.
The citadel was built over the course of a year, alongside the city that surrounded it. Its creation was swift, primarily made from concrete and, in doing so, quickly depleted the world’s supply of sand.
Small, incredibly conspicuous, permanent surveillance towers had been built on every street. By some, they were seen as overt forms of hostile architecture, restricting any and all acts of rebellion. Others, who lived and worked in the citadel, believed in the towers and their embedded technologies, helping the city’s civilians to live a happier, healthier lifestyle.
You see, each tower was equipped with high tech CCTV equipment, connected to a complex algorithm that detects abnormal behaviour within an observed environment. These vision-based surveillance techniques detect and classify intrusion, loitering, and violence, among other, more militant, behaviour.
I used to wonder what those terms actually meant, in the world of the real, detached from their original spawn points; overtly clinical board rooms and glass office spaces. I guess, a more important question might be who, then, is tasked with allocating examples of such behaviour? Examples that will train the algorithm to detect these occurrences quickly and swiftly on a regular basis, continuing to keep the city safe.
Algorithms, and the process of machine learning in particular, require a huge amount of data, labelled and tagged. To identify certain intrusive or violent behaviour, machines need to have their vision refined, being fed massive quantities of manually annotated imagery to efficiently identify patterns. For some time, low paid human beings have been used, crowdsourced through companies who employ a global workforce to annotate endless datasets. These growing collections of data are then used by companies in different platforms and programmes around the world.
Some of these temporary employees live on the outskirts of the city, housed in vast, hive like structures, buried beneath the ground. They, like you and I, live and work every day of their lives, toiling under the weight of the world whilst dreaming of avarice and the pure potentiality of what might have been.
Who then, can be trusted to teach the machines, when human beings are both born with, and taught to have, an in-built bias? We are intricate creatures, living highly charged, emotional lives. These workers are merely one layer of an increasingly complicated stack, filled with interlocking grids, connecting digital and physical beings to create a cohesive, evolving, cybernetic organism.
Generally, I keep my head down when near the citadel, afraid of what worlds reside within this fortress like apparition. It is, as they say, better to be a toad hidden under a stone than a butterfly crushed beneath it.
Unlike citizens of the city, tourists are permitted to come and go as they please, unobstructed by a complex social credit system that prevents many from leaving this concrete abyss. As a citizen everything you do is tracked and recorded, from the purchases you make to being reprimanded at work. The local government works with public and private companies to maintain a constantly shifting database. Residents are ranked on a varied scale, determining whether they’re able to eat at certain restaurants, socialise in public, or travel outside of the city.
There have been times in the past where I’ve been flagged by the system, mistaken for someone else of similar age, stature, or race. It has affected my overall ranking, but if I continue to work hard, and prove that I’m a happy cog, I’ll be able to up my score and escape this city.