When I look in the mirror I don't see a man's face nor a woman's face, what I see is just me, my face. Many people probably read me as a man when I'm out in the street but I am non-binary and my face does not come with a binary gender attached. Gender is an embodied construct that I get to establish for myself. Everyone tends to make judgements on other people's gender based on presentation, stereotypes, social context, cultural background etc., but the ultimate and truest call for anybody's gender is, well, for themselves to make. Faces in general are not gendered a priori, we gender them only because we know or we assume to know the gender of the humans they belong to.
A very simple concept if one knows what gender is. And yet it's also often a difficult one to grasp, in a mostly cis-normative culture: to get it, to have been informed enough, to have come across enough experiences and people to understand it even, requires some labor.
It's easy to find flaws in the many ways that gender is encoded in computational systems and data science often does not play well with queerness. But there's an extra level of concern for me when I come across algorithms that claim to automatically infer it, guess it, predict it based on biometrical traits. Automatic Gender Recognition is the name of the field preoccupied with this kind of tasks. And it's flawed from inception. As researcher Os Keyes puts it, it's based on the misunderstanding that gender is something immutable and that can be inferred entirely from physiological features. The utter wrongness of which no one better than a trans person can feel under their skin.
I've spent years working with technology and witnessing all sorts of uncomputable things becoming outputs of some machine learning algorithm. Gender is the one that gets me deeply, because it took me so many years to free myself from the rigid gender categorisation European culture had taught me. When an algorithm tries to impose one to me, it feels like violence. And I am white and live in a rich country, am not even among the worst impacted by this kind of technologies.
The proclivity for pretending computers can answer questions that should never have been asked to machines in the first place is a pattern we are getting used to when it comes to algorithms and technologies under the label of "Artificial Intelligence" (read machine learning). As Kate Crawford writes in "Atlas of AI":
"What epistemological violence is necessary to make the world readable to a machine learning system? AI seeks to systematize the unsystematizable, formalize the social, and convert an infinitely complex and changing universe into a Linnaean order of machine-readable tables. Many of AI’s achievements have depended on boiling things down to a terse set of formalisms based on proxies: identifying and naming some features while ignoring or obscuring countless others. To adapt a phrase from philosopher Babette Babich, machine learning exploits what it does know to predict what it does not know: a game of repeated approximations. Datasets are also proxies—stand-ins for what they claim to measure. Put simply, this is transmuting difference into computable sameness."
Marginalized people often fall into the cracks of algorithmic systems that are reproducing the violence of the political context they are brewed into, and trans people belong to this category. The step from prediction to prescription is short. And 'diversity' and 'inclusion' are not magical fixes that can mend the damages of ill-defined questions.
Some technologies, like Automatic Gender Recognition (AGR) should have never existed in the first place, and are now their abolition is being called for. I quite like this approach to escape the matrix of domination, as defined by Prof. Sasha Costanza-Chock, that technology can perpetrate: refusal, the ability to push back and just say no, this thing we won't have. And refusal goes beyond absurd things like AGR, it applies to any kind of technology impacting socio-technical systems.
As much as visibility is important for trans non-binary people in different circumstances, it's a double-edge sword, I do not want to be included in algorithmic systems of oppression, I don't want to be intelligible to machines that are bound to reproduce patterns of harm and trauma.
Inspirations in this direction are numerous, from the Feminist Data Manifest-No, to the Refusal Conference that went on in 2020, to the year-long festival Transmediale for Refusal 2021.
My favorite piece of writing on the strategy and perils, and labour of refusal (as a framework against oppression, way beyond technology) is No by Sara Ahmed.
On AI and gender specifically, what can refusal mean in practice beyond straight abolition of pieces of technology when that's not possible? The book Design Justice by Sasha Costanza-Chock is a great starting point, as an "exploration of how design might be led by marginalized communities, dismantle structural inequality, and advance collective liberation and ecological survival".
I also appreciate projects that try to resist harfmul technology with counter-surveillance practices, like spoofing and adversarial techniques. The Dragging AI workshop is an example, and a lovely mix of art and trans excellence, reminiscent of Zach Blas' Facial Weaponization Suite. Unsurprisingly, art is often a field where the exploration of AI technology from a critical/embodied perspective can lead to very intersting results.
One of my current favorite artists is Everest Pipkin, and I dearly recommend their On Lacework: watching an entire machine-learning dataset and Shellsong, where they use deep-fake voice technology to explore "what a voice is worth, who can own a human sound, and how it feels to come face to face with a ghost of your body that may yet come to outlive you".