The need to dream beyond becomes necessary when we recognise that some concepts [like AI] have no revolutionary or liberatory potential insofar that they are inherently welded to infrastructures of domination, extraction and oppression.
Much of my policy work seeks to contest, not improve, inherently harmful technologies, in particular when they form part of broader eco-systems of domination. And yet, that becomes incredibly difficult, if not impossible, when advocating to institutions that themselves seek to catalyse the tech-fuelled surveillance and discrimination.
Without meeting face to face, Buse [Çetin] and I have, in our many conversations through a screen, been discussing how far it is possible, through advocacy in the digital field, to dismantle the systemic ways technology is used to harm, dominate and exclude people at the margins.
Often we end up discussing the work of those we know and admire as a way to find answers to these questions. This piece extends those conversations, using Buse and Nushin’s proposed citational practice – centering in the piece the work of those you cite, going deeper than a passing reference by actively featuring their works. I like this because it assumes and requires a personal connection with those whose work you want to share.
Since moving into digital activism my thinking has been shaped by my encounters with critical feminists researching, organising and contesting the ways technologies – and the structures they are in service to - harm us.
I suppose that’s why the carrier bag became a kind of rambling love letter to black and brown queer and disabled feminists, dreaming beyond AI. Catchy.
A resounding feature of policy work and tech activism is an optimistic attitude to what technology can achieve. As such, many of the policy demands in the face of ‘AI harms’ have been greater transparency, accountability, and tech-based solutions to harmful technological practices.
This has been incredibly frustrating for those of us who came to this work from broader social justice struggles. Inherently sceptical of how far ‘technical checks’ and transparency could achieve justice, we looked to conceptualise how technologies are necessarily woven into broader structures of oppression: racial capitalism, border industrial complexes and queer and transphobia.
As a result I think that those best equipped to practice dreaming beyond are those engaged in broader struggles. Those who can properly contextualise technological harms, putting them in their place:
“A picture starts to emerge, in which a continuum of data-driven surveillance is built which serves the interest of both states and tech corporations, and mutually reinforces their powers. As over-policed, criminalized and marginalized communities are disproportionately targeted by the tools and policies of control that these technologies serve, the programmable infrastructures of surveillance are in fact tools for institutional racism. This is not surveillance capitalism, but rather a racialised digital capitalism that boosts technologies of discriminatory policing. It is not ‘public’ versus ‘private’ actors, but rather the powerful versus the marginalised.”
This is how Esra Özkan and Sanne Stevens approach the issue in the context of their work for the Justice, Equity and Technology Table. They show exactly why dreaming beyond is necessary, because it is so much more than a question of better design, of fixing the technologies. Technology cant save us when working for those structures. The only way out is resistance, putting the needs of the the marginalised at the centre, and for us to take the right to review, revoke and refuse.
But, in the policymaking space, refusal, revoking and dismantling often doesn’t go down well. Advocating for certain technologies to be banned because they are structurally racist is often contested as unrealistic and ‘anti-innovation.’
In ‘Who gets to write the future?’ Erinma Ochu, Caroline Ward and I discussed both the limits and the potential of policymaking. Part of the issue we identified was that policymaking is often ill-suited to radical transformative change because of its commitment to discipline in multiple ways. Siloing policy areas – separating for example ‘equality’ and ‘digital’ policy areas in institutional structures is a major barrier to addressing concerns at those intersections, but also to the need to dream beyond existing governing structures. Here’s a few snippets from our conversation:
Caroline: [H]ow do you bring imagination into policymaking? … we were wondering, how do those that are minoritised become centred in this process, how can we “theorize from the borders,” evoke “border thinking” to come into play. There are the power forces of imperialism that govern what counts and what gets pushed to the side.’
Erinma: So it’s the failure of imagination of the policymakers and where they’re drawing their evidence and practices from? Just like the plantation thinking of siloed disciplines? The disciplinariness of the policy space cuts out everyday life in a spectacular fashion… so how do we bring it back in? How does the policymaker embrace the everyday?
Sarah: I’d really like policy to be more about imagining. I often question how far transformative work is possible within current policymaking circles, or whether this space is only capable of engendering different futures through reform, step-by-step change... Often ‘policy’ is dictated by what is feasible rather than what is necessary. Resigning ourselves to the slowness of progressive change, or limiting our demands so as not to be perceived ‘too radical’ always felt convenient for those that have the luxury of waiting patiently for change. I’ve never really liked this.
Rarely is dreaming considered practice. More often than not, dreaming is dismissed in a policymaking space. Framed in opposition to “practicality” (the policymakers’ virtue), dreaming is a discouraged activity, threatening conversations and ideas that are ‘unrealistic’ and ‘out of scope’.
Often I see the work of activists dismissed as unrealistic by those holding power. The closer you are to governing institutions, the more you are co-opted to defend them. In the course of this defence, you are also required to practice a dismissal of radical ideas and concepts, especially those that challenge the structures you are upholding.
More than simply envisioning a utopian future, dreaming requires a detailed understanding – and feeling - of how the structures of oppression currently work, before we even start to entangle the strands and imagine how things could be different. Far from being unrealistic, dreaming is the essence of pragmatism; the proposal that how things work doesn’t actually work at all, not for us anyway.
Octavia Butler’s work is the best example of this. In Parable of the Sower and Parable of the Talents, Butler extrapolated her present world’s racism, capitalism and division to dream up a dystopian future of company towns, extreme class inequality and completely malfunctioning social systems including a redundant police force.
Technologies of oppression feature strongly. When communities are attacked by fundamental Christian paramilitaries in Parable of the Talents digital collars are used to enslave people by inflicting pain at any sign of defiance of those controlling the technology. Octavia was dreaming of the full extent of harms that can come when ‘innovation’ is simply a project of domination in disguise.
I recognise similar practices of dreaming, visionary futures in the work of Irene Fubara Manuel. In their video game Dreams of Disguise, Irene plays with the lack of agency experienced by those crossing borders. Bordering is increasingly digital, increasing in the invasive and evasive nature of the process. Yet, in the game, Irene dreams up an alternative space of freedom and autonomy outside of those borders, that the player can strive for.
Exploring how technologies function within systems of oppression – now and in worse futures - is part of a practice necessary to dream beyond AI as it is instrumentalised now – as a tool of surveillance, categorisation, ordering and extraction.
Abolitionist thinking has emphasised though, that visionary work must be more than dismantling, its also about building anew – developing counter structures, that model community centres ideas of justice.
"An abolitionist vision means that we must build models today that can represent how we want to live in the future. It means developing practical strategies for taking small steps that move us toward making our dreams real and that lead us all to believe that things really could be different.”
http://criticalresistance.org/about/not-so-common-language/
I am always amazed at the creative approaches to tech justice activism that are out there. Yasmine Boudiaf, in her AI Justice Matrix, plays with power, epistemic justice and knowledge formation. The matrix is a space for collaborative gathering of information on the issue of AI ethics, mirroring in its very methodology the tech justice principles it discusses in content.
AI Justice Matrix (2021). Available at: https://aijusticematrix.com/ (Accessed: [07-11-2021]).
Working with Laurence Meyer, I am constantly inspired at the commitment building resilient spaces for tech activism that centres marginalised commmunities. Coordinating the Digital Rights for All programme, Laurence articulates beautifully the difference between tokenistic inclusion and centering.
“Very often the lived experiences and resistances of racialised, poor, migrant and queer communities facing adverse conditions are used as example instead of expertise.”
In these cases, we are dreaming beyond current, extractive modes of organising as well as extractive technologies. We can learn to dream beyond by constantly seeking to dismantle and divest from the oppression in our current realities, and instead build anew.
I’m grateful to all of the black and brown, queer and disabled feminists, with whom this is already in motion.