Illustration showing hands with different skin colours connected around three pathway signs: Utopia pointing to the left, future ahead, Dystopia to the right.

Dr. Ruha Benjamin advocates for 'Ustopia', challenging tech's harmful impacts, and urging community empowerment. Catalyst's recent research report explores Tech Justice in the UK, emphasizing inclusivity, accountability, and grassroots involvement. Vision: tech serving people, fostering equity, and dismantling oppressive systems.

In her latest, stirring TED Talk, the thinker, dreamer and do-er, Dr. Ruha Benjamin, challenges us to start working towards an ‘Ustopia’: a world in which we’ve consciously moved away from nightmarish, harmful ways of using technology against each other, in service to the visions of multi-national tech companies (the few), and move towards putting power into the hands of our communities (the many), enabling possibilities for human solutions to the many systemic issues we face as local and global societies.

Encountering this provocation from Dr. Benjamin of tech as a ‘saviour’ to some and a ‘slayer’ to others is timely, just as we recently launched the Tech Justice Research report I’ve been working on for a number of months, in collaboration with a number of practitioners. Powered by Catalyst, in partnership with Engine Room, this Inquiry was undertaken by myself and an outstanding team of researchers - Nikita Shah; Quito Tsui; and Rachel Arthur - with the intention to explore whether a Tech Justice movement or landscape exists in the UK, and if so, what opportunities might there be for shifting power towards the most racially marginalised? I explored in this introductory blog post the motivations for the initiation of this work - after all, the personal is indeed political. And the work of thinkers, academics, practitioners and campaigners (including Ruha Benjamin) have been hugely inspirational for grounding this research and helping us to identify its key strands.

Key Learnings

Being able to share the key learnings; more insight into the process; the conversations that took place; and the feelings woven into this work at our recent online launch was a real pleasure. We invited folks who were part of the work from its very early stages, which included our roundtable, through to those who took part in in-depth research interviews with Nikita to tell us more about their specific work within one - or more - of the report’s key themes of focus, which were:

  • Emerging Tech (including AI & Ed Tech)
  • Digital World (Including the Internet and social media ownership)
  • World of Work (anti-racism, Intersectionality and EDI in the Digital workforce; Tech literacy)
  • Surveillance
  • Data
  • Networks & Communities

We were also able to connect with folks working more widely in this emerging ecosystem, either first coming to the work of Tech Justice through their work on social justice, inextricably linked to questions of technology and society of course; and in a few cases folks who perhaps came to this work more from a desire to use ‘tech for good’, and soon understanding that questions of power, justice, and oppression went hand-in-hand. 

Across the space, when we convened as a whole group, and especially during the smaller breakout room sessions, stimulating and thought-provoking conversations were had, with prompts including questions such as:

  • In what ways are Global Majority self-identifying women, in particular, drawing boundaries around anti-racism, emotional burden/labour, rest as resistance and healing from being burnt out/extracted from? What role does tech play?
  • What role do funders play, if any, in shifting EDI work towards anti-racist, decolonial and equitable forms of Tech Justice at work?
  • How do we shift conversations regarding surveillance towards stronger considerations for people’s human rights, our freedom to privacy, and freedom of expression, alongside freedom from discrimination of any kind online (and in turn offline)? 
  • How, in our work, are we able to/ going to ensure community groups, grassroots organisations, and members of the social sector - especially those working on the front lines, closest to where the need is - are centred, heard, and resourced in the design and development of technology, especially tech that impacts them. How do we shift away from the ‘tech being done unto us’ dynamic? 

Framing further conversations

Helping to further frame the conversations, Nikita challenged us to consider how we keep thinking about enabling shifts in power dynamics in the spaces we work and organise in. How do we actually use tech as the means, as a tool, and also as a framework in itself to help support the shifts in power that we need to see?

Across the breakout room explorations, folks were interested in how we action the research findings in our spheres of influence; how we make the intangible (with regards to data, for example) feel tangible; how we can help empower our communities with vital knowledge and skills and a deep understanding of the risks that tech poses in our ever-increasingly digitised world, whilst not feeding into hysteria and disempowering narratives; and one of the biggest questions and concerns on the table was the role of funders in moving this work forward. 

With all the best will in the world, our communities need resourcing to engage with the work of building alternatives and new worlds; likewise, practitioners seeking to shift us towards more life-affirming practices across the board, and campaigners working to bend the arc of truth towards justice - including Tech Justice - all need resourcing. Funders are hesitant to move resources towards anything that feels ‘risky’. There were also frank conversations taking place about the need to decentre funders who weren’t ready to come along on this necessary journey of unlearning, whilst simultaneously recognising the immense power they hold in our work. Can we accept that there will be many that can’t be brought along as we journey forward, due to their choice to remain stuck in ‘old world’ thinking? And what do we do to sustain our work in different ways? What alternatives are there?

There was also a keen interest in figuring out how we prioritise our efforts and capacity,  recognising we don’t have the same heavyweight power and resources as multi-national tech conglomerates like Meta, and how we use technology itself (including and especially social media platforms) to respond to contemporary events. How do we reckon with our complicity when it comes to atrocities taking place in the Congo, a mineral-rich country powering much of the world’s technology? What does the Citizen Journalism we see coming out of Palestine teach us about how tech can help rewrite age-old narratives and shift hearts and minds? In direct contrast, what does the distinct lack of widespread stories from on the ground in places like Sudan show us about what does and doesn’t get amplified to the masses, and the consequence of this?

We are reminded time and time again that the question of Tech Justice is not just theoretical and philosophical but in fact powerfully material, something done as practice.

A Manifesto for a Tech Just World

In the ‘Manifesto for a more Tech Just World’, we summarised what a vision for Tech Justice looked like:

  • Technology that is accessible, enabling, empowering, available, representative, inclusive, non-extractive, owned by the people, made by designers and developers who are answerable to the people, and held accountable by the people.
  • Technology that has practical applications at the grassroots level for those working on the frontlines of social justice issues, addressing root causes, such as health inequality, social housing inequality, injustice in the education system, racism in the workplace, and promoting diversity in all its breadth as an essential part of everyday life.
  • Technology that is community owned by default, or at least having the potential of co-ownership, with the primary function of Tech being in service to our communities. 
  • Technology that can be reused for good. Enabling repurposing and reusing feels important given the amount of waste produced by the need to create new things constantly as a by-product of capitalism.
  • Tech Justice looks like individuals having more rights and knowledge over our data and how its used, with the agency and right to not be digitalised if we so wish.
  • Tech Justice looks like the most marginalised being leading voices in the design of technology and having decision-making power on its uses in their lives. As technology develops, humans will still need to have involvement and decision making power in order to ensure ethical considerations are always front and centre in the work of technological advancement. For example, with artificial intelligence - automated decisions will have human intervention with a human having the final check over a decision. People will have the right to receive an explanation for an automated decision and will be able to file a complaint and appeal against decisions. 

What's next?

And in terms of what next?

Roseanna Dias of Promising Trouble, who facilitated one of the breakout spaces on the day, reminded us that the time for intervention is now. Our collective relationship to tech has to move towards being life-affirming.

Naturally, just as with the research project itself, the conversations at our event left us with many questions that we’ll struggle to answer alone. And this is why one of the hopes for the work’s ongoing life is to convene some sort of Community of Practice around this, bringing together work already being done by the likes of Promising Trouble, The Engine Room, Multitudes Co-op, and many others, to ensure we all stay in community, especially those of us who are practitioners of Global Majority descent. We know from this research (and others), and more importantly, from lived experiencelived-experience, race is a defining axis for all questions of power and anti-oppression. 

The pursuit of justice involves recognising and challenging the structural inequalities perpetuated by both historical and technological factors. It requires fostering inclusive technological innovation, holding powerful entities accountable, and engaging in meaningful conversations that consider the diverse needs and experiences of individuals and communities.

I concluded the sharing event by emphasising a core belief I hold, which is that ‘Justice is love in action’, to echo the words of many others before me. And love happens to be one of Catalyst’s core values also. 

Circling back to how we began, Dr. Benjamin ends her talk by inviting us to envision an ‘Ustopia’ - a world in which we prioritise people over profit; respect that ecological and social wellbeing go hand-in-hand; lift up our communities; and reject deadly systems as inevitable. 

Engaging with technology in a just and more liberatory manner is part of this new world-building, and is intricately connected to all questions of social justice and radically (re)imagining our ways of being with ourselves and each other.  

More from Siana at  www.sianabangura.com | medium.com/@Sianaarrgh

Post-event Resources

In her latest, stirring TED Talk, the thinker, dreamer and do-er, Dr. Ruha Benjamin, challenges us to start working towards an ‘Ustopia’: a world in which we’ve consciously moved away from nightmarish, harmful ways of using technology against each other, in service to the visions of multi-national tech companies (the few), and move towards putting power into the hands of our communities (the many), enabling possibilities for human solutions to the many systemic issues we face as local and global societies.

Encountering this provocation from Dr. Benjamin of tech as a ‘saviour’ to some and a ‘slayer’ to others is timely, just as we recently launched the Tech Justice Research report I’ve been working on for a number of months, in collaboration with a number of practitioners. Powered by Catalyst, in partnership with Engine Room, this Inquiry was undertaken by myself and an outstanding team of researchers - Nikita Shah; Quito Tsui; and Rachel Arthur - with the intention to explore whether a Tech Justice movement or landscape exists in the UK, and if so, what opportunities might there be for shifting power towards the most racially marginalised? I explored in this introductory blog post the motivations for the initiation of this work - after all, the personal is indeed political. And the work of thinkers, academics, practitioners and campaigners (including Ruha Benjamin) have been hugely inspirational for grounding this research and helping us to identify its key strands.

Key Learnings

Being able to share the key learnings; more insight into the process; the conversations that took place; and the feelings woven into this work at our recent online launch was a real pleasure. We invited folks who were part of the work from its very early stages, which included our roundtable, through to those who took part in in-depth research interviews with Nikita to tell us more about their specific work within one - or more - of the report’s key themes of focus, which were:

  • Emerging Tech (including AI & Ed Tech)
  • Digital World (Including the Internet and social media ownership)
  • World of Work (anti-racism, Intersectionality and EDI in the Digital workforce; Tech literacy)
  • Surveillance
  • Data
  • Networks & Communities

We were also able to connect with folks working more widely in this emerging ecosystem, either first coming to the work of Tech Justice through their work on social justice, inextricably linked to questions of technology and society of course; and in a few cases folks who perhaps came to this work more from a desire to use ‘tech for good’, and soon understanding that questions of power, justice, and oppression went hand-in-hand. 

Across the space, when we convened as a whole group, and especially during the smaller breakout room sessions, stimulating and thought-provoking conversations were had, with prompts including questions such as:

  • In what ways are Global Majority self-identifying women, in particular, drawing boundaries around anti-racism, emotional burden/labour, rest as resistance and healing from being burnt out/extracted from? What role does tech play?
  • What role do funders play, if any, in shifting EDI work towards anti-racist, decolonial and equitable forms of Tech Justice at work?
  • How do we shift conversations regarding surveillance towards stronger considerations for people’s human rights, our freedom to privacy, and freedom of expression, alongside freedom from discrimination of any kind online (and in turn offline)? 
  • How, in our work, are we able to/ going to ensure community groups, grassroots organisations, and members of the social sector - especially those working on the front lines, closest to where the need is - are centred, heard, and resourced in the design and development of technology, especially tech that impacts them. How do we shift away from the ‘tech being done unto us’ dynamic? 

Framing further conversations

Helping to further frame the conversations, Nikita challenged us to consider how we keep thinking about enabling shifts in power dynamics in the spaces we work and organise in. How do we actually use tech as the means, as a tool, and also as a framework in itself to help support the shifts in power that we need to see?

Across the breakout room explorations, folks were interested in how we action the research findings in our spheres of influence; how we make the intangible (with regards to data, for example) feel tangible; how we can help empower our communities with vital knowledge and skills and a deep understanding of the risks that tech poses in our ever-increasingly digitised world, whilst not feeding into hysteria and disempowering narratives; and one of the biggest questions and concerns on the table was the role of funders in moving this work forward. 

With all the best will in the world, our communities need resourcing to engage with the work of building alternatives and new worlds; likewise, practitioners seeking to shift us towards more life-affirming practices across the board, and campaigners working to bend the arc of truth towards justice - including Tech Justice - all need resourcing. Funders are hesitant to move resources towards anything that feels ‘risky’. There were also frank conversations taking place about the need to decentre funders who weren’t ready to come along on this necessary journey of unlearning, whilst simultaneously recognising the immense power they hold in our work. Can we accept that there will be many that can’t be brought along as we journey forward, due to their choice to remain stuck in ‘old world’ thinking? And what do we do to sustain our work in different ways? What alternatives are there?

There was also a keen interest in figuring out how we prioritise our efforts and capacity,  recognising we don’t have the same heavyweight power and resources as multi-national tech conglomerates like Meta, and how we use technology itself (including and especially social media platforms) to respond to contemporary events. How do we reckon with our complicity when it comes to atrocities taking place in the Congo, a mineral-rich country powering much of the world’s technology? What does the Citizen Journalism we see coming out of Palestine teach us about how tech can help rewrite age-old narratives and shift hearts and minds? In direct contrast, what does the distinct lack of widespread stories from on the ground in places like Sudan show us about what does and doesn’t get amplified to the masses, and the consequence of this?

We are reminded time and time again that the question of Tech Justice is not just theoretical and philosophical but in fact powerfully material, something done as practice.

A Manifesto for a Tech Just World

In the ‘Manifesto for a more Tech Just World’, we summarised what a vision for Tech Justice looked like:

  • Technology that is accessible, enabling, empowering, available, representative, inclusive, non-extractive, owned by the people, made by designers and developers who are answerable to the people, and held accountable by the people.
  • Technology that has practical applications at the grassroots level for those working on the frontlines of social justice issues, addressing root causes, such as health inequality, social housing inequality, injustice in the education system, racism in the workplace, and promoting diversity in all its breadth as an essential part of everyday life.
  • Technology that is community owned by default, or at least having the potential of co-ownership, with the primary function of Tech being in service to our communities. 
  • Technology that can be reused for good. Enabling repurposing and reusing feels important given the amount of waste produced by the need to create new things constantly as a by-product of capitalism.
  • Tech Justice looks like individuals having more rights and knowledge over our data and how its used, with the agency and right to not be digitalised if we so wish.
  • Tech Justice looks like the most marginalised being leading voices in the design of technology and having decision-making power on its uses in their lives. As technology develops, humans will still need to have involvement and decision making power in order to ensure ethical considerations are always front and centre in the work of technological advancement. For example, with artificial intelligence - automated decisions will have human intervention with a human having the final check over a decision. People will have the right to receive an explanation for an automated decision and will be able to file a complaint and appeal against decisions. 

What's next?

And in terms of what next?

Roseanna Dias of Promising Trouble, who facilitated one of the breakout spaces on the day, reminded us that the time for intervention is now. Our collective relationship to tech has to move towards being life-affirming.

Naturally, just as with the research project itself, the conversations at our event left us with many questions that we’ll struggle to answer alone. And this is why one of the hopes for the work’s ongoing life is to convene some sort of Community of Practice around this, bringing together work already being done by the likes of Promising Trouble, The Engine Room, Multitudes Co-op, and many others, to ensure we all stay in community, especially those of us who are practitioners of Global Majority descent. We know from this research (and others), and more importantly, from lived experiencelived-experience, race is a defining axis for all questions of power and anti-oppression. 

The pursuit of justice involves recognising and challenging the structural inequalities perpetuated by both historical and technological factors. It requires fostering inclusive technological innovation, holding powerful entities accountable, and engaging in meaningful conversations that consider the diverse needs and experiences of individuals and communities.

I concluded the sharing event by emphasising a core belief I hold, which is that ‘Justice is love in action’, to echo the words of many others before me. And love happens to be one of Catalyst’s core values also. 

Circling back to how we began, Dr. Benjamin ends her talk by inviting us to envision an ‘Ustopia’ - a world in which we prioritise people over profit; respect that ecological and social wellbeing go hand-in-hand; lift up our communities; and reject deadly systems as inevitable. 

Engaging with technology in a just and more liberatory manner is part of this new world-building, and is intricately connected to all questions of social justice and radically (re)imagining our ways of being with ourselves and each other.  

More from Siana at  www.sianabangura.com | medium.com/@Sianaarrgh

Post-event Resources

Our Catalyst network - what we do

Support & services

Our free services help you make the right decisions and find the right support to make digital happen.

Learn what other non-profits are doing

39+ organisations share 50+ Guides to how they use digital tools to run their services. Visit Shared Digital Guides.