Workshops aimed at those interested in Big Data, Artificial Intelligence and Algorithmic Service Design

Developing the Not-Equal agenda - workshops

Workshops aimed at those interested in Big Data, Artificial Intelligence and Algorithmic Service Design

big dataartificial intelligencealgorithms

Co-investigators

Swansea University

Project Type

Tool

These sets of workshops were aimed at those interested in aspects of social justice and fairness affected by the developments related to Big Data, Artificial Intelligence and Algorithmic Service Design in public services and the Sharing Economy and provided an occasion to help shape Not-Equal’s agenda and activities.

We invited researchers from a variety of disciplines (social sciences, engineering, design, arts and humanities, computer science, law and business) to come together and share perspectives on issues of social justice in the design and application of new and emerging digital technologies within the three challenge areas the network has identified.  

The Sharing Perspectives, Exploring Responses workshop

This workshop took place for around three hours in the Urban Sciences Building at Newcastle University. People worked in groups to explore and reflect on: issues in areas affecting social justice in the digital economy, and trying to come up with issues and solutions for a specific scenario.  

This could be for example: looking at the use of health data, asking how this could affect people, thinking about what would happen when that data was in the hands of different actors – from hospitals to insurance companies and considering the long-term consequences of this system. 

There were lots of ideas and thoughts floating around the room – from trying to hack the gig economy to asking whether data based in an unjust world can ever be without bias? 

Outputs from the day were shared with participants and used to shape and influence the agenda for the Network+.

Bits Leak Out workshop

This workshop took place at the Computational Foundry, Swansea University. Some technology has obvious and intentional social implications: election rigging, surveillance, cyber-weapons. It is also clear that the increasing dependence on digital access to government, welfare, health and commerce may shut out those without the money to afford or skills to use technology. 

However, even when technology is apparently well-intentioned, when the rules appear unbiased, still the technology may have a differential impact on the vulnerable. Computers are not natural black-boxes, the low-level details of algorithms leak out.

In this event, interested participants shared current research on the ways digital infrastructures affect social justice, explored what transdisciplinary responses may be required for technology to support social justice, and influenced the agenda and funding process of Not Equal.

© Not-Equal.tech 2025