By Lisa May Thomas and Debbie Watson
Everyday vast amounts of data are being collected, analysed and used to train AI on a scale we have never experienced before. Right now, decisions are being made by big businesses and government that will determine how and by whom this data can be used.
But this data impacts all of us, we need to make these decisions together.
These are decisions about what data gets collected; decisions about how we are grouped and compared, about how data is processed and used to teach AI systems to analyse and predict our behaviour. To make assumptions about us.
Data driven systems now underpin our lives. it’s not just about us as individuals. It’s not in the cloud. Data is everywhere. Data is collective. Data is powerful.
So are we.
We need to decide together how this data about all of us should be used, now and in the future.
We are all connected by data.
Caring for Futures
These words, spoken in a short video produced by the not-for-profit organisation ‘Connected by Data’, underpin the essential message of our project Caring for Futures.
Throughout 2024 we worked with groups of young people across Bristol to explore what data means to them and to ask questions about AI systems and the futures this technology might bring about.
We wanted to understand what the young people we engaged with thought about AI, its take up and use in our world today.
Working with small groups of young people, we thought critically about issues and concerns around bias in AI systems, and the effects and impacts of these biases on people and communities; as well as exploring the creative potential of this technological medium.
The workshops
Between May and Dec 2024, the Caring for Futures research team at the Centre for Sociodigital Futures ran a series of creative workshops with young people aged 11-17yrs from four different community organisations across Bristol.
Each workshop was delivered by academics and artists and considered a range of ways that AI features in young lives. Through a 6-session programme the young people explored data journeys and datafication in their bodies and in VR; used different Large Language Models (LLMs) and got to make holograms; designed their own games and considered algorithmic bias and racism; developed branching narratives which they filmed in 360; considered the futures of care and AI; became co-researchers considering surveillance technology and created their own AI manifesto.
Each of the groups, which ranged in size from 5 – 15 individuals, was very different. A mix of ages, backgrounds and genders were represented. Some of the groups were meeting each other for the first time, others were existing groups and came to the sessions as a community with strong relationships. All of them brought interesting and diverse ideas, thoughts and opinions.
This method of engaging with young people has shown us that creative approaches to understanding data and AI can raise important questions and help young people understand data privacy in news ways.
For example, the use of both physical and digital methods was key in helping participants to explore and understand sociodigital futures. We explored the idea of ‘datafied bodies’ through mapping the young peoples’ bodies onto paper (by drawing around their bodies) and adding post it notes inside their paper avatars with the technologies, apps or platforms through which they give or share their data.
[datafied bodies created by the YP displayed on the wall at KWMC]
Outside of the bodies, we placed post it notes for the companies who own the technologies, apps and platforms, adding strings to explore the journeys and networks that are created between them and other bodies. Using a drawing app in VR, the YP then explored ways to imagine and create visual ideas for their possible future datafied bodies.
This method of visualising data journeys helped the young people to understand how data moves and is shared, and to articulate questions around the privacy, consent and control of data. These sessions gave them new skills and provided opportunities for their own reflections and conversations.
Next steps
We’re continuing to reflect on outcomes and themes from the data we have collected with the young people. We know that the sessions raised important questions about the use of different technologies in the everyday lives of the young people we engaged with, their understanding of data and how data is used, and gathering insights about their thoughts and opinions on AI and futures.
We wanted to offer each session as a standalone process, as well as the group of six sessions working together as a curated programme through which the young people could build on their knowledge and critique ideas around AI systems and sociodigital futures.
We plan to adapt these sessions to create an online resource to share more widely across schools and youth services. We are already working on some follow up resources for the young people we have engaged with to support their onward learning and as a way of staying connected with the groups.
This project has been able to happen thanks to the amazing skills and support of our CenSoF team: (Marisela Gutierrez Lopez, Edward Knight, Nicola Horsley, Matt Dowse, Beckie Coleman, Paul Clarke); other University of Bristol colleagues: Ed King and Richard Cole (Concept Games Jam), and Sophia Walsh (app snap), and our artists Vince Baidoo, Emmanuella Blake-Morisi, Steve Hutson, and Lawrence Hoo. We thank you all for your generous contributions!
Particular thanks to Dom Pottle (Film and TV Studies student) who filmed and edited the video.
For more information on the work of the Centre for Sociodigital Futures, join our mailing list, follow us on X, Bluesky and LinkedIn or visit our webpage.