This workshop has been designed for 13 to 24 years old interested in understanding how the internet works. Contrary to public opinion, young people care about their personal data and want a digital world more transparent, a digital world they can trust. Little is know, for example, about how Amazon is able to tailor advertisements and recommend products that are actually interesting for potential online customers, or how Facebook decides which news Facebook users may be more inclined to read. All the mechanisms that support this filtering of information and products is obscure and internet users would like to know more about it, such as possible bias in the behaviour of the algorithm and, more important, have some control over these recommender and personalisation systems.
This event aims to closely work with young people to further understand how aware ‘digital natives’ are about algorithm bias, their attitudes and main concerns and recommendations when interacting with such systems. This information will help us to better understand the way young people interact with such systems and identify youth-led solutions for teaching critical thinking toward digital information systems. We will apply different engagement tools and methodologies including, hands-on excersices using the computer facilities from CityGames, ‘youth juries’ to facilitate discussion, reflection and a deeper understanding of youth online behaviour and youth-lead software solutions.
All participants will receive a £10 gift voucher for their contribution.
In order to promote digital literacy among young people about algorithm bias, the UnBias team at Horizon, The University of Nottingham, presents an innovative format to bring people together and facilitate reflection about how recommender systems work. During the event, participants will be invited to become part of a ‘jury’ that will reflect on and offer advice relating to:
• What are fair algorithms?
• How can we filter all the information contained on the www?
• Are recommender systems transparent?
• Ways in which algorithms can affect us
• Youth-led policy recommendations
• Ways of further engaging with young people in thinking about and acting upon algorithm bias
The aim of our juries is not only to find out what participants think and feel about the experiences of the digital world (e.g., social media, data privacy, cyberbullying, etc.), but to discover if they are open to changing their minds in the light of discussion with others or exposure to new information. In order to explore such questions, we are interested in discussing i) the reasons that jury members give for adopting particular perspectives and positions; and ii) the extent to which participant’s’ perspectives and positions change (individually and collectively) between their arrival in the jury session and their departure. The jury session will take 2-hrs including refreshment and it will be led by a trained facilitator whose task will be to provide a safe space for participants to express themselves freely and critically.
1. Do I need to bring a laptop?
No, you will be able to access a computer at CityGames
2. I am under 16 years of age, do I need parental consent to participate in this workshop?
Yes, once you register your participation a researcher from the University of Nottingham will contact you via email and send you all the documents required for you to take part in this workshop. We are going to collect participant's opinions and concerns regarding the way algorithms and recommender systems work. To collect this data, we need your consent and if you are a minor, we also need parental consent.
3. Where can I contact the organiser with any questions?
You can email researcher Dr Elvira Perez at firstname.lastname@example.org with any questions