Zahraa Karim

Selfies for Inclusion


‘Selfies for Inclusion’ explores the process of categorisation in training data sets and biases within Artificial Intelligence (AI), and what the consequences of bias can do in the real world; perpetuating injustices they can affect what your civil liberties are, and your relationship to legal, social, and economic systems of power. My practice begins this conversation by posing a question, rather than an answer, and offers a glimpse of a world free of bias. The absence of bias in society has no clear form, and the concept of fairness mathematically is hard to define, but this project is a step towards equity and justice in the use of AI.