Tech & Childhood co-founder Katrine K. Pedersen part of EU-wide research project about DATA racism and coded biases in the digital transformation of Europe.
The European Network Against Racism is conducting research to shed light on the often hidden discrimination and racism of the digital transformation of Europe. Among other researchers and experts representing the EU countries, Katrine K. Pedersen will represent Denmark. The research is focusing on biases - AI and discrimination.
ENAR - The European Network Against Racism has taken a leading role in the work towards global digital sustainability:
“The growth of AI and other data-driven sorting systems are often pigeonholed as a ‘US issue’. Quite the opposite, the use of data-driven technology (including Artificial Intelligence, automated decision making systems, algorithmic decision making, the merging of large datasets with personal information, and good old fashioned social media scraping and surveillance) is increasingly unveiled throughout Europe. It is very much a European reality.
What is less explored is how such technologies discriminate. The flip side to the ‘innovation’ and enhanced ‘efficiency’ of automated technologies is how they in effect differentiate, target and experiment on communities at the margins – racialised people, undocumented migrants, queer communities, and those with disabilities.
Automated or data-driven decision-making tools are increasingly deployed in numerous areas of public life, which inherently affect people of colour more. Increasingly, we witness this experimentation on marginalised communities in policing, counter terrorism and migration control functions.
In ENAR’s recent study Data-driven policing: the hardwiring of discriminatory policing practices across Europe, Dr Patrick Williams and Eric Kind highlight the range of data based techniques deployed by police forces across Europe with potential discriminatory impact on racialised communities. From the increased resort to facial recognition tools in crime investigation, despite evidence that they misidentify people of colour (in particular women), to older tech such as automated number plate recognition, which has been used to track and discriminate against Roma and Traveller communities, we see that the application of technologies is only exacerbating trends of over-policing and under-protection.”