audiovisual

#podcast: Facial Recognition: Automating Oppression?

#facial recognition #LGBTQIA #surveillance

Coding launches its first podcast, an episode that opens the Privacy Is Global program, by Internews

All over the world, facial recognition technologies are gradually being deployed in several moments of our lives: be it in surveillance cameras across the streets or when we need to authenticate our IDs to access social services, bank accounts and other private services. But what happens when we use a binary algorithm to recognize our non-binary diversities. Who gets excluded? What historical oppressions are being automated?

Facial Recognition: Automating Oppression? is the topic of the first podcast produced by Coding Rights for the series “Privacy Is Global”, which is an initiative of the Internews and the Heinrich Böll Foundation. In this episode Vanessa Koetz and Bianca Kremer, both Coding Rights fellows on Data and Feminisms, talked to Mariah Rafaela Silva, Brazilian scholar and activist in transgender rights and Pablo Nunes, black scholar and activist specialist in public security and racism. Our goal was to understand the risks of implementing this tech without an informed public debate about its consequences. And, to ensure there were no questions left behind, there was a special participation of the responsible for all this tech to work: Dona Algô (Misses Algô…exactly, from the algorithm), the fictional character of the content producer Helen Fernandez, known on social media as @Malfeitona.

What are the consequences of deploying technologies that actually serve to reinforce discrimination against historically oppressed and persecuted populations?

For instance, in Brazil, we are the third country with the highest rate of incarceration in the world. And approximately 67% of the 770 thousand people imprisoned in the country are black and brown. In a similar trend, a research by the Rede de Observatórios da Segurança shows that 90.5% of people arrested in Brazil after being flagged by facial recognition technologies are black and brown. How accurate is this tech and how can it be automating oppression? In his work at the Center for Studies on Security and Citizenship (CESeC), Pablo Nunes is tracking how the deployment of facial recognition technologies is reinforcing racism and challenging debates on criminal justice. He kindly joined us to chat about these challenges.

We must remind that these technologies are designed based on the worldviews and lived experiences of those who develop or contract them, mostly white cis male from the Global North. This context has consequences: a trends to make historically oppressed portions of the populations, basically, most of us, black, brown, trans, immigrants and LGBTQIA+ people, at the same time, visible for surveillance and invisible to access rights. Complex? Mariah Rafaela develops this idea further in our conversations, particularly focused on the context of transgender people.

Unfortunately, we must remember that Brazil is considered world’s deadliest countries for trans community, as it is the country that most kills transgender people in the world. In 2020, 175 trans women were murdered. A 41% increase over the previous year. As we can see, the situation is already brutal and complicated. Imagine what can happen if technologies that reinforce binary views of gender are added in a context that is already highly violent for trans people? It is happening.

What to do about all this? Campaigns to ban facial recognition technologies are spreading and creating awareness about how these tools enable mass surveillance and gender and racial discrimination. Last June, a letter signed by more than 170 human rights organizations around the world, including us, from Coding Rights, highlighted some of these concerns and asked for a ban on biometric surveillance. The next step is to push for legislation to ban certain usages of these systems in our countries and regions. But, for that to happen, we need a lot of people concerned about the topic. So, come join us and listen to the episode on your favorite audio platform! If you like it, plese, help us spread the link!

https://anchor.fm/privacyisglobal/episodes/Facial-Recognition-in-Brazil-Automating-Oppression-e17bn59/a-a6h7pom