Brown Don't Frown was borne out of a personal journey with womanhood. As a British Bangladeshi, navigating mainstream Feminism often felt exclusionary to me because it didn’t seem to value the experiences or views which shaped my grandmother’s, aunts’, mother’s or friends’ lives. Through this podcast, we seek to build a more inclusive discourse, which breaks down presumptions about different cultures, and shines a positive light on the stories of underrepresented women. Featuring new guest(s) from different walks of life in each episode, Brown Don’t Frown seeks to engage ordinary women and facilitate openness towards entirely new perspectives. It hopes to spark honest and meaningful conversations about intersectional feminist themes in contemporary society with the acknowledgement that our views are shaped by our cultural, racial, religious, social and political experiences. Whether it's discussing society's preconceptions about the Hijab with a British-born Jamaican Muslim woman or examining the impact of gendered expectations on our ability to grieve on our own terms, we hope listeners finish each episode feeling more rounded than they did before. Follow us on: Instagram: https://www.instagram.com/browndontfrownpodcast/ Twitter: https://twitter.com/bdfpodcast?lang=en Facebook: https://www.facebook.com/browndontfrownpodcast LinkedIn: https://www.linkedin.com/company/browndontfrownpodcast
Episodes
Sunday Nov 29, 2020
Sunday Nov 29, 2020
*Apologies in advance for the slightly off sound quality; we hope you still find the content meaningful.*
What is Artificial Intelligence? What role can it play in ensuring information is factually correct (especially during a pandemic) and that data is gender disaggregated? Today I’m joined by our first honorary male guest, Freddie Kalaitzis, Senior Research Fellow in Machine Learning at the University of Oxford. We know that AI is modelled on human behaviour, and given that humans are innately biased, can we trust AI to be unbiased and neutral?
Freddie also talks about his work with Amnesty International on ‘Troll Patrol', which investigated the extent of abuse against women online. One statistic which stood out was that Women of Colour were 34% more likely to be mentioned in abusive tweets than White women and that Black women were particularly affected, being 84% more likely than white women to be mentioned in abusive tweets.
Globally, Public Health policies have not addressed the gendered impacts of pandemics. Campaigners have argued that this research gap is caused by a lack of gender disaggregated data. Emerging evidence suggests that more men are dying, potentially due to gendered immunological or environmental differences, such as exposure to pollutants from working outside or smoking. 89% of nurses in the UK are women, emphasising the gendered nature of the healthcare workforce and the risks that women face. I ask Freddie if AI can drive information gathering techniques which simplify and enable sex disaggregated data to be collected and what solutions AI might provide in response to COVID-19. At a time when we are saturated with news and information at our fingertips, AI could play a crucial role in preventing misinformation. Finally, Freddie reminds us why we should be mindful that correlation doesn’t always mean causation.
Comments (0)
To leave or reply to comments, please download free Podbean or
No Comments
To leave or reply to comments,
please download free Podbean App.