Coded Bias | WKAR

By signing in, you are authorizing ITVS to share information collected from you with any persons or entities participating in or hosting the screening. Such persons or entities may send you periodic communications about related events, services, and support. ITVS’s Privacy Policy and Terms of Service govern all uses of your information.
Thanks!
Be sure to create an OVEE account before the screening day.
Sorry!
Please try again later.
Enter Screening
Welcome! Please sign in with one of the following options...
OVEE AccountWelcome
When MIT Media Lab researcher Joy Buolamwini discovers that many facial recognition technologies fail more often on darker-skinned faces or the faces of women than others, she delves into an investigation of widespread bias in the technology that shapes our lives. Followed by a panel discussion.
Video Descriptions
Filmmaker Introduction
Filmmaker Shalini Kantanna introduces Coded Bias
Coded Bias - Indie Lens Pop-Up
Racial bias in facial recognition algorithms — MIT Media Lab researcher Joy Buolamwini makes a startling discovery, and fallout ensues.
55min 17sec
Moderator
- Jen Preslar
Chat Moderator
Panelists
- Tawana Petty
Tawana is a mother, social justice organizer, youth advocate, poet, and author. She is intricately involved in water rights advocacy, data and digital privacy education, and racial justice and equity work.
Before you get started in OVEE:
1. Ensure you are using the most current version of your favorite popular browser:
2. Run a test to ensure OVEE works properly on your computer
The views and opinions expressed in this online screening are those of the presenters and participants, and do not necessarily reflect the views or policies of ITVS, public broadcasting, or any entities hosting the screening.