Bias in the Algorithm w/Q&A | WKAR

Screening Cover Image
June 3, 2021
7:00pm Eastern Time
The screening has ended.

RVSP to receive details and a reminder email about online screening event.

Thanks!

We'll send a reminder email one hour before the online screening event starts.

Be sure to create an OVEE account before the screening day.
 Add to CalendarfalseMM/DD/YYYY

Sorry!

We can't take RSVPs right now.
Please try again later.

Enter Screening

Welcome! Please sign in with one of the following options...

Google IconGoogleFacebook
Create OVEE Account
By signing in, you agree to our Privacy Policy Privacy Policy and Terms of Service.

Welcome

Can a search engine produce racially biased results? Can software recognize faces with unbiased accuracy? Join the film screening and conversation exploring the impact of bias in internet search and facial recognition technologies.

This virtual film screening with panel discussion features two short films: "Search Engine Breakdown," from the NOVA television series seen on PBS; and "Algorithmic Injustice," a report from WKAR Public Media produced in collaboration with NOVA. The WKAR report explores the case of Michael Oliver, a Black man falsely arrested in Detroit after being misidentified by facial recognition technology.

Angelo Moreno, adult services librarian at East Lansing Public Library, moderates a Q & A session following the film screenings and presents a how-to session on using library catalogs and other resources to search for information in comparison to internet search engines.

The evening is presented by WKAR Public Media in partnership with East Lansing Public Library.


The attendee registering will need to create a free OVEE account and will be asked to provide their birthdate to verify their age as 13 or older.


If you have questions about this event or the OVEE platform, please send an email to events@wkar.org


Video Descriptions

OVEE TitleCard

5 minutes

Welcome to OVEE!

Julie Sochay Senior Director of Content and Communication

Beyond the Search: The Biases Inside Google’s Algorithms | Full Film

Why does a widely-used internet search engine deliver results that can be blatantly racist and sexist? "Beyond the Search" tells the story of two leading information researchers who made shocking discoveries about hidden biases in the search technology we rely on every day. It begins when Dr. Safiya Umoja Noble set out to find activities to entertain her young nieces and entered the term “Black girls’’ into her search bar: pages of pornography appeared as the top results. Subsequent searches of “Latina girls” and “Asian girls” led to similarly sexualizing and racist results. Concerned about the effect of such dangerous stereotypes, Noble embarked on research that would lead to her groundbreaking book, ‘Algorithms of Oppression.’ Along the way, she discovered the work of another prominent Black researcher, computer scientist Dr. Latanya Sweeney, who had made her own disturbing discovery: When she searched her own name, she got online ads for access to an arrest record. As Sweeney had never been arrested, she began investigating discrimination in online ad delivery. Her findings astounded her: Searching a name more commonly given to Black children was 25% more likely to deliver an ad suggestive of an arrest record. Both researchers share common concerns about how everyday online searches can reinforce damaging stereotypes, and explore how technology can be made more equitable. © 2020 WGBH Educational Foundation All rights reserved This program was produced by GBH, which is solely responsible for its content. Some funders of NOVA also fund basic science research. Experts featured in this film may have received support from funders of this program. Funding for NOVA is provided by Draper, the David H. Koch Fund for Science, the NOVA Science Trust, the Corporation for Public Broadcasting, and PBS viewers. This program is made possible by viewers like you. Support your local PBS station here: pbs.org/donate/

How Racial Biases can Corrupt Facial Recognition Technology

A useful tool for consumers and law enforcement alike, facial recognition technology can help police officers identify—and ultimately charge—criminals caught on camera. But its critics argue that it's discriminatory: Research shows that facial recognition software often misidentifies people of color at a much higher rate than white individuals. Now, Detroit, Michigan is facing lawsuits for the false arrests of two Black men misidentified by facial recognition technology. Why is it more difficult for this technology to recognize people of color? And do legal, privacy, and human rights concerns outweigh the benefits of its use?

Please Stay for the Q&A

A Brief Pause

Thanks for joining us!

Julie Sochay Senior Director of Content and Communication

Presented by WKAR Public Media

Michigan State University

60 minutes

Moderator

Participants

  • Panelist Avatar
    Angelo Moreno

    Panelist

    Adult Services Librarian at East Lansing Public Library Panel Moderator

  • Panelist Avatar
    Xiaoming Liu

    Panelist

    Michigan State University Foundation Professor

  • Panelist Avatar
    Arun Ross

    Panelist

    Michigan State University Professor; John and Eva Cillag Endowed Chair in Science and Engineering; Site Director, Center for Identification Technology Research

  • Panelist Avatar
    Carol Yancho

    Panelist

    Filmmaker, Search Engine Breakdown

Before you get started in OVEE:

1. Ensure you are using the most current version of your favorite popular browser:

2. Run a test to ensure OVEE works properly on your computer

The views and opinions expressed in this online screening are those of the presenters and participants, and do not necessarily reflect the views or policies of ITVS, public broadcasting, or any entities hosting the screening.