FIRST AIRED: November 27, 2018

Nice work! Enjoy the show!

×

You’re busy. We get it.

Stay on top of the news with our Editor’s Picks newsletter.

US Edition
Intl. Edition
Unsubscribe at any time. One click, it’s gone.

Thanks for signing up!

×

Transcript

00:00:01
>> In May, Google rolled out a new feature to demonstrate its artificial intelligence prowess.>> We call it Smart Compose. We use machine learning to start suggesting phrases for you as you type.>> Engineers realized quickly that it had a sexist bent says Reuters tech correspondent
UNKNOWN]
>> The problem in Smart Compose was that in a sentence like, hi I'm meeting an engineer tomorrow, would you like to meet him or her?
00:00:33
Google's Smart Compose would often suggest him because the systems learn from the data that is out there in the real world and most engineers for better or for worst are men.>> So Google fixed that by creating a list of banned words, including gender pronouns, racial slurs, and expletives.
00:00:51
Google's decision to play it safe on gender follows some high profile embarrassments. It had to apologize in 2015 when the image recognition feature of its photo service labeled a black couple as gorillas.>> Google has gotten rid of this gender bias from Smart Compose but it still has these similar issues in a lot of its other services.
00:01:10
For example, its predictive keyboard for smart phones which is basically autocorrect, will recommend things like if I'm typing for example police. And it'll suggest to me policeman or I'll type in sales for example, and it might recommend salesman.>> It's not just Google, Apple's keyboard and other predictive services out there also face similar issues, and experts in artificial intelligence say it's hard to suppress some of these biases as some sentences are too complicated for the machine.
00:01:43
Like my bra has an underwire gets corrected to underwrite because underwrite is more often used out there in the world.