A former exec says Google once pulled the plug on videoconferencing tech because it couldn’t identify people of color

Diane Bryant, the former COO of Google Cloud, speaks on the campus of UC Davis in August 2018.

caption
Diane Bryant, the former COO of Google Cloud, speaks on the campus of UC Davis in August 2018.
source
Greg Sandoval/Business Insider

  • Former Google executive Diane Bryant said during a presentation on Tuesday that she had first-hand knowledge about video-conferencing technology that Google had to “pull back” because it couldn’t identify people of color.
  • Bryant suggested that the problem stemmed from a lack of diversity among the people who built the tech.
  • While some of Google’s top execs have warned about the problem of bias in creating algorithms, the company has created software in the past that was allegedly racially biased.
  • Bryant told the gathering of IT professionals: “Diversity is a fact. Inclusion is a choice.”

Google once had to “pull back” video-conferencing software for employees because of the technology’s inability to accurately identify people of color, Google cloud executive Diane Bryant said on Tuesday.

Bryant, Google cloud’s former COO before departing in July after barely more than half a year, also suggested that the problem with the technology was at least partially caused by a lack of diversity among the people who build it.

Bryant made the comments during a presentation at the University of California Davis before a gathering of several hundred IT managers and faculty who work for for the UC system during the annualUniversity of California Computing Services Conference.

Towards the end of her presentation, Bryant began to discuss the need for diversity of viewpoints within IT departments. She cited something that occurred at Google during her brief tenure as an example of what can go wrong when diversity is lacking.

“There’s been a lot of news lately on diversity and AI,” Bryant told the crowd. “For instance, if an algorithm is trained with incomplete data you’re at risk of developing a bias, which can be disastrous when you’re automating decisions, say, behind parole hearings or the approval of loan applications.

“I sadly saw this firsthand at Google,” she continued, when “IT’s initial videoconferencing solution based on AI had to be pulled back” after initially rolling out to employees.

“The algorithms that detected the faces in the room and then zoomed in automatically to center them in the screen failed to accurately identify people of color,” said Bryant.

Through a spokeswoman, Google declined to comment. Bryant departed the auditorium directly after her speech without answering questions from the audience and did not respond to questions sent later to her via LinkedIn.

Diversity is a hot-button topic at Google

The issue of diversity is a sensitive topic for Google, as it is at many other companies in the tech sector. For a long time, Google has pledged itself to diversifying its ranks, though progress has been slow. Google’s latest diversity report disclosed that more than half of its workers (53.1%) were white, a drop of 2.4 percentage points from 2017.

Dr. Fei-Fei Li, Google's chief AI scientist.

caption
Dr. Fei-Fei Li, Google’s chief AI scientist.
source
Greg Sandoval/Business Insider

That means the number of staff members from other ethnic backgrounds increased slightly, but most figures remained largely unchanged.

In 2015, software engineer Jacky Alciné discovered that Google’s image recognition algorithms were classifying black people as “gorillas.” The company apologized to Alciné and promised to fix the problem.

Meanwhile, Diane Greene, the CEO of Google Cloud and Bryant’s former boss, as well as Dr. Fei-Fei Li, the unit’s chief AI scientist, have been vocal advocates for reducing bias in algorithms and AI.

As Bryant began to wrap up her presentation, she received enthusiastic applause after saying “diversity is a fact. Inclusion is a choice.”

She also warned tech companies that they can’t blame under-represented groups for the diversity problem, saying “you can’t put the burden of the marginalized class on the marginalized class. That simply is illogical.”

*If you’re a current or former Google employee and you know more about this, please contact me at gsandoval@businessinsider.com.