Artwork

Nội dung được cung cấp bởi Women in Data Science Worldwide (WiDS), Professor Margot Gerritsen, and Chisoo Lyons. Tất cả nội dung podcast bao gồm các tập, đồ họa và mô tả podcast đều được Women in Data Science Worldwide (WiDS), Professor Margot Gerritsen, and Chisoo Lyons hoặc đối tác nền tảng podcast của họ tải lên và cung cấp trực tiếp. Nếu bạn cho rằng ai đó đang sử dụng tác phẩm có bản quyền của bạn mà không có sự cho phép của bạn, bạn có thể làm theo quy trình được nêu ở đây https://vi.player.fm/legal.
Player FM - Ứng dụng Podcast
Chuyển sang chế độ ngoại tuyến với ứng dụng Player FM !

Timnit Gebru | Advocating for Diversity, Inclusion and Ethics in AI

36:23
 
Chia sẻ
 

Manage episode 264308727 series 2706384
Nội dung được cung cấp bởi Women in Data Science Worldwide (WiDS), Professor Margot Gerritsen, and Chisoo Lyons. Tất cả nội dung podcast bao gồm các tập, đồ họa và mô tả podcast đều được Women in Data Science Worldwide (WiDS), Professor Margot Gerritsen, and Chisoo Lyons hoặc đối tác nền tảng podcast của họ tải lên và cung cấp trực tiếp. Nếu bạn cho rằng ai đó đang sử dụng tác phẩm có bản quyền của bạn mà không có sự cho phép của bạn, bạn có thể làm theo quy trình được nêu ở đây https://vi.player.fm/legal.

Timnit recently completed her postdoc in the Fairness, Accountability, Transparency, and Ethics (FATE) group at Microsoft Research, New York. Prior to that, she was a PhD student at the Stanford Artificial Intelligence Lab, studying computer vision under Fei-Fei Li. She also co-founded Black in AI, an organization that works to increase diversity in the field and to reduce the negative impact of racial bias in training data used for machine learning models.

She was born and raised in Ethiopia. As an ethnic Eritrean, she was forced to flee Ethiopia at age 15 because of the war between Eritrea and Ethiopia. She eventually got political asylum in the United States. “This is all very related to the things I care about now because I can see how division works,” she explains during a conversation with Margot Gerritsen, Stanford professor and host of the Women in Data Science podcast. “Things that may seem little, like visas, really change people's lives.”

Last year, she said that half of the Black in AI speakers could not go to NeurIPS because of different visa issues. “And in that 20 seconds, that visa denial, it feels like the whole world is ending for you because you have an opportunity that's missed… Not being able to attend these conferences is much more important than people know.”

She has learned through her work with Black in AI that the number one thing we need to do is empower people from marginalized communities, which is why diversity, inclusion and ethics are not at all separate. It’s essential to have a wider group of people in the world determining where AI technology goes and what research questions we pursue. She says the industry has been pretty receptive to her proposals around norms, process and transparency because they are easier to operationalize. However, there are other things like racism and sexism where we need a fundamental shift in culture.

She has seen the potential for unintended consequences with AI research. Her PhD thesis at Stanford utilized Google maps data to predict income, race, education level, and voting patterns at the zip code level. She saw some follow up research using a similar methodology to determine what kind of insurance people should have. “And that is very scary to me. I don't think we should veer off in that direction using Google Street View.” She says she wishes you could attach an addendum to your earlier research where you talk about your learnings and your intentions for how the work be used. Timnit is currently working on large-scale analysis using computer vision to analyze society with lots of publicly available images. She says it’s critical that she also spend a lot of time thinking about the consequences of this research.

RELATED LINKS

Connect with Timnit Gebru on Twitter (@TimnitGebru) and LinkedIn
Read more about Google AI and Black in AI
Connect with Margot Gerritsen on Twitter (@margootjeg) and LinkedIn
Find out more about Margot on her Stanford Profile
Find out more about Margot on her personal website

  continue reading

55 tập

Artwork
iconChia sẻ
 
Manage episode 264308727 series 2706384
Nội dung được cung cấp bởi Women in Data Science Worldwide (WiDS), Professor Margot Gerritsen, and Chisoo Lyons. Tất cả nội dung podcast bao gồm các tập, đồ họa và mô tả podcast đều được Women in Data Science Worldwide (WiDS), Professor Margot Gerritsen, and Chisoo Lyons hoặc đối tác nền tảng podcast của họ tải lên và cung cấp trực tiếp. Nếu bạn cho rằng ai đó đang sử dụng tác phẩm có bản quyền của bạn mà không có sự cho phép của bạn, bạn có thể làm theo quy trình được nêu ở đây https://vi.player.fm/legal.

Timnit recently completed her postdoc in the Fairness, Accountability, Transparency, and Ethics (FATE) group at Microsoft Research, New York. Prior to that, she was a PhD student at the Stanford Artificial Intelligence Lab, studying computer vision under Fei-Fei Li. She also co-founded Black in AI, an organization that works to increase diversity in the field and to reduce the negative impact of racial bias in training data used for machine learning models.

She was born and raised in Ethiopia. As an ethnic Eritrean, she was forced to flee Ethiopia at age 15 because of the war between Eritrea and Ethiopia. She eventually got political asylum in the United States. “This is all very related to the things I care about now because I can see how division works,” she explains during a conversation with Margot Gerritsen, Stanford professor and host of the Women in Data Science podcast. “Things that may seem little, like visas, really change people's lives.”

Last year, she said that half of the Black in AI speakers could not go to NeurIPS because of different visa issues. “And in that 20 seconds, that visa denial, it feels like the whole world is ending for you because you have an opportunity that's missed… Not being able to attend these conferences is much more important than people know.”

She has learned through her work with Black in AI that the number one thing we need to do is empower people from marginalized communities, which is why diversity, inclusion and ethics are not at all separate. It’s essential to have a wider group of people in the world determining where AI technology goes and what research questions we pursue. She says the industry has been pretty receptive to her proposals around norms, process and transparency because they are easier to operationalize. However, there are other things like racism and sexism where we need a fundamental shift in culture.

She has seen the potential for unintended consequences with AI research. Her PhD thesis at Stanford utilized Google maps data to predict income, race, education level, and voting patterns at the zip code level. She saw some follow up research using a similar methodology to determine what kind of insurance people should have. “And that is very scary to me. I don't think we should veer off in that direction using Google Street View.” She says she wishes you could attach an addendum to your earlier research where you talk about your learnings and your intentions for how the work be used. Timnit is currently working on large-scale analysis using computer vision to analyze society with lots of publicly available images. She says it’s critical that she also spend a lot of time thinking about the consequences of this research.

RELATED LINKS

Connect with Timnit Gebru on Twitter (@TimnitGebru) and LinkedIn
Read more about Google AI and Black in AI
Connect with Margot Gerritsen on Twitter (@margootjeg) and LinkedIn
Find out more about Margot on her Stanford Profile
Find out more about Margot on her personal website

  continue reading

55 tập

Усі епізоди

×
 
Loading …

Chào mừng bạn đến với Player FM!

Player FM đang quét trang web để tìm các podcast chất lượng cao cho bạn thưởng thức ngay bây giờ. Đây là ứng dụng podcast tốt nhất và hoạt động trên Android, iPhone và web. Đăng ký để đồng bộ các theo dõi trên tất cả thiết bị.

 

Hướng dẫn sử dụng nhanh