Artwork

Nội dung được cung cấp bởi Sundae. Tất cả nội dung podcast bao gồm các tập, đồ họa và mô tả podcast đều được Sundae hoặc đối tác nền tảng podcast của họ tải lên và cung cấp trực tiếp. Nếu bạn cho rằng ai đó đang sử dụng tác phẩm có bản quyền của bạn mà không có sự cho phép của bạn, bạn có thể làm theo quy trình được nêu ở đây https://vi.player.fm/legal.
Player FM - Ứng dụng Podcast
Chuyển sang chế độ ngoại tuyến với ứng dụng Player FM !

270: Accountable AI with Maikel Groenewoud

50:12
 
Chia sẻ
 

Manage episode 322051067 series 2427769
Nội dung được cung cấp bởi Sundae. Tất cả nội dung podcast bao gồm các tập, đồ họa và mô tả podcast đều được Sundae hoặc đối tác nền tảng podcast của họ tải lên và cung cấp trực tiếp. Nếu bạn cho rằng ai đó đang sử dụng tác phẩm có bản quyền của bạn mà không có sự cho phép của bạn, bạn có thể làm theo quy trình được nêu ở đây https://vi.player.fm/legal.

What good is all of the awareness we’re raising towards a more diverse, equitable, and inclusive world if technology doesn’t follow suit? Why does the hard work to acknowledge our own biases and prejudices within our societies get under-translated with artificial intelligence?

And this includes growing our intolerance for not just overt racism, which is easy to spot and condemn, but also more subtle affronts.

Regardless, the time to look away or stay quiet has passed. So when we see these infractions, whether in the design lab, at church, on a plane, in the lunchroom, at the bank, or online, we need to call them out.

This week, it’s my pleasure to welcome Maikel Groenewoud to discuss the critical intersection of AI and a lack of diversity and inclusion in technology. A specialist in data science and artificial intelligence (AI), Maikel’s managerial role at Ernst & Young Netherlands focuses on trust and ethics related to technology.

Born and raised in Amsterdam, Maikel is of Afro-Surinamese descent. And in our conversation, Maikel candidly shares how his own personal experiences with racism impassioned him to hold AI programmers accountable.

Today, Maikel reveals the dire consequences that occur when human biases from the individual or societal level bleed into data programming and manifest in AI. His mission is to unite people to fight on the same frequency and ensure the industry truly has a broad perspective that’s fair for all.

What You’ll Learn in this Episode:

  • Why 95% doesn’t cut it
  • When skin color determines loan rates
  • Ways data reflects society & human nature
  • AI’s responsibility to explain decisions
  • Voicing support with your wallet

Listen to the Full Episode

Featured on the Show:

The IN TRANSIT Hub is where our global community leans on each other for support, and we want you with us. There’s a lot of really hard stuff going on right now in the world, and we’re talking about all of it right here.

Catch These Podcasts / Articles:

We’re delighted to be in the Top 5 of the global Best 30 Expat Podcasts!

Subscribe: iTunes | Android

Full Episode Transcript:

Download Transcript

Hello, It is 9:00 am in New York, 4:00 pm in Johannesburg, and 9:00 pm in Bangkok. Welcome to IN TRANSIT with Sundae Bean. I am an intercultural strategist, transformation facilitator, and solution-oriented coach, and I am on a mission to help you adapt & succeed through ANY life transition.

This week’s episode, we’re going to talk about global transitions and I have a hunch this is not going to be a stretch for you. Thinking all of the global transitions we’ve been feeling over the two years, the pandemic, the resulting travel restrictions health risks, and more have profoundly impacted all of us. Even this week that I’m recording we’re seeing another ripple across the world, with the Russia-Ukraine conflict. What we don’t think about is the subtler things that impact us. The things that emerge with seeming neutrality that suddenly have a profound grip on our lives. And today, we’re going to talk about one of them.

AI or artificial intelligence is to be very honest something I push to Sci-Fi movies. Maybe I’ve seen a family film, like Walle, and thought about that adorable robot, but when we really think about Artificial Intelligence and how it’s taking over so much of what we’re doing, we see how it comes quite close to our lives. When you think about the AI that is built into the palm of your hand, think about the ways in which you rely on it as you just use your cell phone. Stephen Hawkings says, “AI is likely to be either the best or worst thing to happen to humanity.” Right? Right now, I need some hope so I am hoping it is the best thing that’s going to happen to humanity. But what I know after this interview that you’ll hear today, it’s not going to happen on its own.

What we’re learning about AI is that its scope and power is far-reaching. In an article from VentureBeat, they say, “The far-reaching scope and power of Machine Learning, ML, and Artificial Intelligence, AI, means that gender and racial bias at the source is multiplied to the nth power in businesses and out in the world.” I mean we were busy enough with human bias. And now we’re going to have to worry about bias programmed into our technology. The answer is, “Absolutely, yes.” This is serious. It is so serious because the scope as you’ve seen is magnified and we can’t directly see it.

So in this episode, we’re going to look at the intersections between AI, and DEI. And to help me do that, we’re going to invite Maikel Groenewoud.

Maikel specializes in data science and artificial intelligence (AI). He performs technical implementations and also has a particular focus on trust and ethics as it relates to the technology.

He studied business mathematics and informatics in Amsterdam, nowadays known as business analytics, and began his career at a policy research institute where he worked for over 10 years. He has been working at Ernst & Young Netherlands since 2016 as a manager within the consulting practice where he leads the data ccience and AI capability.

On a personal note, Maikel was born and raised in Amsterdam, Maikel is of Afro-Surinamese descent. Suriname is a country in South-America, a former colony of the Netherlands and he also has a lot to tell about that history and how it relates to the present. He has what is called a Pan-African mindset and is very much here to aid in healing the Souls of Black Folk, he says. In today’s episode, you will also hear how his work has the power to heal but also prevent harm.

He says, “My background as a researcher always comes in handy. I’m critical and analytical but I also have an eye for the human side.” That’s exactly what caught my attention about his work. He combines his interest and know-how in history, lived experience, and professional background to make a difference for all of us in this all important global transition. Stick around to the end because this is our very first interview since I unveiled ATT: Ambitious Transformation in Transition, and Maikel is the first answer a set of questions I have about it. This will show you exactly why I believe in this approach so much. I’ll let Maikel help you understand AI more. We know the definition is, but what’s behind it?

But before we dive in, I want to just briefly make sure we’re on the same page about DEI. Now, DEI: Diversity, Equity, Inclusion is something that we’ve been talking a lot about. I want us to have this shared context. So according to Seth Boden from harvardbusiness.org, “Diversity, refers to anything that sets one individual apart from another, including the full spectrum of human demographic differences, as well as different ideas backgrounds, and opinions people bring.” All right. So when we say that, that’s what we’re looking at. E in DIE, stands for Equity, “Fair treatment for all while striving to identify and eliminate inequities and barriers.” And, “Inclusion implies a cultural and environmental feeling of belonging and a sense of uniqueness. It represents to the extent to which employees feel valued, respected, encouraged to fully participate, and able to be their authentic selves.” So wanted to make sure we’re on the same page with that, thanks to Seth Boden from harvardbusiness.org.

In this interview, we also talk about white supremacy. And if you are from my generation, when I hear white supremacy, I automatically think of the KKK in white hoods. But if you’ve been following the discussion around racism, equity and inclusion, and diversity, we know the term has been changing over the years. And now, according to Nobel Peace Prize nominee Barbara Smith, she talks about it how we now use it in the context. Now in her piece from the Boston Globe, she talks about, “Systemic racism and how clearly conveys the pervasiveness of racial oppression,” but goes on to say that, ”White supremacy goes further by indicating that there is a rigid nexus of power that protects it and reinforces it.” So we want to make sure that when we hear that conversation, it’s talking about something more subtle, something more common, even and maybe not intentional or even obvious, right? So I want to make sure we are on the same page with that, as well. All right, let’s get started and hear from Maikel.

Sundae: All right, Maikel. I am so excited to welcome you to IN TRANSIT. Thank you for being here.

Maikel:: Yes. Thank you, Sundae, and I’m very pleased to be here. Looking forward to chatting with you.

Sundae: So it’s been a long time coming. You and I were talking before we hit record that. We’ve known each other over LinkedIn for at least two years so, it’s exciting to be able to connect in this way, and I’m so lucky to have been able to follow your work because it’s really truly opened my eyes to the connection of my work and data collection, AI in the world. So, I’m excited to share this with our listeners today. And do so as I was saying to you before, kind of with, how should I say it? There’s so much responsibility in getting this topic out because it has such a profound impact on all of our lives. And that’s not something I really realized. So before we begin, can you please help those of us who are not in AI understand just generally what that really is and where we come in contact with it in our everyday lives.

Maikel: Yeah, of course. So yeah, AI is short for Artificial Intelligence and that term it kind of suggests real intelligence. What would often tell people is that these systems aren’t actually intelligent. So what they do is data. They might look at data that has been collected, and they might be able to detect the patterns in there and based on those patterns, based on historical data, the system might be able to predict, may try to like, based on the data try to determine like, “Okay, how does that work,” and based on that data to think, “What will be the best decision to make, what might be the best recommendation to provide.” But it isn’t so that the system really understands what’s going on. So you how does it manifest? So when you think about speech recognition systems, which maybe can talk to to your computer or to another device. But also at image recognition, those are like examples that AI has manifested.

But it also mentions a lot of different things, like also, our mobile phones, all the stuff you have on there. So many ways, many different areas, you see AI in it. Yeah, you can say that it makes life easier in a way, you could say because they allow us to do certain things more efficiently. Or maybe allow us to like tailor services specifically to the person based on maybe your preferences, your previous history or other characteristics you have or maybe characteristics of people have to have a similar profile as you. And then based on all the data, the system then can look at, “Okay. What might be the best fit for you when it comes to service or products or other things, you might be interested in.”

So those are like, yeah, just a quick introduction, some of the ways that you see this technology manifesting and it’s quite broad because you see it in many different areas and also many areas that you don’t think about, but also keeping like the software that we have on the computers, all the word processing. Also the system that we used to create presentations, etc. All that software will also AI Incorporated to make stuff work more smoothly for us. So it’s many different areas.

Sundae: And so for me, as I’ve done a little look into it, I realized it goes Way beyond what I had in mind, kind of those future, Sci-Fi movies. The real practical, real practical things that are in everyday life. You said that the intention is to make our lives easier. But what I’ve learned, looking at it, again, this naive view of, hey, “This is fantastic.” We have data driven in ways to do new things because it feels like it actually puts in some neutrality to our human experiences. But what I’ve learned from you is, actually, it’s the opposite. That whatever bias that we have, as individuals gets programmed into the data, and then it manifests the scale.

So can you say a little bit more about the intersection between AI, Artificial Intelligence, and DEI, this awareness that more people are having that we need to build in processes of Diversity, Equity, Inclusion, in all of our things?

Maikel: Yeah, of course. Yeah, because that’s very true that it might be that, okay, decisions can make matters or the sense is to make more objective. But the reality is that specifically, when it’s about people when the systems are trained on data that comes from people, or is about people, we then see is that the systems basically just reflect what is in the data, and the data is a reflection of society, of human nature. So for instance, if people have certain biases, that’s also reflected in the data. And then also the system can learn from the data. But also, for instance, another way to show up is that the systems are only basically smart or can only learn from the data that is presented to them. So for instance, if only one group of people is used to train a system, then it also means that basically the system only learns– gets an understanding of that group of people and that can then mean that maybe if you apply to other people, that it doesn’t work that well.

So for instance, if you only make a system based on men, then it could be the decision doesn’t work that well for women. Or if you only use training system based on White people, only on White man, that also means that the mat only work for that group. So those are also something to think about and this isn’t just one side of it, that’s the input data side, but even regardless of what data you put into it, also humans that are developing or monitoring the systems, also at that point, you need to be able to supervisor the data and say, “Okay. I know the society that we live in. I know the biases that exist there. So based on that, then I also need to make sure that I check the system and also check whether what I’ve built actually works for different groups.” And that is something that you can also get, you also need to do yourself as a human, regardless of what data you put into it. You need to check for it. And then also you need to check it if it doesn’t work for different groups.

And then also, you asked about the intersection of AI and DEI. It is in many different ways And a lot of ways that you look at the people that are developing it do they have like diverse backgrounds? And also to different ways of thinking? So that you can also incorporate the different perspectives, not just to theoretically but also like really incorporate people from those different groups. Like if you make a system for all people, then it cannot be that only men are the ones developing it. You also need to include women. But also, if your system is for everyone, you can’t just only have one demographic in there. Only one ethnicity. You also need to make sure they can incorporate the other ethnicities in there. Just also to increase the likelihood that you will get a truly broad perspective in there.

Sundae: But let’s hold on a second. This is more than just a broad perspective in something functioning. This actually has implications that are life or death.

Maikel: Truly. Yeah.

Yeah. So one example indeed that I also often talk about oxygen meters or oximeters. And the thing there is that they basically are like little devices that are used to– they clicked on the fingers of people and they are used to estimate the blood levels in the oxygen of people, but the only problem is that they don’t work that well necessarily for people with darker skin. So what they basically do is they overestimate the level of oxygen in the blood and as a result based on an overestimation, those people then only later get the help that they need. So only later, they might get extra oxygen or get put on a ventilator and that’s of course extra critical right now because we’ve been living in a pandemic for all the last few years. And of course, oxygen levels are very important there because one of the things that you saw getting in hospital that they need extra oxygen, put on a ventilator. But if then the oxygen meter, oximeter overestimates your blood oxygen levels, you have a big problem.

And the thing is also with that is that it’s already bad enough that it’s like that, but the research saying that has actually already been available for quite a long time. So the research was already done like 30 years ago, 40 years ago and that already said, okay, these devices don’t work on people with darker skin, and the recommendation was also given by the researchers that the manufacturers would incorporate more data from black people or from other people with darker skin, to make sure that you would then be able to like recalibrate the algorithms so it would also get proper readings for them. But the thing is right now that during the epidemic it was established that it seemed that it still don’t work that well, these devices. So the UK is doing a review to get rid of the US counterparts.

And also the original researchers who first published the article in 1990, at the beginning of this year, they published a new article and in it, they also said that they have seen no evidence that the manufacturers in all those years have actually implemented the recommendation that they gave to also like get data from people with darker skin so that you can recalibrate those algorithms. So yes, like you said life and death. It’s definitely a case of life and death in this case. But it’s also important for me to note that because people talk a lot about the technology and also, of course because I work in the technology field, I get asked first of all about the technology, but what I also tell people is that this is not really a technology issue.

It’s really about people because it is just like ethical AI, it really requires ethical humans. And in this case also, if you’ve known for 30 years then that’s not a technology problem, just like a human decision problem that they know about it and they make the conscious decision to say, “Okay. For me, it’s not that important enough to make that change.” So you just leave the technology as it is with all the dire consequences for other people. So yeah.

Sundae: It’s unbelievable, like what? 30 years go by and nothing has happened. And I was listening to an interview that you did, and there was a caller that came on and it broke my heart. She’s like, “What do I tell doctors when I go in and they take my oxygen and I’m having an asthmatic attack. My son is having an asthmatic attack. And they say my oxygen is fine.” So she’s not only experiencing – I wouldn’t even call it a microaggression here, with a human because are not believing her case. But you’re also getting it built into the technology. Right? And it’s incredible. It’s incredible. That’s massive. I just want to do a shout-out to Caroline Criado Pérez she has a book called Invisible Women, where she talks about exposing the data by us, in a world designed for men, how that also has health implications for women who are, let’s say in a car crash because the seat belts were designed for men and not for women. So it’s huge.

So doesn’t this all start with who’s getting trained at the university level for being in the data industry? Or It’s much bigger than that, I’m guessing.

Maikel: Yeah, it’s certainly a part of it. So for instance, if people were only trained – if you only look at the technical aspect, you might say, “Okay I’ve made a system and it is very high accuracy in total.” So you would think maybe, “Okay. I’m good, I’m fine, everything works properly.” But they also need to be aware of the fact that you need to also look at the different subgroups. “Okay, does this also work well for all the different groups that are in there?” And if that’s not the case, then you might need to say, “Okay. I need to maybe arrive at the new or different AI system or that I only use it for specific groups, which I noticed it works well. And for other groups, I might do something else.” And also, if you would use this to diagnose people, when it comes to healthcare, also, healthcare example, you can also say, “Okay, it has only been tested mostly on White people,” then you can say, “Okay. I know that it works for White people. I can use it there. But in other cases based on research, it might be better to use another device.” Or another way to determine whatever it is that you want to diagnose.

It is also important to say that it’s not as so that you always need to use the technology. It’s more, okay, it is a tool that you can use, but if there is a certain imbalance in there, they can also make an alternative decision. But yeah, I think it’s people, the problem that I also have is with it, all these technology discussions, it’s a lot of deflection to the technology, it’s a lot of pointing to the technology as if that’s the big issue. What? All the things that we talk about have everything to do with human nature. It reflects human nature, but that’s only one of the issues because you can say whatever the data is, but as humans, you can also then make the decision, “Am I going to check for it? Am I going to leave it in place or not? And we’re going to make light different decisions with it? Or am I just okay with whatever it is that’s in there?”

And that’s just really has to do with the ethical behavior that needs to come from humans. And I think that’s the big thing that we need to solve because ethics and AI like I said that the terms, Artificial Intelligence kind of suggests real intelligence, but the systems aren’t really intelligent. So real intelligence and the real ethical and moral behavior, really need to come from humans. And yeah, they need to step up there. But yeah, I think we can say that we can just look at the whole world around us. Everything that goes on, it’s very clear, it’s very obvious time and time again, that humans and ethics, that’s not a problem that we’ve really solved yet. So, but that’s wher we really need to start. If you want to solve the problem of ethics and AI.

Sundae: Absolutely. Maikel, I don’t really hear a lot of hope in your voice. Tell me where I’m wrong.

Maikel: Well, I’m just realistic and it’s also because my personal background also, even though I work in technology, I love history more, and when I just look at history, like all the stuff that you see repeating all the time and also how people simply refuse to really acknowledge history? So for instance I know that you’re in South Africa, and I know that South Africa also has a brutal history, but one thing that I really do appreciate, of the countries, they had truth and reconciliation. And you can say whatever you want about that process but that does provide you at least a way forward way to at least try to heal and come together. But when I look at European countries, when I look at America also, I don’t see truth and reconciliation. I also don’t see any reparations because that’s also an important discussion, but like, if no reparations, no truth, no reconciliations then how can you really move forward in a constructive manner?

I think we can definitely make progress, but there needs to be a willingness. And that willingness is something that I’ve, yeah, that has very much been lacking is again. Like I said, South Africa, truth reconciliation, I think this is a good model, but I have not seen that in Europe, and I have not seen that in America, and I think that’s something that needs to happen to need to be if you really want to move forward and really need to heal. Then we need to be able to like half-truth and really acknowledge that. And if not, then we’ll keep having the same issues over and over again. That’s what I do think.

Sundae: Yeah, we’ve seen that for 450 years in the US for sure. And you said truth. Right? And you said at the same time, you talked about acknowledging it, like acknowledging the truth. We can hide it under the covers and we won’t work on it if we don’t acknowledge it. And you and I have had private conversations about how in Europe often, there is a denial of the level of racism that goes on. And you only see it, if it’s not public conversation, you only see it when you’re in private conversation with people who are directly impacted.

Maikel: Exactly.

Sundae: Which were subtle, right? It’s so much more subtle and implicit and that’s almost scary because then you can’t see it.

Maikel: Exactly, yeah. And I can give an example, for instance, because, of course, I work at a big corporate, but one thing that I experienced a few years ago, that was very interesting because my family is from Suriname, Suriname, that’s how you pronounce it in English, a country in South America, but that used to be a former colony of the Netherlands. And it was when the Netherlands were very active when it came to slavery. So that’s where they enslaved Africans in Suriname, one of the colonies. What the interesting thing that I had then when I was sitting in the cafeteria just chatting with some colleagues. We were just joking around, not serious. Just talking about all the different ways that people can speak a language. But we were just joking around, not disrespecting anyone but then someone came in, senior manager, actually and she clearly misread the room because what she said is that, in an irritated voice, is she said that all Surinamese people do not speak Dutch in a proper way. That was what she said and I looked at her and I corrected on the spot. I said, “Okay, it’s not correct, what she said, but also that there are different forms of Dutch.”

So you have like the Dutch that we speak here in the Netherlands, but also like your Flemish-Dutch, that they speak in Belgium and also the Dutch that we speak in Suriname, but that they are different forms. It doesn’t mean that the one is better or worse than the other. But interesting thing there was that, of course, she was wrong with what you said, but also like she was clearly gotten so comfortable with me that you’ve forgotten, who she was sitting next to. That I was actually one of the people that she just disrespected because she disrespected the Surinamese people as a whole. So that’s how it then gets out. So actually those moments are not pleasant. But at the same time, they really showed me a person because she was so comfortable that she showed her real side and I was like, “Okay, now I know who I’m dealing with.”

And this is one example, but with the same person, I also had another example again, with language was when a South African there and of course, in South Africa, you have the Afrikaans and that language basically derives from Dutch. What her comment about Afrikaans was that it’s a very cute language and she talked about it with affection and then I was like, “Okay, interesting.”

“So the Surinamese that the Black people speak, the Surinamese Dutch that the Black people speak in Suriname, that you criticize? But Afrikaans that is spoken there by the so-called settlers, who actually introduced apartheid, the brutal apartheid regime, you talk fondly about that?” So those kinds of things and this is another example, the thing is how subtle it is. This is also a person that presented herself as very progressive, but this is then the things that she says. So yeah, this is one example, but that’s one example that really I clearly remember because it really shocked me something.

Sundae: Right. But then, you’ll never know when you’re in a safe space with people because you can’t know where they’re really at. And, I mean, I can speak as a White woman. It’s like white supremacy and all of those contexts that you’re raised in, it’s pervasive and you discover it as you wake up to it more and more ways every day. Right? Even if you know a lot, doesn’t mean that there aren’t things lurking inside. It’s an ongoing lifelong process of deconstructing that in ourselves. It’s incredible. And this is why I think I felt so much respect for this topic because, in my work also with my background in intercultural communication, it’s around seeing similarities and differences in people, breaking down prejudices and stereotypes. And that’s with the human side. And I would never think about getting involved in a project that had anything to technology because that’s not my area. Not my concern. But when I learned more about AI, I thought how important it is to have multidisciplinary diverse teams incorporate, in education because it has such pervasive human impacts.

Maikel: Definitely. And I think this is one of the things that I do with in my work because I work at EY and one of the things that we do is we look at okay, we look what we call “rusted AI,” and we look at different factors. So we say, okay, the technical reliability is one part of it, but actually when it comes to the trust equation, where the people will trust in technology, it has some good factors that are related to the ethics such as fairness and bias, values, morals, but also like the social impact. For example, how will the technology maybe change the way that people do work? Will they need to do different work? Will the jobs disappear? All those kinds of questions. And also like accountability and explainability are very important because no matter what the system decides, those needs to be a system behind it that’s accountable, a person or organization behind that, AI also need to be able to explain decisions. Especially when it’s crucial because at the beginning we talked about, how does AI manifest anything?

If a system recommends a certain movie to you and it isn’t a movie that you really like, or recommends a song to you and when you hear it, you’re like, “No. No, that’s not really a song that I like,” then the consequences are not that big, the negative consequences. But for instance, if you use AI to determine whether some will get a loan or whether someone will get benefits because they have disabilities or they’re are unemployed for whatever reason and if there’s then a bias in there then of course consequences are way bigger. So you need to be even more mindful there.

Also when you assess, okay, how can you use technology responsibly? I think the negative impact that will have on people, that’s also something you really need to consider because yeah, it’s very easy to arrive at the decision through such a system, but the consequences can be huge. And then also repairing those consequences for people can also take quite some time. So it’s very important to be quite mindful of those things and but it really has to do with whether you care enough about people also to really consider those things. And also to just invest in trying to make sure that it actually does work well for all people. Because, if you more like, “Okay, it works fine in 95% of the cases,” no, then I’m cool. But that little 5%, that’s so important, especially when it’s about people and especially when it can really hurt people.

Sundae: It’s immense, it’s immense. How do you keep motivated to do what you do, Maikel?

Maikel: Well, I’m like yeah, I just always need to keep going. Regardless of the outcomes that you will see or the results of the work that you will see, it’s like, okay, putting the work in the process. Try to at least keep pushing forward to move society forward in a better manner. And again, like I said, was also just giving the signal that I don’t accept the nonsense, even in the corporate world when I encounter certain stuff. It’s like, “Okay. I’m just going to give you the signal that. No, it’s not okay, and I’m not going to be the one to play with.” I don’t accept those things and that doesn’t necessarily mean that that will change the whole world or the corporate world that we live in but I’m just going to keep giving a signal because that’s I think that’s also an important element. Because like I said, my family is originally from Suriname, but before that, we were of course originally from Africa, but we were enslaved there. But a key element of the whole enslavement is not just breaking you physically but also mentally. So for us today to just also still give the signal that, okay, no matter what the nonsense is that you try to put on us, you don’t have me mentally. I’m still resisting and I’ll still put you in your place.

And I think that that aspect alone is for me also, very important because in that sense, you can say no matter where you are that internal you or that internal self, you still have power there and still have the resistance there you. So I’m not going to let someone else basically try to define me, because even though that’s kind of like what we’re faced with, that the world around us tries to define us but I’m like, “No, I’m going to resist that.” I know where I came from and that’s something that basically also can keep me grounded. Kind of like an anchor, based on that, I can still find all the strength to just keep going on.

And also just connecting with different people because I can also give a shout out to you. Because actually, I’ve also mentioned you to also like a lot of other people that I’m connected to, there is a clear difference I see in the interaction, for instance, I have with you, and that also have with a lot of other people, a lot of other White people that I’ve also counted in the Netherlands, for instance, and they see how racism manifests in many different ways because you see how you treat another human being. And then when they look at how some people always treat me, I see that there is a difference in how they treat White people and then how they treat me. Not based on any actions, but you see there’s a lack of respect or lack of grace or people just stare at you as if you were some strange creature. Like your hair or they would want to comment about your hair, those kinds of things. With another human being that you actually see as a human being, you would not do that.

So for instance, I never would have the urge to say, okay, I want to comment about a person’s skin color or hair texture or whatever. But still, I encounter a lot of people that think that it’s a fun or okay thing to do and then I’m like, “Okay, you really don’t look at us really as human beings.” You look at us with more some kind of anomaly, or something strange. And that’s something that I see when I interact with you or with other people, you see that nonsense that is not there. And that’s how I know that there’s so many layers to this whole racism and how it is internalized with other people and how it can manifest. And this is also one of the most underrated ways it manifests. It’s so blatantly obvious that you don’t treat other people from your own group like that. But then you see people that look like me, you all of the sudden believe that it’s okay. And I’m like, “Yeah, it’s not okay.” And also important for me to put it out there, that I’m definitely not going to take that.

But it’s yeah, it’s very telling that it really shows how much of it is internalized and I’m like, of course, it is internalized because it’s the same way that there’s a theory about basically, like post-traumatic slave disorder. Basically, what you get as a result of being enslaved, also really traumatized people, and that’s something that it can also be passed on from generation to generation.

Sundae: That’s what’s so daunting about it. I’m thinking about Resmaa Menakem’s book: My grandmother’s hands, where he talks about, there’s 14, I think it’s 14 generations of racialized trauma that stays in your body. And so it’s not just about reading a book on anti-racism. It’s like a physical practice like meditation or yoga and no one is exempt from it, right? Unless it’s the people who have been oppressed and done the work, probably to work out the oppressor right that colonization of the mind. It’s so pervasive and when I think about the connection to AI there’s nothing more important than doing the work because it has literally an impact on billions of people instantaneously.

Maikel: Yeah, exactly. And also with AI every day I say, indeed it can have an impact on so many people and also often the people that develop the technology, they won’t necessarily really see the people that are impacted. So there’s also a big gap there. If you directly see all the bad consequences that your actions might have, then it might also deter you, but where we are now, that’s not really the case. It’s like everything’s happening in some kind of a bubble, you could say. And the people, that are affected might be miles away from you, maybe on the different content, so you don’t you see it. And I think that’s also an issue because to be able to really empathize, it’s also important that you know what is going on and how your technology is being used and that you can really see what it might be doing to people. And it’s true for AI but also true for a lot of other technologies, I think AI is just the latest iteration. But like what you said before, also the books that you read, all those examples, would see they were not tested on women, mostly just for men. But also like medication often in trials, also, it was mostly men and also the mostly White men that were included there. So, as a result, certain medication might not work that well for women.

But for children, for instance, a lot of medication was never really tested on children. So that then also, they don’t necessarily know how well it would work for them, like, “They’re younger. Give them a half dose.” But yeah, you don’t really know for sure if they really, really incorporate in the development and the testing. In many different ways. Even outside of AI, there’s really this bias. And yeah, we often say it’s really not only about the technology you really need to get different kinds of mindset involved because we know society. If people are honest and really acknowledge what’s going on in society, then it isn’t a surprise at all, these biases here because when you just look at history, it is racism. It is sexism. It’s patriarchy. That is not something that is a total fantasy or that it’s something that people really don’t know about. I’m like, “Just open your eyes.” You can see, but you also need to be willing to acknowledge because I think it’s, the whole acknowledgment part is really about the people also choose to not acknowledge, and choose to the other direction, instead of really addressing what’s going on.

But yeah, I think like, as long as we as society, keep looking the other way, then we will keep finding ourselves back into these difficult situations

Sundae: Yeah, absolutely. So this leads me to one of my last questions on this topic is in light of the fact that, I don’t know, only a handful of companies or leaders control our software companies. And all of the other technology companies, we can also pretty much guess the demographics of the ones who are holding the power. What can we do as just regular citizens? Where is our power in this?

Maikel: Okay, that’s a good question, indeed. Of course, now is also as regular citizens, we have more opportunities to speak out against things. For instance, by utilizing social media, but at the same time, the reality is we look at the technology, it’s so pervasive. It’s not really that in a lot of cases that you really have the option to really opt out of things. But I think one thing that you can do, it’s important to, don’t just assume that because it’s technology that it will be objective. Realize that the technology is really a reflection of the society that we live in, of the history, of the historical human behavior, of the present time human behavior. And if you keep that in mind, that will also then make you more critical in realizing that, “Well, maybe I cannot just trust the technology. It isn’t just an objective thing. Maybe does not work that well for me.” And that will maybe also spur you to maybe do your own research. Also to make sure that you are well informed yourself. So that is one thing you can do to at least try to empower yourself more. Because like you said since you also learned more about AI and you also see that is indeed not just about technology. It’s about so many more other factors.

Sundae: Yeah, totally. For me, because I hold so many dominant identities, right? I’m White, I’m straight. I’ve got passport privilege. For me, what that says is, “Keep doing your work Sundae, because there’s so much to sort of deconstruct,” and keep having those conversations with my boys, keep having those conversations here and in community with the listeners. And that’s what I know I can do. It also makes me think about, “How can I send a message?” And that’s with your money. Which organizations you support or speak out against to hold leaders accountable. So I really want to keep this conversation going, also, when this goes live when we’re sharing it on LinkedIn and other forums to hear from other people, if they have ideas on what we as individuals can do to work toward some of these better outcomes.

Maikel: Yeah. Just speaking out is important.

Sundae: Yeah, and it works. We’ve seen it work. We’ve seen if people apply pressure and that’s one of the positive things about social media right now, when that spreads fast and people apply pressure, real change can happen. Do you mind Maikel, If we switch to a little bit more of an individual focus on you because my whole theme is around Ambitious Transformation in Transition. I’d love to hear a little bit more before we close off. What are some of the transitions that you’re feeling right now, whether they’re global transitions or personal transitions, and tell us more about some of the transformations you’re feeling.

Maikel: Yeah. well, I think one of the real transitions, I think that I’m feeling is that earlier this year of, a family member of mine actually passed away. I think those kinds of transitions, they are part of life, but they do show you how things can suddenly end also unexpectedly. I think that’s so interesting that I find it difficult to really process because of that reality that it can just end and that really also changes how we move forward because you might have the idea that you might have plans for the stuff that you’re going to do, but that really also shows is that it’s important to really also spend the time together with family because it can just be over like that. So I think that kind of transition has been a lot mind because it also still quite recently.

And other transitions, I think the way the working world is, at least our working world is changing, because, of course, during the pandemic, we were all working from home and how it will go moving forward. I think that’s a transition that I’m also in to see what is something that would work for me. But at the same time, I also realize that it’s quite a luxury that I mean because I was able to work from home and it worked fine, no problem. But other people like there were just stopped. So it’a also in equity again, because it’s the work that I can do, that’s fine. You can just do work from home. But other people, they really struggled through the pandemic work-wise. So, how that will bounce back out there, full transition. I think that’s something that’s been on my mind. For me, personally, is just I think, I’m just trying to find a balance in there. I think for me the working world has been very pleasant. Also has to do with the fact that the microaggressions and all that kind of thing that you talked about, not being confronted in that environment all week, all day. That’s also something that I have in mind. So transition wise, it’s basically for me, I’m going to definitely you maintain a part of what I’ve not been able to build up during this pandemic because in a certain way definitely been a more healthy environment for me. So yeah, that’s some of the transitions that have been on my mind.

Sundae: Thank you for saying that because that’s one of the reasons why that’s so important to me to make transparent because someone else that might be working with you has no idea that you’re in a process of grief, might not have an identity where they even think about the microaggressions that others have to have when they just on the tram, right? I think I’m committed to bringing this idea about straight talk and being a full human. This is part of you being a full human. Not just, an employee at a big fancy company and when we start to see each other’s humanity that way I think that helps us connect in new ways. So thanks for sharing that.

Maike: You’re welcome.

Sundae: What do you have going on right now that you would consider ambitious? And remember ambitious is defined by you and has no influence under external scope or scale. So it could be just like clean laundry. What’s ambitious for you right now?

Maikel: Ambitious is just basically just living my life authentically. Just being me. Because you hear a lot about showing up authentically. Well, the reality is that it’s not necessarily appreciated. But I’m like, okay, “I’m just Maikel in whatever space.” So, you know me from LinkedIn, but the way that I show up that’s the way that I present always. So the way I speak out is just me consistently in all areas. So that’s like my ambition to maintain that consistency and just keep it real all the time. And so, I think it’s important because I think that’s again also just giving the signal to the outside world like okay, “With me. You don’t have to question where I stand. I make it very clear.” So you don’t have to guess, how I view things and you might not like it but that’s for you to deal with because I’m not responsible for your emotional reaction to that. But I’m just going to keep it real. And that’s my ambition to just keep it real and just keep going the way that I’ve been going. And just speak my mind about the things that really matter. Because for me, it’s like I have my job to do my work but ultimately for me, not the most important thing. Like we talked about human rights issues, that is way more important and I’m not going to pretend like I’m just making my money. Everything is going fine. What in our part of the world to see the people are being killed or being brutalized or whatever that’s happening, would be very hypocritical of me to not care about that or not speak about that.

Because I have been approached by people who say, ”Aren’t you concerned that people might say something about the things that you say here.” And I’m like, “Well, I don’t lie. I just also just tell the truth, just the facts, nothing more than that.” And also know when I’m on the right side or not because like a lot of these issues like it’s not even very difficult when it comes to racism, police brutality, when it comes to sexism, misogyny. It’s not like it’s very difficult to decide what is right or wrong there. So when I speak out I know the real concern, because I know which side is right or wrong, it’s not very difficult.

It’s also with the #Metoo movement I remember that people were saying like, “We cannot say anything right now,” and I’m like, “No, you never could say those things.” It’s not really very difficult. We’re talking about certain extreme behaviors that werer okay, they might have been tolerated. But even though they were tolerated, you already knew that it was not okay. That’s also true for slavery and racism. Also 200 years ago they knew that there was not okay, but they twisted everything that they could to make it seem. So they used the law like Dutch law, which said that we were not humans. They used religion also to try to justify it and the fact that they spent so much energy in justifying, or now faced in America, with the surety to try to erase the history that already shows you that, actually, well, what we did was very bad. So bad that we really need to twist everything as much as you can, to make it go away because we know how bad it actually really is.

Sundae: Right. Exactly.

Thank you for being part of this conversation. It has meant so much to me and you’ve represented this topic with such clarity and integrity. I just want to say thank you for being here.

Maikel: Okay. Thank you. I appreciate it. And I also really appreciate our connection. I think we are able to fight clearly on the same frequency on many things. So it has been wonderful actually connecting with you. So I also appreciate your giving me this opportunity to spend some time with you on your podcast.

Maikel gave us so much to think about, I’m so grateful for the time he shared with us, and it left me with a lot of clarity on things that were before quite fuzzy, but a lot of new questions as well. I’m grateful that this is a conversation that we can have together. So if you’re listening to this and see this on social media, chime in with your questions or insight because this is an important conversation for us to be having. You’ve been listening to IN TRANSIT with Sundae Bean, steady advice in an unsteady world. Thank you for listening.

Maikel inspired me and left me with some hope because he is so committed to making change. And that is why I chose to leave you with the words of this poet, Vironika Tugaleva: “By serving humanity. I automatically serve myself.”

Enjoy The Show?

  • Don’t miss an episode, subscribe via iTunes or RSS.
  • Please leave us a review in iTunes (or here for Android).

The post 270: Accountable AI with Maikel Groenewoud appeared first on Sundae Schneider-Bean, LLC..

  continue reading

218 tập

Artwork
iconChia sẻ
 
Manage episode 322051067 series 2427769
Nội dung được cung cấp bởi Sundae. Tất cả nội dung podcast bao gồm các tập, đồ họa và mô tả podcast đều được Sundae hoặc đối tác nền tảng podcast của họ tải lên và cung cấp trực tiếp. Nếu bạn cho rằng ai đó đang sử dụng tác phẩm có bản quyền của bạn mà không có sự cho phép của bạn, bạn có thể làm theo quy trình được nêu ở đây https://vi.player.fm/legal.

What good is all of the awareness we’re raising towards a more diverse, equitable, and inclusive world if technology doesn’t follow suit? Why does the hard work to acknowledge our own biases and prejudices within our societies get under-translated with artificial intelligence?

And this includes growing our intolerance for not just overt racism, which is easy to spot and condemn, but also more subtle affronts.

Regardless, the time to look away or stay quiet has passed. So when we see these infractions, whether in the design lab, at church, on a plane, in the lunchroom, at the bank, or online, we need to call them out.

This week, it’s my pleasure to welcome Maikel Groenewoud to discuss the critical intersection of AI and a lack of diversity and inclusion in technology. A specialist in data science and artificial intelligence (AI), Maikel’s managerial role at Ernst & Young Netherlands focuses on trust and ethics related to technology.

Born and raised in Amsterdam, Maikel is of Afro-Surinamese descent. And in our conversation, Maikel candidly shares how his own personal experiences with racism impassioned him to hold AI programmers accountable.

Today, Maikel reveals the dire consequences that occur when human biases from the individual or societal level bleed into data programming and manifest in AI. His mission is to unite people to fight on the same frequency and ensure the industry truly has a broad perspective that’s fair for all.

What You’ll Learn in this Episode:

  • Why 95% doesn’t cut it
  • When skin color determines loan rates
  • Ways data reflects society & human nature
  • AI’s responsibility to explain decisions
  • Voicing support with your wallet

Listen to the Full Episode

Featured on the Show:

The IN TRANSIT Hub is where our global community leans on each other for support, and we want you with us. There’s a lot of really hard stuff going on right now in the world, and we’re talking about all of it right here.

Catch These Podcasts / Articles:

We’re delighted to be in the Top 5 of the global Best 30 Expat Podcasts!

Subscribe: iTunes | Android

Full Episode Transcript:

Download Transcript

Hello, It is 9:00 am in New York, 4:00 pm in Johannesburg, and 9:00 pm in Bangkok. Welcome to IN TRANSIT with Sundae Bean. I am an intercultural strategist, transformation facilitator, and solution-oriented coach, and I am on a mission to help you adapt & succeed through ANY life transition.

This week’s episode, we’re going to talk about global transitions and I have a hunch this is not going to be a stretch for you. Thinking all of the global transitions we’ve been feeling over the two years, the pandemic, the resulting travel restrictions health risks, and more have profoundly impacted all of us. Even this week that I’m recording we’re seeing another ripple across the world, with the Russia-Ukraine conflict. What we don’t think about is the subtler things that impact us. The things that emerge with seeming neutrality that suddenly have a profound grip on our lives. And today, we’re going to talk about one of them.

AI or artificial intelligence is to be very honest something I push to Sci-Fi movies. Maybe I’ve seen a family film, like Walle, and thought about that adorable robot, but when we really think about Artificial Intelligence and how it’s taking over so much of what we’re doing, we see how it comes quite close to our lives. When you think about the AI that is built into the palm of your hand, think about the ways in which you rely on it as you just use your cell phone. Stephen Hawkings says, “AI is likely to be either the best or worst thing to happen to humanity.” Right? Right now, I need some hope so I am hoping it is the best thing that’s going to happen to humanity. But what I know after this interview that you’ll hear today, it’s not going to happen on its own.

What we’re learning about AI is that its scope and power is far-reaching. In an article from VentureBeat, they say, “The far-reaching scope and power of Machine Learning, ML, and Artificial Intelligence, AI, means that gender and racial bias at the source is multiplied to the nth power in businesses and out in the world.” I mean we were busy enough with human bias. And now we’re going to have to worry about bias programmed into our technology. The answer is, “Absolutely, yes.” This is serious. It is so serious because the scope as you’ve seen is magnified and we can’t directly see it.

So in this episode, we’re going to look at the intersections between AI, and DEI. And to help me do that, we’re going to invite Maikel Groenewoud.

Maikel specializes in data science and artificial intelligence (AI). He performs technical implementations and also has a particular focus on trust and ethics as it relates to the technology.

He studied business mathematics and informatics in Amsterdam, nowadays known as business analytics, and began his career at a policy research institute where he worked for over 10 years. He has been working at Ernst & Young Netherlands since 2016 as a manager within the consulting practice where he leads the data ccience and AI capability.

On a personal note, Maikel was born and raised in Amsterdam, Maikel is of Afro-Surinamese descent. Suriname is a country in South-America, a former colony of the Netherlands and he also has a lot to tell about that history and how it relates to the present. He has what is called a Pan-African mindset and is very much here to aid in healing the Souls of Black Folk, he says. In today’s episode, you will also hear how his work has the power to heal but also prevent harm.

He says, “My background as a researcher always comes in handy. I’m critical and analytical but I also have an eye for the human side.” That’s exactly what caught my attention about his work. He combines his interest and know-how in history, lived experience, and professional background to make a difference for all of us in this all important global transition. Stick around to the end because this is our very first interview since I unveiled ATT: Ambitious Transformation in Transition, and Maikel is the first answer a set of questions I have about it. This will show you exactly why I believe in this approach so much. I’ll let Maikel help you understand AI more. We know the definition is, but what’s behind it?

But before we dive in, I want to just briefly make sure we’re on the same page about DEI. Now, DEI: Diversity, Equity, Inclusion is something that we’ve been talking a lot about. I want us to have this shared context. So according to Seth Boden from harvardbusiness.org, “Diversity, refers to anything that sets one individual apart from another, including the full spectrum of human demographic differences, as well as different ideas backgrounds, and opinions people bring.” All right. So when we say that, that’s what we’re looking at. E in DIE, stands for Equity, “Fair treatment for all while striving to identify and eliminate inequities and barriers.” And, “Inclusion implies a cultural and environmental feeling of belonging and a sense of uniqueness. It represents to the extent to which employees feel valued, respected, encouraged to fully participate, and able to be their authentic selves.” So wanted to make sure we’re on the same page with that, thanks to Seth Boden from harvardbusiness.org.

In this interview, we also talk about white supremacy. And if you are from my generation, when I hear white supremacy, I automatically think of the KKK in white hoods. But if you’ve been following the discussion around racism, equity and inclusion, and diversity, we know the term has been changing over the years. And now, according to Nobel Peace Prize nominee Barbara Smith, she talks about it how we now use it in the context. Now in her piece from the Boston Globe, she talks about, “Systemic racism and how clearly conveys the pervasiveness of racial oppression,” but goes on to say that, ”White supremacy goes further by indicating that there is a rigid nexus of power that protects it and reinforces it.” So we want to make sure that when we hear that conversation, it’s talking about something more subtle, something more common, even and maybe not intentional or even obvious, right? So I want to make sure we are on the same page with that, as well. All right, let’s get started and hear from Maikel.

Sundae: All right, Maikel. I am so excited to welcome you to IN TRANSIT. Thank you for being here.

Maikel:: Yes. Thank you, Sundae, and I’m very pleased to be here. Looking forward to chatting with you.

Sundae: So it’s been a long time coming. You and I were talking before we hit record that. We’ve known each other over LinkedIn for at least two years so, it’s exciting to be able to connect in this way, and I’m so lucky to have been able to follow your work because it’s really truly opened my eyes to the connection of my work and data collection, AI in the world. So, I’m excited to share this with our listeners today. And do so as I was saying to you before, kind of with, how should I say it? There’s so much responsibility in getting this topic out because it has such a profound impact on all of our lives. And that’s not something I really realized. So before we begin, can you please help those of us who are not in AI understand just generally what that really is and where we come in contact with it in our everyday lives.

Maikel: Yeah, of course. So yeah, AI is short for Artificial Intelligence and that term it kind of suggests real intelligence. What would often tell people is that these systems aren’t actually intelligent. So what they do is data. They might look at data that has been collected, and they might be able to detect the patterns in there and based on those patterns, based on historical data, the system might be able to predict, may try to like, based on the data try to determine like, “Okay, how does that work,” and based on that data to think, “What will be the best decision to make, what might be the best recommendation to provide.” But it isn’t so that the system really understands what’s going on. So you how does it manifest? So when you think about speech recognition systems, which maybe can talk to to your computer or to another device. But also at image recognition, those are like examples that AI has manifested.

But it also mentions a lot of different things, like also, our mobile phones, all the stuff you have on there. So many ways, many different areas, you see AI in it. Yeah, you can say that it makes life easier in a way, you could say because they allow us to do certain things more efficiently. Or maybe allow us to like tailor services specifically to the person based on maybe your preferences, your previous history or other characteristics you have or maybe characteristics of people have to have a similar profile as you. And then based on all the data, the system then can look at, “Okay. What might be the best fit for you when it comes to service or products or other things, you might be interested in.”

So those are like, yeah, just a quick introduction, some of the ways that you see this technology manifesting and it’s quite broad because you see it in many different areas and also many areas that you don’t think about, but also keeping like the software that we have on the computers, all the word processing. Also the system that we used to create presentations, etc. All that software will also AI Incorporated to make stuff work more smoothly for us. So it’s many different areas.

Sundae: And so for me, as I’ve done a little look into it, I realized it goes Way beyond what I had in mind, kind of those future, Sci-Fi movies. The real practical, real practical things that are in everyday life. You said that the intention is to make our lives easier. But what I’ve learned, looking at it, again, this naive view of, hey, “This is fantastic.” We have data driven in ways to do new things because it feels like it actually puts in some neutrality to our human experiences. But what I’ve learned from you is, actually, it’s the opposite. That whatever bias that we have, as individuals gets programmed into the data, and then it manifests the scale.

So can you say a little bit more about the intersection between AI, Artificial Intelligence, and DEI, this awareness that more people are having that we need to build in processes of Diversity, Equity, Inclusion, in all of our things?

Maikel: Yeah, of course. Yeah, because that’s very true that it might be that, okay, decisions can make matters or the sense is to make more objective. But the reality is that specifically, when it’s about people when the systems are trained on data that comes from people, or is about people, we then see is that the systems basically just reflect what is in the data, and the data is a reflection of society, of human nature. So for instance, if people have certain biases, that’s also reflected in the data. And then also the system can learn from the data. But also, for instance, another way to show up is that the systems are only basically smart or can only learn from the data that is presented to them. So for instance, if only one group of people is used to train a system, then it also means that basically the system only learns– gets an understanding of that group of people and that can then mean that maybe if you apply to other people, that it doesn’t work that well.

So for instance, if you only make a system based on men, then it could be the decision doesn’t work that well for women. Or if you only use training system based on White people, only on White man, that also means that the mat only work for that group. So those are also something to think about and this isn’t just one side of it, that’s the input data side, but even regardless of what data you put into it, also humans that are developing or monitoring the systems, also at that point, you need to be able to supervisor the data and say, “Okay. I know the society that we live in. I know the biases that exist there. So based on that, then I also need to make sure that I check the system and also check whether what I’ve built actually works for different groups.” And that is something that you can also get, you also need to do yourself as a human, regardless of what data you put into it. You need to check for it. And then also you need to check it if it doesn’t work for different groups.

And then also, you asked about the intersection of AI and DEI. It is in many different ways And a lot of ways that you look at the people that are developing it do they have like diverse backgrounds? And also to different ways of thinking? So that you can also incorporate the different perspectives, not just to theoretically but also like really incorporate people from those different groups. Like if you make a system for all people, then it cannot be that only men are the ones developing it. You also need to include women. But also, if your system is for everyone, you can’t just only have one demographic in there. Only one ethnicity. You also need to make sure they can incorporate the other ethnicities in there. Just also to increase the likelihood that you will get a truly broad perspective in there.

Sundae: But let’s hold on a second. This is more than just a broad perspective in something functioning. This actually has implications that are life or death.

Maikel: Truly. Yeah.

Yeah. So one example indeed that I also often talk about oxygen meters or oximeters. And the thing there is that they basically are like little devices that are used to– they clicked on the fingers of people and they are used to estimate the blood levels in the oxygen of people, but the only problem is that they don’t work that well necessarily for people with darker skin. So what they basically do is they overestimate the level of oxygen in the blood and as a result based on an overestimation, those people then only later get the help that they need. So only later, they might get extra oxygen or get put on a ventilator and that’s of course extra critical right now because we’ve been living in a pandemic for all the last few years. And of course, oxygen levels are very important there because one of the things that you saw getting in hospital that they need extra oxygen, put on a ventilator. But if then the oxygen meter, oximeter overestimates your blood oxygen levels, you have a big problem.

And the thing is also with that is that it’s already bad enough that it’s like that, but the research saying that has actually already been available for quite a long time. So the research was already done like 30 years ago, 40 years ago and that already said, okay, these devices don’t work on people with darker skin, and the recommendation was also given by the researchers that the manufacturers would incorporate more data from black people or from other people with darker skin, to make sure that you would then be able to like recalibrate the algorithms so it would also get proper readings for them. But the thing is right now that during the epidemic it was established that it seemed that it still don’t work that well, these devices. So the UK is doing a review to get rid of the US counterparts.

And also the original researchers who first published the article in 1990, at the beginning of this year, they published a new article and in it, they also said that they have seen no evidence that the manufacturers in all those years have actually implemented the recommendation that they gave to also like get data from people with darker skin so that you can recalibrate those algorithms. So yes, like you said life and death. It’s definitely a case of life and death in this case. But it’s also important for me to note that because people talk a lot about the technology and also, of course because I work in the technology field, I get asked first of all about the technology, but what I also tell people is that this is not really a technology issue.

It’s really about people because it is just like ethical AI, it really requires ethical humans. And in this case also, if you’ve known for 30 years then that’s not a technology problem, just like a human decision problem that they know about it and they make the conscious decision to say, “Okay. For me, it’s not that important enough to make that change.” So you just leave the technology as it is with all the dire consequences for other people. So yeah.

Sundae: It’s unbelievable, like what? 30 years go by and nothing has happened. And I was listening to an interview that you did, and there was a caller that came on and it broke my heart. She’s like, “What do I tell doctors when I go in and they take my oxygen and I’m having an asthmatic attack. My son is having an asthmatic attack. And they say my oxygen is fine.” So she’s not only experiencing – I wouldn’t even call it a microaggression here, with a human because are not believing her case. But you’re also getting it built into the technology. Right? And it’s incredible. It’s incredible. That’s massive. I just want to do a shout-out to Caroline Criado Pérez she has a book called Invisible Women, where she talks about exposing the data by us, in a world designed for men, how that also has health implications for women who are, let’s say in a car crash because the seat belts were designed for men and not for women. So it’s huge.

So doesn’t this all start with who’s getting trained at the university level for being in the data industry? Or It’s much bigger than that, I’m guessing.

Maikel: Yeah, it’s certainly a part of it. So for instance, if people were only trained – if you only look at the technical aspect, you might say, “Okay I’ve made a system and it is very high accuracy in total.” So you would think maybe, “Okay. I’m good, I’m fine, everything works properly.” But they also need to be aware of the fact that you need to also look at the different subgroups. “Okay, does this also work well for all the different groups that are in there?” And if that’s not the case, then you might need to say, “Okay. I need to maybe arrive at the new or different AI system or that I only use it for specific groups, which I noticed it works well. And for other groups, I might do something else.” And also, if you would use this to diagnose people, when it comes to healthcare, also, healthcare example, you can also say, “Okay, it has only been tested mostly on White people,” then you can say, “Okay. I know that it works for White people. I can use it there. But in other cases based on research, it might be better to use another device.” Or another way to determine whatever it is that you want to diagnose.

It is also important to say that it’s not as so that you always need to use the technology. It’s more, okay, it is a tool that you can use, but if there is a certain imbalance in there, they can also make an alternative decision. But yeah, I think it’s people, the problem that I also have is with it, all these technology discussions, it’s a lot of deflection to the technology, it’s a lot of pointing to the technology as if that’s the big issue. What? All the things that we talk about have everything to do with human nature. It reflects human nature, but that’s only one of the issues because you can say whatever the data is, but as humans, you can also then make the decision, “Am I going to check for it? Am I going to leave it in place or not? And we’re going to make light different decisions with it? Or am I just okay with whatever it is that’s in there?”

And that’s just really has to do with the ethical behavior that needs to come from humans. And I think that’s the big thing that we need to solve because ethics and AI like I said that the terms, Artificial Intelligence kind of suggests real intelligence, but the systems aren’t really intelligent. So real intelligence and the real ethical and moral behavior, really need to come from humans. And yeah, they need to step up there. But yeah, I think we can say that we can just look at the whole world around us. Everything that goes on, it’s very clear, it’s very obvious time and time again, that humans and ethics, that’s not a problem that we’ve really solved yet. So, but that’s wher we really need to start. If you want to solve the problem of ethics and AI.

Sundae: Absolutely. Maikel, I don’t really hear a lot of hope in your voice. Tell me where I’m wrong.

Maikel: Well, I’m just realistic and it’s also because my personal background also, even though I work in technology, I love history more, and when I just look at history, like all the stuff that you see repeating all the time and also how people simply refuse to really acknowledge history? So for instance I know that you’re in South Africa, and I know that South Africa also has a brutal history, but one thing that I really do appreciate, of the countries, they had truth and reconciliation. And you can say whatever you want about that process but that does provide you at least a way forward way to at least try to heal and come together. But when I look at European countries, when I look at America also, I don’t see truth and reconciliation. I also don’t see any reparations because that’s also an important discussion, but like, if no reparations, no truth, no reconciliations then how can you really move forward in a constructive manner?

I think we can definitely make progress, but there needs to be a willingness. And that willingness is something that I’ve, yeah, that has very much been lacking is again. Like I said, South Africa, truth reconciliation, I think this is a good model, but I have not seen that in Europe, and I have not seen that in America, and I think that’s something that needs to happen to need to be if you really want to move forward and really need to heal. Then we need to be able to like half-truth and really acknowledge that. And if not, then we’ll keep having the same issues over and over again. That’s what I do think.

Sundae: Yeah, we’ve seen that for 450 years in the US for sure. And you said truth. Right? And you said at the same time, you talked about acknowledging it, like acknowledging the truth. We can hide it under the covers and we won’t work on it if we don’t acknowledge it. And you and I have had private conversations about how in Europe often, there is a denial of the level of racism that goes on. And you only see it, if it’s not public conversation, you only see it when you’re in private conversation with people who are directly impacted.

Maikel: Exactly.

Sundae: Which were subtle, right? It’s so much more subtle and implicit and that’s almost scary because then you can’t see it.

Maikel: Exactly, yeah. And I can give an example, for instance, because, of course, I work at a big corporate, but one thing that I experienced a few years ago, that was very interesting because my family is from Suriname, Suriname, that’s how you pronounce it in English, a country in South America, but that used to be a former colony of the Netherlands. And it was when the Netherlands were very active when it came to slavery. So that’s where they enslaved Africans in Suriname, one of the colonies. What the interesting thing that I had then when I was sitting in the cafeteria just chatting with some colleagues. We were just joking around, not serious. Just talking about all the different ways that people can speak a language. But we were just joking around, not disrespecting anyone but then someone came in, senior manager, actually and she clearly misread the room because what she said is that, in an irritated voice, is she said that all Surinamese people do not speak Dutch in a proper way. That was what she said and I looked at her and I corrected on the spot. I said, “Okay, it’s not correct, what she said, but also that there are different forms of Dutch.”

So you have like the Dutch that we speak here in the Netherlands, but also like your Flemish-Dutch, that they speak in Belgium and also the Dutch that we speak in Suriname, but that they are different forms. It doesn’t mean that the one is better or worse than the other. But interesting thing there was that, of course, she was wrong with what you said, but also like she was clearly gotten so comfortable with me that you’ve forgotten, who she was sitting next to. That I was actually one of the people that she just disrespected because she disrespected the Surinamese people as a whole. So that’s how it then gets out. So actually those moments are not pleasant. But at the same time, they really showed me a person because she was so comfortable that she showed her real side and I was like, “Okay, now I know who I’m dealing with.”

And this is one example, but with the same person, I also had another example again, with language was when a South African there and of course, in South Africa, you have the Afrikaans and that language basically derives from Dutch. What her comment about Afrikaans was that it’s a very cute language and she talked about it with affection and then I was like, “Okay, interesting.”

“So the Surinamese that the Black people speak, the Surinamese Dutch that the Black people speak in Suriname, that you criticize? But Afrikaans that is spoken there by the so-called settlers, who actually introduced apartheid, the brutal apartheid regime, you talk fondly about that?” So those kinds of things and this is another example, the thing is how subtle it is. This is also a person that presented herself as very progressive, but this is then the things that she says. So yeah, this is one example, but that’s one example that really I clearly remember because it really shocked me something.

Sundae: Right. But then, you’ll never know when you’re in a safe space with people because you can’t know where they’re really at. And, I mean, I can speak as a White woman. It’s like white supremacy and all of those contexts that you’re raised in, it’s pervasive and you discover it as you wake up to it more and more ways every day. Right? Even if you know a lot, doesn’t mean that there aren’t things lurking inside. It’s an ongoing lifelong process of deconstructing that in ourselves. It’s incredible. And this is why I think I felt so much respect for this topic because, in my work also with my background in intercultural communication, it’s around seeing similarities and differences in people, breaking down prejudices and stereotypes. And that’s with the human side. And I would never think about getting involved in a project that had anything to technology because that’s not my area. Not my concern. But when I learned more about AI, I thought how important it is to have multidisciplinary diverse teams incorporate, in education because it has such pervasive human impacts.

Maikel: Definitely. And I think this is one of the things that I do with in my work because I work at EY and one of the things that we do is we look at okay, we look what we call “rusted AI,” and we look at different factors. So we say, okay, the technical reliability is one part of it, but actually when it comes to the trust equation, where the people will trust in technology, it has some good factors that are related to the ethics such as fairness and bias, values, morals, but also like the social impact. For example, how will the technology maybe change the way that people do work? Will they need to do different work? Will the jobs disappear? All those kinds of questions. And also like accountability and explainability are very important because no matter what the system decides, those needs to be a system behind it that’s accountable, a person or organization behind that, AI also need to be able to explain decisions. Especially when it’s crucial because at the beginning we talked about, how does AI manifest anything?

If a system recommends a certain movie to you and it isn’t a movie that you really like, or recommends a song to you and when you hear it, you’re like, “No. No, that’s not really a song that I like,” then the consequences are not that big, the negative consequences. But for instance, if you use AI to determine whether some will get a loan or whether someone will get benefits because they have disabilities or they’re are unemployed for whatever reason and if there’s then a bias in there then of course consequences are way bigger. So you need to be even more mindful there.

Also when you assess, okay, how can you use technology responsibly? I think the negative impact that will have on people, that’s also something you really need to consider because yeah, it’s very easy to arrive at the decision through such a system, but the consequences can be huge. And then also repairing those consequences for people can also take quite some time. So it’s very important to be quite mindful of those things and but it really has to do with whether you care enough about people also to really consider those things. And also to just invest in trying to make sure that it actually does work well for all people. Because, if you more like, “Okay, it works fine in 95% of the cases,” no, then I’m cool. But that little 5%, that’s so important, especially when it’s about people and especially when it can really hurt people.

Sundae: It’s immense, it’s immense. How do you keep motivated to do what you do, Maikel?

Maikel: Well, I’m like yeah, I just always need to keep going. Regardless of the outcomes that you will see or the results of the work that you will see, it’s like, okay, putting the work in the process. Try to at least keep pushing forward to move society forward in a better manner. And again, like I said, was also just giving the signal that I don’t accept the nonsense, even in the corporate world when I encounter certain stuff. It’s like, “Okay. I’m just going to give you the signal that. No, it’s not okay, and I’m not going to be the one to play with.” I don’t accept those things and that doesn’t necessarily mean that that will change the whole world or the corporate world that we live in but I’m just going to keep giving a signal because that’s I think that’s also an important element. Because like I said, my family is originally from Suriname, but before that, we were of course originally from Africa, but we were enslaved there. But a key element of the whole enslavement is not just breaking you physically but also mentally. So for us today to just also still give the signal that, okay, no matter what the nonsense is that you try to put on us, you don’t have me mentally. I’m still resisting and I’ll still put you in your place.

And I think that that aspect alone is for me also, very important because in that sense, you can say no matter where you are that internal you or that internal self, you still have power there and still have the resistance there you. So I’m not going to let someone else basically try to define me, because even though that’s kind of like what we’re faced with, that the world around us tries to define us but I’m like, “No, I’m going to resist that.” I know where I came from and that’s something that basically also can keep me grounded. Kind of like an anchor, based on that, I can still find all the strength to just keep going on.

And also just connecting with different people because I can also give a shout out to you. Because actually, I’ve also mentioned you to also like a lot of other people that I’m connected to, there is a clear difference I see in the interaction, for instance, I have with you, and that also have with a lot of other people, a lot of other White people that I’ve also counted in the Netherlands, for instance, and they see how racism manifests in many different ways because you see how you treat another human being. And then when they look at how some people always treat me, I see that there is a difference in how they treat White people and then how they treat me. Not based on any actions, but you see there’s a lack of respect or lack of grace or people just stare at you as if you were some strange creature. Like your hair or they would want to comment about your hair, those kinds of things. With another human being that you actually see as a human being, you would not do that.

So for instance, I never would have the urge to say, okay, I want to comment about a person’s skin color or hair texture or whatever. But still, I encounter a lot of people that think that it’s a fun or okay thing to do and then I’m like, “Okay, you really don’t look at us really as human beings.” You look at us with more some kind of anomaly, or something strange. And that’s something that I see when I interact with you or with other people, you see that nonsense that is not there. And that’s how I know that there’s so many layers to this whole racism and how it is internalized with other people and how it can manifest. And this is also one of the most underrated ways it manifests. It’s so blatantly obvious that you don’t treat other people from your own group like that. But then you see people that look like me, you all of the sudden believe that it’s okay. And I’m like, “Yeah, it’s not okay.” And also important for me to put it out there, that I’m definitely not going to take that.

But it’s yeah, it’s very telling that it really shows how much of it is internalized and I’m like, of course, it is internalized because it’s the same way that there’s a theory about basically, like post-traumatic slave disorder. Basically, what you get as a result of being enslaved, also really traumatized people, and that’s something that it can also be passed on from generation to generation.

Sundae: That’s what’s so daunting about it. I’m thinking about Resmaa Menakem’s book: My grandmother’s hands, where he talks about, there’s 14, I think it’s 14 generations of racialized trauma that stays in your body. And so it’s not just about reading a book on anti-racism. It’s like a physical practice like meditation or yoga and no one is exempt from it, right? Unless it’s the people who have been oppressed and done the work, probably to work out the oppressor right that colonization of the mind. It’s so pervasive and when I think about the connection to AI there’s nothing more important than doing the work because it has literally an impact on billions of people instantaneously.

Maikel: Yeah, exactly. And also with AI every day I say, indeed it can have an impact on so many people and also often the people that develop the technology, they won’t necessarily really see the people that are impacted. So there’s also a big gap there. If you directly see all the bad consequences that your actions might have, then it might also deter you, but where we are now, that’s not really the case. It’s like everything’s happening in some kind of a bubble, you could say. And the people, that are affected might be miles away from you, maybe on the different content, so you don’t you see it. And I think that’s also an issue because to be able to really empathize, it’s also important that you know what is going on and how your technology is being used and that you can really see what it might be doing to people. And it’s true for AI but also true for a lot of other technologies, I think AI is just the latest iteration. But like what you said before, also the books that you read, all those examples, would see they were not tested on women, mostly just for men. But also like medication often in trials, also, it was mostly men and also the mostly White men that were included there. So, as a result, certain medication might not work that well for women.

But for children, for instance, a lot of medication was never really tested on children. So that then also, they don’t necessarily know how well it would work for them, like, “They’re younger. Give them a half dose.” But yeah, you don’t really know for sure if they really, really incorporate in the development and the testing. In many different ways. Even outside of AI, there’s really this bias. And yeah, we often say it’s really not only about the technology you really need to get different kinds of mindset involved because we know society. If people are honest and really acknowledge what’s going on in society, then it isn’t a surprise at all, these biases here because when you just look at history, it is racism. It is sexism. It’s patriarchy. That is not something that is a total fantasy or that it’s something that people really don’t know about. I’m like, “Just open your eyes.” You can see, but you also need to be willing to acknowledge because I think it’s, the whole acknowledgment part is really about the people also choose to not acknowledge, and choose to the other direction, instead of really addressing what’s going on.

But yeah, I think like, as long as we as society, keep looking the other way, then we will keep finding ourselves back into these difficult situations

Sundae: Yeah, absolutely. So this leads me to one of my last questions on this topic is in light of the fact that, I don’t know, only a handful of companies or leaders control our software companies. And all of the other technology companies, we can also pretty much guess the demographics of the ones who are holding the power. What can we do as just regular citizens? Where is our power in this?

Maikel: Okay, that’s a good question, indeed. Of course, now is also as regular citizens, we have more opportunities to speak out against things. For instance, by utilizing social media, but at the same time, the reality is we look at the technology, it’s so pervasive. It’s not really that in a lot of cases that you really have the option to really opt out of things. But I think one thing that you can do, it’s important to, don’t just assume that because it’s technology that it will be objective. Realize that the technology is really a reflection of the society that we live in, of the history, of the historical human behavior, of the present time human behavior. And if you keep that in mind, that will also then make you more critical in realizing that, “Well, maybe I cannot just trust the technology. It isn’t just an objective thing. Maybe does not work that well for me.” And that will maybe also spur you to maybe do your own research. Also to make sure that you are well informed yourself. So that is one thing you can do to at least try to empower yourself more. Because like you said since you also learned more about AI and you also see that is indeed not just about technology. It’s about so many more other factors.

Sundae: Yeah, totally. For me, because I hold so many dominant identities, right? I’m White, I’m straight. I’ve got passport privilege. For me, what that says is, “Keep doing your work Sundae, because there’s so much to sort of deconstruct,” and keep having those conversations with my boys, keep having those conversations here and in community with the listeners. And that’s what I know I can do. It also makes me think about, “How can I send a message?” And that’s with your money. Which organizations you support or speak out against to hold leaders accountable. So I really want to keep this conversation going, also, when this goes live when we’re sharing it on LinkedIn and other forums to hear from other people, if they have ideas on what we as individuals can do to work toward some of these better outcomes.

Maikel: Yeah. Just speaking out is important.

Sundae: Yeah, and it works. We’ve seen it work. We’ve seen if people apply pressure and that’s one of the positive things about social media right now, when that spreads fast and people apply pressure, real change can happen. Do you mind Maikel, If we switch to a little bit more of an individual focus on you because my whole theme is around Ambitious Transformation in Transition. I’d love to hear a little bit more before we close off. What are some of the transitions that you’re feeling right now, whether they’re global transitions or personal transitions, and tell us more about some of the transformations you’re feeling.

Maikel: Yeah. well, I think one of the real transitions, I think that I’m feeling is that earlier this year of, a family member of mine actually passed away. I think those kinds of transitions, they are part of life, but they do show you how things can suddenly end also unexpectedly. I think that’s so interesting that I find it difficult to really process because of that reality that it can just end and that really also changes how we move forward because you might have the idea that you might have plans for the stuff that you’re going to do, but that really also shows is that it’s important to really also spend the time together with family because it can just be over like that. So I think that kind of transition has been a lot mind because it also still quite recently.

And other transitions, I think the way the working world is, at least our working world is changing, because, of course, during the pandemic, we were all working from home and how it will go moving forward. I think that’s a transition that I’m also in to see what is something that would work for me. But at the same time, I also realize that it’s quite a luxury that I mean because I was able to work from home and it worked fine, no problem. But other people like there were just stopped. So it’a also in equity again, because it’s the work that I can do, that’s fine. You can just do work from home. But other people, they really struggled through the pandemic work-wise. So, how that will bounce back out there, full transition. I think that’s something that’s been on my mind. For me, personally, is just I think, I’m just trying to find a balance in there. I think for me the working world has been very pleasant. Also has to do with the fact that the microaggressions and all that kind of thing that you talked about, not being confronted in that environment all week, all day. That’s also something that I have in mind. So transition wise, it’s basically for me, I’m going to definitely you maintain a part of what I’ve not been able to build up during this pandemic because in a certain way definitely been a more healthy environment for me. So yeah, that’s some of the transitions that have been on my mind.

Sundae: Thank you for saying that because that’s one of the reasons why that’s so important to me to make transparent because someone else that might be working with you has no idea that you’re in a process of grief, might not have an identity where they even think about the microaggressions that others have to have when they just on the tram, right? I think I’m committed to bringing this idea about straight talk and being a full human. This is part of you being a full human. Not just, an employee at a big fancy company and when we start to see each other’s humanity that way I think that helps us connect in new ways. So thanks for sharing that.

Maike: You’re welcome.

Sundae: What do you have going on right now that you would consider ambitious? And remember ambitious is defined by you and has no influence under external scope or scale. So it could be just like clean laundry. What’s ambitious for you right now?

Maikel: Ambitious is just basically just living my life authentically. Just being me. Because you hear a lot about showing up authentically. Well, the reality is that it’s not necessarily appreciated. But I’m like, okay, “I’m just Maikel in whatever space.” So, you know me from LinkedIn, but the way that I show up that’s the way that I present always. So the way I speak out is just me consistently in all areas. So that’s like my ambition to maintain that consistency and just keep it real all the time. And so, I think it’s important because I think that’s again also just giving the signal to the outside world like okay, “With me. You don’t have to question where I stand. I make it very clear.” So you don’t have to guess, how I view things and you might not like it but that’s for you to deal with because I’m not responsible for your emotional reaction to that. But I’m just going to keep it real. And that’s my ambition to just keep it real and just keep going the way that I’ve been going. And just speak my mind about the things that really matter. Because for me, it’s like I have my job to do my work but ultimately for me, not the most important thing. Like we talked about human rights issues, that is way more important and I’m not going to pretend like I’m just making my money. Everything is going fine. What in our part of the world to see the people are being killed or being brutalized or whatever that’s happening, would be very hypocritical of me to not care about that or not speak about that.

Because I have been approached by people who say, ”Aren’t you concerned that people might say something about the things that you say here.” And I’m like, “Well, I don’t lie. I just also just tell the truth, just the facts, nothing more than that.” And also know when I’m on the right side or not because like a lot of these issues like it’s not even very difficult when it comes to racism, police brutality, when it comes to sexism, misogyny. It’s not like it’s very difficult to decide what is right or wrong there. So when I speak out I know the real concern, because I know which side is right or wrong, it’s not very difficult.

It’s also with the #Metoo movement I remember that people were saying like, “We cannot say anything right now,” and I’m like, “No, you never could say those things.” It’s not really very difficult. We’re talking about certain extreme behaviors that werer okay, they might have been tolerated. But even though they were tolerated, you already knew that it was not okay. That’s also true for slavery and racism. Also 200 years ago they knew that there was not okay, but they twisted everything that they could to make it seem. So they used the law like Dutch law, which said that we were not humans. They used religion also to try to justify it and the fact that they spent so much energy in justifying, or now faced in America, with the surety to try to erase the history that already shows you that, actually, well, what we did was very bad. So bad that we really need to twist everything as much as you can, to make it go away because we know how bad it actually really is.

Sundae: Right. Exactly.

Thank you for being part of this conversation. It has meant so much to me and you’ve represented this topic with such clarity and integrity. I just want to say thank you for being here.

Maikel: Okay. Thank you. I appreciate it. And I also really appreciate our connection. I think we are able to fight clearly on the same frequency on many things. So it has been wonderful actually connecting with you. So I also appreciate your giving me this opportunity to spend some time with you on your podcast.

Maikel gave us so much to think about, I’m so grateful for the time he shared with us, and it left me with a lot of clarity on things that were before quite fuzzy, but a lot of new questions as well. I’m grateful that this is a conversation that we can have together. So if you’re listening to this and see this on social media, chime in with your questions or insight because this is an important conversation for us to be having. You’ve been listening to IN TRANSIT with Sundae Bean, steady advice in an unsteady world. Thank you for listening.

Maikel inspired me and left me with some hope because he is so committed to making change. And that is why I chose to leave you with the words of this poet, Vironika Tugaleva: “By serving humanity. I automatically serve myself.”

Enjoy The Show?

  • Don’t miss an episode, subscribe via iTunes or RSS.
  • Please leave us a review in iTunes (or here for Android).

The post 270: Accountable AI with Maikel Groenewoud appeared first on Sundae Schneider-Bean, LLC..

  continue reading

218 tập

Tất cả các tập

×
 
Loading …

Chào mừng bạn đến với Player FM!

Player FM đang quét trang web để tìm các podcast chất lượng cao cho bạn thưởng thức ngay bây giờ. Đây là ứng dụng podcast tốt nhất và hoạt động trên Android, iPhone và web. Đăng ký để đồng bộ các theo dõi trên tất cả thiết bị.

 

Hướng dẫn sử dụng nhanh

Nghe chương trình này trong khi bạn khám phá
Nghe