[ad_1]
Stephanie Lampkin has dedicated her career to addressing the important issue of bias and fairness and embedding the core values of fairness and equity in corporate America. Stephanie, Founder and CEO of Blendoor, recently joined Ben Taylor, Chief AI Evangelist at DataRobot, on the DataRobot More Intelligent Tomorrow podcast and explained the importance of removing an applicant’s identity during the hiring process:
“There were actually orchestras who did this years ago. In an effort to increase gender diversity, they required everyone who was auditioning to do so behind a curtain. They even required you to take your shoes off so couldn’t tell if the person had on high heels or not, and it increased gender diversity by 5X.”
Stephanie points out that the same principle has been applied to the coding world:
“They’ve seen it even more recently with GitHub repositories. They had some senior engineers review code—anonymized code from some male and some female engineers—and found that people tend to appreciate the female engineers code when it was anonymized.”
Diverse Datasets Must Train Machine Learning Algorithms
Stephanie emphasizes that her mission at Blendoor goes beyond an anonymous recruiting process:
“One of the things that I’m really passionate about, in addition to this whole idea of anonymizing resumes—which is just the beginning—is getting the diverse datasets necessary to train the machine learning algorithms that are going to become more ubiquitous and responsible for our day-to-day lives. Because, historically, there’s been a lot of bias—we’ve seen that with credit scores most significantly.”
Stephanie recounts that her mother put “the fear of death around credit,” and that the issue of credit scores and the ability to buy a house still looms large today:
“I think that is probably one of the most direct ways in which bad algorithms are impacting certain people’s livelihoods. And, as we enter the Fourth Industrial Revolution, those inequalities are only going to be exacerbated, so I’m working to help get more diverse datasets to train some of these algorithms.”
Stephanie also points out how a lack of diversity in datasets has negatively affected the accuracy of facial recognition:
“Facial recognition fails on anyone not pale or male because the datasets that they’re using to train them are on people who are pale and male. It’s actually a 98% success rate for a white man, and 65% success rate for black women. Joy Buolamwini was really the leader in bringing this to light and creating this whole program around more algorithmic justice.”
Stephanie Believes in Celebrating Life, Not Easy Wins
Stephanie bucks a Silicon Valley trend and finds that an even keel approach works best in the C-Suite:
“This is kind of lame to say, but I don’t celebrate wins, nor do I get really down about losses. I can’t handle the fluctuations. So, if something really big happens it’s just like, ‘That’s great. Going back to work.’ If something really bad happens it’s like, ‘Alright, great, going back to work.’ Because that’s all I’m able to do. If I go up too high, then I might go down too low.”
Outside of work, Stephanie, who is an avid skier, is much more adventurous with her celebrations and pastimes:
“Yeah. I celebrate being alive. I go to Burning Man—that’s a really good life celebration. My birthdays are always epic. That’s something to celebrate: being alive, being healthy, and being able to take care of my family.”
To hear more about bias and fairness in AI, algorithmic justice, and the changing face of datasets, check out Datarobot.com/podcast or http://datarobot.buzzsprout.com/. You can also listen everywhere you already enjoy podcasts, including Apple, Spotify, Stitcher, and Google.
About the author
Enabling the AI-Driven Enterprise
The leader in enterprise AI, delivering trusted AI technology and enablement services to global enterprises competing in today’s Intelligence Revolution. Its enterprise AI platform maximizes business value by delivering AI at scale and continuously optimizing performance over time.
Meet DataRobot
[ad_2]
Source link