As a young graduate student working on her thesis at the Massachusetts Institute of Technology Media Lab, Joy Buolamwini, Ph.D., experimented with facial recognition technology. While the algorithm was able to recognize the lighter-skinned faces of her colleagues, it was unable to detect her skin’s melanin-rich tones. Struggling to be “seen” by computer vision, she donned a white mask, enabling the algorithm used by the artificial intelligence, or AI, to finally detect her face. This marked the beginning of her seven-year journey exploring and auditing the discriminatory bias she discovered inherent in AI algorithms. Now in her early 30s, she has made this quest her life’s work.
While writing her Ph.D. dissertation, Dr. Buolamwini founded the Algorithmic Justice League, an organization that combines research with art to illuminate the impact of AI on individuals and society. Today, as an internationally recognized computer scientist, she lends her expertise to congressional hearings and government agencies seeking to enact equitable and accountable AI policy.
An outgrowth of the league is her recent bestselling book, Unmasking AI: My Mission to Protect What Is Human in a World of Machines, in which she discusses the social implications of AI technology. “With the rapid proliferation of AI, it is more crucial than ever to ensure that these algorithms acting as gatekeepers serve us all and [do] not impede our civil rights,” she writes.
Is data destiny? What happens when a facial recognition algorithm is trained using a dataset containing mainly white male faces? As Dr. Buolamwini discovered, it is destined to fail to correctly recognize women and non-white people.
Those who may never have been accounted for in the design of an algorithm and its training dataset cannot be recognized. This leaves them invisible to algorithms, and they end up being “excoded.” Dr. Buolamwini explains: “You can be excoded when you are denied a loan based on algorithmic decision-making. You can be excoded when your resume is automatically screened out. You can be excoded when a tenant screening algorithm denies you access to housing. These examples are real. No one is immune from being excoded, and those already marginalized are at greater risk.”
“The machines we build reflect the priorities, preferences, and even prejudices of those who have the power to shape technology.”
— Joy Buolamwini
In her book, Dr. Buolamwini examines the social impact of AI technologies applied to high-risk decisions. She takes the machine learning/AI community to task for failing to apply insights from anti-discrimination scholarship. Understanding that the origins of algorithmic bias are found in real-world stereotypes, and then reinforced as AI products become artificially “intelligent” through machine learning, makes these omissions particularly poignant.
Dr. Buolamwini emphasizes: “The machines we build reflect the priorities, preferences, and even prejudices of those who have the power to shape technology.” The opportunity to define classification systems, she writes, is power. And we should not ignore the subjectivity of data classification as influenced by cultural, political, and economic interests.
“The responsibility of preventing harms from AI lies not with individual users but with the companies that create these systems, the organizations that adopt them, and the elected officials tasked with the public interest.”
— Joy Buolamwini
Moreover, people have an unrealistic expectation of the neutrality, infallibility, and objectivity of machines. Dr. Buolamwini clarifies our tendencies to “… swap fallible human gatekeepers for machines that are also flawed but assumed to be objective. Just like algorithms confronted with individuals who do not fit prior assumptions, the human gatekeeper stands in the way of opportunity, supposedly for the safety of those deemed credible enough or worthy to enter the building.”
As a society, we must understand that outsourcing difficult decisions to machines will not solve underlying social dilemmas and allow us to build a more equitable society.
Algorithmic justice
In a world in which decisions about our lives are increasingly informed by algorithmic decision-making, we cannot have justice and equity if we create AI-powered tools that erase the existence of certain people and use historical data that reflect discriminatory practices.
In Unmasking AI, Dr. Buolamwini shares her journey from being an eager computer scientist ready to solve the world’s problems with code to an advocate for algorithmic justice. While raising awareness of the imperfections of AI applications, she advocates for developing more equitable and inclusive artificial intelligence systems. Emphatically she reminds us: “The responsibility of preventing harms from AI lies not with individual users but with the companies that create these systems, the organizations that adopt them, and the elected officials tasked with the public interest.”
About Dr. Joy Buolamwini
Dr. Joy Buolamwini is a Canadian American with Ghanaian roots who describes herself as a “poet of code” seeking to illuminate the social implications of artificial intelligence. She founded the Algorithmic Justice League in 2016 to create a world with more equitable and accountable technology. She earned a Ph.D. at MIT and is a Rhodes Scholar and Fulbright fellow. She also holds two master’s degrees from Oxford University and MIT and a bachelor’s degree in computer science from the Georgia Institute of Technology. Fortune magazine named her to its 2019 list of the world’s greatest leaders, describing her as “the conscience of the AI revolution.”
Dr. Buolamwini also contributed to an article in the Winter 2020 issue of SWE Magazine, “Examining the Conflict between Technology and Human Rights.” The issue can be viewed at https://magazine.swe.org/past-issues/.
The subject was also discussed in the SWE All Together blog article, “CAMERAS EVERYWHERE: Examining the Conflict Between Technology and Human Rights.”
References
MIT Media Lab: www.media.mit.edu/people/joyab/overview/
Algorithmic Justice League: www.ajl.org/
Poet of Code: www.poetofcode.com/