How computers see race and gender
Morgan Klaus Scheuerman has one fundamental goal with his research: to show tech companies that marginalized people matter.
To do so, the information science graduate student studies, literally, how computers see us, focusing on the facial analysis software hidden in everything from cell phones and computers to surveillance cameras at airports and malls. Already, his work has found that such platforms frequently misidentify those who are not white, male and cisgender (with a gender identity that matches their birth sex). He wants to understand why.
Where do things go wrong in the making of such products? Can they be improved? And should some technologies not be made at all?
鈥淲e have labor laws and advertising laws and housing laws against racial and gender discrimination, but there are no laws specific to embedding discrimination into algorithms,鈥 said Scheuerman, recipient of Microsoft鈥檚 prestigious 2021 Research Fellowship. 鈥淭he only way we discover such discrimination is when it happens to us.鈥
Principal investigator
Morgan Klaus Scheuerman; Jed Brubaker
Funding
Microsoft; National Science Foundation (NSF)
Learn more about this topic:
How computers see us: Doctoral student working to curb discrimination by artificial intelligence