Londa Schiebinger is the John L. Hinds Professor of History of Science at Stanford University and founding Director of “Gendered Innovations in Science, Health and Medicine, Engineering, and Environment. She is a leading international expert on gender in science and technology and an elected member of the American Academy of Arts and Sciences. Her work on Gendered Innovations harnesses the creative power of sex and gender analysis to enhance excellence and reproducibility in science and engineering.
Professor Schiebinger received her PhD from Harvard University in 1984. Over the past thirty years, her work has been devoted to teasing apart three analytically distinct but interlocking pieces of the gender and science puzzle: the history of women’s participation in science; gender in the structure of scientific institutions; and the gendering of human knowledge. She presented this work to the United Nations in 2010 and again in 2014. She is a recipient of numerous prizes and awards, including the prestigious Alexander von Humboldt Research Prize and Guggenheim Fellowship. She is the author of numerous prize-winning books, including The Mind has No Sex? Women in the Origins of Modern Science and Nature’s Body: Gender in the Making of Modern Science.
Schiebinger’s recent research analysed gender in Artificial Intelligence (AI). AI is the next big thing, which is shaping technology gender in a unique way. It incorporates in ways that scientists couldn’t even think of back in the 20th C. However, as Schiebinger points out that AI might unknowingly perpetuate past bias into the future, even when the governments, universities and companies and big giants like Google and Facebook seek to foster equality. Schiebinger’s research was inspired by the integration of AI with society. http://genderedinnovations.stanford.edu/case-studies/machinelearning.html#tabs-2
According to Schiebinger, AI is prevalent in present-day societies which have elements of gender and ethnicity. In this new age era of algorithms, Schiebinger observed that algorithm bias arose from multiple sources, which ranged from human bias to unconscious choices in the algorithm design. On close examination, Schiebinger saw that a lack of proper monitoring and regulation in the era of machine learning might lead to social inequities. She suggests the need for researchers to understand how gender and ethnicity operate within the context of the algorithm. She has further suggested some “avenues to reduce bias in training data and algorithms in efforts to produce AI that enhances social equalities”.
She has given an elaborate analysis of gender and has described gender, gender norms, gender identity and gender relations in her paper.
Furthermore, Schiebinger has mapped known examples of human bias amplified by technology. She observed how men are more often offered high paying jobs via google ads in comparison to women. Schiebinger also analysed factors which, intersected with sex and gender. She has uniquely elucidated examples of ethnic bias and gender intersecting with ethnicity in her work.
Schiebinger has suggested the building of training databases to avoid bias, to develop methods that can algorithmically detect and remove bias. She has recommended solutions that can create AI to result in high-quality techniques and social justice. According to her, attention to infrastructure issues is prime, a rigorous social benefit review should be in place, the creation of interdisciplinary and socially diverse teams is essential and integration of social issues into the core CS curriculum would aid in the transformation of AI usage.
She has agreed that AI will play an important role in reshaping economics and societies. According to Schiebinger, rather than creating social inequalities, humans have the intellect and the power to enhance the quality of life worldwide and AI should catalyse such efforts.