From left to right: Rachel Coldicutt, David Leslie, Rumman Chowdhury, Noura Al Moubayed and Wendy Hall
Royal Society/Debbie Rowe
On the second day of the Women and the Future of Science conference at the Royal Society in London, a session on artificial intelligence highlights a significant issue: women are being marginalized in the development of new AI technologies. Ironically, during this discussion, my AI transcription software persistently misinterprets a name, revealing an underlying issue of gender bias in AI.
The session, chaired by computer scientist Wendy Hall, explores a deeper problem beyond the known biases in AI datasets. The focus is on the fact that men predominantly design these transformative technologies.
The tech industry has long been male-dominated. In the UK, only 25 per cent of computer science students are women. Meanwhile, in Silicon Valley, the environment has become increasingly unwelcoming to women, especially as generative AI has grown.
According to David Leslie of the Alan Turing Institute, there has been a regression over the past two years. He notes that the policies of the Trump administration have undeniably harmed women’s progress in science, reflecting a period of regressive thinking.
Last year, Donald Trump issued an executive order targeting “woke AI,” suggesting the removal of references to misinformation, diversity, equity, and inclusion from AI risk management frameworks.
Rumman Chowdhury, a data scientist and a former science envoy for AI, was responsible for ethics at Twitter until Elon Musk took over and disbanded her team. She argues that the concept of “woke AI” originated from misogynistic attitudes in Silicon Valley even before Trump’s intervention.
When asked to envision AI without women, several panelists indicated that this was already the reality. Rumman Chowdhury mentioned, “I am in the world of frontier AI, and that is the world of AI without women.” Rachel Coldicutt, who studies the societal impacts of new technologies, concurs, stating that the current state of AI reflects a world without women.
The lack of female representation in technology is significant. Historically, technology has been designed with men’s needs in mind, evident in areas ranging from crash test dummies to medical research. This gender data gap can have consequences that are not only inconvenient but also dangerous.
Chowdhury highlights that only 2 per cent of venture capital funding is directed toward women, while less than 1 percent of healthcare innovation focuses on women’s health. Coldicutt adds, “We need to make tech work for 8 billion people, not eight billionaires.”
Coldicutt suggests that with centuries of biased data ingrained in AI models, alternative models are necessary. These new models should focus on prioritizing care for both people and the planet.
Chowdhury, who co-founded the non-profit Humane Intelligence to promote accountability in AI systems, believes that the sense of urgency surrounding AI often overshadows diversity concerns. She explains that when people perceive an existential threat, they tend to disregard what seems non-essential, including diversity.
Leslie points out that to inspire future generations to use AI for societal benefits, the economic and political structures supporting AI development must be addressed. He emphasizes starting with the basics and transforming incentives.
Hall suggests that redefining intelligence to embrace diverse perspectives may be essential in the AI context. The original concept of AI, formed at a 1950s Dartmouth College conference, came exclusively from men, which Hall notes as a significant factor.
Topics:

