Women's Rights & Issues
Related: About this forumWhere are all the 'godmothers' of AI? Women's voices are not being heard
Where are all the godmothers of AI? Womens voices are not being heard
Luba Kassova
Amid the coverage of Sam Altman returning to the helm of OpenAI, women are being written out of the future of AI
Sat 25 Nov 2023 04.00 EST
Last modified on Wed 29 Nov 2023 07.40 EST
?width=620&dpr=1&s=none
We are heading toward the best world ever, said Sam Altman in an interview earlier this month, just before the saga of his firing and rehiring as OpenAIs chief executive. As an expert on gender equity in news, this statement made me wonder: whose world was heading towards being the best ever? As it turns out, the one the Altman team is crafting is largely devoid of women. My analysis amid the furore around his dismissal revealed fascinating insights: for example, of the 702 (out of 750) employees who signed the letter demanding Altmans reinstatement more than 75% were men, a gender imbalance that matches that identified in AI teams in McKinseys The State of AI in 2022 report. After Altmans return, OpenAIs newly established board of directors is now made up exclusively of white men, a situation compounded by male dominance among executives. Where are the voices of female AI leaders and experts in coverage of this most dramatic of Silicon Valley stories?
Womens role in crafting our AI-infused future and shaping the news around generative AI has concerned me for some time. From analysing data and conversations with experts, I realise that, whether as developers, news editors or AI experts, women are largely absent from the AI world. Generative AI (GAI) relies on processing vast datasets of text, images and video, all of which have featured overwhelmingly more men than women in the past. This inherited male bias, mirrored in the news, combined with the structural gaps women face in society today, results in a narrative about GAIs risks, limitations, opportunities and direction shaped primarily by men. AKASs pronoun analysis of GDELT Projects global online news database shows that so far this year men have been quoted 3.7 times more frequently than women in news about AI in English-speaking nations. According to the most recent Global Media Monitoring Project results, only 4% of news stories focusing on science, technology, funding discoveries and developments centred around women. An assessment by AKAS of tech news editors in April shows that in Britain and the US only 18% and 23% respectively were female. Men are between three and five times more likely than women to be deciding what constitutes a technology story.
. . .
Given womens peripheral presence in the AI industry and muted voice in news, their concerns are unlikely to be captured, let alone addressed in future developments.Leslie McIntosh, vice-president of research integrity at Digital Science, says: If your perspective is not reported, you are not in the story. GAI takes those historical texts and is building and projecting our future. So where womens missing voices were once crevices, they have now become large gaps. Nicholas Diakopoulos, professor in communication studies at Northwestern University in Chicago, says: Disparities in representation of race, gender or different occupations [in generative AI models] are important, since if the media uses these kinds of models uncritically to illustrate stories, they could easily perpetuate biases embedded in the training data of the models.
?width=620&dpr=1&s=none
Tasha McCauley in 2014. She has just been ousted from OpenAIs board along with Helen Toner. Photograph: Jerod Harris/Getty
. . . .
What can be done to ensure that concerns voiced by women such as Helen Toner and Tasha McCauley (both now ousted from OpenAIs board), or those of the women on the fringes of the AI industry, are not squandered? While there is much debate about the effectiveness of the guard rails (coding aimed at correcting data biases), I detected a consensus among the experts I spoke with: AI itself can remedy the diversity deficit. What gets measured, gets managed, says Lars Damgaard Nielsen, Mediacatch.ios chief executive, and a proponent of using AI to track gender and ethnic bias in the media. He and other experts argue that an effective way of correcting male bias would be to use AI to measure womens share of presence within the discourse, flagging to us humans the vital need to seek the perspectives of all genders, groups and cultures on one of the centurys most far-reaching stories.
Luba Kassova is the author of the Missing Perspectives of Women series of reports and co-founder of international audience strategy consultancy AKAS.
https://www.theguardian.com/global-development/2023/nov/25/where-are-godmothers-of-ai-womens-voices-not-heard-in-tech-sam-altman-openai