< Back to News

What AI Has to Get Right About Women

March 27, 2026

By Rachael Payton, Senior Director of Communications and Marketing

Pavithra remembers the days as a young girl standing in long lines with her mother for a single pot of clean water. The water in their home was often yellow and sometimes visibly dirty. She remembers routine power cuts, some that were scheduled and some that were unpredictable. She remembers the normalcy of life being shaped by uncontrollable interruptions at the hands of a greater power structure she’d grow to learn more about. But it’s not just the scarcity of resources that her memory bank holds, it’s who carried it. “It was almost always the women. Women waiting. Women managing. Women reorganizing entire households around uncertainty,” she says.

Daughter of the resilient and resourceful lineage of India, Pavithra Priyadarshini Selvakumar is now a Postdoctoral Research Scientist at Columbia University’s Climate School leading research on the intersection of AI, women, and climate justice. What was once just her everyday lived experience as a young girl, is now her everyday call to action. The question she raises in her article “How Can AI Address Climate Justice When Women’s Voices Are Silenced?” inspired me to look more closely at what happens when women are left out of the algorithm. 

According to the United Nations, women and girls make up 80% of the people displaced by climate disasters, putting them at an elevated risk of violence and food insecurity. By 2050, climate displacement may push up to 158 million more women and girls into poverty. Simultaneously, our AI systems are becoming more powerful and increasingly shaping how institutions make environmental decisions, from disaster prediction to insurance pricing to energy forecasting or resource allocation. 

Pavithra’s work brings attention to what can happen when the experiences of women, the population most harmed by climate disasters, are not fed into the AI systems shaping how the world responds to crises. If women’s experiences are missing from the data and the decisions, humans will keep building systems that get the future wrong. For example, in a climate disaster, AI systems are more likely to prioritize things like asset recovery or infrastructure restoration over more critical survival issues like prolonged heat exposure, sanitation of evacuation shelters, continuity of medication during displacement, or the loss of income stability. But the gap is not just about who is included in the data, it’s also about who is inputting the data.

In the United States, about eight out of ten women (roughly 58.87 million) have jobs that are highly vulnerable to automation such as clerical work, retail, or administrative roles. Not only are women at more risk for job displacement, we are less likely to use or be trained on generative AI. When you add in the existing research telling us that women are more likely to have negative experiences with technology, like explicit nonconsensual deepfakes or online harassment, it’s not so surprising that there are a lot of women who are more skeptical about learning AI. 

And of course, women are even less likely to be in the AI workforce. For women pioneers who are leading AI startups, they receive pennies compared to the billions of venture capital funding being poured into the sector. The CEO of AI software company Palantir, Alex Karp, said himself that AI technology “increases the economic power of vocationally trained working-class, often male, voters,” while it decreases the power of “highly educated, often female, voters.” Who participates in shaping these systems matters, a lot.

Manuela Veloso, Professor Emeritus in the School of Computer Science at Carnegie Mellon University has been researching and building AI systems since the 1980s. As she puts it: “It’s the human mind that conceived such technology, and it’s up to the human mind to make good use of it.” For decades, Professor Veloso has described AI as the greatest test to humanity. “Humans need to give [AI] feedback and we must work with what is good about the technology to build machines that are trained to continually improve for good.” But if women’s lived experiences are not included in that training, we will fail the test. 

Professor Veloso also stresses the importance of educating communities, especially women and girls, about how these systems work and what’s possible with them. “AI participation shouldn’t be limited to a small group of technologists, it should reflect the communities they impact the most,” she says. These impacts do not stop in the code or in the boardroom, they also come by way of physical infrastructure.

For many Black, Hispanic, and Indigenous women, we are more likely to live in the environmentally burdened communities where AI data centers are being built. These massive digital warehouses often emit pollution in communities already environmentally burdened, increase energy costs, and drink up water in drought ridden regions. Senator Bernie Sanders and U.S. Representative Alexandria Ocasio-Cortez recently introduced the Artificial Intelligence Data Center Moratorium Act to address these and other harms. The bill calls for an immediate halt to the construction or expansion of data centers until federal regulation is in place.

“Air pollution that comes from these data centers compounded with existing legacy pollution makes us more at risk for miscarriages, low birth rates, premature births and other pregnancy complications,” says LaTricea Adams, founder of Young, Gifted, and Green as she stresses the role that reproductive justice plays in the environmental justice fight. “These are complications that Black women have already been at a higher risk for without the data centers.”

LaTricea is a lifelong Memphian, where the community has been fighting back against xAI’s supercomputer Colossus and its pollution since 2024. LaTricea also talked about Memphis being the asthma capital of Tennessee, with many children missing school and parents missing work because of severe asthma attacks. “The more we are exposed to the pollution, the more we are all exposed to the body burden.”

The lack of women included in the data and the decision-making impacts everyone, not just women. Women are often the glue that holds families and entire communities together. What’s good for us is good for humanity. As Professor Veloso says, “women are not just spectators of the future, we are actors of the future.” We still have a window of opportunity to get this right. For me, getting AI right means getting women right. And per Pavithra, getting women right means:

  • equitable access to digital infrastructure and AI education for women
  • empowering women’s participation in the technology and governance
  • data center regulation with attention to environmental and social impacts
  • accountability frameworks so that environmental burdens are not shifted onto already vulnerable populations

We have a once in a lifetime opportunity in 2026. As America celebrates its 250th birthday, we are at another major turning point where we get to decide who is written into the next chapter and who is left out. Let us make the right decisions about AI today so that in another 250 years, history will show that we got AI, and women, right.

The future starts with a dream.
The future starts with us.
A woman laughing
Dream.org Logo
Dream.Org - 1630 San Pablo Avenue, 4th Floor, Oakland, CA 94612, USA
Show your support and help fuel our work.
DONATE NOW
crosschevron-down