Artificial Intelligence may be the fastest-moving technology of our time, but beneath its sleek surface lies a stubborn problem - bias. While engineers refine algorithms and corporations race to deploy AI into every sector, the issue of how these systems reflect and reinforce cultural stereotypes remains dangerously overlooked.
For Lakshmi Pillai Gupta, technologist, author, and entrepreneur, this isn’t a minor design flaw. It is what she calls a “gender glitch - culture embedded in code.” Her forthcoming book, The Gender Glitch: Decoding Identity in the Age of AI, argues that AI is not a neutral tool but a mirror, quietly amplifying assumptions about gender, beauty, family, and even intimacy.
The Invisible Biases of Everyday AI
AI’s gender problem is most visible in its daily interactions with millions of users. Female-coded voices dominate the world of digital assistants (Alexa, Siri, Google Assistant), almost always cast as helpful, obedient, and endlessly patient. Meanwhile, male-coded personas appear more often in roles linked to authority or decision-making.
Gupta believes these choices are not accidental. “When we interact daily with a female voice that always complies without question, it subtly shapes our expectations, not only of machines, but of people,” she says. “The danger is not in one conversation but in millions of micro-reinforcements that teach us who serves and who commands.”
This concern was echoed by UNESCO’s 2022 report, which warned that feminised AI assistants risk normalising the image of women as “docile helpers” who tolerate mistreatment. Early versions of Siri once responded to harassment such as “you’re sexy” with polite deflection: “I’d blush if I could.” For Gupta, this is not harmless banter but a reflection of attitudes women and girls have fought against for generations.
Yet the implications go far beyond how we speak to our devices. Gupta widens the frame to show how AI seeps into every corner of identity: when beauty filters dictate what “attractive” means, young women find themselves measuring against machine-made ideals; when algorithms enforce rigid pronouns, gender-fluid and non-binary identities risk being erased. She imagines futures where artificial wombs and AI caregivers decouple biology from care, where intimacy with machines tests our ideas of love and consent, and where body upgrades raise questions of rights and dignity. At the root of it all lies a deeper concern: who gets to design these systems, whose values are encoded, and whose voices are left out.
In short, Gupta argues that AI is not only about machines that work faster, it is about machines that shape who we are allowed to become.
Why It Matters for India
The stakes, she stresses, are particularly urgent for India. Generative AI is being introduced into classrooms, hospitals, banks, and even governance. From education to healthcare, young Indians will grow up with AI in their daily lives. Often, they will not even realise the subtle biases embedded within.
“If the defaults are not questioned today, we risk shaping an entire generation’s understanding of gender roles in ways that reinforce inequality,” Gupta warns. For her, inclusivity in AI is not just a matter of fairness but a cultural necessity for societies on the cusp of massive digital adoption.
A Global Concern
Gupta’s insights align with leading global voices. Safiya Noble, in Algorithms of Oppression, demonstrated how search engines reproduce racial and gender stereotypes. Joy Buolamwini at the MIT Media Lab exposed how facial recognition technologies misidentify women and darker-skinned individuals at dramatically higher rates. Even creative AI tools, when prompted to “show a woman,” often recycle Eurocentric ideals of beauty.
“These patterns reveal that bias isn’t just data, it’s design,” Gupta explains. “The way systems are built, the voices we choose, the images we privilege, all of these are human choices. If AI can inherit bias, it can also inherit aspiration. But only if inclusivity is intentional.”
Possible Alternatives and Experiments
There are glimmers of hope. Some technologists are rethinking defaults—creating assistants with no human gender, experimenting with rotating personas, or designing systems that occasionally push back to remind users they are interacting with a tool, not a servant.
Projects like Mozilla’s Common Voice are building diverse, gender-neutral speech datasets. In Kenya and Brazil, participatory design initiatives are including women, queer, and non-binary users in dataset creation. In Finland, AI literacy is being introduced in schools, teaching students to question cultural assumptions hidden in their tools.
But Gupta cautions that such examples remain exceptions. “The reality is that most AI development is driven by speed-to-market and commercial gain. Ethical questions become secondary,” she says. Without wider adoption of inclusive design practices, the danger is that stereotypes will not only persist but become invisible defaults.
The Values Question
For Gupta, the debate is not whether AI will replace human work or decision-making. Instead, the deeper question is what values these machines will inherit and amplify in the process.
“AI is not just about intelligence; it is also about identity. Every click, command, and interaction feeds into that story. If we don’t question it, the future of who we are will be written by default, not by choice.”
As a columnist, speaker, and soon-to-be published author, Gupta’s mission is to make these cultural blind spots visible before they become permanent. Her book has been described by early readers as “razor-sharp, unsettling, and oddly hopeful,” promising to spark important debates about the intersection of technology, feminism, and identity.
In a global conversation that increasingly revolves around AI, voices like Lakshmi Pillai Gupta, Safiya Noble, and Joy Buolamwini are essential. Together, they remind us that building machines is not only about innovation but about responsibility.
About Lakshmi Pillai Gupta
Lakshmi Pillai Gupta is the Founder of Ammara Technologies Pvt. Limited and an author based in Noida, India. With over 22 years of experience in digital transformation, she has designed enterprise systems at Infosys and SAP Labs, scaled ERP platforms at Attero Recycling, founded ventures in logistics (UrbanHopperz) and FMCG (Eiwa), and served as Senior Digital Products Advisor at McKinsey, where she developed strategies for agri-tech marketplaces.
An electronics engineer from Kurukshetra University with a postgraduate diploma in advanced software design from ER&DCI, Gupta has worked across clean-tech, logistics, FMCG, and AI. She is now developing an AI-driven healthcare startup (in stealth mode) while pursuing her larger mission: building empathy-driven, inclusive technologies and stories that challenge bias and reimagine identity.
Her forthcoming book, The Gender Glitch: Decoding Identity in the Age of AI, promises to spark one of the most important debates of our time: not just about innovation, but about the kind of humans we become when technology starts to define us.