Women Don’t Seem to Care for AI

It was sometime in the spring of 2023 when Harriet Kelsall, a Cambridge jeweller who knows rather more about the gemstones in royal crowns than most of us, decided to interrogate ChatGPT about the St Edward’s Crown, the one placed upon King Charles III’s head during his coronation. The AI, with its characteristic confidence, proceeded to tell her about an entirely wrong crown. Large chunks of misinformation, delivered with the unshakeable certainty that has become the technology’s signature trait.

This small moment of disillusionment might seem inconsequential. One woman, with one error, one decision to stick with doing things the old-fashioned way. But it illuminates something rather more consequential: a chasm opening between men and women in their adoption of artificial intelligence, one that could reshape individual careers and the very architecture of the workplace for decades to come.

The numbers, when you examine them, are startling. A Harvard University study uncovered a gender gap of twenty-five percentage points in AI usage. Across the Atlantic and beyond, research from the Oliver Wyman Forum found that fifty-nine per cent of male workers use generative AI tools weekly, against fifty-one per cent of women. In Britain specifically, fifty-four per cent of men now employ AI in their professional or personal lives; for women, that figure collapses to thirty-five per cent. And here’s the detail that should give us pause: the divide is widest among the youngest workers, those aged eighteen to twenty-four. Seventy-one per cent of young men use AI weekly. Among young women? Fifty-nine per cent. We are watching a generation of women, ostensibly digital natives raised on smartphones and social media, choosing to stand apart from what technologists insist is the most revolutionary advance since the internet itself.

The reasons for this reluctance are manifold, overlapping, and, when you listen to women themselves, entirely rational.

Consider Michelle Leivars, a London business coach who refuses to let AI write for her. Her clients, she notes with some satisfaction, have told her they booked sessions specifically because her website copy felt authentic, unpolished by the smooth artificiality of machine-generated prose. They could hear her voice. Or Hayley Bystram, whose matchmaking agency deliberately eschews algorithms in favour of face-to-face meetings and hand-crafted member profiles that can take half a day to create. Using ChatGPT, she says plainly, would be cheating. It would strip the soul from the work.

Some experts describe AI content generation as “heavy photoshopping”. That is a comparison that carries more weight when you consider her particular concern about AI-generated images that render people as their “slimmest, youngest, hippest” selves.

These are aesthetic and ethical positions, rooted in values that our broader culture claims to cherish: authenticity, craftsmanship, truthfulness, and the irreducible value of human creativity. Kelsall puts it succinctly: “I value authenticity and human creativity.” It’s a statement that ought not be controversial, and yet in the current moment, it positions her as something of a holdout.

But there are deeper structural forces at work here, beyond individual preference or philosophical stance. Rembrand Koning, the Harvard Business School professor behind that twenty-five-point gender gap study, points to a sobering reality: women face greater penalties for being perceived as lacking expertise. If you’re already fighting for credibility in your field, and research suggests most women are, the risk of being caught relying on AI-generated information, particularly if it’s wrong or deemed unethical, carries disproportionate consequences.

Lee Chambers, a psychologist studying this phenomenon, identifies what he calls the confidence gap. Women, he notes, tend to want high competence before using a new tool; men are content to fumble through. Women are more likely to be accused of incompetence, more likely to have their credentials questioned, and more likely to see their ideas appropriated by male colleagues.

Using AI becomes not a productivity boost but another potential weapon in the armoury of those who would question whether you really wrote that report, really understood that analysis, really deserve that promotion.

The workplace penalties are already documented. Women’s median wages in Canada sit seventeen per cent below men’s. Women are overrepresented in precisely those entry-level office positions that AI is predicted to eliminate. They’re three times more likely than men to have their jobs replaced entirely. And yet they’re adopting the technology more slowly, even as the premium for AI skills surges. American job listings requiring AI competence advertise a twenty-eight percent wage premium, roughly eighteen thousand dollars annually. When listings specify two or more AI skills, that premium leaps to forty-three per cent. The mathematics of this is brutal. Women need to upskill to compete, yet they’re less likely to believe that AI will help them develop new skills or create opportunities. Only forty per cent of women think generative AI will enable skill development, against fifty-one per cent of men. Thirty-three per cent see it creating opportunities; forty per cent of men share that optimism.

Part of this stems from training, or rather, its absence. A joint American-Danish study found that women citing barriers to ChatGPT adoption pointed to insufficient training. But who provides that training? In most organisations, AI strategy resides within IT departments, where female representation remains dismal. The technology is being shaped, refined, and evangelised by the very demographic already dominating its use. This becomes self-reinforcing: tools developed predominantly by men, for men, learning from men’s usage patterns. Koning asks the pertinent question: “If it is learning predominantly from men, does that cause these tools to potentially respond differently or be biased in ways that could have long-term effects?” We already know the answer, because we’ve seen this film before. Facial recognition systems that fail to recognise Black faces. Virtual reality devices that disproportionately cause nausea in women, a problem that has moved with glacial slowness toward resolution. The pattern is established; the question is whether we’ll allow it to calcify around AI.

And also this: science, technology, engineering, and mathematics remain stubbornly male-dominated. In Britain, just twenty-four per cent of STEM workers are women. As Jodie Cook, founder of an AI coaching platform, observes, “If more women don’t view themselves as technically skilled, they might not experiment with them.” Even if the tools themselves don’t require deep technical knowledge, the cultural association persists. AI still feels like science fiction, and science fiction, as marketed in our culture, skews heavily male. Sure, some women are experimenting with more personal forms of AI, like journaling bots or conversation-based sexting AI companions that explore intimacy and agency in safer digital spaces. But those numbers are low as well.

There’s a certain irony here, almost baroque in its convolutions. AI, the technology that promises to level playing fields, to enhance human capability, this technology is widening existing inequalities precisely because those inequalities shaped who felt comfortable adopting it first.

The early adopters, as is the case with most new technologies, tend to be men. Young girls may show initial interest in technical fields, but they lose it over time, for reasons that have been exhaustively documented and insufficiently addressed.

The most promising suggestion involves turning the generational dynamic on its head. Generation Z, already more familiar with AI tools, could provide peer-to-peer mentorship to older colleagues. Young women could lead these initiatives, creating wider awareness whilst establishing themselves as authorities in the technology’s deployment. It’s an elegant solution: address the gender gap whilst leveraging the enthusiasm of younger workers who value community and collaborative learning.

But here’s what we must reckon with: perhaps some women’s reluctance stems not from ignorance or timidity but from a clearer view of what’s being sold. When Kelsall tested ChatGPT and found it confidently wrong, when Leivars heard her clients praise her authentic voice, when Bystram saw the soul in half a day’s careful work on a client profile, these were assessments of its limitations.

Does AI serve the values and working styles that many women have articulated? That is the question here. And whether the workplace of the future will have room for the authenticity, craftsmanship, and human connection that some are choosing to protect.

We stand at an inflection point. The gender gap in AI adoption could indeed magnify existing workplace inequalities, particularly in “pink collar” occupations already vulnerable to automation. Or it could prompt a necessary reckoning with how we’re deploying these tools, who benefits, and what we’re sacrificing in the rush to automate.

Ninety-eight per cent of surveyed workers believe they’ll need AI upskilling in the next five years. But henceforward, the challenge isn’t simply ensuring women have access to training, though that remains essential. It’s ensuring the training, the tools, and the workplace culture they’re being trained for actually deserve their participation.

Similar Posts