top of page

Gender bias in Health AI - prejudicing health outcomes (or getting it right!)

This week, Intelligent Health AI will bring together the global AI and health community in Switzerland to advance discussions on how emerging technology can be used to prevent and solve some of the world’s greatest healthcare problems, and improve the health of the human race. The hashtag for the meeting is #saveliveswithAIHealth. Caitlin Pley, Women in Global Health Policy Associate will deliver a presentation at the conference challenging the AI and health community to address gender equality as critical to transforming global health for all genders.

In dystopian sci-fi versions of the future, Artificial Intelligence (AI) machines and robots develop more intelligence than their makers intended, turn on human beings and wipe us out. But the real danger in Health AI is already with us – and it is that Health AI will entrench existing gender bias in Western medicine, bias that kills women through inappropriate diagnosis, clinical guidelines and treatment; neglects women’s health conditions and gives men more access than women to the design and use of AI innovations.

Women in Global Health have three headline messages on Health AI and gender:

1. Western medicine isn’t gender neutral – AI Health could change that

Western medicine may be based on scientific principle and rigour but it isn’t gender neutral. There are many striking examples of gender bias in medicine - women wait longer to be prescribed the painkillers they need, they wait longer to receive a cancer diagnosis, and they are more likely to have physical symptoms attributed to mental health issues. Many drugs, including general anaesthetics and pills for heart disease, have historically been tested only on men despite the obvious fact that the physiology of men is profoundly different to that of women. For example, although 70% of sufferers of chronic pain are women, 80% of pain medications have been tested only on men.

The sleeping pill Ambien, also known under the generic name zolpidem, was found, after it was licensed, to be metabolised faster in men than women. As a result, some women who took the recommended dose were found asleep carrying out daily activities, including driving. One woman who took Ambien remembers nothing after sleep driving in the early hours of the morning. She was found by police officers slumped behind the wheel, having stopped her car at the bottom of an exit ramp. The FDA has since reduced the dose of Ambien for women by half.

Women have routinely been excluded from clinical trials due to their "complicated biology", with worries that hormonal cycles could be seen to confound the results of clinical trials. Other factors are also important determinants of drug action, including sex differences in liver metabolism, kidney function, some gastric enzymes and whether or not the woman takes the pill. If a drug affects half the population differently, we need know about it but gender bias in science and medicine has often led to men being treated as the norm and women treated as ‘atypical’ men’. The results of this bias can mean life or death for women.

One of the most devastating examples of gender bias in medicine is the diagnosis of cardiac arrest. The hallmark symptom of a heart attack, the one commonly depicted in TV dramas and films, is the man who falls to the ground with a crushing central chest pain, often described as a tight band wrapped around the chest, that may radiate to the jaw or left arm [1]. However, these "typical" symptoms are typical male symptoms. Women are much more likely to experience other symptoms with a heart attack. Nearly half (40%) of women do not have any chest pain. Instead, women are more likely to complain of nausea, vomiting, abdominal discomfort, light-headedness, dizziness or shortness of breath. Basing diagnosis on male symptoms and assuming women experience the same, results in misdiagnosis and mistreatment, and as a direct result, women are more likely to die after a heart attack than men. In one study, women under the age of 45 experiencing a heart attack but presenting without chest pain were 14 times more likely to die than men of the same age with chest pain.

Image from the British Heart Foundation

Human bias in diagnosis is not limited to gender, and latest research in the UK has shown that black women are more likely to experience serious complications during childbirth than white women, linked to racial bias that black women can withstand more pain [3].

Artificial intelligence has the power to radically transform the way we practice and experience healthcare for better, or for worse. A well-designed algorithm can learn faster than any human, and make fewer errors. However, the way that AI algorithms are designed, and the data they are trained on, will have an enormous impact on learning outcomes. If societal stereotypes and biases are woven into the very fabric of an AI system, gender biases and inequalities may become further entrenched with negative health outcomes for women.

2. Inclusive Health AI will be better designed and more effective

To ensure that gender bias is not built into the fabric of AI systems and influences the diagnosis and management of disease, AI companies must deliberately include female perspectives at all stages of the development process. A gender-balanced team, including at the highest echelons of leadership, minimises the effect of societal biases and norms on patients' health outcomes, it also draws on 100% of the talent pool. Gender disparities in the STEM workforce are well documented and particularly acute in the field of AI. We are, in fact, regressing on reaching gender equality: the proportion of female researchers in AI was higher in the mid-90s that it is now, down to 12% from 14%[5]. While universities do slightly better than tech firms, the tech giants Google, IBM and Microsoft employ fewer than 15% women in their AI projects.[6]

A major funding gap exists too, in 2018, of the $130 billion invested into venture capital, only 2.2% went to female founders[7]. The importance of including a diversity of perspectives in AI design and leadership is not limited to gender. An AI system designed for a particular health system and patient population must necessarily be created by a team that is representative of this population, including gender, race, religion, and socioeconomic status. This is crucial to ensure that cultural and social barriers are taken into account in the system, and the health service is accessible and acceptable to all members of the population.

If Health AI aims to save lives – and its potential is enormous - tech companies cannot wait for gender inequalities in the system to self correct. On present trends it will take decades or even centuries before there is a gender balance in STEM higher education, senior leadership posts and tech start ups. Deliberate action will be needed to bring women and diverse perspectives to the decision making table as AI Health innovations are developed, tested and evaluated.

3. Gender inequity in access to Health AI can widen social and economic inequality

For Health AI to positively impact on the whole of society AI systems for health must be accessible to those who will be using them, both health workers and patients. Women account for 70% of the total health and social care workforce, but occupy only 25% of senior leadership roles. Women tend to be clustered into lower status and lower paid roles in the health workforce, with less access than men to new technology and training. Given the role of women in the global health workforce, health system leaders must ensure that all health workers whose role will change with the advent of AI receive training tailored to their needs and experience. For example, if a new AI-assisted health system necessitates a high standard of digital literacy, existing gender differences in access to digital education would only exacerbate inequalities for both female health workers, and female patients.

Leaders in the field must ensure that novel technologies like AI do not leave parts of the world even further behind and widen existing inequalities between resource rich and resource poor countries and within countries. This is particularly relevant in the month that world leaders meet at the United Nations for a landmark High Level Meeting on Universal Health Coverage. The digital gender gap is especially pronounced in sub-Saharan Africa, where millions of women lack access to smartphone technology and mobile banking, meaning that women may be less able to benefit from emerging technologies designed to improve health. In resource poor countries girls may have less access to secondary schooling than boys, limiting their ability to enter sectors of the health workforce using Health AI, likely to be better paid and higher status sectors. The responsible use of novel technologies includes recognising the limits of their application in society and taking the necessary steps to level the playing field so that everyone can reap the fruits of progress.

Conclusion

The advent of artificial intelligence in healthcare is as much a challenge, as it is an opportunity. AI systems are capable of accelerating progress on global health, by ensuring equal access and standards of care for all genders, improving the safety of healthcare work and providing equal opportunity for leadership, and deconstructing gender norms that lead to stigma, bias and discrimination. By working with large quantities of data, machine learning should be able to identify at which point we are failing in women's health, and improve health outcomes. Crucially, however, leaders in the field must ensure that novel technologies like AI do not leave parts of the world further behind and widen existing inequalities. Since women are half the population and areas of women’s health such as menstruation, menopause and endometriosis have historically been under researched, there are huge business opportunities for companies that invest in better diagnosis, management and treatment of conditions affecting women. But companies that fail to address sex and gender differences risk high business costs, including product recall, lawsuits, compensation claims and reputational damage, if they get it wrong.

When building AI systems for health, it is vital that our own bias is not programmed into the system. This requires us to first understand and recognise our own bias, and secondly, ensure that the datasets on which algorithms are trained are not biased by sex, gender or other factors. Caroline Criado Perez, author of Invisible Women, writes "Because of the way machine-learning works, when you feed it biased data, it gets better and better - at being biased. We could be literally writing code that makes healthcare for women worse."[4] Instead of allowing in-built human bias to become further ingrained, incorporating negative feedback loops into system design would allow AI to compare its own diagnosis with the clinically confirmed diagnosis. This form of checks and balances turns a vicious cycle into a virtuous one, allowing the AI system to ultimately become more reliable than a human practitioner.

Recommendations

Women in Global Health have 5 key recommendations to reduce the impact of gender bias on AI systems for health:

  1. Ensure all health professionals have equal access to AI for health. This includes ensuring that training is appropriate and suitable for everyone.

  2. Consider the impact of AI on existing inequalities and use it to disrupt gender bias for all genders. For example, given the large difference in smartphone ownership between men and women in sub-Saharan Africa, new technologies need to ensure that all desired end-users can actually access the technology.

  3. Use unbiased datasets and disaggregate results by sex, age and other factors. This is to ensure that machine learning does not become better and better at being biased if it starts out with unbalanced data sets.

  4. Publish all data and software open access and open source, so that the benefits of new technology can be shared more widely, inviting feedback from diverse sources and ultimately improving design.

  5. Bring women and diverse perspectives to the table for better results. Inclusive products can only be created if everyone is included in design and decision making.

AI in Health has the potential to entrench and reproduce gender inequality in health outcomes far deeper than human beings can do alone. Or it can quite deliberately disrupt gender bias and inequality and transform global health outcomes for all genders, for the better. Which will it be?

References:

[1] https://www.bbc.co.uk/news/health-17116820

[2] https://www.ahajournals.org/doi/full/10.1161/JAHA.117.007123

[3] https://www.bbc.co.uk/news/uk-england-47115305

[4] https://www.evoke.org/articles/july-2019/data-driven/deep_dives/the-dangers-of-gender-bias-in-design?utm_source=TW&utm_medium=MG_OG&utm_campaign=EV_JUL&utm_content=72619_POST_GB_TW&linkId=70880903

[5] https://www.nesta.org.uk/report/gender-diversity-ai/

[6] https://www.nesta.org.uk/report/gender-diversity-ai/

[7] https://twitter.com/DKTechAmb/status/1154401232842186752

[8] https://mg.co.za/article/2019-06-28-00-digital-tech-can-boost-health

[9] https://www.who.int/hrh/resources/health-observer24/en/

[10] https://www.theguardian.com/world/2019/jul/17/g7-leaders-to-back-tech-revolution-for-african-women

Featured Posts
Recent Posts
Archive
Search By Tags
Follow Us
  • Facebook Basic Square
  • Twitter Basic Square
  • Google+ Basic Square
bottom of page