Omnia Health is part of the Informa Markets Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Why data should fuel healthcare for all

diversity-in-data-abstract-image.jpg
Diversity and inclusion in data should be given top priority, according to Dr Maliha Hashmi, from having mixed teams to strengthening regulations.

We are well into 2022 and it is not unreasonable to consider that the worst of the pandemic is now behind us. But rather than return to “normal”, we can expect many changes to the way we work and live that we saw emerge in the past two years to accelerate and deepen.

It is a significant opportunity. We can and should do better.

One area that merits further and immediate attention, that is close to my heart, is diversity and inclusion. Pre-pandemic, we saw how offices welcomed mothers to the office with toddlers with the provision of daycare services, while Muslims were provided with a prayer area at work, so that they didn’t have to seek somewhere suitable to pray.

Today we have seen how the pandemic has ignited the digital transformation of industries, including healthcare, meaning that this new technological landscape, already sweeping the old world aside, must also take into account diversity and inclusion needs.photo5832531915880642331.jpg

In particular we need to take a closer look at health data – the new oil (as some call it) underpinning many of the new innovations disrupting healthcare and set to grow enormously in volume. By 2025 there will be an expected 163 zettabytes of data worldwide, and 30 per cent of this will be in healthcare. That is a big number: a zettabyte is equivalent to a trillion gigabytes.

Here's why we need to take data seriously. A February 2022 paper by Imperial College London found that data-driven technologies including AI, while demonstrating potential in the diagnosis and treatment of diseases such as skin cancer, could worsen the health inequalities experienced by minority ethnic groups if biased algorithms, poor data collection and a lack of diversity in R&D are not addressed. 

Unconscious and conscious bias in AI is partly fuelled by the lack of diversity in academia among AI developers, according to the report, and at strategic levels of the health system and beyond. Having diverse teams is key.

In its call for evidence for a Women’s Health Strategy for England, the UK government saw that future research should recruit and consider groups of women who have been under-represented historically in research and data collection so that research outcomes may benefit all women in society. Its ambition is furthermore to have the right data, and to make better use of the data collected to tackle sex-based data gaps with the goal of improving women’s health outcomes, reducing disparities and supporting a life course approach for women’s health.

Today’s data-driven products and services will become more sophisticated and ubiquitous as the “Fourth Industrial Revolution” continues apace. Talk is growing of a quantum AI and there are already major investments in the “metaverse”, touted as the new internet, meaning that we must begin now and with urgency.

Any investment in or deployment of new technology must factor in diversity and inclusion from the outset – at the design stage – with an understanding of how datasets will work for us all. These technologies are not only used in areas such as imaging and diagnostics; they are increasingly used in the hiring process, with AI-powered tools used to find and recruit professionals.

Key questions include who is collecting the data, how is the data being collected, and is that data representative of a complete global picture? Who then analyses and interprets the data, and how will it be applied? According to Heather Mattie, co-director of the Applied Artificial Intelligence for Health Care programme at the Harvard T.H. Chan School of Public Health, algorithmic bias can “creep into” the process anywhere

For the avoidance of biases, and to ensure that the data is to be used ethically, mixed teams are required at all stages (gender, ethnicity, age, religion, and more) must be deployed at every stage.

In healthcare, as has been noted by the Imperial College London paper authors, the patients and the public must be involved and consulted throughout, while a Journal of Global Health paper also called for clinicians to be involved to offer a deep understanding of the clinical context.

Aside from this, there is a need for specific legislation and regulations in AI to minimise bias and ensure that the rights of the individual are protected. Legal frameworks to watch include a draft AI act unveiled by the European Commission in late 2021 which focuses on the specific utilisation of AI systems and associated risks, and the Algorithmic Accountability Act of 2022 in the United States.

International Women’s Day this year had as its theme #breakthebias. As we pause to consider how we can better challenge stereotypes and discrimination, let us also give thought to the data challenge. It is important to get this right.

TAGS: Management
Hide comments
account-default-image

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish
Patient Safety cover-thumbnail_1
The Evolution of Healthcare – Patient Safety in the Post-Pandemic Era

Download the report

This report shares exclusive perspectives from healthcare leaders from Patient Safety Congress, who discuss the lessons learned from the pandemic and what the future has in store.

Topics include building safety cultures within health systems, harnessing data-driven insights, integrating AI and – importantly – preventing the next pandemic.

Patient Safety cover-thumbnail_1