Can artificial intelligence close the gap in women’s health care? Pipa News

Can artificial intelligence close the gap in women’s health care?

For decades, women have been vastly underrepresented in medical research and investigations. That’s starting to change — Health Canada’s mandate to include women in clinical trials in 1997 was a helpful start — but many of the protocols for diagnosing and treating patients are based on what works for men. That means women take drugs dosed for the average male body weight and experience unexpected side effects due to biological differences. Women are even 50 to 75 percent more likely to experience a side effect of medication than men.

Their symptoms are also less likely to be taken seriously. For example, while heart disease is the leading cause of premature death in women in Canada, early signs of heart attack were overlooked by doctors in 78 percent of women, according to the Heart and Stroke Foundation. And issues that specifically affect women receive far less attention than those that only affect men — five times more money goes to research on erectile dysfunction than studies on premenstrual syndrome.

In recent years, researchers have looked to artificial intelligence (AI) to improve healthcare. Experiments with the technology have shown promise in easing patient flows, helping diagnose skin problems, reducing human hours spent on administrative details, prioritizing patient needs and facilitating drug discovery, among other things.

So can AI help close the knowledge gap and improve healthcare for women and other underserved groups? It’s an exciting prospect and very conceivable, says Ashley Casovan, executive director of the Responsible AI Institute in Ottawa. The non-profit organization works to promote the responsible use of AI systems through the development of a certification program. “There are tons of potential opportunities,” she adds — but only if there is rigorous oversight and inclusiveness every step of the way.

AI is only as good as the data it can access. Aren’t these algorithms perpetuating false assumptions and misunderstandings about women’s health?

There are many concerns about using AI systems in healthcare – building these tools from historical biases and even the systemic issues that exist in healthcare, where women don’t have the same level of access and there isn’t the same amount of research. to understand women-specific disorders.

Then these bleed into understanding how an AI will be trained. If not built responsibly, it could perpetuate those systemic human-induced healthcare challenges.

For example, we saw this in Google’s first attempt to create a healthier diagnosis and treatment for skin conditions. (Black people made up 6.8 percent of the sample dataset.) It’s important to make sure these systems are fair to everyone.

Are you an optimist when it comes to this technology? Is inclusive AI actually possible in healthcare?

There is a way to make it more equitable by identifying the issues and then identifying how to mitigate some of the risks and harms for specific groups through design, development and operational processes.

Responsible AI tries to establish standards for assessing the systems before deploying them. That means looking at things like what data is used to train the model, what type of patient consent is required, whether a system improves physician decision-making or is more of an end-to-end process. And when the process is used, we need to make sure people understand the implications and, if something goes wrong, there are redress mechanisms to support them.

Recommendations have been made to ensure that people affected – women, various minority groups – are involved in the design and development of those systems. The whole process should be more inclusive.

What is the potential of AI to level the playing field for women in healthcare?

It’s like seat belts are made for men, not women. How do you design something more nuanced? With AI, there is an opportunity to have more options tailored to different groups. A major challenge is when gender differences intersect with other things. If you have a woman who also has pre-existing conditions, is there the right medicine for that?

It comes back to collecting data. If women, children, and people of color are underrepresented in that data, then its interpretations become skewed toward the overrepresented populations.

What systems and rules should there be?

At the federal level, it is recognized that AI needs oversight. That was documented last year in Bill C-27, which combined AI with data privacy and consumer protection – all of these components go hand in hand.

The data entry is one factor; the context in which that system is deployed is another. Canada recognizes that a uniform policy will not serve all contexts of how AI is used.

Health Canada is thinking about what that industry regulation is. And since healthcare is regulated provincially, it works together with the provinces.

Health inequalities are recognized problems. They are not new. They are not because of AI. But we are at this turning point where we can use technology to help us or perpetuate this damage. And I don’t want to see that last one. There must be regulations, oversight and standards so that AI can be used in a way that benefits everyone, rather than just a few.

Ashley Casovan will participate in a panel discussion on what it takes to transform healthcare leadership during a special MaRS morning, networking breakfast and lecture on March 8. More information here.

Torstar, the parent company of the Toronto Star, has partnered with Mars to showcase innovations in Canadian companies.

disclaimer This content was produced as part of a partnership and therefore may not meet the standards of impartial or independent journalism.


Most Popular

Most Popular