Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Crikey
Crikey
Health
Cam Wilson

Emergency room patients’ scans used to train AI without their knowledge, documents suggest

Without their knowledge, emergency patients’ medical scans from public hospitals were likely provided by Australia’s largest medical imaging provider to train AI, documents reveal.

Last month, Crikey revealed that I-MED had given potentially hundreds of thousands of chest x-rays to Australian health technology company harrison.ai as part of a partnership.

In response, the Office of the Australian Information Commissioner said it was making “preliminary inquiries” into the practice, as politicians and consumer, privacy and patient advocacy groups raised concerns about patient data being used to train AI without express consent. 

Now, further analysis of a 2021 paper published on the training of harrison.ai’s chest x-ray AI tool, the Annalise.ai CXR, uncovers new details about the kinds of patients whose data was used.

The methodology section of the paper lists the different settings reflected in the 821,681 chest x-ray images obtained from I-MED, as well as a handful of other sources. This included cases that were “representative of inpatient, outpatient, and emergency settings”.

Emergency settings are one of the few situations where patients can be given procedures like a chest x-ray without giving prior consent. This might include conditions where a patient is unconscious or not in the right state of mind to agree to a procedure necessary to save their life or prevent serious injury. NSW Health’s requirement for consent policy, for example, lists it as one of two situations in which it is acceptable not to be given consent. 

It’s likely, too, that harrison.ai’s chest x-ray training set included patients from the public health system. According to I-MED, it has “clinics based in … public hospitals”. 

I-MED and harrison.ai did not respond to requests for comment. A spokesperson for Health Minister Mark Butler did not answer questions about whether he was aware of I-MED’s use of public hospital patient health data to train AI, if he had made inquiries about it, and whether he had any concerns. 

Bond University Associate Professor of Law Dr Wendy Bonython said she would expect a provider to seek consent for the purpose of training AI even if they were right to carry out a procedure without initial consent.

As Bonython told Crikey, “There’s no good reason why, if they wanted to include those images in a data training set, they couldn’t come back and seek consent later on, when the person it relates to or their substitute or delegate decision-maker has had an opportunity to actually think through, well, would the person want their data included?”

Privacy expert Dr Bruce Baer Arnold said that seeking consent for purposes like this is crucial to trust in the healthcare system and for the dignity of the patient. Even when it’s part of a larger data set, he said, there is still a risk to the individual.

“‘Big data’ about cohorts — potentially millions of people — is typically valuable because it’s in bulk, often comprehensive (e.g. age, gender, ethnicity and inferred income, occupation or location, rather than just a specific scan image), and enables pattern-matching that might be used for legitimate purposes and in future are likely to be used for purposes that we currently don’t envisage.”

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.