Artwork

Sisällön tarjoaa the Royal Australasian College of Physicians and The Royal Australasian College of Physicians. the Royal Australasian College of Physicians and The Royal Australasian College of Physicians tai sen podcast-alustan kumppani lataa ja toimittaa kaiken podcast-sisällön, mukaan lukien jaksot, grafiikat ja podcast-kuvaukset. Jos uskot jonkun käyttävän tekijänoikeudella suojattua teostasi ilman lupaasi, voit seurata tässä https://fi.player.fm/legal kuvattua prosessia.
Player FM - Podcast-sovellus
Siirry offline-tilaan Player FM avulla!

Ep99: When AI goes wrong

38:43
 
Jaa
 

Manage episode 373783911 series 2898400
Sisällön tarjoaa the Royal Australasian College of Physicians and The Royal Australasian College of Physicians. the Royal Australasian College of Physicians and The Royal Australasian College of Physicians tai sen podcast-alustan kumppani lataa ja toimittaa kaiken podcast-sisällön, mukaan lukien jaksot, grafiikat ja podcast-kuvaukset. Jos uskot jonkun käyttävän tekijänoikeudella suojattua teostasi ilman lupaasi, voit seurata tässä https://fi.player.fm/legal kuvattua prosessia.

This is the fourth part in a series on artificial intelligence in medicine and we try and unpick the causes and consequences of adverse events resulting from this technology. Our guest David Lyell is a research fellow at the Australian Institute of Health Innovation (Macquarie University) who has published a first-of-its kind audit of adverse events reported to the US regulator, the Federal Drugs Administration. He breaks down those that were caused by errors in the machine learning algorithm, other aspects of a device or even user error.
We also discuss where these all fit in to the four stages of human information processing, and whether this can inform determinations about liability. Uncertainty around the medicolegal aspects of AI-assisted care is of the main reasons that practitioners report discomfort about the use of this technology. It's a question that hasn’t been well tested yet in the courts, though according to academic lawyer Rita Matulonyte, AI-enhanced devices don’t change the scope of care that has been expected of practitioners in the past.

Guests
>
Rita Matuolynte PhD (Macquarie Law School, Macquarie University; ARC Centre of Excellence for Automated Decision Making and Society; MQ Research Centre for Agency, Values and Ethics)
>David Lyell PhD (Australian Institute of Health Innovation, Macquarie University; owner Future Echoes Business Solutions)
Production
Produced by Mic Cavazzini DPhil. Music licenced from Epidemic Sound includes ‘Kryptonite’ by Blue Steel and ‘Illusory Motion’ by Gavin Luke. Music courtesy of Free Music Archive includes ‘Impulsing’ by Borrtex. Image by EMS-Forster-Productions licenced from Getty Images.

Editorial feedback kindly provided by physicians David Arroyo, Stephen Bacchi, Aidan Tan, Ronaldo Piovezan and Rahul Barmanray and RACP staff Natasa Lazarevic PhD.

Key References
More than algorithms: an analysis of safety events involving ML-enabled medical devices reported to the FDA [Lyell, J Am Med Inform Assoc. 2023]
How machine learning is embedded to support clinician decision making: an analysis of FDA-approved medical devices [Lyell, BMJ Health Care Inform. 2021]
Should AI-enabled medical devices be explainable? [Matulonyte, Int J Law Inform Tech. 2022]

Please visit the Pomegranate Health web page for a transcript and supporting references. Login to MyCPD to record listening and reading as a prefilled learning activity. Subscribe to new episode email alerts or search for ‘Pomegranate Health’ in Apple Podcasts, Spotify, Castbox or any podcasting app.

  continue reading

118 jaksoa

Artwork
iconJaa
 
Manage episode 373783911 series 2898400
Sisällön tarjoaa the Royal Australasian College of Physicians and The Royal Australasian College of Physicians. the Royal Australasian College of Physicians and The Royal Australasian College of Physicians tai sen podcast-alustan kumppani lataa ja toimittaa kaiken podcast-sisällön, mukaan lukien jaksot, grafiikat ja podcast-kuvaukset. Jos uskot jonkun käyttävän tekijänoikeudella suojattua teostasi ilman lupaasi, voit seurata tässä https://fi.player.fm/legal kuvattua prosessia.

This is the fourth part in a series on artificial intelligence in medicine and we try and unpick the causes and consequences of adverse events resulting from this technology. Our guest David Lyell is a research fellow at the Australian Institute of Health Innovation (Macquarie University) who has published a first-of-its kind audit of adverse events reported to the US regulator, the Federal Drugs Administration. He breaks down those that were caused by errors in the machine learning algorithm, other aspects of a device or even user error.
We also discuss where these all fit in to the four stages of human information processing, and whether this can inform determinations about liability. Uncertainty around the medicolegal aspects of AI-assisted care is of the main reasons that practitioners report discomfort about the use of this technology. It's a question that hasn’t been well tested yet in the courts, though according to academic lawyer Rita Matulonyte, AI-enhanced devices don’t change the scope of care that has been expected of practitioners in the past.

Guests
>
Rita Matuolynte PhD (Macquarie Law School, Macquarie University; ARC Centre of Excellence for Automated Decision Making and Society; MQ Research Centre for Agency, Values and Ethics)
>David Lyell PhD (Australian Institute of Health Innovation, Macquarie University; owner Future Echoes Business Solutions)
Production
Produced by Mic Cavazzini DPhil. Music licenced from Epidemic Sound includes ‘Kryptonite’ by Blue Steel and ‘Illusory Motion’ by Gavin Luke. Music courtesy of Free Music Archive includes ‘Impulsing’ by Borrtex. Image by EMS-Forster-Productions licenced from Getty Images.

Editorial feedback kindly provided by physicians David Arroyo, Stephen Bacchi, Aidan Tan, Ronaldo Piovezan and Rahul Barmanray and RACP staff Natasa Lazarevic PhD.

Key References
More than algorithms: an analysis of safety events involving ML-enabled medical devices reported to the FDA [Lyell, J Am Med Inform Assoc. 2023]
How machine learning is embedded to support clinician decision making: an analysis of FDA-approved medical devices [Lyell, BMJ Health Care Inform. 2021]
Should AI-enabled medical devices be explainable? [Matulonyte, Int J Law Inform Tech. 2022]

Please visit the Pomegranate Health web page for a transcript and supporting references. Login to MyCPD to record listening and reading as a prefilled learning activity. Subscribe to new episode email alerts or search for ‘Pomegranate Health’ in Apple Podcasts, Spotify, Castbox or any podcasting app.

  continue reading

118 jaksoa

Kaikki jaksot

×
 
Loading …

Tervetuloa Player FM:n!

Player FM skannaa verkkoa löytääkseen korkealaatuisia podcasteja, joista voit nauttia juuri nyt. Se on paras podcast-sovellus ja toimii Androidilla, iPhonela, ja verkossa. Rekisteröidy sykronoidaksesi tilaukset laitteiden välillä.

 

Pikakäyttöopas