Before we get into a complex issue, it should be stated that AI was already used in the medical field and AIs could already make suggestions as to the best course of treatment given a certain set of symptoms. But now with AI booming with hype and panic surrounding its abilities, it’s highlighting a specific fear for doctors who are uncertain of its application in the medical field.
AI can currently pass the USMLE (United States Medical Licensing Exam) at about a 60% rate. ChatGPT, for instance, was able to pass the test without being specially trained in medical information required to pass the test. This stoked several fears for medical doctors who spent years in training before they were able to pass the USMLE. Fears of replacement arose in diagnostics and now, a question of whether or not deviating from AI-generated diagnosis would be grounds for medical malpractice. Alternatively, whether doing as the AI suggested could also result in potential medical malpractice lawsuits.
How would AI work in terms of medical diagnostics?
AI is good at applying statistical principles to a constellation of symptoms and then generating a list of potential maladies that a patient may have. In these cases, the doctor would input the symptoms into the AI and then the AI would self-generate potential conditions that could account for all of the symptoms. Both AI and doctors employ statistical models for determining a diagnosis, but in both cases, there are instances where a patient has a condition that is not statistically likely. Doctors are prone to discussing the matter in terms of zebras and horses. In other words, if you see a horse-shaped entity, the most likely answer is: you’re looking at a horse. It could be a zebra but it’s less likely. In terms of AI, it would basically do the same thing. And it could provide statistics as well.
In terms of medical malpractice lawsuits, one of the most common types of claims is that a healthcare provider failed to provide an accurate and/or timely diagnosis. A doctor fails to catch a problem in time and this results in severe injury to the patient. Is the doctor responsible? This depends on a complex test to determine whether or not the doctor applied the prevailing standard of care to the injured patient. It’s perfectly possible that a doctor reaches the wrong diagnosis based on a standardized litmus, the patient is injured, and this still doesn’t amount to medical malpractice.
How does AI factor into medical malpractice?
If the doctor follows the recommendation set forth by an AI and it turns out to be wrong, then the doctor could be sued for not using their own brain. If the doctor deviates from the AI and it turns out the AI was right, then the doctor could be sued for not listening to the AI. In either case, doctors would be held to a potentially ambiguous standard and this is a cause for concern among doctors.
Talk to a Tampa, FL Medical Malpractice Attorney Today
Palmer | Lopez represents Tampa residents who have been injured by negligent medical doctors. Call our Tampa medical malpractice lawyers today to schedule a free consultation and learn more about how we can help.
Source:
forbes.com/sites/lanceeliot/2023/05/23/generative-ai-is-stoking-medical-malpractice-concerns-for-medical-doctors-in-these-unexpected-ways-says-ai-ethics-and-ai-law/?sh=df389e664373