- 16th April 2018
- Posted by: Manolis
The media is replete with articles about how artificial intelligence (AI) is going to change the medical world, in cancer detection and other diagnostic and treatment disciplines. The articles describe how AI, primarily deep learning (DL) applications are as accurate or better than medical experts. That means they’ll be used quickly adopted, right? Not really, there’s a regulatory picture many ignore.
One of the first expert systems, a subset of AI, was MYCIN, initially developed as a doctoral dissertation by Edward Shortliffe, at Stanford University. It was first developed in 1972 and worked on over the next few years. MYCIN’s purpose was to identify bacteria causing different infectious diseases and its result rivaled, actually slightly exceeded, the percentage accuracy of experts.
Now think back to your last doctor’s visit. Can you remember the doctor using an AI system to fit your medical records and then follow the software’s recommendations? Of course not.
There was nothing that technically prevented MYCIN from being productized and sold. The challenge was overcoming legal and ethical considerations. If the system was wrong, should the doctor or the developers be held responsible?
Regulations and Compliance: Even More Critical in Healthcare
That issue still stands today. Without a regulatory and legal framework to indemnify software companies and approve new technologies the medical practices that leverage AI, the risk is too high. There needs to be some way to have software incorporated into the regulatory system. In the near term, physicians and others in the medical community will continue to make decisions based on AI and other input. That postpones the legal framework, but does nothing to address regulatory issues.
An overarching issue about the acceptance of DL into medicine is the need for regulatory approval. All medical advances need to show an advance that helps patients. Machine learning is different from a new vaccine, as an example, for the second word – learning. Think about what it takes to get regulatory approval for an algorithm. Then consider that the algorithm learns from more data and adapts. It’s no longer the same algorithm. The new learning can’t be implemented in the clinical practice without updating approvals. “The regulatory cycle is not set up to address the machine learning environment,” says Elad Benjamin, Co-founder & CEO at Zebra Medical Vision. “The clinical, testing, approval process will be difficult for algorithmic adaption in the medical industry.”
Another area of concern is data privacy. General rules, such as the EU’s GDPR, and medical specific laws, such as HIPAA in the US, are increasingly protective of patient privacy. Those laws will have an impact on how medical systems move into the cloud. “We’ve seen hospitals requiring certain data stay in local systems, limiting what can go up to cloud servers,” said Fabien Beckers, CEO at Arterys. “Solutions that keep protected health information safe will need to be considered in order to leverage the advantages of the cloud while remaining in compliance with government privacy regulations.”
Machine Learning, Healthcare and Rural Communities
In much of the non-US world, universal healthcare is the standard. That means regulations come from the national government. In addition, in nations such as China and India, with a large patient to doctor ratio, the pressure to help doctors reach more patients should help the inclusion of DL into the diagnostic and treatment process. That means we should expect to see DL move more quickly into other markets than the US. There’s a global demand and many nations see the need for the technology in order to improve patient care.
On the other hand, the US is a large market for technology and the lack of a national policy means the adoption here will be slower. With US healthcare being so fractured, what will drive the adoption of AI and DL into this market? I expect the main driver will be support for rural healthcare. In fact, the National Conference of State Legislatures is very aware of this issue. As they point out, the problem is expected to grow, “The rural population of those ages 55 to 75 is estimated to grow 30 percent between 2010 and 2020 due, in part, to retiring baby boomers migrating from urban areas.”
Our system is expensive, and the rural community is very underserved. The role of Nurse Practitioner is one way to extend more health services to those communities, but it only helps part of the problem. Getting access to even Nurse Practitioners is going to be difficult without higher taxes and the creation of incentives to get healthcare personnel into those communities.
Given the challenges in US system, I see state legislatures being a place where regulatory and legal compliance issues will be addressed to help the healthcare community extend its coverage through the use of AI.
The breadth of healthcare diagnostics and treatment methods that can be advanced through artificial intelligence has grown significantly since the 1970s. The regulatory environment hasn’t. For AI to gain traction in healthcare, regulatory frameworks must be created and then companies can create the compliance structures necessary to smooth adoption into healthcare that will save lives.