Since ChatGPT was released to the public, proponents of artificial intelligence have lauded its potential to improve healthcare –– and even cure cancer.
From reducing administrative tasks and streamlining operations to improving diagnoses and reducing mistakes, AI can potentially improve patient care.
Chrystie Howard, ARM, CRM, CIC, an expert in enterprise risk management at HUB International, considers how AI can improve care at emergency clinics and even reduce medical malpractice lawsuits.
Healthcare insurance broker MagMutual, conducted an informal study to evaluate the potential impact of ChatGPT on improving emergency care diagnoses. They selected medical malpractice claims based on “delay” or “failure to diagnose” and compared ChatGPT responses to the emergency physicians’ actions. They found ChatGPT may have helped the physicians avoid diagnostic errors and malpractice claims in slightly more than half the claims (52%).
While their study was small and informal, most healthcare leaders considering how to use AI in the industry agree its potential is massive. Still, in the highly regulated healthcare field, where new technology adoption tends to be slower, most organizations are only beginning to consider how to best use AI.
Understanding AI’s potential in emergency care
While AI for public consumption is still in its infancy, research and tasks that once took hours or days now take minutes. For emergency medicine, machine learning algorithms can analyze vast patient data to identify patterns and make real-time recommendations. This can potentially support emergency medicine clinicians in delivering faster, more accurate care.
Following are some examples of how AI can help improve care in emergency healthcare settings.
Operational efficiency
AI-powered systems can automate and streamline administrative tasks such as patient triage, diagnostic scheduling and chart notes. This can free up staff to focus on direct patient care.
“When there’s a lag in ‘door to doc’ times in an emergency room, people at best get testy and at worst get inadequate care. Your satisfaction rates could go down,” says Howard.
Predictive analytics for triage
AI can predict patient outcomes by analyzing data from symptoms and patient electronic health records, allowing emergency clinics to prioritize high-risk patients and allocate resources more effectively.
“Staffing issues are a gigantic headache for healthcare, and AI could reduce some of the clinical staff needs for triage,” says Howard.
Enhanced diagnostics
AI algorithms can analyze symptoms, patient history, lab results and medical imaging with high accuracy.
“Decisions and conclusions can be drawn more quickly if they can be made in a split second by an automated AI process. Of course, you still need clinical judgment to adjust them and make a final decision,” says Howard.
Key considerations before implementing AI
Despite its potential, embedding AI into emergency care is not automatic. Howard recommends carefully considering several factors to ensure that AI technology is integrated effectively and ethically.
1. Data quality, integration and interpretation
Anyone who’s ever tried to access patient history from another system’s EMR knows that seamless technology integration in healthcare is easier said than done. It’s hard to imagine a near future where AI can access patients’ full medical histories.
That said, emergency care clinics need to partner with technology companies that agree to a HIPAA-compliant business associate agreement. Many healthcare companies looking to integrate patient data with AI are building proprietary systems, so there is no need to share data.
When AI suggests tests or diagnoses, clinical oversight still needs to be done to ensure accurate results.
2. Ethical considerations of AI in emergency care
The healthcare industry is subject to strict ethical standards. A 2024 National Consumer Insight Study found that patients’ comfort level with AI in healthcare varies based on age, race and gender. Following are some of their findings on AI comfort level of the 1,000 survey respondents from across the U.S.:
- 23% are either mostly or completely comfortable with AI
- 41% have mixed feelings
- 9% are mostly uncomfortable
- 15% are completely uncomfortable
“There’s definitely concern over how comfortable people are with sharing so much personal information and the risks of data breaches. Hackers can exploit this data for financial gain or to harm organizations. There’s also a general distrust of AI among the public, especially in personal and emergency healthcare, which might not be as accepting as in other areas,” says Howard.
Emergency clinics that adopt AI in patient care will need to develop a roadmap for comfortably and ethically bringing patients along that journey. “Patient education is essential, but where and how it happens remains unclear,” says Howard.
3. Cost-benefit analysis
Like any emerging technology, implementing AI in emergency care can be costly and distracting. New tools promising to save time, lives, and money are developed every day.
Before investing in an AI tool in emergency care, decision makers should conduct a thorough cost-benefit analysis to determine whether its potential benefits outweigh the financial and time investment. Pay special attention to how easy and long it takes for staff to train and accept the technology. “This analysis should consider not only the immediate costs but also the long-term savings in improved efficiency and patient outcomes,” says Howard.
The future of AI in emergency care
As AI evolves, its potential to transform emergency care –– and healthcare in general –– will only increase. Similarly to adopting electronic medical records, successfully integrating AI into emergency care requires careful planning, collaboration and a commitment to ethical and regulatory standards.
With careful consideration, emergency care clinics can harness AI to improve patient care, enhance operational efficiency, stay at the forefront of healthcare innovation, and perhaps even reduce medical malpractice lawsuits.
Chrystie Howard is ERM Leader, Complex Risk Practice at HUB International. Learn more about CM&F Group’s professional liability insurance.