hHealth systems and ardent cost-cutters believe the answer lies in a small group of patients who represent more expenses than anyone else.
If they can catch these patients — often called “high users” or “high cost, high need” — before their condition deteriorates, providers and insurers can refer them to primary care or social programs such as food services that can keep them away from Emergency Department, divisions. An increasing number also want to identify patients who are most at risk of being readmitted to hospital, which could lead to a buildup of more large bills. To find them, they set up their own algorithms that rely on previous claims information, history of prescribed medications, and demographic factors such as age and gender.
Moataz Al-Shakiwi, director of research at global market research firm IDC, said an increasing number of service providers he works with globally are experimenting and using predictive technology for prevention.
Nigam Shah, a professor of biomedical informatics at Stanford University, said these precisely and precisely designed models can significantly reduce costs and also keep patients healthy. “We can use algorithms to do good, to find people who are likely to be expensive, and then to identify those for whom we might be able to do something,” he said.
But this requires a level of coordination and reliability that has hitherto been rare in the use of healthcare algorithms. There is no guarantee that these forms, which are often produced locally by insurance companies and health systems, will work as intended. If they rely solely on past spending as an indicator of future spending and medical need, they risk skipping for patients who have not been able to access health care at all. Experts warn that predictions won’t help at all if providers, payers and social services don’t adjust workflows to get these patients into preventive programs.
“There is very little regulation,” Shah said. “There is definitely a need to standardize the industry in terms of how to do that and what to do with the information.”
The first issue, experts said, is that there is no agreed upon definition of what constitutes high usage. As health systems and insurers develop new models, Shah said they will need to be very accurate — and transparent — about whether their algorithms for identifying potentially prohibitive patients measure medical spending, the volume of visits compared to a baseline, or medical need based on Clinical data.
Some models use cost as a proxy measure of medical need, but they often cannot account for differences in a person’s ability to actually receive care. In a widely cited 2019 research paper examining an algorithm used by Optum, researchers concluded that the tool — which used upfront spending to predict patient needs — referred white patients to follow-up care more frequently than black patients who were equally sick.
“The expectation of high-cost patients in the future can differ from that of patients with high medical needs due to confounding factors such as insurance status,” said Irene Chen, a computer science researcher at MIT who co-authored a health affairs article describing potential bias in health algorithms. “.
If a high-cost algorithm is not accurate, or exacerbates biases, it can be difficult to detect—especially when models are developed and implemented in individual health systems, without external oversight or scrutiny by government or industry. A group of Democratic lawmakers has introduced a bill that would require organizations using artificial intelligence to make decisions to evaluate them for bias and create a public repository of such systems at the Federal Trade Commission, although it is not yet clear whether it will advance.
This places the onus, for the time being, on health systems and insurance companies to ensure that their models are fair, accurate and beneficial to all patients. Shah suggested that developers of any cost-prediction model — especially taxpayers outside the clinical system — review the data with providers to ensure that the target patients also have the highest medical needs.
“If we are able to figure out who is going to get in trouble, or a medical problem, then fully understanding that cost is a proxy for that … we can then engage human processes to try and prevent that,” he said.
Another key question about using algorithms to identify high-cost patients is exactly what health systems and payers should do with this information.
“Even if you are able to predict that next year someone will cost a lot because this year they have stage 3 colon cancer, you can’t get rid of the cancer, so that cost can’t be prevented,” Shah said.
For now, the serious work of figuring out what to do with the predictions the algorithms produce is left in the hands of health systems making their own models. So, too, are data being collected to understand whether these interventions are making a difference in patient outcomes or costs.
At UTHealth Harris County Psychiatry, a safety net center that primarily serves low-income individuals in Houston, researchers are using machine learning to better understand patients with the highest need and support resources for this population. In one study, researchers found that factors such as high school dropout or a diagnosis of schizophrenia were associated with frequent and costly visits. Another analysis suggested that low income was strongly associated with homelessness, which in turn was associated with the cost of psychiatric treatment in hospitals.
Some of these findings may seem obvious, but determining the strength of these links helps hospital decision makers with limited staff and resources determine which social determinants of health should be addressed first, according to study author Jane Hamilton, associate professor of psychiatry and behavioral sciences at the University of Texas Health Science Center. at Houston College of Medicine.
The study of homelessness, for example, has led to more local intermediate interventions such as residential “step-down” programs for psychiatric patients. “What you have to do is get all the social workers to really sell it to the social work department and the medical department to focus on one particular outcome,” Hamilton said.
Predictive technology has not been directly included in the health records system yet, so it is not part of clinical decision support yet. Instead, social workers, doctors, nurses, and executives are separately informed about factors the algorithm determines for readmission risk, so they can refer some patients for interventions such as acute, short-term visits, said Lokesh Shahani, the hospital’s chief medical officer and assistant professor in the department of medicine. Psychology and Behavioral Sciences at UTHealth University. “We rely on the profile defined by the algorithm and then kind of pass that information on to our doctors,” Shahani said.
“It’s a little more difficult to put a complex algorithm into hospital electronic health records and change the workflow,” Hamilton said. Shahani said the Psychiatric Hospital plans to link the two systems so that risk factors are flagged in individual records over the next few months.
Part of changing hospital operations is determining which visits can be avoided and which are part of the normal course of care. “We’re really looking at the elastic factors,” Hamilton said. “What could we do differently?