Workers’ compensation medical treatment requests must be supported by scientifically based evidence published in peer reviewed literature, literally hundreds if not thousands of publications. So how might a busy practitioner keep up with this literature?
U.S. researchers, reporting in the Annals of Internal Medicine, say that “smart” search programs can ease the process of systematically reviewing new medical research, a key step in getting the best practices from laboratories to doctors’ offices.
The Institute of Medicine (now the National Academy of Medicine) says clinical practice guidelines should be based on a systematic review of the evidence, lead author Dr. Paul Shekelle from RAND Corporation in Santa Monica, California, told Reuters Health by email, and reported in the story by Reuters.
Systematic reviews are a cornerstone of evidence-based care and a necessary foundation for care recommendations to be labeled clinical practice guidelines. However, they become outdated relatively quickly and require substantial resources to maintain relevance. One particularly time-consuming task is updating the search to identify relevant articles published since the last search.
Typically, researchers and their assistants perform computer searches to identify anywhere from a few to thousands of new research studies, then they determine which ones are relevant and assemble the information into updated guidelines and recommendations.
Shekelle and colleagues thought machines could do more of the job and do it faster, so they compared machine-learning methods with the standard search methods for identifying new information.
They tested the idea on three health conditions: gout, low bone density and osteoarthritis of the knee. The smart search program “learned” which key terms to look for by analyzing words from studies that were included in prior reviews on each topic.
In all three cases, computers – provided only with the titles and summaries of articles included in previous reviews – reduced the number of articles researchers had to screen further by 67 to 83 percent, according to the results in Annals of Internal Medicine.
“Machine learning methods are very promising as a way to reduce the amount of time and effort for the literature search, which in turn should make it easier to update the systematic review, which in turn can facilitate keeping clinical practice guidelines up to date,” Shekelle said.
The approach would “shorten the time from completion of research studies to adoption of effective treatments in clinical practice,” said Dr. Alfonso Iorio from McMaster University in Hamilton, Ontario, Canada, who coauthored an editorial accompanying the report.
“In the near future, artificial intelligence will also be used to match to individual need with the best available health care intervention – one necessary step to get this is proper classification on existing and newly generated knowledge,” Iorio said.