Nowhere throughout history has a technological breakthrough come fully optimized and pre-packaged for practical use. By necessity, a trial period must follow all major innovations, during which they are refined through testing; widespread adoption doesn’t happen until most people can be reasonably assured that the benefits of using a new technology outweigh any drawbacks.
Digital healthcare is currently undergoing its own refinement period. Technical advances in telemedicine, remote monitoring, mobile health apps, and wearable devices have cultivated interest in digital tech’s potential as a method of augmenting medical treatment. This potential is evident to many in the medical community, as the American Medical Association reports that most physicians believe implementing digital health in everyday practice could enable more effective care.
It’s clear that the gap is narrowing between digital healthcare’s practical potential and it’s current reality, however several significant issues need to be resolved before digital health can claim its place as a fixture in daily clinical practice.
Connecting The Tech And Healthcare Industries
Some medical practitioners are concerned that the current trajectory of tech development shows “a shocking lack of focus on the place where healthcare takes place,” according to John S. Rumsfeld, chief innovation officer of the American College of Cardiology. This disconnect is rooted in a lack of communication between healthcare professionals and tech entrepreneurs; a 2016 survey reported that 11% of health app companies did not involve medical professionals at all in the process of developing medical apps. “Unfortunately it often takes the critical eye of a physician to judge whether there is a credible level of evidence for an app or whether it is just a bunch of hocus pocus,” said David M. Levine, a researcher at Harvard Medical School.
Collecting More/Better Evidence
The evidence employed in defense of certain digital health products, particularly health apps, can be shallow and deceptive. Many clinical trials are done in a monitored setting, where subjects are meticulously instructed on the product’s proper use, and incentivised with payment to use the product consistently and correctly. In the real world, however, the percentage of patients who stick to correct procedure will be far lower than in the lab, therefore the reliability of any evidence gathered under stringent conditions is questionable at least. Since their usefulness is not yet proven, primary care providers are (rightfully) wary to implement in health regimens any wellness programs which base their claims on insubstantial evidence.
A huge roadblock for digital health is the fact that many (if not most) programs cannot share and exchange data mutually. Without incorporating functions which allow for integrated data transfer, particularly in regard to electronic health records, the interconnectivity that allows programs to access the information needed to provide beneficial service will “remain largely unattainable,” says Dr. Levine, “we want it to all be visible to our entire health team so that anyone can log into it and it is all in one place.”
While these problems are severe, they are by no means insurmountable. With time, proper research, and professional counsel, we will start to see a substantial rise in digital health adoption rates as the underlying technology, as well as our ability to manage and engage with it, improves.