SUMMARY: DigitalHealth.com, a British website, had a fascinating look at the use of health apps during the pandemic. Among their findings:
- Downloads of apps supporting consumers with mental health needs increased by nearly 200% from summer 2019 to summer 2020
- Downloads of those supporting consumers with diets and weight loss rose by a massive 1294% from mid-2019 to mid-2020.
Some may see this as a good sign but we need to proceed with caution.
When I first read these statistics, I thought, “what a great opportunity,” but I wanted to check with some physician thought leaders. They were a bit taken back when I read the stats to them; when I probed some more, their response was, “if patients are using these applications as a tool, then I’m all for it, but if they’re using these applications instead of coming to see me than I am quite worried.”
An Epidemiologist was so taken back by the stat (downloads of apps helping consumers manage their diabetes rose by 482%) she was silent. I thought our call dropped, and she responded, “I want to know more, a LOT more.” She told me that she wanted to understand better how patients were using these apps, where the apps were developed, and if they had been studied to ensure they presented good clinical data.
There is no doubt that the pandemic has shifted many people to digital health, but we need to understand the implications better with this shift. The Epidemiologist I spoke to, for example, said that “type 2 diabetics may believe they are managing their diabetes when in fact they are in trouble”.
According to BI “as the vanguards of the digital transformation of healthcare, investors are taking notice of digital health startups’ potential—and committing record funds to them. Globally, healthcare funding to private firms reached $18.09 billion in Q2 2020, according to CB Insights—establishing a new quarterly record, with equity investments growing 6.3% quarter-over-quarter (QoQ) from 1,197 deals in Q1 2020 to 1,272 deals in Q2 2020.”
What worries me is that too much money is flowing into digital health fly-by-nights in the hope of scoring a huge return. Digital health can be a huge game-changer, but only when apps and home devices are studied and get buy-in from HCP’s and insurers. No app can ever replace a trained medical professional.
There are also other issues that need to be addressed for digital health:
1ne: AI might pick up hand trembling and detect Parkinson’s disease. It’s very beneficial if a disease is caught early on. Still, at the same time, we need to find ways to guarantee that in practice, such information is not misused – e.g., sold to third parties, such as insurance companies, even before a patient is given an early warning that they should seek a detailed diagnosis. It is also important to ensure that the doctors oversee diagnosis.
2wo: Bad Data=Bad Suggestions? AI learns from data sets. But there is the very real risk that algorithms “learn”, and may accentuate, underlying bias present in these data sets.
3hree: Oops! What happens when AI/digital health harms patients? If a doctor makes a mistake when diagnosing and treating your illness, rules establish their responsibility and legal liability. If an app makes a fatal recommendation to patients, our legal systems fail to define who is responsible and accountable.
4our: Who updates AI and ensures AI has updated medical information? We see new medical data almost every day, some of which the FDA says is questionable. Who is ultimately responsible for ensuring AI/digital health apps are up to date?
With the shift to digital health, it would seem that the FDA would at some point need to develop a strategic plan to review digital health apps to ensure they aren’t dangerous. Pharma could also recommend certain digital health apps, but they would need to review all the data used to develop these apps.
Digital health is exciting and terrifying, and the healthcare industry needs to do a better job in vetting this technology.