Health-care professionals should reasonably assume that data from behaviour-changing apps will be “shared with commercial entities whose own privacy practices have been questioned”, according to the authors of a new study.
Australian and US researchers looked at 36 top-ranked apps for depression and smoking cessation and found 29 of them were transmitting data to Facebook or Google, although only 12 made that clear in a privacy policy.
They recommend only prescribing and using apps that have been carefully scrutinised to make sure they are not sneakily sharing information.
“Because most health apps fall outside government regulation, up-to-date technical scrutiny is essential for informed decision making by consumers and health care professionals wishing to prescribe health apps,” they write in a paper published in the journal JAMA Network Open.
The study was led by Kit Huckdale from the University of New South Wales, Australia.
Only 25 of the 36 apps studied incorporated a privacy policy and only 16 of these described secondary as well as primary uses of collected data. And while 23 stated in a policy that data would be transmitted to a third party, transmission was detected in 33.
“Data sharing with third parties that includes linkable identifiers is prevalent and focused on services provided by Google and Facebook,” the researchers write.
“Despite this, most apps offer users no way to anticipate that data will be shared in this way. As a result, users are denied an informed choice about whether such sharing is acceptable to them.
“Privacy assessments that rely solely on disclosures made in policies, or are not regularly updated, are unlikely to uncover these evolving issues. This may limit their ability to offer effective guidance to consumers and health care professionals.”
Huckdale and colleagues say their findings are topical because of contemporary concerns about the privacy practices of certain commercial entities, and in respect to current efforts to establish accreditation programs for mental health apps that account for privacy and transparency concerns.
“Our data highlight that, without sustained and technical efforts to audit actual data transmissions, relying solely on either self-certification or policy audit may fail to detect important privacy risks,” they write.
“The emergence of a services landscape in which a small number of commercial entities broker data for large numbers of health apps underlines both the dynamic nature of app privacy issues and the need for continuing technical surveillance for novel privacy risks if users and health care professionals are to be offered timely and reliable guidance.”
More broadly, the researchers suggest the tension between personal privacy and data capture by health-care apps is largely driven by the business models of the makers.
“Because many national health payers and insurance companies do not yet cover apps (given their often nascent evidence base), selling either subscriptions or users’ personal data is often the only path toward sustainability,” they conclude.