A cross-sectional assessment of 36 of the top-ranked apps for depression and smoking cessation in the US and Australia found that the majority of these health apps share data with third-parties, but only 12 accurately disclosed the practice in its privacy policy, according to a recent study published in JAMA.
The researchers from Australia’s Black Dog Institute and Beth Israel Deaconess Medical Center in Boston analyzed the privacy policies of the top-raked apps, as well as the encrypted and unencrypted data transmission between April and June 2018.
They found that 69 percent, or 25 of the 36 apps, incorporated a privacy policy for users. Of those with a policy, 88 percent were clear about the primary uses for its data collection. However, only 16 of those apps shared the secondary uses for data sharing.
What’s concerning is that while 92 percent, or 23 of the 25 apps with a privacy policy, notified users that their data would be transmitted to a third-party, the researchers detected data transmission in 33 of all of the 36 analyzed health apps.
“Almost half of the apps (17 of 36 (47 percent)) transmitted data to a third-party but lacked a privacy policy (9 apps), failed to disclose this transmission in policy text (5 apps), or explicitly stated that transmission would not occur (3 apps),” the researchers wrote.
About 70 percent of the apps that positively indicated that data would be shared with advertisers and 61 percent shared data with advertisers and analytics services. And only one app explicitly stated that their data would not be shared with any third-party.
More than 80 percent shared data for marketing purposes to just two third-parties: Google and Facebook. But only 43 percent transmitting data to Google and 50 percent sharing data with Facebook disclosed it to users.
“Transmission of data to third-party entities was prevalent, occurring in 33 of 36 top-ranked apps (92 percent) for depression and smoking cessation, but most apps failed to provide transparent disclosure of such practices,” the researchers wrote.
“Commonly observed issues included the lack of a written privacy policy, the omission of policy text describing third-party transmission (or for such transmissions to be declared in a nonspecific manner), or a failure to describe the legal jurisdictions that would handle data,” they added.
And for a small number of these apps, some of the data transmissions observed by the researchers were contrary to the disclosed privacy policies. Further, they found that many healthcare apps will label themselves as wellness tools in their privacy policies, to get around HIPAA and other regulations that mandate patient privacy protections.
The latest JAMA report mirrors similar findings from BMJ, which found the majority of all health apps, or 79 percent, routinely shared data with third-parties and were far from transparent about the practice.
Currently, the New York Governor is investigating Facebook’s health data practices, following a Wall Street Journal report that showed several apps sharing data with the social media platform and not always with user consent.
What's concerning with these app practices discussed in the most recent Jama report is that some of the data shared with these third-parties was highly sensitive in nature, including health diaries and substance use reports.
Although the researchers did not observe these apps directly sharing personally identifiable information, the traffic sent to the outside parties routinely included linkable information, such as fixed device identifiers on Android and advertising identifiers on both iOS and Android products.
As a result, any transmission of basic user data combined with these identifiers could allow third parties to generate linkable data about the user’s mental health status.
“The observed consolidation of services offering advertising, marketing, and analytics may exacerbate this risk by increasing the likelihood that a given service provider holds data from multiple sources,” the researchers wrote.
“Consequently, users should be aware that their use of ostensibly stand-alone mental health apps, and the health status that this implies, may be linked to other data for other purposes, such as marketing targeting mental illness,” they added. “Our data highlight that, without sustained and technical efforts to audit actual data transmissions, relying solely on either self-certification or policy audit may fail to detect important privacy risks.”
The researchers stressed that healthcare providers prescribing the use of any mental health app should not rely on privacy disclosures, but should assume any data will be shared with outside commercial entities – some of which have questionable privacy practices.
“If possible, [providers] should consider only apps with data transmission behaviors that have been subject to direct scrutiny,” the researchers wrote.