Transparency & Consent
Unpacking the Threats to User Privacy
Privacy Threat Categories
Understanding the privacy risks associated with mental health apps is crucial. The LINDDUN framework offers a comprehensive approach to identifying these risks, focusing on seven key threat categories.
These categories highlight how third parties might re-identify or track users, exposing sensitive data like mental health conditions. The risks are compounded by the fact that many apps do not adequately inform users about these potential privacy breaches.
Linkability
Mental health apps often face linkability threats, allowing third parties to connect user data across platforms, potentially compromising user privacy.
Dangerous Permissions
Some mental health apps request permissions that are unnecessary for their function, posing additional risks to user privacy. Most mental health and wellbeing apps require users to disclose personal data and consent to its use before usage, typically presenting the privacy policy during download. Understanding and managing these permissions is a vital step toward protecting your personal information.
-
Unnecessary Risks: On average, mental health apps utilize several unnecessary "dangerous permissions" that could compromise your data. Examples include permissions that allow apps to access external storage or the user's profile information without clear necessity.
-
The Power of Knowledge: Knowing which permissions are unnecessary and how to revoke them empowers users to protect their data proactively. The majority of privacy policies require a college-level understanding, making them inaccessible to many users. This complexity hides how apps handle data.
Hover over each permission to verify if it is deemed safe for granting access to this mental health app.
⚠️ Accesses contents of external storage with sentitve user data
Read External Storage
1
A study on the privacy of mental health apps conducted by Iwaya et al. (2022) found examples of unnecessary permissions that included READ_EXTERNAL_STORAGE and WRITE_EXTERNAL_STORAGE, which provide indiscriminate access to sensitive user information stored on the device. In addition, the Android permission READ_PROFILE, allows apps to access user profile data. This poses severe risks as it may collect unnecessary data and lack clear privacy policy explanations to users, potentially violating compliance regulations and contributing to user unawareness of data access.
However, users can revoke these permissions from any app if they know how to change the configurations.
Towards Clearer and More Accessible Privacy Policies
Challenges with Current Privacy Policies
Privacy policies, crucial for informed consent, are often inaccessible due to their complexity and length. This barrier is particularly problematic for users with mental health needs, compromising their understanding and consent.
-
Accessibility and Compliance Issues: Studies and user feedback have shown that privacy policies are not only difficult to understand but also fail to meet regulatory standards like the General Data Protection Regulation (GDPR). GDPR states that privacy policies should be written “in a concise, transparent, intelligible and easily accessible form, using clear and plain language.” The use of jargon and legal terminology further alienates users.
-
User Perspectives on Privacy Policies: Many users ignore privacy policies because they are long, hard to navigate, or seem irrelevant. However, when these policies are more accessible, users feel empowered to make informed decisions, enhancing their sense of control and trust in the app.
An example of a complicated 100 word section taken from an FDA compliant privacy policy. While the overall reading grade level of the policy might be acceptable, this particular passage has a higher reading grade level, indicating it may be more difficult to understand.
A Call for Clarity and Accessibility
Many privacy policies contain complex language and exceed recommended reading grade levels. This makes them harder for users to comprehend. Understanding the unique requirements of individuals facing mental health challenges is important for mental health app developers. By doing so, developers can make their digital products more accessible and user-friendly, potentially improving mental health recovery outcomes.
Practical guidelines for the design of privacy policies developed by service users.
Developers should prioritize the clarity and accessibility of their terms, conditions, and privacy policies, using plain language to ensure users fully understand the implications of consent.
Adopting
Clear
Language
Implement distinct, understandable consent processes for health-related data, clearly outlining how data will be used, stored, and shared.
Separate Health Data Consents
Regularly consult with users, especially those with mental health challenges, to identify ways to make policies more user-friendly and transparent.
Engaging with User Feedback
Integrate privacy considerations into every stage of app development, ensuring that user consent is informed, voluntary, and revocable at any point.
Committing to Privacy by Design
Conclusion
The road to truly informed consent in mental health apps is paved with transparency, simplicity, and user engagement. By reimagining how terms, conditions, and privacy policies are presented, developers can foster a more trustworthy and respectful dialogue with their users, ultimately enhancing privacy protection in the digital health landscape.
References
-
https://thedataprivacygroup.com/us/blog/data-privacy-myths-marketers-still-believe/#Myth_2_Compliance_Takes_Priority_Over_Consumer_Trust Borner, I. (2024, March 27). Data privacy myths marketers still believe - the Data Privacy Group. The Data Privacy Group.
-
Cybernews. (2023, November 15). Dangerous permissions detected in top Android health apps. https://cybernews.com/security/dangerous-permissions-android-health-apps/
-
Data Privacy Day: Top six common privacy myths debunked - Spiceworks. (2022, April 27). Spiceworks. https://www.spiceworks.com/it-security/data-security/articles/debunking-data-privacy-myths/
-
Government of Canada, Interagency Advisory Panel on Research Ethics. (2023, January 11). Tri-Council Policy Statement: Ethical Conduct for Research Involving Humans – TCPS 2 (2022) – Chapter 5: Privacy and Confidentiality. https://ethics.gc.ca/eng/tcps2-eptc2_2022_chapter5-chapitre5.html
-
Iwaya, L. H., Babar, M. A., Rashid, A., & Wijayarathna, C. (2022). On the privacy of mental health apps. Empirical Software Engineering, 28(1). https://doi.org/10.1007/s10664-022-10236-0
-
Jilka, S., Simblett, S., Odoi, C., Van Bilsen, J., Wieczorek, A. M., Erturk, S., Wilson, E., Mutepua, M., & Wykes, T. (2021). Terms and conditions apply: Critical issues for readability and jargon in mental health depression apps. Internet Interventions, 25, 100433. https://doi.org/10.1016/j.invent.2021.100433
-
Robinson, M. D., & Robinson, M. D. (2019, April 3). Do you know the terms and conditions of your health apps? HIPAA, privacy and the growth of digital health - Bill of Health. Bill of Health - The blog of the Petrie-Flom Center at Harvard Law School. https://blog.petrieflom.law.harvard.edu/2019/04/03/do-you-know-the-terms-and-conditions-of-your-health-apps-hipaa-privacy-and-the-growth-of-digital-health/