Apps designed for female health monitoring are exposing users to significant privacy and safety risks due to poor data handling practices, according to a new study from King’s College London and University College London (UCL).
In the most comprehensive evaluation of the privacy practices of female health apps to date, researchers found that these apps, which handle medical and fertility data, often coerce users into providing sensitive information, potentially putting them at risk.
Analyzing the privacy policies and data safety labels of 20 of the most popular female health apps available in the UK and U.S. Google Play Stores, the study revealed that user data could often be accessed by law enforcement or security authorities. Only one app reviewed explicitly addressed the sensitivity of menstrual data in relation to law enforcement and made efforts to protect users against legal threats.
Many pregnancy-tracking apps required users to disclose whether they had previously miscarried or had an abortion. Additionally, some apps lacked data deletion functions or made it difficult to remove entered data. Experts warn that these poor data management practices could pose serious physical safety risks for users in countries where abortion is a criminal offense.
The findings are being presented at the ACM Conference on Human Factors in Computing Systems (CHI) 2024, taking place from May 11-16, 2024.
Lead investigator Dr. Ruba Abu-Salma from King’s College London stated, “Female health apps collect sensitive data about users’ menstrual cycle, sex lives, and pregnancy status, as well as personally identifiable information such as names and email addresses. Requiring users to disclose sensitive or potentially criminalizing information as a pre-condition to deleting data is an extremely poor privacy practice with dire safety implications. It removes any form of meaningful consent offered to users.”
Dr. Abu-Salma added, “The consequences of leaking sensitive data could result in workplace monitoring and discrimination, health insurance discrimination, intimate partner violence, and criminal blackmail. These risks intersect with gendered forms of oppression, particularly in countries like the U.S. where abortion is illegal in 14 states.”
The study examined well-known apps including Flo and Clue and uncovered stark contradictions between privacy policy wording and in-app features, flawed user consent mechanisms, and covert gathering of sensitive data with widespread third-party sharing.
Key findings included:
35% of the apps claimed not to share personal data with third parties in their data safety sections but contradicted this statement in their privacy policies by describing some level of third-party sharing.
50% provided explicit assurance that users’ health data would not be shared with advertisers but were ambiguous about whether this included data collected through using the app.
45% of privacy policies outlined a lack of responsibility for the practices of any third parties, despite also claiming to vet them.
The study also found that many apps linked users’ sexual and reproductive data to their Google searches or website visits, posing a risk of de-anonymization and leading to assumptions about their fertility status.
Lisa Malki, first author of the paper and a former research assistant at King’s College London, now a Ph.D. student at UCL, stated, “There is a tendency by app developers to treat period and fertility data as ‘another piece of data’ as opposed to uniquely sensitive data which has the potential to stigmatize or criminalize users. Increasingly risky political climates warrant a greater degree of stewardship over the safety of users.”
Co-author Dr. Mark Warner from UCL emphasized, “These apps are crucial in helping women manage various aspects of their health. Asking users to delete these apps is not a responsible solution. The responsibility is on app developers to ensure they design these apps in a way that considers and respects the unique sensitivities of both the data collected directly from users and the data generated through inferences made from that data.”
To help developers improve the privacy policies and practices of female health apps, the researchers have developed a resource that can be adapted for manually and automatically evaluating female health app privacy policies in future work. They are also calling for critical discussions on how health apps, including fitness and mental health apps, handle sensitive data.
The study was led by Dr. Ruba Abu-Salma, Lisa Malki, and Ina Kaleva from the Department of Informatics at King’s College London, alongside Dr. Mark Warner and Dr. Dilisha Patel from UCL.