Apps designed for female health monitoring are exposing users to significant privacy and safety risks due to inadequate data handling practices, according to new research from University College London (UCL) and King’s College London.
The study, presented at the ACM Conference on Human Factors in Computing Systems (CHI) 2024 on May 14, represents the most extensive evaluation to date of the privacy practices of female health apps. Researchers found that these apps, which manage sensitive medical and fertility information such as menstrual cycle data, often coerce users into providing personal details that could endanger their privacy.
The research team analyzed the privacy policies and data safety labels of 20 of the most popular female health apps available on the Google Play stores in the UK and USA, which are utilized by hundreds of millions of users. Their analysis revealed that user data in many cases could be accessible to law enforcement or security authorities.
Only one app explicitly addressed the sensitivity of menstrual data in relation to law enforcement in its privacy policies, taking steps to protect users from legal threats. In contrast, several pregnancy-tracking apps required users to disclose whether they had previously miscarried or had an abortion, and some apps either lacked data deletion functions or made it difficult for users to remove their data once entered.
Experts warn that these poor data management practices could pose severe physical safety risks for users in regions where abortion is criminalized.
The study also uncovered significant contradictions between privacy policy statements and in-app features, flawed user consent mechanisms, and covert data collection practices involving third-party sharing.
Key findings included:
35% of the apps claimed not to share personal data with third parties in their data safety sections but contradicted this in their privacy policies by acknowledging some level of third-party sharing.
50% provided explicit assurances that users’ health data would not be shared with advertisers but were ambiguous about whether this included data collected through app usage.
45% of privacy policies indicated a lack of responsibility for third-party practices, despite claiming to vet them.
Additionally, many of the apps linked users’ sexual and reproductive data to their Google searches or website visits, raising concerns about the risk of de-anonymization and assumptions about users’ fertility status.
Lisa Malki, the study’s first author and a former research assistant at King’s College London who is now a PhD student at UCL Computer Science, stated, “App developers often treat period and fertility data as just ‘another piece of data,’ failing to recognize its unique sensitivity and potential to stigmatize or criminalize users. Given the increasingly risky political climates, developers need to take greater care in protecting user safety and privacy, moving beyond the ‘notice and consent’ model that places an undue privacy burden on users.”
The researchers have developed a resource to help developers improve the privacy policies and practices of female health apps, which can be used for both manual and automated evaluations in future studies.
The team also calls for critical discussions on how other health apps, such as those for fitness and mental health, handle sensitive data.
Dr. Mark Warner, an author of the paper from UCL Computer Science, emphasized, “These apps are crucial for women to manage various aspects of their health, so simply asking them to delete the apps is not a viable solution. The onus is on app developers to design these apps with a consideration for the unique sensitivities of the data being collected and the inferences drawn from it.”