This paper was converted on www.awesomepapers.org from LaTeX by an anonymous user.
Want to know more? Visit the Converter page.

What is in Your App? Uncovering Privacy Risks of Female Health Applications

Muhammad Hassan [email protected] 0000-0002-5713-9658 Mahnoor Jameel [email protected] University of Illinois Urbana-Champaign 614 E. Daniel Street (the Hub)ChampaignIllinoisUSA61820-6211 Tian Wang University of Illinois Urbana-Champaign 614 E. Daniel Street (the Hub)ChampaignIllinoisUSA61820-6211 [email protected]  and  Masooda Bashir University of Illinois Urbana-ChampaignChampaign, ILUSA [email protected]
(2018)
Abstract.

FemTech or Female Technology, is an expanding field dedicated to providing affordable and accessible healthcare solutions for women, prominently through Female Health Applications that monitor health and reproductive data. With the leading app exceeding 1 billion downloads, these applications are gaining widespread popularity. However, amidst contemporary challenges to women’s reproductive rights and privacy, there is a noticeable lack of comprehensive studies on the security and privacy aspects of these applications. This exploratory study delves into the privacy risks associated with seven popular applications. Our initial quantitative static analysis reveals varied and potentially risky permissions and numerous third-party trackers. Additionally, a preliminary examination of privacy policies indicates non-compliance with fundamental data privacy principles. These early findings highlight a critical gap in establishing robust privacy and security safeguards for FemTech apps, especially significant in a climate where women’s reproductive rights face escalating threats.

Femtech, Mobile Apps, Privacy Policy, Security
copyright: acmcopyrightjournalyear: 2018doi: XXXXXXX.XXXXXXXconference: Make sure to enter the correct conference title from your rights confirmation emai; May 11–16, 2024; Hawai’i, USAbooktitle: Extended Abstracts of the 2024 CHI Conference on Human Factors in Computing Systems (CHI EA ’24), May 11–16, 2024, Hawai’i, USAprice: 15.00isbn: 978-1-4503-XXXX-X/18/06ccs: Security and privacy Social aspects of security and privacyccs: Human-centered computing Smartphones

1. Introduction

In a rapidly developing digital age, technology constantly demonstrates itself to be an integral part of the lives of individuals. The efficiencies it provides allow for specific aspects of livelihood to be narrowed in on, targeted to fulfill specific needs, educate, and expand accessibility. This rings true especially in the case of female. From well-known tools such as Flo Health, Maven Clinic, and Progyny where specific emphasis is placed on reproductive health, technology has placed itself in a role that promotes the empowerment and agency of female health(Narwani, 2023; Bose, 2023).

Tools such as Flo Health and Progyny are in the technological category of FemTech, a term that refers to the use of technology to address the needs and challenges of women’s health and wellness (Yashar and Wannon, [n. d.]; Bose, 2023). FemTech mobile apps are one of the most popular and accessible forms of this technology, offering a variety of features and services for women, such as period tracking, fertility and contraception support, menopause management, hormonal disorder and chronic condition education, and sexual wellness guidance. Provided technology’s strong, deep presence and design for education, personal monitoring, and solution-based recommendations, one must consider the impacts and any related considerations around the vulnerabilities that exist and the biggest ones being privacy and policy.

The recent reversal of Roe v. Wade by the US Supreme Court, which recognized a woman’s constitutional right to choose whether to have an abortion, has raised serious concerns about the reproductive rights and privacy of women in the United States(Totenberg and McCammon, 2022). The decision has not only banned or restricted access to safe and legal abortions, but also undermined the right to privacy that was the basis of Roe v. Wade, which could have implications for other aspects of women’s health care such as fertility treatment, contraception, and cancer care (Scott, 2022; Suran, 2022). Moreover, there have been cases where women have been prosecuted or persecuted based on their online purchase of abortifacients, text messages, or search history related to abortion or miscarriage(Valenti, 2015; Elliott, 2022; Ant-Wuorinen et al., [n. d.]). Therefore, some experts have advised women to delete their FemTech apps or avoid using them to track their periods and fertility cycles, as they fear that their personal data could be used against them by anti-abortion laws or authorities(Ant-Wuorinen et al., [n. d.]).

Female health applications (FHA) are a form of FemTech that provide various services and products to women, such as period tracking, fertility support, pregnancy monitoring, menopause management and general health care. However, these applications also collect sensitive health data from their users, which could expose them to privacy and security risks. Previous studies have shown that some FHA have poor data security and privacy practices, such as tracking and profiling users, sharing data with third parties, or not disclosing data policies to users (Matsakis, 2019; Shipp and Blasco, 2020; Alfawzan et al., 2022; Mehrnezhad and Almeida, 2021). Moreover, in some contexts, women’s bodies are under political surveillance, and their data could be used against them by authorities or adversaries. For example, in Missouri, the state health department tracked women’s period information to determine if an abortion took place (Arwa Mahdawi, 2019). Therefore, it is important to understand the security, privacy and data practices of FHA and how they affect the users. In this exploratory study, we aim to provide a comprehensive overview of the current state of FemTech apps and examine their potential benefits and challenges for women’s health. This emphasis on data privacy informs underlying impacts on threats to female health posed by data vulnerabilities. We conduct a preliminary analysis of what and how information is collected by FHA, and what information is disclosed to the users about the data practices of FHA by their privacy policies.

We present an initial exploration of the FHA space that aims to answer the following research questions.

  • RQ1 Examining the Data collection and data practices of popular FHA?
    We analyzed the permissions requested by the applications to determine their level of privilege and access to sensitive data. We also examined the presence of third-party trackers in the applications that may collect and share user data.

  • RQ2 Evaluating privacy policies of selected FHA apps for data privacy principles
    We assessed the privacy policies of 7 popular FHA apps by using the Fair Information Practice Principles FIPPs framework to evaluate their data practices focusing on privacy and security measures.

2. Background

The academic literature on the use of technology for female health (FemTech) reveals the importance and the risks of collecting and processing female healthcare data. In this section, we review the relevant literature to understand the current state of FemTech and the challenges it faces in terms of data privacy and security. We also examine the laws and policies that regulate healthcare data in general and female healthcare data in particular, focusing on prominent frameworks in this domain. We compare and contrast these regulations to investigate how they address the specific needs and concerns of female healthcare data privacy. Our goal is to identify the gaps and opportunities for improving FemTech data privacy and security in light of the existing literature and regulations.

2.1. Female Health Data Risks

Female Health Apps (FHA) collect sensitive user data, often revealing intimate aspects of women’s lives. Trust and data safety are paramount, especially for FHA users, as they entrust these applications with their sexual and reproductive health (SRH) information(Aïmeur et al., 2016; Kesan et al., 2015; Muller et al., 2023). However, uninformed users may share data without understanding the implications(uuI, 2023; Vallor, 2018; umi, 2023). Previous studies highlight the privacy risks in FemTech and FHA due to inadequate consent or protection(Almeida et al., 2022), and the misuse of this data could lead to discrimination, violence, and legal repercussions(Maas, 2022; Olivero, 2022; Almeida et al., 2022; Arwa Mahdawi, 2019; Court, [n. d.]; con, 2022; rep, 2023; Masling et al., 2023).

FHA’s collection of personal health data poses privacy risks. Sharing this data with third parties could lead to unwanted targeting or exploitation. Despite privacy promises, some FHA have been found to share user data with third parties(Rosato, 2020; Nguyen, 2020), and some even shared health information without user consent(Cohen, 2022), breaching trust and privacy rights.

3. Methodology

To begin, we select seven women’s health applications and conduct a static analysis of the applications’ code to reveal their functionalities. Next, we analyze the privacy policies of these applications to determine their scope, accessibility, and ease of understanding.

Our research employs a mixed-method approach, combining both qualitative and quantitative analyses, to address a range of issues related to women’s health applications. This approach is essential as it allows us to investigate both the linguistic analysis of privacy policies and the technical behavior of the applications. This section outlines the various steps we undertook in our research methodology.

3.1. App Selection

Table 1. List of the Applications Analyzed
App # Installs Category
1 1,000,000,000+ Health and Fitness
2 100,000,000+ Health and Fitness
3 100,000,000+ Health and Fitness
4 10,000,000+ Health and Fitness
5 500,000+ Medical
6 100,000+ Health and Fitness
7 100,000+ Health and Fitness

Our study focuses on analyzing Android Google Playstore applications, which are widely available and offer many resources for analysis. To identify relevant applications, we use search terms such as ’woman’, ’female’, ’feminine’ along with combination of ’health’, ’wellness’, and ’well-being’ on the Google Playstore. We rank the search results based on the number of downloads and ratings and downloaded the applications using a new Google Account in May 2023. Our selection of applications follows the research methodology of previous studies such as Adhikari et al. (Adhikari et al., 2014) and Shipp et al. (Shipp and Blasco, 2020). During our analysis, we exclude applications that were unrelated to our research objective, such as general awareness applications that appeared in the search results due to associated tags or descriptions. Additionally, we excluded ’erotica’ applications as they were irrelevant to the functionality we aimed to investigate.

3.2. Static Analysis

In this step of our analysis, we employed a rigorous approach to examine the applications by installing them on an Android device and using Android Debugging Bridge (adb) to extract the apk file onto a workstation. To conduct the static analysis of the applications, we utilized Mobile Security Framework (Abraham et al., 2020), which is a widely adopted tool for this purpose, and in accordance with the methodology of (Owens et al., 2022). The application’s source code contains vital information about its data practices, including the permissions it requests, which act as data sources and may lead to privilege escalation (Felt et al., 2011; Li et al., 2021). Google groups permissions into 4 categories based on the risk associated with the permission and the use(Google, 2023c). These permission categories include

  • Normal Permissions are considered low-risk permissions for system and other applications.

  • Dangerous Permissions are higher-risk permissions granting an application access to private user data or control over the device.

  • Signature Permissions are granted automatically to an application if it is signed with the same certificate as the application that declared the permission, without informing or approval of the user.

  • signatureOrSystem are used for sharing specific features between multiple vendors’ applications built into a system image.

To gain deeper insights into the applications’ data practices, we also examined the presence of third-party libraries and trackers within the application code. Developers often integrate these third-party services to monetize, analyze, or add new functionalities to their applications. However, these libraries or third parties may pose a threat to user privacy by leaking users’ information (Balebako et al., 2014). Therefore, the presence of a higher number of permissions declared by an application may potentially escalate its privilege, and the presence of a higher number of trackers may indicate a privacy vulnerability.

3.3. Privacy Policy Analysis

In this phase of our analysis, we collected the privacy policies of the seven selected applications. It is ideal for the privacy policy to be readily available on the app page in Google Playstore; however, if not, we ensured that it was available after downloading the application. We used the Fair Information Practice Principles (FIPPs) as a standard to evaluate the privacy policies of the selected applications (FPC, 2022). FIPPs are a set of established principles that form the foundation for both GDPR and HIPAA. In addition to the evaluation based on FIPPs, we also analyzed the privacy policies to identify the user (’data subject’) rights listed in them. This analysis provides insight into the transparency and accountability of the application developers regarding the handling of user data.

4. Results

In order to comprehensively analyze the selected women’s health applications, we conducted a thorough examination of their permissions, privacy policies, and third-party library usage. In this sections, we will present our findings for all the applications included in our study.

4.1. Permission: Information Source

Refer to caption
Figure 1. Dangerous Permissions in Applications

The scope of data that can be collected by apps is largely determined by the permissions they request. To thoroughly evaluate the privacy risks to individuals using these apps, it is essential to understand the types of data that can be potentially collected. The permissions on smartphone operating systems are in place to safeguard restricted data and actions. Consequently, apps that request more permissions have the more privilege to transmit a larger amount of data to backend and third-party entities. Analyzing the distribution of permissions requested by these female health apps allows for comparison to the app with the least amount of privileges among them. If the app with the least permissions shares similar goals with the others, it may function as a standard for the ”minimum number of permissions” required for these apps to operate optimally. This method is in accordance with the Owens. et. al. approach for permission analysis. (Owens et al., 2022)

The most prevalent dangerous permissions gave applications access to external storage. READ_EXTERNAL_STORAGE and WRITE_EXTERNAL_STORAGE were most common permission found in the study. We found four (4/7) application to use these permissions. These permissions allow applications to read and write data to external storage devices, potentially exposing sensitive user data to unauthorized access.

We observed several requested permissions that were not commonly used. One such permission is REQUEST_INSTALL_PACKAGES, which, if granted, can be used to install any app package on the user’s device (Google, 2023f). Another permission, WRITE_SETTINGS, was requested by an application and can be used to change a system setting of the device. The permissions READ_CONTACTS and READ_PHONE_NUMBERS were also requested once and grant apps access to the user’s contact information and device phone number(s), respectively. The READ_PHONE_STATE permission was requested by an application and grants access to the phone state, including the current cellular network information, the status of any ongoing calls, and a list of any Phone Accounts registered on the device. This is a highly privileged permission that includes the READ_PHONE_NUMBERS permission and the PhoneAccount class as a subset (Google, 2023d).

The permissions ACCESS_FINE_LOCATION and ACTIVITY_RECOGNITION were requested once and grant app access to the user’s fine location and physical activity, such as running, walking, biking etc., respectively (Google, 2023a, b). Lastly, the SYSTEM_ALERT_WINDOW permission was also only requested once and allows an app to create windows that are displayed over all other apps, such as pop-up notifications or floating widgets etc. Google recommends that this permission be used by very few applications as the windows created are for system-level interactions (Google, 2023e).

4.1.1. Most Privileged App

Our analysis of mobile applications and their data practices revealed that App 1 has the highest level of privilege among the applications in our study. This is because it requests a large number of dangerous permissions, which grant access to sensitive data and resources on the device. Moreover, we found that App 1 requests most of the least widely used dangerous permissions, which are uncommon among other applications. Considering that App 1 has over 1 billion downloads on the Google Play Store, its data practices have a significant impact on a large number of Android users. We are currently conducting a further study to examine the functionality and features of App 1 and to evaluate the necessity and justification of its permission requests.

4.1.2. Least Privileged App(s)

In contrast to App 1, our analysis revealed that two other health applications, App 2 and App 5, did not request any dangerous permissions. Despite this difference in data practices, both applications have a significant user base. App 2 has over 100 million downloads, while App 5 has over 500K downloads.

4.2. Trackers: Information Sink

As shown in Table 2, our analysis detected the presence of various trackers within the applications studied. The majority of these trackers were utilized for analytic and advertisement purposes. Notably, Facebook had the highest number of unique trackers present (5), which served a range of functions including analytics (based on user interaction data within the app), user identification (for login and sign-up purposes), and content sharing. Google also had a significant presence with 3 trackers, primarily focused on analytics and advertisement.

Our analysis revealed that the highest number of trackers present in a single app was 8. Specifically, the app App 4 contained 8 unique third-party trackers, including those from both Facebook and Google. The primary functions of these trackers were for advertisement and marketing purposes, as well as for analytic based on user interaction with the application. The app with the second-highest number of trackers was App 3, with 7 trackers present. These trackers served similar functions to those found in App 4.

Table 2. Third party trackers detected in the applications
Trackers # of Apps Category
AppsFlyer 2 Analytics
AutoNavi / Amap 1 Location
Facebook Ads 3 Advertisement
Facebook Analytics 2 Analytics
Facebook Login 3 Identification
Facebook Places 2
Facebook Share 3 Content Sharing
Google AdMob 3 Advertisement
Google CrashLytics 3 Crash Reporting
Google Firebase Analytics 5 Analytics
IAB Open Measurement 1
Identification
& Advertisement
myTarget 1 Advertisement
myTracker 1
Analytics
& Marketing
OneSignal 1
Push Notification
& Messaging

4.3. Privacy Policy: Information Notice

As part of our analysis of privacy policies, we evaluated the availability of privacy policies on the Google Play Store page for each application. Ideally, an app should provide its privacy policy on its Play Store page for easy accessibility. Our findings revealed that approximately 40% applications in our study did not provide a privacy policy on their Play Store page. These application account for more than 20 Million download, hence it takes privacy accessibility away from a large number of users.

Our analysis of the privacy policies of the selected applications revealed that none of them adhered to the principles of Data Minimization and Quality and Integrity. We found that these apps’ privacy policies claimed to collect more personally identifiable information (PII), such as user information and device identifiers, than was necessary for their core functionality. Furthermore, these apps did not limit their retention of PII and user data. Instead, they would either claim to retain it for business purposes even after a user deleted their account, keep it for an extended period of time, or use unclear and vague language regarding data retention. Some applications did not provide any notice on how long data would be kept. Additionally, none of the applications provided any notice on how they would ensure the accuracy and relevance of the data they collected to ensure fairness in their services.

The analysis further reveals that the least followed principles by the apps’ privacy policies were Accountability, Authority, and Purpose Specification and Use Limitation. We found that these apps’ privacy policies did not provide clear information on who is responsible for ensuring compliance with privacy regulations, who has the authority to access and use user data, and whether any training is provided to those who access users’ data and PII. Furthermore, there was a general lack of detail regarding the specific purposes for which user data is collected and used, which also correlates with the lack of adherence to the principle of Data Minimization. As a result, users may not have a clear understanding of how their data is being collected, used, and protected by these apps.

A majority of applications (70%) were observed following principle of Individual Participation. Our analysis of the privacy policies of the applications under study revealed that they purport to involve user consent in their data practices. Specifically, these policies provide mechanisms for users to submit complaints or share concerns and queries regarding privacy and data processing. This indicates that the privacy policies claim to take measures to ensure that users have a degree of agency in providing input or feedback on personal data processing and can participate in decisions regarding its collection and use.

Table 3. Policies Analysis of Female Health Application (LEGEND: ●= Followed, ◑= Partial, ○= Not followed)
Privacy principles App 1 App 2 App 3 App 4 App 5 App 6 App 7
1. Access and Rectification
2. Accountability
3. Authority
4. Minimization
5. Quality and Integrity
6. Individual Participation
7. Purpose Specification and Use Limitation
8. Security
9. Transparency

5. Conclusion

In this paper, we conducted an exploratory study on the privacy and security of 7 popular Female Health Applications. We found that these applications requested a varied number of dangerous permissions, which gave them access to sensitive data and resources on the device. We also detected numerous third-party trackers in these applications, which could collect and share user data with external parties, such as advertisers, analytics providers, or social media platforms. Furthermore, we analyzed the privacy policies of these applications using the FIPPs framework and found a general lack of adherence to various principles, such as notice, choice, access, security, and accountability. These findings raise concerns about the privacy and security of user information, especially in the context of the current political and capital surveillance of female health data in the post Roe v. Wade era. To the best of our knowledge, this is the first comprehensive study on the overall female health applications, extending the limited prior work on sub-categories. We are currently working on an extended set of applications with an enhanced analysis pipeline for a further study on this topic. We hope that our current and future results will help all stakeholders improve privacy design by ensuring more informed user policies and data privacy practices.

References