What is in Your App? Uncovering Privacy Risks of Female Health Applications
Abstract.
FemTech or Female Technology, is an expanding field dedicated to providing affordable and accessible healthcare solutions for women, prominently through Female Health Applications that monitor health and reproductive data. With the leading app exceeding 1 billion downloads, these applications are gaining widespread popularity. However, amidst contemporary challenges to women’s reproductive rights and privacy, there is a noticeable lack of comprehensive studies on the security and privacy aspects of these applications. This exploratory study delves into the privacy risks associated with seven popular applications. Our initial quantitative static analysis reveals varied and potentially risky permissions and numerous third-party trackers. Additionally, a preliminary examination of privacy policies indicates non-compliance with fundamental data privacy principles. These early findings highlight a critical gap in establishing robust privacy and security safeguards for FemTech apps, especially significant in a climate where women’s reproductive rights face escalating threats.
1. Introduction
In a rapidly developing digital age, technology constantly demonstrates itself to be an integral part of the lives of individuals. The efficiencies it provides allow for specific aspects of livelihood to be narrowed in on, targeted to fulfill specific needs, educate, and expand accessibility. This rings true especially in the case of female. From well-known tools such as Flo Health, Maven Clinic, and Progyny where specific emphasis is placed on reproductive health, technology has placed itself in a role that promotes the empowerment and agency of female health(Narwani, 2023; Bose, 2023).
Tools such as Flo Health and Progyny are in the technological category of FemTech, a term that refers to the use of technology to address the needs and challenges of women’s health and wellness (Yashar and Wannon, [n. d.]; Bose, 2023). FemTech mobile apps are one of the most popular and accessible forms of this technology, offering a variety of features and services for women, such as period tracking, fertility and contraception support, menopause management, hormonal disorder and chronic condition education, and sexual wellness guidance. Provided technology’s strong, deep presence and design for education, personal monitoring, and solution-based recommendations, one must consider the impacts and any related considerations around the vulnerabilities that exist and the biggest ones being privacy and policy.
The recent reversal of Roe v. Wade by the US Supreme Court, which recognized a woman’s constitutional right to choose whether to have an abortion, has raised serious concerns about the reproductive rights and privacy of women in the United States(Totenberg and McCammon, 2022). The decision has not only banned or restricted access to safe and legal abortions, but also undermined the right to privacy that was the basis of Roe v. Wade, which could have implications for other aspects of women’s health care such as fertility treatment, contraception, and cancer care (Scott, 2022; Suran, 2022). Moreover, there have been cases where women have been prosecuted or persecuted based on their online purchase of abortifacients, text messages, or search history related to abortion or miscarriage(Valenti, 2015; Elliott, 2022; Ant-Wuorinen et al., [n. d.]). Therefore, some experts have advised women to delete their FemTech apps or avoid using them to track their periods and fertility cycles, as they fear that their personal data could be used against them by anti-abortion laws or authorities(Ant-Wuorinen et al., [n. d.]).
Female health applications (FHA) are a form of FemTech that provide various services and products to women, such as period tracking, fertility support, pregnancy monitoring, menopause management and general health care. However, these applications also collect sensitive health data from their users, which could expose them to privacy and security risks. Previous studies have shown that some FHA have poor data security and privacy practices, such as tracking and profiling users, sharing data with third parties, or not disclosing data policies to users (Matsakis, 2019; Shipp and Blasco, 2020; Alfawzan et al., 2022; Mehrnezhad and Almeida, 2021). Moreover, in some contexts, women’s bodies are under political surveillance, and their data could be used against them by authorities or adversaries. For example, in Missouri, the state health department tracked women’s period information to determine if an abortion took place (Arwa Mahdawi, 2019). Therefore, it is important to understand the security, privacy and data practices of FHA and how they affect the users. In this exploratory study, we aim to provide a comprehensive overview of the current state of FemTech apps and examine their potential benefits and challenges for women’s health. This emphasis on data privacy informs underlying impacts on threats to female health posed by data vulnerabilities. We conduct a preliminary analysis of what and how information is collected by FHA, and what information is disclosed to the users about the data practices of FHA by their privacy policies.
We present an initial exploration of the FHA space that aims to answer the following research questions.
-
•
RQ1 Examining the Data collection and data practices of popular FHA?
We analyzed the permissions requested by the applications to determine their level of privilege and access to sensitive data. We also examined the presence of third-party trackers in the applications that may collect and share user data. -
•
RQ2 Evaluating privacy policies of selected FHA apps for data privacy principles
We assessed the privacy policies of 7 popular FHA apps by using the Fair Information Practice Principles FIPPs framework to evaluate their data practices focusing on privacy and security measures.
2. Background
The academic literature on the use of technology for female health (FemTech) reveals the importance and the risks of collecting and processing female healthcare data. In this section, we review the relevant literature to understand the current state of FemTech and the challenges it faces in terms of data privacy and security. We also examine the laws and policies that regulate healthcare data in general and female healthcare data in particular, focusing on prominent frameworks in this domain. We compare and contrast these regulations to investigate how they address the specific needs and concerns of female healthcare data privacy. Our goal is to identify the gaps and opportunities for improving FemTech data privacy and security in light of the existing literature and regulations.
2.1. Female Health Data Risks
Female Health Apps (FHA) collect sensitive user data, often revealing intimate aspects of women’s lives. Trust and data safety are paramount, especially for FHA users, as they entrust these applications with their sexual and reproductive health (SRH) information(Aïmeur et al., 2016; Kesan et al., 2015; Muller et al., 2023). However, uninformed users may share data without understanding the implications(uuI, 2023; Vallor, 2018; umi, 2023). Previous studies highlight the privacy risks in FemTech and FHA due to inadequate consent or protection(Almeida et al., 2022), and the misuse of this data could lead to discrimination, violence, and legal repercussions(Maas, 2022; Olivero, 2022; Almeida et al., 2022; Arwa Mahdawi, 2019; Court, [n. d.]; con, 2022; rep, 2023; Masling et al., 2023).
FHA’s collection of personal health data poses privacy risks. Sharing this data with third parties could lead to unwanted targeting or exploitation. Despite privacy promises, some FHA have been found to share user data with third parties(Rosato, 2020; Nguyen, 2020), and some even shared health information without user consent(Cohen, 2022), breaching trust and privacy rights.
3. Methodology
To begin, we select seven women’s health applications and conduct a static analysis of the applications’ code to reveal their functionalities. Next, we analyze the privacy policies of these applications to determine their scope, accessibility, and ease of understanding.
Our research employs a mixed-method approach, combining both qualitative and quantitative analyses, to address a range of issues related to women’s health applications. This approach is essential as it allows us to investigate both the linguistic analysis of privacy policies and the technical behavior of the applications. This section outlines the various steps we undertook in our research methodology.
3.1. App Selection
App # | Installs | Category |
---|---|---|
1 | 1,000,000,000+ | Health and Fitness |
2 | 100,000,000+ | Health and Fitness |
3 | 100,000,000+ | Health and Fitness |
4 | 10,000,000+ | Health and Fitness |
5 | 500,000+ | Medical |
6 | 100,000+ | Health and Fitness |
7 | 100,000+ | Health and Fitness |
Our study focuses on analyzing Android Google Playstore applications, which are widely available and offer many resources for analysis. To identify relevant applications, we use search terms such as ’woman’, ’female’, ’feminine’ along with combination of ’health’, ’wellness’, and ’well-being’ on the Google Playstore. We rank the search results based on the number of downloads and ratings and downloaded the applications using a new Google Account in May 2023. Our selection of applications follows the research methodology of previous studies such as Adhikari et al. (Adhikari et al., 2014) and Shipp et al. (Shipp and Blasco, 2020). During our analysis, we exclude applications that were unrelated to our research objective, such as general awareness applications that appeared in the search results due to associated tags or descriptions. Additionally, we excluded ’erotica’ applications as they were irrelevant to the functionality we aimed to investigate.
3.2. Static Analysis
In this step of our analysis, we employed a rigorous approach to examine the applications by installing them on an Android device and using Android Debugging Bridge (adb) to extract the apk file onto a workstation. To conduct the static analysis of the applications, we utilized Mobile Security Framework (Abraham et al., 2020), which is a widely adopted tool for this purpose, and in accordance with the methodology of (Owens et al., 2022). The application’s source code contains vital information about its data practices, including the permissions it requests, which act as data sources and may lead to privilege escalation (Felt et al., 2011; Li et al., 2021). Google groups permissions into 4 categories based on the risk associated with the permission and the use(Google, 2023c). These permission categories include
-
•
Normal Permissions are considered low-risk permissions for system and other applications.
-
•
Dangerous Permissions are higher-risk permissions granting an application access to private user data or control over the device.
-
•
Signature Permissions are granted automatically to an application if it is signed with the same certificate as the application that declared the permission, without informing or approval of the user.
-
•
signatureOrSystem are used for sharing specific features between multiple vendors’ applications built into a system image.
To gain deeper insights into the applications’ data practices, we also examined the presence of third-party libraries and trackers within the application code. Developers often integrate these third-party services to monetize, analyze, or add new functionalities to their applications. However, these libraries or third parties may pose a threat to user privacy by leaking users’ information (Balebako et al., 2014). Therefore, the presence of a higher number of permissions declared by an application may potentially escalate its privilege, and the presence of a higher number of trackers may indicate a privacy vulnerability.
3.3. Privacy Policy Analysis
In this phase of our analysis, we collected the privacy policies of the seven selected applications. It is ideal for the privacy policy to be readily available on the app page in Google Playstore; however, if not, we ensured that it was available after downloading the application. We used the Fair Information Practice Principles (FIPPs) as a standard to evaluate the privacy policies of the selected applications (FPC, 2022). FIPPs are a set of established principles that form the foundation for both GDPR and HIPAA. In addition to the evaluation based on FIPPs, we also analyzed the privacy policies to identify the user (’data subject’) rights listed in them. This analysis provides insight into the transparency and accountability of the application developers regarding the handling of user data.
4. Results
In order to comprehensively analyze the selected women’s health applications, we conducted a thorough examination of their permissions, privacy policies, and third-party library usage. In this sections, we will present our findings for all the applications included in our study.
4.1. Permission: Information Source

The scope of data that can be collected by apps is largely determined by the permissions they request. To thoroughly evaluate the privacy risks to individuals using these apps, it is essential to understand the types of data that can be potentially collected. The permissions on smartphone operating systems are in place to safeguard restricted data and actions. Consequently, apps that request more permissions have the more privilege to transmit a larger amount of data to backend and third-party entities. Analyzing the distribution of permissions requested by these female health apps allows for comparison to the app with the least amount of privileges among them. If the app with the least permissions shares similar goals with the others, it may function as a standard for the ”minimum number of permissions” required for these apps to operate optimally. This method is in accordance with the Owens. et. al. approach for permission analysis. (Owens et al., 2022)
The most prevalent dangerous permissions gave applications access to external storage. READ_EXTERNAL_STORAGE
and WRITE_EXTERNAL_STORAGE
were most common permission found in the study. We found four (4/7) application to use these permissions. These permissions allow applications to read and write data to external storage devices, potentially exposing sensitive user data to unauthorized access.
We observed several requested permissions that were not commonly used. One such permission is REQUEST_INSTALL_PACKAGES
, which, if granted, can be used to install any app package on the user’s device (Google, 2023f). Another permission, WRITE_SETTINGS
, was requested by an application and can be used to change a system setting of the device. The permissions READ_CONTACTS
and READ_PHONE_NUMBERS
were also requested once and grant apps access to the user’s contact information and device phone number(s), respectively. The READ_PHONE_STATE
permission was requested by an application and grants access to the phone state, including the current cellular network information, the status of any ongoing calls, and a list of any Phone Accounts registered on the device. This is a highly privileged permission that includes the READ_PHONE_NUMBERS
permission and the PhoneAccount
class as a subset (Google, 2023d).
The permissions ACCESS_FINE_LOCATION
and ACTIVITY_RECOGNITION
were requested once and grant app access to the user’s fine location and physical activity, such as running, walking, biking etc., respectively (Google, 2023a, b). Lastly, the SYSTEM_ALERT_WINDOW
permission was also only requested once and allows an app to create windows that are displayed over all other apps, such as pop-up notifications or floating widgets etc. Google recommends that this permission be used by very few applications as the windows created are for system-level interactions (Google, 2023e).
4.1.1. Most Privileged App
Our analysis of mobile applications and their data practices revealed that App 1 has the highest level of privilege among the applications in our study. This is because it requests a large number of dangerous permissions, which grant access to sensitive data and resources on the device. Moreover, we found that App 1 requests most of the least widely used dangerous permissions, which are uncommon among other applications. Considering that App 1 has over 1 billion downloads on the Google Play Store, its data practices have a significant impact on a large number of Android users. We are currently conducting a further study to examine the functionality and features of App 1 and to evaluate the necessity and justification of its permission requests.
4.1.2. Least Privileged App(s)
In contrast to App 1, our analysis revealed that two other health applications, App 2 and App 5, did not request any dangerous permissions. Despite this difference in data practices, both applications have a significant user base. App 2 has over 100 million downloads, while App 5 has over 500K downloads.
4.2. Trackers: Information Sink
As shown in Table 2, our analysis detected the presence of various trackers within the applications studied. The majority of these trackers were utilized for analytic and advertisement purposes. Notably, Facebook had the highest number of unique trackers present (5), which served a range of functions including analytics (based on user interaction data within the app), user identification (for login and sign-up purposes), and content sharing. Google also had a significant presence with 3 trackers, primarily focused on analytics and advertisement.
Our analysis revealed that the highest number of trackers present in a single app was 8. Specifically, the app App 4 contained 8 unique third-party trackers, including those from both Facebook and Google. The primary functions of these trackers were for advertisement and marketing purposes, as well as for analytic based on user interaction with the application. The app with the second-highest number of trackers was App 3, with 7 trackers present. These trackers served similar functions to those found in App 4.
Trackers | # of Apps | Category | ||
---|---|---|---|---|
AppsFlyer | 2 | Analytics | ||
AutoNavi / Amap | 1 | Location | ||
Facebook Ads | 3 | Advertisement | ||
Facebook Analytics | 2 | Analytics | ||
Facebook Login | 3 | Identification | ||
Facebook Places | 2 | |||
Facebook Share | 3 | Content Sharing | ||
Google AdMob | 3 | Advertisement | ||
Google CrashLytics | 3 | Crash Reporting | ||
Google Firebase Analytics | 5 | Analytics | ||
IAB Open Measurement | 1 |
|
||
myTarget | 1 | Advertisement | ||
myTracker | 1 |
|
||
OneSignal | 1 |
|
4.3. Privacy Policy: Information Notice
As part of our analysis of privacy policies, we evaluated the availability of privacy policies on the Google Play Store page for each application. Ideally, an app should provide its privacy policy on its Play Store page for easy accessibility. Our findings revealed that approximately 40% applications in our study did not provide a privacy policy on their Play Store page. These application account for more than 20 Million download, hence it takes privacy accessibility away from a large number of users.
Our analysis of the privacy policies of the selected applications revealed that none of them adhered to the principles of Data Minimization and Quality and Integrity. We found that these apps’ privacy policies claimed to collect more personally identifiable information (PII), such as user information and device identifiers, than was necessary for their core functionality. Furthermore, these apps did not limit their retention of PII and user data. Instead, they would either claim to retain it for business purposes even after a user deleted their account, keep it for an extended period of time, or use unclear and vague language regarding data retention. Some applications did not provide any notice on how long data would be kept. Additionally, none of the applications provided any notice on how they would ensure the accuracy and relevance of the data they collected to ensure fairness in their services.
The analysis further reveals that the least followed principles by the apps’ privacy policies were Accountability, Authority, and Purpose Specification and Use Limitation. We found that these apps’ privacy policies did not provide clear information on who is responsible for ensuring compliance with privacy regulations, who has the authority to access and use user data, and whether any training is provided to those who access users’ data and PII. Furthermore, there was a general lack of detail regarding the specific purposes for which user data is collected and used, which also correlates with the lack of adherence to the principle of Data Minimization. As a result, users may not have a clear understanding of how their data is being collected, used, and protected by these apps.
A majority of applications (70%) were observed following principle of Individual Participation. Our analysis of the privacy policies of the applications under study revealed that they purport to involve user consent in their data practices. Specifically, these policies provide mechanisms for users to submit complaints or share concerns and queries regarding privacy and data processing. This indicates that the privacy policies claim to take measures to ensure that users have a degree of agency in providing input or feedback on personal data processing and can participate in decisions regarding its collection and use.
Privacy principles | App 1 | App 2 | App 3 | App 4 | App 5 | App 6 | App 7 | |
---|---|---|---|---|---|---|---|---|
|
○ | ● | ○ | ○ | ○ | ● | ○ | |
|
○ | ◑ | ○ | ○ | ○ | ○ | ○ | |
|
○ | ● | ○ | ○ | ○ | ○ | ○ | |
|
○ | ○ | ○ | ○ | ○ | ○ | ○ | |
|
○ | ○ | ○ | ○ | ○ | ○ | ○ | |
|
◑ | ● | ◑ | ○ | ◑ | ◑ | ○ | |
|
○ | ● | ○ | ○ | ○ | ○ | ○ | |
|
○ | ● | ◑ | ○ | ○ | ○ | ○ | |
|
○ | ◑ | ○ | ○ | ◑ | ○ | ○ |
5. Conclusion
In this paper, we conducted an exploratory study on the privacy and security of 7 popular Female Health Applications. We found that these applications requested a varied number of dangerous permissions, which gave them access to sensitive data and resources on the device. We also detected numerous third-party trackers in these applications, which could collect and share user data with external parties, such as advertisers, analytics providers, or social media platforms. Furthermore, we analyzed the privacy policies of these applications using the FIPPs framework and found a general lack of adherence to various principles, such as notice, choice, access, security, and accountability. These findings raise concerns about the privacy and security of user information, especially in the context of the current political and capital surveillance of female health data in the post Roe v. Wade era. To the best of our knowledge, this is the first comprehensive study on the overall female health applications, extending the limited prior work on sub-categories. We are currently working on an extended set of applications with an enhanced analysis pipeline for a further study on this topic. We hope that our current and future results will help all stakeholders improve privacy design by ensuring more informed user policies and data privacy practices.
References
- (1)
- con (2022) 2022. Dobbs v. Jackson Women’s Health Organization (2022). https://constitutioncenter.org/the-constitution/supreme-court-case-library/dobbs-v-jackson-womens-health-organization.
- uuI (2023) 2023. Informed consent for data sharing. https://www.uu.nl/en/research/research-data-management/guides/informed-consent-for-data-sharing.
- umi (2023) 2023. Informed Consent Language for Data Sharing. https://www.icpsr.umich.edu/web/pages/datamanagement/confidentiality/conf-language.html.
- rep (2023) 2023. The Case in Depth - Center for Reproductive Rights. https://reproductiverights.org/case/scotus-mississippi-abortion-ban/dobbs-jackson-womens-health.
- Abraham et al. (2020) Aijn Abraham, D Schlecht, G Ma, M Dobrushin, and V Nadal. 2020. Mobile security framework (MobSF).
- Adhikari et al. (2014) Rajindra Adhikari, Deborah Richards, and Karen Scott. 2014. Security and privacy issues related to the use of mobile health apps. ACIS.
- Aïmeur et al. (2016) Esma Aïmeur, Oluwa Lawani, and Kimiz Dalkir. 2016. When changing the look of privacy policies affects user trust: An experimental study. Computers in Human Behavior 58 (2016), 368–379.
- Alfawzan et al. (2022) Najd Alfawzan, Markus Christen, Giovanni Spitale, and Nikola Biller-Andorno. 2022. Privacy, Data Sharing, and Data Security Policies of Women’s mHealth Apps: Scoping Review and Content Analysis. JMIR mHealth and uHealth 10, 5 (2022), e33735.
- Almeida et al. (2022) Teresa Almeida, Laura Shipp, Maryam Mehrnezhad, and Ehsan Toreini. 2022. Bodies Like Yours: Enquiring Data Privacy in FemTech. In Adjunct Proceedings of the 2022 Nordic Human-Computer Interaction Conference. 1–5.
- Ant-Wuorinen et al. ([n. d.]) Sini-Marja Ant-Wuorinen, Maria Knaapi, Heli Koskela, Emma Lindberg, Anni Lintula, and Linda Palenius. [n. d.]. “Your period starts in two days” Risks of period-tracking post Roe v. Wade. ([n. d.]).
- Arwa Mahdawi (2019) Arwa Mahdawi. 2019. If the government tracks women’s periods, why not track male ejaculation, too? https://www.theguardian.com/world/commentisfree/2019/nov/02/if-the-government-tracks-womens-periods-why-not-track-male-ejaculation-too.
- Balebako et al. (2014) Rebecca Balebako, Abigail Marsh, Jialiu Lin, Jason I Hong, and Lorrie Faith Cranor. 2014. The privacy and security behaviors of smartphone app developers. (2014).
- Bose (2023) Priyom Bose. 2023. The Femtech Revolution: How Technology is Empowering Women’s Health and Wellness. https://www.news-medical.net/health/The-FemTech-Revolution-how-technology-is-empowering-Womens-health-and-wellness.aspx
- Cohen (2022) Kristin Cohen. 2022. Location, health, and other sensitive information: FTC committed to fully enforcing the law against illegal use and sharing of highly sensitive data. https://www.ftc.gov/business-guidance/blog/2022/07/location-health-and-other-sensitive-information-ftc-committed-fully-enforcing-law-against-illegal.
- Court ([n. d.]) U.S. Supreme Court. [n. d.]. Dobbs v. Jackson Women’s Health Organization, 597 U.S. https://supreme.justia.com/cases/federal/us/597/19-1392/
- Elliott (2022) Vittoria Elliott. 2022. The fall of “Roe” would put big tech in a bind. https://www.wired.com/story/big-tech-roe-abortion/
- Felt et al. (2011) Adrienne Porter Felt, Erika Chin, Steve Hanna, Dawn Song, and David Wagner. 2011. Android permissions demystified. In Proceedings of the 18th ACM conference on Computer and communications security. 627–638.
- FPC (2022) Federal Privacy Council FPC. 2022. Fair Information Practice Principles (FIPPs). https://www.fpc.gov/resources/fipps/
- Google (2023a) Google. 2023a. ACCESS_FINE_LOCATION permission — Manifest permission. https://developer.android.com/reference/android/Manifest.permission#ACCESS_FINE_LOCATION
- Google (2023b) Google. 2023b. ACTIVITY_RECOGNITION permission — Manifest permission. https://developer.android.com/reference/android/Manifest.permission#ACTIVITY_RECOGNITION
- Google (2023c) Google. 2023c. Permission - Android Developers. https://developer.android.com/guide/topics/manifest/permission-element#pgroup
- Google (2023d) Google. 2023d. READ_PHONE_STATE — Manifest permission. https://developer.android.com/reference/android/Manifest.permission#READ_PHONE_STATE
- Google (2023e) Google. 2023e. SYSTEM_ALERT_WINDOW permission — Manifest permission. https://developer.android.com/reference/android/Manifest.permission#SYSTEM_ALERT_WINDOW
- Google (2023f) Google. 2023f. Use of the REQUEST_INSTALL_PACKAGES permission. https://support.google.com/googleplay/android-developer/answer/12085295?hl=en.
- Kesan et al. (2015) Jay P Kesan, Carol M Hayes, and Masooda N Bashir. 2015. A comprehensive empirical study of data privacy, trust, and consumer autonomy. Ind. LJ 91 (2015), 267.
- Li et al. (2021) Rui Li, Wenrui Diao, Zhou Li, Jianqi Du, and Shanqing Guo. 2021. Android custom permissions demystified: From privilege escalation to design shortcomings. In 2021 IEEE Symposium on Security and Privacy (SP). IEEE, 70–86.
- Maas (2022) Mary Maas. 2022. Data Privacy in the Femtech Industry — vanderbilt.edu. https://www.vanderbilt.edu/jetlaw/2022/11/08/data-privacy-in-the-femtech-industry/.
- Masling et al. (2023) Sharon Masling, Shane Sigel, and Megan Lipsky. 2023. Evolving Laws and Litigation Post–Dobbs: The State of Reproductive Rights. https://www.morganlewis.com/pubs/2023/05/evolving-laws-and-litigation-post-dobbs-the-state-of-reproductive-rights-as-of-may-2023.
- Matsakis (2019) Louise Matsakis. 2019. The WIRED Guide to Your Personal Data (and Who Is Using It). https://www.wired.com/story/wired-guide-personal-data-collection/.
- Mehrnezhad and Almeida (2021) Maryam Mehrnezhad and Teresa Almeida. 2021. Caring for intimate data in fertility technologies. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. 1–11.
- Muller et al. (2023) Regina Muller, Nadia Primc, and Eva Kuhn. 2023. ’You have to put a lot of trust in me’: autonomy, trust, and trustworthiness in the context of mobile apps for mental health. Medicine, Health Care and Philosophy (2023), 1–12.
- Narwani (2023) Deepa Narwani. 2023. Unlocking the full potential of Femtech in addressing women’s health needs. https://insights.omnia-health.com/technology/unlocking-full-potential-femtech-addressing-womens-health-needs
- Nguyen (2020) Stephanie Nguyen. 2020. Reproductive Health Apps: A Digital Standard Case Study - Innovation at Consumer Reports. https://innovation.stage.consumerreports.org/reproductive-health-apps-a-digital-standard-case-study/.
- Olivero (2022) Amy Olivero. 2022. Privacy and digital health data: The femtech challenge — iapp.org. https://iapp.org/news/a/privacy-and-digital-health-data-the-femtech-challenge/.
- Owens et al. (2022) Kentrell Owens, Anita Alem, Franziska Roesner, and Tadayoshi Kohno. 2022. Electronic Monitoring Smartphone Apps: An Analysis of Risks from Technical,Human-Centered, and Legal Perspectives. In 31st USENIX Security Symposium (USENIX Security 22). 4077–4094.
- Rosato (2020) Donna Rosato. 2020. What Your Period Tracker App Knows About You - Consumer Reports. https://www.consumerreports.org/health-privacy/what-your-period-tracker-app-knows-about-you-a8701683935/.
- Scott (2022) Dylan Scott. 2022. It’s not just about abortion. https://www.vox.com/23137822/abortion-birth-control-health-care-roe-wade
- Shipp and Blasco (2020) Laura Shipp and Jorge Blasco. 2020. How private is your period?: A systematic analysis of menstrual app privacy policies. Proc. Priv. Enhancing Technol. 2020, 4 (2020), 491–510.
- Suran (2022) Melissa Suran. 2022. Treating cancer in pregnant patients after Roe v Wade overturned. JAMA 328, 17 (2022), 1674–1676.
- Totenberg and McCammon (2022) Nina Totenberg and Sarah McCammon. 2022. Supreme Court overturns Roe v. Wade, ending right to abortion upheld for decades. https://www.npr.org/2022/06/24/1102305878/supreme-court-abortion-roe-v-wade-decision-overturn
- Valenti (2015) Jessica Valenti. 2015. It isn’t justice for Purvi Patel to serve 20 years in prison for an abortion — Jessica Valenti. https://www.theguardian.com/commentisfree/2015/apr/02/it-isnt-justice-for-purvi-patel-to-serve-20-years-in-prison-for-an-abortion
- Vallor (2018) Shannon Vallor. 2018. An introduction to data ethics. Course module.) Santa Clara, CA: Markkula Center for Applied Ethics (2018).
- Yashar and Wannon ([n. d.]) Moryel Yashar and Sabrina Wannon. [n. d.]. Future Flora as a Case Study for FemTech’s Role in Science: Tackling the Taboo Head-On. ([n. d.]).