
for myself which comes as a fem app and then as I was searching for it I have a habit of um you know looking for privacy policies or data practices of the apps before I give my data to them and the deeper I looked the more concerning and stranger the policies went and that's how I got into a rabbit hole of um really weird data collection practices And that's why I'm here to share my insights with you. Hope you enjoy it. So let's start with what is fem. So femeche or female technology is a term used to define software and services that use technology tailored towards women's health and wellness. So these are fertility trackers, period trackers,
um uh pregnancy and nursing care, women's sexual health and wellness, all of these applications like Flow, Clue, Maya, uh you must have heard these names. These are really big names right now. And FEMA is not just limited to uh applications like mobile apps, but they also include variables then um uh diagnostics and telealth services. But today we will focus mostly on the uh mobile apps. So fem in Europe is a big thing now. Europe consists of 28% of global feech market with more than 540 active companies in 23 out of 44 countries. And um by 2032 the European fee sector is expected to grow beyond $35 billion. Isn't that insane? And still we are in a stage of infancy
with FEM because it's a highly underfunded and underresarched field. even if it's growing in terms of money um we still lack a lot of uh data collection and regulation around it and I guess this gap is not uh something that we should be bothersome about but it's a good opportunity to create more transparent and evidence-based feech in the coming years. So fem f apps have been in place since a long time now since like early 2000s but they got traction after the flow apps controversy around data privacy that they are sharing users data with third parties in 2021 and it's continuing so it happened again in this year twice now but uh yeah can't can't really go into
the political nuance but uh high time that we look into it. So these apps are incredibly popular. They have tens of millions of users per month and hundreds of millions of downloads worldwide. But why why do we seek fem apps? To understand our body better to manage fertility, achieve or avoid pregnancy for emotional validation and community. A lot of these apps have a community feature. So you can go in and ask your queries. They do say with a disclaimer written above that don't put personal data but then there's a lack of digital literacy around the world. So we do put our personal data there as well. Um then we use it for convenience because it's not always easy to manually
track your period and to feel empowered and informed. And I relate to all of this. When I was young, I was told I was taught to manually track my cycle over a calendar in our kitchen. Um, but yeah, now today with all the fem apps, we have so much more insights. We have so much more data points to look at. But the thing is the fem apps also come with a lot of empowerment wrapped in them. You might recognize some of these. These are some of the slogans that very famous femic apps usually use. They say your health simplified with us. The birth control app your friends won't stop talking about. My favorite is all natural birth control
powered by your body signals. Does it sound empowering to you? Does it? Well, it's just not the slogans. Empowerment comes wrapped in aesthetics as well. And these are the ones that they deliberately use. And when once you spot this aesthetic, I'm sure you won't unsee it. Bear with me. Can you see all these apps? These are the most famous apps. So if you go on Google App Store or your Apple App Store and just type fem or period tracker, these are the ones that you get on the top of your list. Can you sense a theme? Is it all too pink? I find it quite stereotypical, but that's my opinion and you can have your
own. Some people like pink pink, but cool. Um, but yeah, this is the aesthetic that they use. And all of these apps are pink or the second most used color is purple. I get it that it's a marketing gimmick. Uh pink is a very feminine color used for visibility and inclusivity. But then these apps were traditionally made to empower women. Don't you think that with this pink branding and empowerment slogans, it's trying to drive us away from empowerment. the trope that it was made for.
So what is beyond the pink? Have you heard about pink tax? So pink tax is usually associated with uh everyday products like razors, deodorants, shampoos, lotions which uh women pay much more for because it's pink in color and um it's not really associated with femech usually but then with AI powered insights and premium versions of these apps I think pimpax has found its way into fchech as well and it's not just money it is more about we pay in terms of privacy and data exploitation. Isn't that pink tax as well? So behind these friendly colors, empowerment slogans, there lies a complex ecosystem of data algorithms and sometimes invisible risks. Let's get more into some technicalities.
So these are some of the data collection practices that fem apps use currently. So when whenever you download any kind of app, it always takes your device data which I'm fine with. Uh it takes your behavioral meta data which is again general data. All of the apps do that. All right. Good. But then when it comes to femech um uh they ask you to log into the app. So there goes your name, email, contact information. Most of the apps now have premium features. So there goes uh payment information as well. A lot of them collect location data for some reason which I find a bit unnecessary but they do. And um uh yeah so all of this is personally
identifiable information or PII. What goes unnoticed with fem is apart from this information there's more personal and intimate information that these apps are collecting. So there goes our menstrual cycle data. So period dates irregularities then flow intensity, ovulation test results, pregnancy tracking, conception date, due date, uh menopause symptoms, then birth control. Um then we have sexual health data. So, FEM apps not only take your data but indirectly they are taking the data of your sexual partner as well. um mental health and emotional well-being. They have really good um mood logging features which are insightful. But then all of this information um when clubed together, it creates more of a digital profile. And even if it's
not coming from just one app, they if the information about you is coming from other third parties, it is a whole profile of a person. Then there goes lifestyle data. Uh these apps also have a good feature for um water intake. Um that's a good example of how they how much niche we are going into. Um then sleep patterns, breastfeeding and lactation data, medical history and doctor visits. This is what I find really concerning. Um you have to log your doctor visits. Some of the apps also take what medication you're on or uh what kind of um treatment you're going through. So you can log your diagnosis as well which is really concerning. It comes with a disclaimer
that uh when they try to provide AI powered health insights they say that you should not trust these and you should seek help from a medical practitioner. But it is written so small that you hardly look at it or if it's a popup we we don't take like five seconds to read it but just close because it's irritating. But again um let me give you an example why why this is so crucial. Just imagine if uh a woman is uh logging her breastfeeding data into the app and the app is also collecting her location data along with it and somehow if this data is leaked or sold or in the hands of um data brokers.
They have a full profile of um her as well as her child. They know where she is with her child alone at her most vulnerable. And this is not just um digital risk anymore. It has escalated to a physical safety risk
that that um invites us to like looking at the privacy policies of these apps. So do we actually read privacy policies? Nobody does, isn't it? They are long, complex, full of jargon and yeah, they they put the accept button in bold pink or blue or whatever color you like. And there's hardly an option to um reject those because if you reject, you might not be able to use the app. But if you take some time and read those, then you might find some concerning patterns. So in 2022, Mosul did a study on all these fem apps and their privacy policies and they rated them from a level of creepiness like most creepy to the least creepy. And kid
you not the most famous apps right now are the most creepiest and I thought it is still like 2022. It's fine. So I thought I'll do a review myself after 3 years it might be better but no it was not at all any better. So this is what I found. So there's ambiguity in language. They use terms like we may share your data with trusted partners or your data is used to improve our services. It sounds really harm harmless but it's kind of vague. What who are these trusted partners or what kind of data are you using to improve what services? We don't know. Um I read one of these uh privacy policies and it said we do not
sell your data and just three lines below that it said that we use your data for marketing and research purposes. Doesn't it mean it's still getting monetized somehow? Then there are difficult optout options. So if you go like of course um now because of GDPR and everything these apps need to provide user control deletion. So if you go in any of these apps, it is so difficult to find to opt out of um all the any any kind of tracking or delete your data. I think this is one thing that the app should do easily, isn't it? We anyways don't read the whole privacy policy. Let us find the settings clearly as at least I would say. Um there is third party
involvement of course. So, a lot of these apps um let you sync your device with your partner's device and a lot of with v uh like variables as well like Fitbit. Um and they also um have this integration with other apps like your in-built mobile apps. So, with all of this said, it feels really nice that you can integrate and it's good. everyone can have visibility and everything but uh if you read the privacy policies with them there's so much lack of responsibility. So the fem app might say that um we you can you are free to integrate data but uh we don't have responsibility of the data that goes outside our app and then
if you read the privacy policy for any variable they say we get our data from third party and we are not responsible how they track it or how we get it. Then who do we complain to? In terms of regulation around EU, it's still good but we don't have one single regulatory body for fee. Right now we have got three which align a bit with FEM. So there's MDR which is the medical devices regulation which determines when a feech app counts as a medical device it should meet all the safety safety regulations and testing standards. Then there is digital services act which is more reactive than proactive um by which I mean that if you complain
only then they act on it uh otherwise it's just guidelines. So it addresses platform transparency, targeted advertising and accountability. Um this is more for influencers and how you market the app. And of course there's GDPR. So it governs how sensitive health data is processed, stored and shared. We have upcoming EU regulations, the EHDS and digital fairness act, which I think we should keep an eye on how it aligns. But I still think that because it's so fragmented and given that fem industry is growing so much, we need a regulation for it or at least we need MDR to act on it. So let's talk a bit more about MDR. So medical regulation um like medical
regulations works only when a company labels themselves as a medical device. So um most of these apps label themselves as wellness or health and fitness and that's how they bypass the MDR. Um but again in MDR it states that if your device or your app um it is used to diagnose or predict um reproduction or fertility things like that it should be terms termed as a medical device but however it is not and that's a big gap there. So that's how they bypass all the um safety testing and oversight. What do you think a privacy first future would look like in terms of fem? So have you heard about data anonymization? I bet everyone has, isn't it? So that's
that's just the basic thing you need to do according to GDPR and everything. But again research has shown us over and again that uh we can still get back to the original data even if we anonymize data now and none of the fem apps have gone beyond data anonymization really. So there's another thing called differential privacy which means that you can add noise uh to your data. That's how um you kind of anonymize it and it's an advanced technique for data anonymization I would say. So you cannot trace back to your data. So it's time that we use better functionality or better majors in these terms. I guess a lot of these apps use of course I guess cloud backups
uh because it's easy but uh then again it with cloud storage it anyways comes with a lot of its own vulnerabilities but then uh there's Apple's health app that does ondevice data processing and that also asks the users if they want to back their data up or not which is a fair deal I think Why why would I want somebody to take a look at my data um just so easily just because I back it up myself? But if I know that I don't want to back this data up, then it that feels empowering to me that feels that I am in control. So yeah. Um but there's again it's just the Apple Health app doing the ondevice
processing and no default backups. It's not a norm yet. User control deletion as I said earlier those settings are really deep into the app and really difficult to find and again what's the guarantee that it is the data is collected wisely and what's the guarantee that if even if I uh you know delete my data from the app it's actually being deleted. you know when the 2021 flow app controversy happened um a lot of women um deleted the app completely but that does not mean right that the data is being deleted it's just the app but again there's so much lack of digital literacy around that we think if I don't use the app anymore I'm safe
then encrypted data syncs this is really something that I was shocked to know. Um so as I said that these apps have um the functionality to sync with your partners' app or variables. So a lot of these apps if you read the policies or their data collection practices they explicitly mention that there's no encryption in between and it is crazy. We have gone so much advanced with other security measures and everything right now and how how are we not encrypted yet?
I think when when we talk about data breaches, we just talk about monetary losses or reputational loss. But with fem collecting so much intimate information, I think we need more of a sociological approach to it and I'll tell you in a bit why. So if uh if we um look at it as a threat model, I think the focus areas would be assets. So the personal or intimate health data, then there is behavioral metadata, user identities, shared privacy, user generated content. Of course, uh there are threat actors that are not just developers, advertisers, but then there are employers, governments, abusive partners as well. Just for an example, if uh if if an employes data is breached
from her um fem app, it might um firstly it might lead to a lot of um stigmatization. But again, if uh everyone in the company knows and suddenly her manager passes her over for a promotion, isn't it workplace discrimination? If they know that she's trying to conceive or anything like that, isn't it crazy? That's an extreme example. I get it. But still could happen. So that's why I say employers, governments, abusive partners, and communities, stalkers, cyber criminals are all threat actors. Um, in a lot of societies or a lot of countries right now, menstruation as a topic is a taboo and not a lot of people talk about it openly. And that's where if somebody is using a
fem app and someone finds out that they are um they are trying to conceive or they have a missed period, it can cause a lot of social riot. I would say also in terms of government and prosecution even um the countries with really stringent abortion laws this could go haywire absolutely because um you see even if a missed period or um even if you don't log for a bit in the app they can use the app's data to prosecute someone then attack vectors of course dark patterns SDKs GP APS is really a big one. I don't I still don't get why do we need location data for this? Um APIs uh coercion and surveillance infer
disclosures and uh legal loopholes. One thing to mention when uh when these apps provide you health insights. So we don't really have um cohesive EU data for fem. just um you know a limited amount of database coming from US because it's all started in the US. Um but then we need a location based or a more uh comprehensive database I guess to work more into it to give health advice but then um all of these apps assume a lot about your gender emotions and that's how the health advice goes. Um I guess that's what um I mean by inferred disclosures that they just conclude from generic assumptions. Um there are critical gaps as we spoke
about weak and manipulative consent mechanisms lack of data minimization absence of collective privacy safeguards. By collective privacy I mean not just the user the logged in user's privacy but also the privacy of their child or their partner because we are indirectly giving all that data as well. Um algorithmic bias and lack of transparency. Uh psychological harms is something we always miss out on. uh over reliance on flawed AI because they do say it's still health insights not health advice but in terms of emergency we take it as health advice. Um there have been a lot of comments on these apps um social media posts as well that the app is so good that it told me
about my health condition before my doctor did which is just crazy. um false reassurance, health anxiety, and obsessive tracking. It can be obsessive because uh it's it's similar to Instagram res like you get addicted to just scrolling. It's addictive to track it because it gives you uh health insights every now and then. It pops up uh notifications like you must be cranky today because you might be ovulating, which is really annoying at times. I'm not cranky today but now that I saw the notification I am. Um there is a lot of shame and stigma around it as well. If any kind any of this data is really leaked out it could lead to a lot of stigma for um women
their families and overall this is creating a loss of trust in digital care as these apps were made to empower women. I don't think they are doing it anymore. And it's been a long time that this has been going on with a lot of um lot of news coming up around sensitive data and how the data practices are really really concerning um and that's leading to a lot of loss of trust and that's the last thing we want isn't it? So I think real empowerment in fe F femch requires reentering privacy, autonomy and ethical accountability while transforming invisible harms into visible change. Our data belongs to us not governments, advertisers, nobody, just us. So I just want to urge everyone
to be aware, ask questions, um, and take control of your data. Thank you.
Do we have time for questions? >> We have about two minutes for questions. So, does anybody have a question?
>> Hello. Thank you for the great talk. I think it's a very important uh topic and issue and I'm just curious whether you can think of any or you know any digital alternatives you can like we can use um because we of course can use just normal calendars but uh uh what about digital world? So there are a few options that are pretty fine like the Apple's health app or there's an app called Yuki that does um ondevice processing. So it's offline almost and for myself I do I have written a Python code for myself that I use to track my cycle. Uh but then again um a lot of these apps have anonymous modes in them as well now which I'm not
very sure about but uh still uh the thing is they don't have the same functionality as the whole app would have but then it's still better to be anonymous somehow I guess but best way if you can write a code for yourself. >> All right one more question maybe. Okay. >> Um, thank you for the talk. It was actually really really insightful. Um, you spoke about regions globally where some of this is taboo to to speak about or to talk about. Um, have you noted in your research uh any apps or companies that have uh adapted their uh policies that are more evasive for regions that it is considered taboo? >> I did I did look at that and no none of
the apps have actually considered a sociological approach to it. It's very generic and mostly targeted towards um US Europe but hardly takes into consideration any of the uh regions like there are some regions in India or there are some regions in Africa where it's really critical to even speak about these things. Um and there's absolutely no consideration. The thing is um there's a lot of lack of awareness around it. So I think even if uh with me if I talk to my friends or my family older people in my family they don't know they don't even know these apps exist but when it comes to the younger generation even if they do know that there are these apps now there's no
consideration of how much data we should give to it give to these apps or not and I think with social media it just becomes a bubble isn't did that it just the word just spreads out and again yeah there's no no consideration this has happened in some of the countries where there are really stringent abortion laws that this kind of data is used for prosecution and it has gone really really wrong but yeah >> okay we're out of time thank you very much for the talk >> thank you so