← All talks

Femtech And Data Privacy: What's At Stake?

BSides Leeds · 202626:5326 viewsPublished 2025-08Watch on YouTube ↗
Speakers
Tags
About this talk
Femtech apps—period trackers, fertility monitors, and digital contraceptives—are a rapidly growing industry, but security and privacy lag far behind functionality. This talk examines the intimate health data these apps collect, the weak privacy policies and regulatory gaps that protect them, and concrete threats from data brokers, stalkers, governments, and employers. The speaker proposes technical and policy solutions including device-side processing, end-to-end encryption, and threat modeling that accounts for physical safety risks.
Show transcript [en]

Yeah. Hi, I'm Adohi Nik. I'm um working as a security engineer at SQR and today I'm talking about FEM and data privacy. What's at stake? So, let's get started with what is FEM. FEM or female technology is basically referred to um apps or services or softwares that deal with women's health and wellness. So these are period trackers, fertility trackers. Nowadays they are also known as uh digital contraceptives and you might have heard about these like there are quite famous apps like Flow, Glue, Maya or Eve. Um the screenshot that you can see on the screen, it's a direct screenshot from Google Play Store. Can you guys sense a bit of theme going on there? For me, it's a bit too pink and I find

it quite stereotypical. Uh, however, it is meant to make us feel more feminine. But, uh, that's a topic for another day. Can I have a quick show of hands? How many of you use these apps or have known about these apps? Well cool. Um while this might amuse you, fem now is considered as a separate industry with a growth growth rate of 36.3% which is huge and it is expected to go up to 310 million by 2026 and maybe up to billions by 2031. Um but see this is the start just for UK and if you see globally it's like pretty amazing and there's a lot of investment going on into FEM this year especially

but it is all towards how to make the apps more functional or how to integrate more AI into it but hardly anything towards securing the infrastructure or maybe defending the users. Um but uh yeah going ahead why I started getting into this. So these apps have been in use since a long time like since early 2000s but they got significant attention after the flow apps data privacy controversy that happened in 2021 in the US. So basically in 2021 flow app which is pretty famous for period tracking. So it was found to share its users private data to companies like Google and Facebook for say analytics, advertising or marketing and that too without the consent of the

users and I got pretty skeptical then that why should I use these apps because my data is not private and I think a lot of people did as well. people were scared to download the apps later after that. But since then there have been a bit more research towards FEM. There have been um other news items as well like data breaches or these apps not following the practice of security practice as well or even not working the way they should. And what caught my eye was last year Amazon shut down its secret project to develop a fertility tracker and they laid off uh hundreds of people then and I think um you know it was

pretty insulting that they treated it as an experimental business for Amazon being such a big conglomerate. We expect them to do good towards like any kind of product say but if they would have done a good fertility tracker it would have reached two millions of women around and it might have also helped small companies to have a better reach but they shut it out and yeah I found it very infuriating to be honest but um see that's the point like if our reproductive health is going to be treated like a side hustle. Why are we expected to function like a full-time job then? Let's get into some technicalities here. So, what are their data collection

practices? So, when you log into an app which all f apps um want us to login first create an account. So it uh of course takes your name, email ID, contact information, of course device data, then user ID, device ID. They also collect your location data which is a bit dodgy but yeah then everyday mood is uh good. you log everyday mood which I like it but um also you um give permissions like cookies 90% of of the times you just accept all and go ahead and some of these apps have now started premium versions so they have AI powered health advice and which needs a premium so there goes your payment information as well but this is all still fine

because it's PII or nonPI as well and all the apps collect this information but When it comes to femech, what we don't consider mostly is the intimate data. It's not just about the menstrual cycle data. It is ovulation test results, your BMI, your mental health, sexual health data, then your lifestyle, diet and nutrition, sleep patterns, breastfeeding and lactation routines, and also medical history, doctor visits, prescriptions. It makes you log all of it into the app and it's pretty critical but developers have a tendency to look at all data as just another piece of data and not uniquely sensitive which I think needs to change. Um let me give you an example why this is so serious. So uh just

imagine um a woman logs in her breastfeeding routine into the app and there's location her GPS is on in the background. So now we know when and where she is in the day alone with her child and this information like it's just information for the app developers but for stalkers or abusers if it is leaked or sold it they can leverage it to track manipulate or you know take advantage of someone who is at their most vulnerable. So I think it's not just privacy risk now it has become more physical safety risk. Do you think this much information is needed for the apps to just predict your cycle? Do we need to log that much?

Do you think some of it is actually unnecessary like location data? I don't see a point. And what is the guarantee that this data is collected and handled wisely? That's something to think about. Isn't it? That brings me to privacy policies. Do we actually read them? Let's be honest, nobody does. They are long, complex with legal jargon, and it's pretty um boring sometimes as well. But if you take time to read them like I did with the Pimp apps, you might see a lot of ambiguity in the language. They use terms like we may share your data with trusted partners. We don't know who these trusted partners are. Then your data is used to improve our services. What data

is used to improve what services? They also uh use terms like we do not sell your data. But then three paragraphs down they would say we use it for marketing and research purposes which still means that the data is being monetized. Another issue I have with these apps is when you even if they say that they have um you know an option to delete your data but what is the guarantee that the I mean the deletion is just not on the app but from the servers as well. Um also when it comes to deleting the data these settings are buried so deep in the app it is so difficult to find them. then there's no single or legal

regulatory framework. So in the UK operates under MDR, ASA and UK GDPR. So MDR which is the medical device regulation. They do SE safety testing for the apps and they provide guidelines how to uh develop a secure app. But what the issue is that these apps mostly label themselves as well apps and not medical devices. Uh as these apps do provide health advice and they do ask you to log your medical history, they should be labeled as medical devices and not just health and lifestyle tech. But then when they label themselves as that, that means they bypass all the safety testing and oversight. The ASA gives guidelines to um influencers, marketing teams how to u

market the app better. But then it only takes action when a complaint is made against it and we are hardly aware of app labeling or transparent labeling. So nobody does complain and it's the same with UK GDPR. Under GDPR, we have the right to ask questions about our data, to ask for deletion, but lack of awareness and we just don't ask for it. From a security point of view, I think I believe in privacy first practices like there should be privacy and security ingrained from the very design start. Um there's I think with a technical point of view decentralization is a way that we can go ahead with which means processing data on device. So Apple's

health app does that. It processes the uh the all the data on device and then it asks the users if they want to back it up. Um but no other apps do that like absolutely none and I think it's not impossible right? Um these apps also have a functionality to share data with other devices or with your partner's device but there's no encryption in place. We absolutely need end to end encryption like WhatsApp because of the intimate and private data sharing there. Um when it comes to policy regulation we spoke about MDR supervision. I think uh there's an ISO for health companies like digital health companies but it's very it is not very granular to fem I would

say so we need something specifically for fem because of the growth of industry and of course there need to be clearer data protection guidelines transparent app labeling accountability for companies um and for the influencer claims and of First to reiterate GDPR we need to exercise our rights. Um going into the granular security bits I think we need a better suited sociote techchnical threat model in our traditional threat models we do focus on assets like the devices or the API and things like that but we don't consider the personal intimate health data here user generated content user identities when it comes to thread actors It's not just hackers or cyber criminals. It is more of advertisers, data brokers,

stalkers, abusive partners or you know backward community. Then there is governments. Let me give you an example. If um some of the countries do have restrictive abortion laws and if such data is gone out, it could be you know it's just a missed period could be weaponized legally against you. So they could just sue you in courts if this this kind of data is surveiled by the government or just in terms of like sold outside or leaked somehow. Um attack vectors we are not just looking at dark patterns or unreliable SDKs but uh surveillance in disclosures and legal loopholes. when it comes to impact and risk this is the most important part is what I think

because it has gone to physical risk now um then there's privacy violation surveillance and profiling data monetization discrimination is a good one so for example um in your organization somebody's fem app data is leaked and um say she is trying to conceive and then the manager founds out about it and they um possibly just pass her over because for a promotion because they might think that she's going to take a maternity leave ahead and that's workplace bias and along with that comes a lot of shame and stigma as well. What we don't really think about is the emotional and social stress around all of these things. um the traditional threat models that we have that all of the fem apps also use.

There's nothing as specific um to you know social sociote techchnical or even towards uh user experience designed but uh there's weak or manipulative consent mechanisms. There's no granular consent. It's just you either accept it or just no functionality. Um that is basically with the location data mostly. It's just either you accept share your GPS or no functionality of the app at all. Then there's lack of data minimization. Absence of collective privacy. So collective privacy or shared privacy means that um the data you collect is not just about the enrolled user but about others as well like um a sexual partner or their child. and we never consider this. Um then of course there's algorithmic bias and lack of

transparency there. Uh when it comes to data breaches as well, we usually think about reputational damage or financial damage but hardly about mental health or psychological harms which means even just using a fee apply a lot on AI. you just heard in the last talk and um there's sometimes there's over reliance over it. All these apps do now have AI integration and do do provide um health advice. When I um mentioned inferred disclosures, it is um basically that if you give just a little bit of information but the what AI does it it just assumes things and then gives you conclusions over it and people can trust it and fall for it. um which might lead to health anxiety,

obsessive tracking and there are certain countries where talking about menstruation or periods, it is still considered a taboo and that if that information is leaked outside or even spoken about, it can lead to a lot of shame and stigma around it or a social call out to be the extreme and um yeah towards the end it's just loss in loss of trust in digital healthcare right and that's the last thing we want that's why I just want to say our health data belongs to us not to advertisers governments or corporations or just anyone and it's time to dei demand privacy in fem also I would love to urge everyone to please be aware ask questions and take

control of your data that's my time Thank you. Thanks so much for listening. [Applause] >> Question. Anyone got questions? >> Yeah. >> Have you ever submit a subject access request to see exactly what do we do? >> Sorry. >> So in the UK we can submit subject access request. >> Yes. >> Have you ever seen the results one of those? >> Oh, I have not. But yeah, we can. Also in the EU there's this thing that you can directly contact the developers of the app to ask for data deletion or what kind of data we have. I think that that is a great thing to incorporate. question there. What we'll do is if you can repeat the question back for the

recording. Yeah.

>> Uh can you please repeat? Sorry. software.

Um so if data monetization is um good for privacy >> more so you think

>> um well if the data is being monetized for um research but then there should be a um what do you say boundary to it right that what kind of data is used for actually monetization. Um also the thing is they don't mention it very clearly that what kind of data they are using for research and um even now just recently I uh saw this that um there's a lot of investment going on in these things. So UK government is also investing in fem um but nobody is speaking about how it is going to make the apps more secure. It's more capitalism. So, I really don't think that data monetization is helping right now.

>> Yeah. So, um Mozilla, sorry, I'll repeat the question. >> So, if anyone has um analyzed if the apps are um like creepy or not. So Mosula has done a survey on all these privacy policies of all fem apps. So it's on their website. It's pretty funny. So they have ranged the apps from level of creepiness from least to worst. You should absolutely check it out. It's very interesting. It's it is again a very older um research or review of these apps, but uh there's hardly been any surveys recently since 2023. Yeah. Question at the back. >> Do you think there's an appetite

>> if there's an appetite for people to pay for these apps? Um I think so. Yes. Some people do rely a lot on um the apps for usually for fertility tracking. Um and when such apps just you know the way they advertise themselves is very funny but they do make sure that you fall for it and get the AI integration in and some of these apps do have like um the general functionality as well as a paid version. So without the paid version you just can log in period cycle dates but nothing else. So um yeah, I guess people do pay for it. >> Feels almost wrong, isn't it, to pay a company to keep your data safer and then

kind of kind of think you should be really keeping your data secure as it kind of as it starts off and not you know, oh yeah, if you pay an extra 10 or a month, >> we'll uh we'll make sure we definitely don't sell our data. Anyway, uh sorry different than that. >> Sorry, I was going to say I was going to ask the same thing. Sorry. So essentially everybody says if product is great it's because you're the product. So would you suggest that premium versions of fech apps as safer or is that a false sense of security? >> Well if the premium versions are safer I would say no because um I don't think

they're doing anything differently. They are just making money out of it. >> So um yeah there's this app called women log. They have the most transparent privacy policy I've read. But um again, they are a free app. So I think you can keep it free for everyone and do good security. >> Uh I was just going to say that you know free are premium versions of apps. They're all at risk of that patterns. >> Yeah. and you know developers and design they know that. >> So I'd say just because you sign up for the premium version of something it's probably exactly the same you just pay to layer on whatever it is. >> Yeah. It's not. And if if there is a

version where you're paying more on your phone subscription, I'd like it in a better more private service than actually that's a red flag. >> Just money. >> Just money. >> Any question? Yeah, we've got another question there. >> Do you think part of the problem with the security is that society has become a little complacent in what's already out there using social companies in particular that we've given away so much information already that this is just another part of that that we're giving that the security lacking because we think we got 90% as well. >> Absolutely. Um we we are complacent now in terms of all the social media going on even with charge GBD I mean it's so

easy people don't Google nowadays just go to charge GBD and yeah AI is there to make things easier maybe but then at what cost um yeah I mean would you rely more on AI or more on your Any other questions? >> Quick quick two-parter. Um, first of all, I've seen a lot of these talks and this is the first one that really addressed the threat model we need. So, that's a great angle to take on. So, thank you for doing that. But secondly, what are your suggestions for how do we fight this? I mean, what a few of us are going to send in nasty grams to the companies. Okay. and they'll they'll here you can have a month's payback or

pay what you paid for the app or something like that. What are we really going to do about it? um to fight back I would say um we need to talk to people who are at board level with these apps and I think if they change some things then it might be a bit of change towards it but then now they there are so many of these apps and the problem is that they lack resources they are small teams they lack resources and then they uh do compliance just as a checkbox and that needs to change I guess the way we think needs to change and even I don't have a solution towards it like

how we are going to change it but uh in terms of uh you know like I said incorrect labeling so if um if Google does something about it when they put the app on their app store or we have a special regulation just um just to publish the app to public then that might help as well but then as individual responsibility I guess please don't go for premium versions don't rely on AI and yeah um GDPR >> exercise your rights that's the least one more question >> more questions well oh we have we do well we have a winner >> perhaps the problem diversity, the development teams as a guy. I'm not impacted by that. And you put the

courage to think of you put them up on the list there. Wow. You guys have given up a lot. >> Well, speaking about diversity, there's so much but yes, there is lack of diversity in development teams. um when it comes to even cyber security there's so much gap and yeah we need more women in leadership positions in terms of at least fee I would say but then yes um I have seen that even men are not very you know open to hear these problems I'm very glad that all you are Okay. But yeah, I mean it's just not you. Your partners are involved. It's your family who is involved. Even you are involved in a way. And yeah, it needs um we need

to take it seriously. >> Thank you very much. >> Thank you so much.