← All talks

Anonymity or Security: Encrypted Messaging Apps in Modern Espionage and Counterintelligence

BSides Tampa36:45132 viewsPublished 2025-07Watch on YouTube ↗
Speakers
Tags
StyleTalk
About this talk
2025 BSides Tampa Anonymity or Security: Encrypted Messaging Apps in Modern Espionage and Counterintelligence by Bidemi Ologunde Description In recent years, encrypted messaging applications such as Signal and Telegram have become integral tools for secure communication, valued for their ability to protect user privacy through robust end-to-end encryption. However, these platforms have also been exploited for illicit activities, including espionage and unauthorized disclosure of classified information. Notably, the case of Jack Teixeira, a former Air National Guardsman, underscores the dual-edged nature of these technologies. Teixeira was sentenced to 15 years in prison for leaking highly classified U.S. military intelligence on social media platforms, highlighting significant concerns about access controls and the misuse of secure communication channels. This presentation aims to explore the complex intersection between the preservation of anonymity in encrypted messaging apps and the imperatives of national security. We will examine how these platforms are utilized in counterintelligence and espionage operations, assessing both their benefits and the challenges they pose to law enforcement and intelligence agencies. The discussion will include an analysis of recent advisories from U.S. officials urging the use of encrypted apps following significant cyber espionage campaigns attributed to foreign actors, such as the Chinese-linked "Salt Typhoon" operation. I’ll use this presentation to provide a nuanced understanding of how to balance the right to privacy with safeguarding national security. Attendees will gain insights into the strategies employed by cybercrime and intelligence professionals to navigate the challenges posed by encrypted communications, fostering a more informed approach to the use and regulation of these technologies.
Show transcript [en]

Thank you so much, Sam. Um, I've been privileged to witness Sam's um, career. She got into cyber security two, three years ago, and I want to say I played a minor part in her journey into cyber security, but she's going to disagree with me and say maybe I played a much bigger role. But anyways, I'm again proud to be part of your cyber security story and it's just a a honor for you to do the intro. So, thank you so much. Um, welcome everyone. Thanks for coming to this presentation. My name is Bedi or Bid. Um, it's just shorter and easier. Um, I have a background in cyber threat intelligence. I have um some research

and investigations background as well. I'm originally from Nigeria. Came to the US for grad school here at USF. Um that was in 2012 December. I was able to get my master's PhD um in electrical engineering. My research in electrical engineering was on military network security and that allowed me to be exposed to cyber security, IT security. I got my A+ and network plus because of that research in academia and then I got my first job at Verizon. I have one of my senior managers at Verizon here, um, William. Um, learned quite a lot from him actually. Um, when I was at Verizon, the the that attention to detail for a cyber security analyst. And while at

Verizon, I realized I like research and investigations. And from Verizon, I got into Raymond James working on cryptocurrency cases and all types of fraud investigations. again that research and investigation skill set that I tend to naturally have. I started acquiring all these skills and basically leveraging all the knowledge I was able to learn at these jobs from Raymon James. I got a job at Expedia working on Thread Intel while also doing some trafficking investigations. So thread actors would abuse a platform any chance they get. Expedia is one of those platforms. They would use fake IDs to book flights, hotels, and rental cars to move victims around and stay under the radar. So, I was able to be privileged

to do those types of investigations. Now, I work back in the financial industry, um, working on insider threat cases, again, research investigations, thread intelligence. On my weekends and days off, I would work on Amber Alert investigations. Um, doing some quick OENT to find out this person is missing. Where could they be, where could they have gone to, who are their familiar relatives and so on. I would partner with every police department in the country have websites where they would list how the community can help them solve some cases. and just type any random city in the US add police department or sheriff's department you would see a website on that website there's a section for

community engagement I would go on those websites see the cases they need help with to build my own oent skills and over the years I realized different communities in the US have different needs depending on trafficking is popular among the border communities fentanil cases are popular among the border communities for both Canada and Mexico. Um, Florida is known for trafficking. All ages actually in Florida are being trafficked and so on. So, all of this to say that I'm here in front of you speaking to you about anonymity or security, the use of encrypted messaging apps in espionage, counter intelligence, cyber crime, etc. And I just laid out why I'm qualified to present on this topic. So

prior to March of this year, signal gate happened and we heard about all the the people in Washington DC who use signal comm to communicate about war plans and a journalist got added to that signal chart. And you would imagine these people should know better than to use that type of signal communication app for that sensitive type of planning. But then to their defense, not like I'm defending them, Salt Typhoon, um the Chinese threat actors went after Americans text message data, call records data, and then they've been stealing that since 2024, spring there, which led the FBI to come out with this emergency advisory saying calls and text messages are no longer secure because China is stealing

everybody's information. Therefore, everyone use encrypted messaging apps. So that happened last year. So the politicians doing all that stuff in Washington DC use that as their justification saying we have this sensitive communications between this planning for something going on in the Middle East. We cannot use calls and text so we use signal. Well, there are other things you could use like secure communications specifically for DoD employees, but that's another talk on another day. So, what I'm going to talk about today is basically why are you here to listen about encrypted messaging apps in communication systems, espionage, counter intelligence, enterprise crisis communications, and so on. I'm going to be presenting some threat models. What are the types of insider threats you

might face in your companies if you decide to use an encrypted messaging app for off-ch communications? Maybe you face a ransomware attack. Maybe there is some emergency crisis you have to deal with within your organization. So you say, you know what, we're not going to use Teams or Slack or Google Chat. We're going to use this encrypted messaging app because we want it to be more secure. We don't know if a threat actor is on our network. So, we're going to migrate to this off channel encrypted app. That is all well and good, but there are risks inherent in that move, in that business continuity plan you have in mind with good intentions. I'm going to talk about how criminals and

APS use encrypted apps. I'm going to talk about the lawful and covert use by intelligence agencies. There are intelligence operatives in Ukraine who use signal to do artillery corrections. There are law enforcement agencies in the US who seed cutouts on encrypted messaging apps. I'm going to give some definitions um later on. I'm going to look at the legal and policy landscape. Um there is something called the Espionage Act. There is the Computer Fraud and Abuse Act in the US. There are tons of GDPR type regulations in Europe and so on. Um, I'm going to talk about Apple and the FBI that happened, the case that happened in 2016 as a sort of precedent for these types of conversations. I'm

going to give some thoughts on how to balance anonymity and security based on these four key principles. You need to give someone access if that person has a need to know that information. example I like to use is the president of the US has the highest security clearance but he or she does not have access to every possible information just as some company CEOs have well in quote access to have information on everything going on in the company but they don't give them all those information because it's going to be too much they actually don't need it for anything a CEO only needs to make a decision to approve or not approve something they don't need to know the

graph grassroots detail of that process of reaching that approved or not approved decision. Just like Obama did not have the detail of how Bin Laden was captured and then killed and so on. He only got the message for do we need to move in or not on that May 2nd or May 3rd night in 2019. So all of this to say that there are four key principles. You need to know how to minimize metadata. So you don't just collect every possible thing. There's no need to collect all the timestamps on an email for login purposes. You're going to end up spending so much more for your cloud service provider and so on and so forth. You need to pro

do know how to detect insiders and then of course zero trust communications. And then I'm going to present a five-step actionable framework how to choose an encrypted app that fits your needs. Not everyone needs to use Signal. Not everyone needs to use Telegram or Matrix or Elements or Threma and all the other different apps available. And then some resources and then questions and answers. So why do insiders and spies love encrypted chat right now? Um I mentioned um the Salt Typhoon, the Chinese thread actors that have been coming after Americans calls and text records since 2024. That's the one we know of. In November of last year, there was this individual named Jack Tasher, a 21 22

year old National Guardsman. He got 15 years in prison. I was telling Charles the other day about why that sentence seems extravagant for that offense. He posted pictures and screenshots of classified military information on a private Discord channel. not a public discord channel, a private invite only discord channel that he was on with his friends to get some clout as kids do these days. He didn't think those information would leak out. I'm sure he did some training on how sensitive information could be. So, in his 20some year old mind, he figured, okay, this is a private Discord channel. It's not going to leak beyond this Discord channel. But guess what? It takes a screenshot of a Discord channel

to make that encryption on Discord become null and void. So I'm using Signal to communicate on a sensitive case I'm working on with some law enforcement agency in the US. If I take a second phone to take a picture of the phone where that signal channel is running, that defeats the encryption. This was what happened in this young man's case. and he got 15 years in prison. He's currently in prison up until 2024 + 15. My math is not good, but you get the point. So, in other words, insiders believe they can go under the radar. DLP protocols would not detect what they are trying to do if they use signal to take a picture of

their work device and send sensitive marketing information to a journalist because they think they're a whistleblower and then no one is going to be any wiser for it. Well, guess what? if that journalist ends up publishing something, there are ways and methods to figure out who took that picture. And that's what happened in this young man's case. So, someone took a screenshot of a Discord channel and then they were like, "Oh, okay. Forensics and so on, all the gymnastics that the NSA can do to realize this picture was uploaded by this person." Of course, the news would not tell us how they got to know he was the one. But again, this is just a reminder. You can detect insiders

because there are methods that these people can be detected and all it takes is just the people, process and technology. The technology is sound. He used a sens a a an encrypted messaging platform. But the human and the motivations of the human is questionable. The process maybe he shouldn't have been able to take a phone into the sensitive compartmented information facility or maybe he could have been searched properly as he was leaving the base every day to actually find out okay does he have what he's not supposed to have in his pocket and in his boots and in his hat and so on. So people, process and technology need to be sound for us to be able to say that okay insiders are

not a problem for our organization. Um mass adoption is another one. Um Signal is gaining popularity. Telegram now has 1 billion users as of two months ago. Um the Signal app itself is not it's it's secure but the team behind it is running on a lean security budget. So, as much as people are saying, "Okay, well, move to a secure encrypted app," the apps themselves, of course, they're not going to say, "Okay, we're not secure." They want people to download them, but all it takes is probably doing some further digging. If we're going to migrate our crisis communications to an app running a lean security budget, $50 million is small for a an encrypted app

in terms of Silicon Valley numbers. So these are the reasons why both insiders, spies, intelligence agencies are migrating to encrypted chats because insiders think they can use it to get away with stealing information. You don't have to email stuff to yourself anymore. You can just take a picture of your work screen, put that picture on an encrypted app in quotes, and then send it to whoever you want to send it. That is happening every day. It's probably happening right now as I'm giving this presentation. So, insiders are realizing they don't need to email stuff to their personal Gmail because it's going to get flagged and they're going to get a question from HR and their manager.

They're realizing they don't even need to upload something to the draft in their email account and then login from somewhere else and then download that draft because they're technically not sending the email. So they use encrypted apps for that um process. And then nation state threat actors they are recruiting spies. So the traditional ways of espionage is still being done. Someone would approach you at a park and sleep something beside like we see in the movies. They also do all of that on encrypted messaging apps because now they think the NSA cannot exactly see what's going on in an encrypted group chat and they can just approach you, send you a few messages, do some drop offs and so on.

Next, I'm going to be talking about who's really on the other end of your private channel. Similar to the previous slide, um, insiders and leakers, um, the different motivations, money, ego, compromise um ideology community clout, as in the case of the National Guardsman. So many different reasons why someone might want to steal your financial report that you've not yet released to the press and leak it to your rival. So many reasons why someone might want to do some insider trading because they know that your stock is about to rise because someone is about to acquire a version or a component of your company and they would send the information to their cousins, nieces, uncle to buy a

bunch of shares so that the money can somehow flow back to them. Nation state threat actors again they have all sorts of reasons. Some people claim that China is vacuuming up all these emails to train their AI, which they released sometime early this year or last year. Some people think they stole all those OPM records to be able to create fake identities for spies to then do traditional spy work on. whatever their reason, these are the kind of people that might be on the other end of your supposed whistleblowing operations. Um, criminals use encrypted apps as well for malware, selling malware, drugs, child sexual abuse material. Um, they use it because they realize, well, law

enforcement cannot come after us if we do all these things on Telegram. Signal says they have a limited group size. Um, Telegram has a super group size of 200,000 people. Signal says we're limited our own to 1,000 members because of privacy and usability. Even on top of that, like I mentioned earlier, Signal runs a small operating budget of $50 million. That makes them liable to, like we saw in the Signal Gate incident, someone could potentially add someone else to a closed group. Maybe that was a vulnerability that they didn't even tell us in the news, but that is happening on Signal. Maybe because they have a lean operating budget. Who knows? And then intelligence agencies actually use telegram signal

and so on. Like I mentioned earlier, they seed cutout. So what's a cutout? Cutouts are people that have information that the CIA, FBI and so on, they need those information. So the intelligence officers would join specific groups and start a conversation in a way to elicit information from the people that have the information. So those people don't know they are being encouraged to say something because they think this other person is also a group member because someone vouched for them. So all of that is happening on encrypted messaging apps. Before in the like 5 10 years ago, it would take traditional spy tradecraft to get that information. You have to approach someone at a party, in

a bar, on the street to get. Now it's easier. So those are the lawful in quotes uses of these encrypted messaging apps. Police officers as far as I know use these apps to find trafficking victims and the trafficking operations and actually make arrests. So there are good uses of these encrypted messaging apps and as well as bad uses and I mentioned in Ukraine um these volunteer oin groups people that know how to use geographical coordinates and all those geo apps and so on to actually make artillery correction so that maybe they miscalculated where the bomb is going to land and then these people civilians would make the right calculations post it on the encrypted apps for Ukrainian

military officers to then do properly. So it's like farming out war fighting basically. So some legal um policy landscape in the US. I mentioned how there are different laws over the years to address the use of encrypted messaging apps, the use of encryption in how law enforcement carries out its jobs. um the Espionage Act um I believe that came after 9/11 and the Patriot Act and so on and then the Computer Fraud and Abuse Act saying if you use your computer to do anything illegal, this law would make sure that you get the maximum possible sentence. Wire fraud, criminal stalking, whatever people use computers to do to break the law, this is the law that basically

takes care of that. And then there is something called the Earn It Act. So the screenshot down there is saying how lawmakers, senators, um, senators and house of reps members are saying this particular earnit act is saying if anyone is suspected or confirmed to be involved in cyber crime, crimes against children, domestic violence and so on, law enforcement has a legal ability to look into their phones. Currently that is not being done unless there is a subpoena from a judge unless there is a warrant like a search transure warrant um and so on. But now this earnit act is saying that is going to be possible without a warrant without a subpoena from a judge but everyone including

privacy practitioners are saying you can't do that because I'm a parent of young boys. If my child has a diaper rash and it's middle of the night, weekend, the pediatrician is not open, I should be able to take a picture of the diaper rash, send it to my doctor, my child's doctor, and then they tell us, "Okay, use this ointment, use this cream etc." That picture technically looks like child sexual abuse material, but it's coming from a parent concerned about their child's diaper rash to the pediatrician. If this law passes, it's going to make me a criminal. But then someone might have to look, okay, this is the context of this picture and so on. That is why there is

push back on that law. There are so many things that crosses that line or stays right on that line of child sexual abuse material or my son in kindergarten has a rash on his arm which could look like someone tying up a kid and then the kid is reacting to whatever is used to tie the kid up. Again, these pictures can be on the line, exactly on the line of child sexual abuse material or a concerned parent sending a picture to a doctor to figure out what's going on with their child. So again, there is heavy push back on the earn it act and how about we keep what exists and unless someone is involved in criminal activity

then a judge would issue a subpoena then their phone can be searched not just search everyone's phone 24/7. No. And then for allied democracies in the UK and the EU, there are similar laws that basically address this privacy security kind of conundrum. Um the president I mentioned earlier happened in 2016. In 2015, there was a terror attack in San Bernardino, California. A couple um with ties to the Middle East conducted a terror attack. They planned everything from start to finish on their iPhone. And then the FBI approached Apple saying we need you to unlock this phone because it's part of a terror investigation. They came with the Patriot Act. They came with the exp all types of laws that

existed in 2015. Apple said no, we cannot do that because it's against privacy. Apple quoted privacy laws back to the FBI. The FBI quoted terrorism laws to Apple. Eventually, Apple did not unlock the phone, but FBI was able to get some Israeli company to unlock the phone. So, Apple did that or didn't do that to be able to say, "Look what we did. We didn't set a precedent for the FBI to be able to get into anyone's iPhone or MacBook even if this person is suspected in a terror investigation." And then the emerging doctrines based on this six, seven years of privacy versus security is now saying the earn it act on that screenshot is leading to this

case where there should be exceptional access without back doors. So the law enforcement agencies are coming back to the tech company saying okay how about you create back doors so that if there is anyone suspected of a crime we can get into these devices through these back doors. And then of course Apple and Google and all of them are saying nope. If there is a backdoor a thread actor or an AP group will abuse that back door as thread actors do. Even when there are no back doors they already use vulnerabilities. So when you now create a legal backdoor then you're just making their jobs easier. A cartel in Mexico would pay top dollar to find ways into

that back door. a uniformed officer in Russia, China, Iran, North Korea would find a way into those back doors if they legally exist. So, how about we just maintain the vulnerability bug bounty system where okay, you see something, you don't disclose it, you milk it for as long as possible and then a white hat hacker comes out with this patch and then everybody patches just like we've been doing for the past 20some years. And then another doctrine that is emerging is the client side scanning. Again coming out of this earn it act saying the client is like me and you our phones you should be able to scan on the client side what is going on in this

person's phone because they sent a picture of their baby's diaper rash to their pediatrician. No, do not do that unless I'm suspected in some criminal investigations and a judge comes to me with a just like a police officer cannot just enter your house and say we want to do a search. No, where is your search warrant before you can come into my house? Don't even step on my driveway. So, and then key escro is basically saying you have a private key on your device and then your phone maker has a public key and how about you both put those keys in escrow and the law enforcement agency has access to that escrow. Um, no, I'm not putting my

private key in any escrow because you claim the escrow is going to be secure. Well, a Chinese AP group is just waiting to breach that escrow and now they have access to everyone's phones on USF campus on May 17, 2025. No, I'm not going to allow you do that. So, how do we design chat that is private and defensible? I mentioned these four principles. Um, need to know, metadata minimization, insider detection, and zero trust communications. the way my brain works. I'm going to tell you what I know and why I know what I know and the benefits this information I'm telling you is going to be for you. Call it how I was trained in my intelligence agency. Call

it how William taught me in Verizon. Who knows? The first principle is need to know. You don't tell someone information unless they need to know that information. I tell my son, "Put on your socks. It's time to go to school." And he asked me, "What do you mean why? It's time to put your socks on and then I'm going to do drop off in school. What? What are you talking about? He doesn't need to know why he needs to put the socks on. All he needs to know is you need to put your socks on in 2 minutes or we're going to have a problem. Anyway, so the defensive technique that this addresses is attribute-based access

control. Because someone is the CEO of a company, they don't need to have access to everyone's password in the organization. Technically, they can, but do they need to? No. The domain controller should be the one to have that access. And they can't even just go looking into people's passwords all the time unless someone is breached. Efficient email comes in and then we say, "Okay, we need to reset your I've reset your password for you. Call me and then I'll walk you through the steps to change the password." So, the CFO, for example, doesn't need to have access to everyone's W2 in the company. They can technically but do they need that access? No. So the

privacy gradu for that is the ability to minimize login scope so that I don't end up collecting logs for everything and anything on the network. I only collect logs for the things I need access to. For the things the assistant controller of the CFO's organization needs access to. for the things the email security analyst needs access to and so on. Um metadata minimization. So basically saying if you take a picture on your phone, you don't need to be able to collect all the geol location and the IP address of the device because I'm going to be sending this picture to my auntie in Chicago and now she's going to see the location. No, all of that is just

exposing your threat footprint because again transparent audit programs would ensure that are you minimizing as much metadata as possible so that if god forbid you get breached and you have these tons of metadata in your organization that you don't exactly need to collect because you're building this chat application to be able to communicate during a crisis and then you're collect collecting all the different device location, IP location. At least it shouldn't be collected on a system where everyone can see it. Maybe only the admins should be able to see all that data. Just in case you get breached or your communication system gets breached, now everyone is seeing this person's private key and this person's public key and the timestamps

and all the different metadata and the pictures and so on. So the defensive techniques that would make this possible would be basically using tall bridges um VPNs DNS over HTTPS the omimo there is basically a multiclient end to end encryption method so if you are designing a chat system for your enterprise communications most people buy chat system slack Microsoft teams they claim to have this inbuilt but just in case you want to have a backup up for crisis communications and your developers are thinking of how to go about this. That's what this table is referring to. Have a system where if multiple people are sending messages to multiple people, that system is encrypted end to end. That's what that

omimo is basically describing, a multiccast key exchange. So that even if someone breaches your internal communications app, they might be able to see a point in time message. They won't be able to read the previous messages that led to that point and the future messages after that point. That's what that communication protocol is basically describing. And of course DNS over HTTPS, that's like table stakes at this point. Um there shouldn't be any domain or URL in your organization that's just HTTP. whether you're an e-commerce, whether you're a whatever your business is, and then send DNS over HTTPS, like a double protection kind of deal. And then the ability to detect insiders within your custombuilt communications

platform is crucial. It's necess It's like using HTTPS. So, what that looks like is user entity behavior analytics. Is someone logging in into this chat platform at 2:00 a.m. on a weekend? Do I see them sending a bunch of messages to an unknown client on this application? So, UEIBA plus natural language processing to say that if they take a picture and they're exporting that picture, there should be natural language processing being run on images. Just like online banking apps run it on checks so that you deposit a check, you take a picture, the check says 200, but you write 2,000. It's going to flag it. How is it flagging it? Because it can actually detect what you wrote on that

check. that is the natural language processing that is being done. So if someone is taking a picture of their W2, the UEBA would flag, okay, this person and this image, the names match, they're doing it because it's tax season. They want to be able to use it to file their taxes. No problem. If this same person is taking the W2 of another employee or a senior employee or any other employee that is not related to them, maybe some tax fraud is going on, then we're going to flag it. That's what this UIBA plus NLP on outbound content previews. So content previews means you're not reading the entire message or the entire image. You're just reading a preview of

key fields, PII, name, social security number, credit card information, and so on to find out is this person excfiltrating a list of credit card numbers? Is this person excfiltrating sensitive market information that is non-public? Is this person excfiltrating W2 that does not belong to them? Is this person exfiltrating sensitive communications during a crisis to an email that belongs to Wall Street Journal, Tampa Bay Times and so on and so forth. All of that would be captured under UIBA added with natural language processing. And then the privacy guard drills for that is basically zero knowledge analytics. I'm collecting all these analytics but I don't exactly see every single thing because of privacy considerations. And the last one is zero

trust communications. Do not trust anyone that is using this app for because it could be someone hacked the app and then they're asking for BEC frauds and so on asking you to wire some money to because you think it's a CFO and all the AI AI presentations we've listened to um this morning. So the defensive techniques that would look like is reproducible builds. If your developers are building something for you, this is basically a deterministic compilation. Have someone external to your company verify like a bug bounty kind of deal. Have a senior um developer look over what this developer did and vice versa. Um mandatory code signing for mobile clients. If an app is going

to live on a mobile app that doesn't necessarily have the EDR component of your work machines, there has to be mandatory code signing. Do not just plug in apps and software packages from GitHub. Have a more stringent way of saying this app is internal. A lot of financial companies develop apps internally. This should be table stakes, mandatory code signing, reproducible builds, etc. And what the privacy guard rules look like is open source verification bug bounties.

So the framework I mentioned earlier how do we pick the right channel for mission critical communications. So let's say you don't want to go with Slack, Signal, etc. and you want to build something in your organization or you even want to go with SMS and Signal and Telegram. What's that? What are those five things you need to take into consideration? How do you classify the conversation? Is this conversation public? Some workplaces have like categories you can select before you send an email. Says public because you're sending it to a client. Internal only because it's an internal type email. Sensitive because you're dealing with HR and legal and privacy etc. So that's one way to classify the conversation. If you're a

small business or a startup, you can basically say, okay, whatever the publication