
Um so this is Mario Almeida and Justin Steven. Uh their talk is sleepless strings template injection injection in insomnia. So big round of applause.
>> Thank you everyone. My name is Justin. I'm the head of research at Tanto Security and I'm joined by Mario, one of our co-founders and our technical director. We've got a story that we're really excited to share with you today. We call it sleepless strings and it's a story of a template injection vulnerability in a client side application. And before we get into it, a little bit of housekeeping. We want to talk you through the lineage of our communication with the vendor on this one. And as an illustrative tool, we're going to be using the little chat bubbles. Uh these aren't copy paste of the email conversation we had. These are just an illustrative tool to kind of show you
the timeline and the cadence of of the journey we went through with the vendor to see these bugs be patched or maybe not be patched. Before we get going, we'd like to level set. What are templates and how do we at Insomnia Security, not Insomnia Security, Kanto Security, think about templates. So, a template is like a document with placeholders in it. You can think of these as gaps. Different templating engines have different syntaxes. Some of them use these little curly braces, but these are like little holes where data can go. So, a developer can write a template once and then slot data into it multiple times. This template is fed into a templating engine and alongside
it are some parameters and these are those values to to fill in those gaps. You turn the handle on the engine and out pops a rendered document with those little placeholders filled with the values. These are often used in web technologies. The developer will write a HTML document with these little placeholders or these little gaps in it and then reuse it multiple times for different users to fill in some dynamic data. And what's neat about template rendering is it's a way for data to be safely interpolated into a document. It's intended for attacker control data or untrustworthy data to go into this parameters side. In fact, a lot of templating engines will automatically HTML encode stuff for you. So, it's a
good way to get XSS protection if you're building HTML documents from dynamic data. This automatic HTML encoding can be turned off if you're not using it to build HTML. It's a handy feature to have. And for the rest of this talk, we're going to put HTML to the side because the app we're talking about isn't building HTML. Uh, so you can forget I said this or forget we said this, but this is a bit of background as to how templates are often used. But what's important is that it's generally considered safe to feed untrustworthy data into the parameters side only. Untrustworthy data going into the template side can have interesting consequences. Uh, this was really
brought to prominence in 2015 by James Kettle from Portswiger. I think it had been discussed prior to that, but James really brought it into the limelight with a bug class called SSTI or serverside template injection because when an attacker can control and and bring some squiggly boys to the party, some more interesting things can happen. These template engines are quite powerful technologies. Here's a little collage we've built of various different syntaxes for various different templating engines to show you the complexity and the power and the grunt behind some of these engines and what the squigglies can do. Not all templating engines are made equal. A lot of them are really, really powerful. And these squiggly braces or
whatever the syntax is can do some interesting things. Some of them aren't so sophisticated and may be considered safe to allow an attacker to control the template, but as a general rule, it can be considered unsafe for an attacker to control the template. If they can, then weird and wonderful things may happen. And speaking of wonderful, like to hand over to Maria. >> Thank you, Justin. All right, guys. Now, let's talk about the tool actually that brought us us here today. Cool. So, what's insomnia? It's not like the problem that keep us awake at night, but it's actually a API uh de an API 2 developed by Kong Incorporation. So the main goal of this tool is like uh users
can create uh example requests right so developers can create that and they have like a all those requests that from this his API instead of like a or pretty much they just get like a swagger file or open API documentation or WSDL and create example of the request so they can quick test it like a sending sending the the request to to the API and grab the response right and it's developed it in electum with in this case here they have like a no.js JS integration enabled and most of the tools actually that do the same thing like the postman for instance they also develop it in electum and insomnia is also a great alternative
tool to the postman itself I think everyone else most people is uh familiar with postman because it's probably the most well-known tool that do the same thing but like myself I decided to change to since they they kind of like a moved from um I don't know let's say a business model where you could load local scratch pads. That's in my in my opinion a very important feature for us as spent testers because when a client send to us a collection file like that probably is full of credentials right sometimes it's just like the credentials for the development environment but also in some occasions will be like a uh when they ask us to test in the production
environment if they do not have configured like for instance uh an API to grab uh authentication authentication material will be in those files and in our perspective at least in my perspective like a synchronizing this kind of information to a cloud environment that does not belong to the client but belongs to you yourself and anything bad happens to your account and those scratches end up leaking it be pretty bad. So the idea of having like a local scratch pad for me is is primordial and when I look up for tools I end up like finding insomnia and I really like it like straight away have most of the functionalities that I was using with um postman itself and yeah
that was the main reason that I decide to use this tool cool now I'm going to explain to you guys how I do my API pin testing working flow right how how most of us at tto do it it's pretty much like a common I guess like everyone do the same way. So here I have like insomnia where uh we have we ask the client the API collection right so that's one of the files that we ask them when we talking with them asking for information about APIs besides all the documentation that they can give to usually asks we we usually ask this this collection file because it's a easy way for us to grab
this thing load inside of our API client to start like a testing those parameters straight away On top of that, I like to use Burp Suite as a proxy for that. Why is because Insomnia is great for for having all those API things there. But if I want to specific parameter for I don't know tests on my doors or for input validation vulnerabilities, Burpuite can give me this power with intruder, right? Or if I need to manipulate the headers of the request or something in a better way, quicker way, responder can do that for me as well. And not only that, for uh for us to take the evidence for instance, we like to have the request
and response uh in the same screenshot so the client can see what's going on in the quest. So yeah, that's the way I set up my environment. And of course uh Burpuite going to be the client for the target API. Cool. Now that we understood that, let's talk uh to you guys little history and little history like how I established this vulnerability, right? So here's me one day doing a penetration test API to one of our clients and like uh I remember I was testing this this API and the head was one functionality that I that I took some parameters and it returned like a some sort of like a built HTML in the JSON response of it.
So I immediately thought okay maybe what I can test here cross-ite scripting of course and also template injection right. So I of course I went there and sent the most infamous payload for testing template injection that's pretty much like a curly braces curly braces 7 * 7 and that's a simple mathematical operation that should return 49 if it it's like a resolved right in my mind when I click send from um insomnia it was sending that payload to burpuite burpuite was passing that through the API and would return and for my surprise The response actually came 49 and easing celebration. Nice. Uh that's it. Uh SSTI. Now let's take some evidence and try to exploit further. Right. However,
when I tapped in to take my screenshot, I not something very odd. Actually, in the quest there, instead of having my payload that I put before, I actually had 49 directly there. And that was the reason that actually received 49 as a response from the API. And then when I used Burpu directly to send my payload to the application, the application was actually not vulnerable was returning my string as a payload. And then I'm like, okay, what's going on here? And then I tapped back to Insomnia, open the request I was manipulating. And as you guys can see here uh the where the place where I put the payload it became uh automatically handled in the
application uh let's say GI as something that we call like a environment variable. So environment variables are used for multiple things but the principal uh usability of that is to configure for instance credentials or anything that are used across multiple requests. Then when I click in the 7* 7 then I understood that actually it was being parsed by some sort of like a template engine inside of insomnia itself and I found this a little bit odd because was like a just after I changed tools as well and when I went in in in postman that does not happen like it the the the curly braces things instead of postman is just used to actually map a string to
another string and that's it and there here it was actually evaluated in the content and that's since it was environment variable and then I tapped back to the management environments and here is where we configure like let's say um the credentials that we want use uh across a quest and here I have like some examples of fake fake data and I not something very interesting this page in the bottom here it says environment data can be used for nonjug templating in uh requests and I'm like oh n actually I'm not like a Java JavaScript developer and was the first time that I actually heard about this template engine, right? So, okay, this is even safe. Then I
started researching about that and when I went to the documentation of like a nunchucks, I the first thing they have like a pre-eread on top is like a user defined templates warning. Njax does not send box execution. It's not safe to run userdefined templates or inject userdefined content into a template definition. And funny enough, that's exactly what like insomnia is doing as functionality as default, right? And that was pretty sus. And uh however, like I told you guys, I was doing like a penetration testing, right? So I need to finish the job that I was doing, the more that I love finding this kind of like a unexpected bugs in tools like even that I was using
and that's probably would deviate my attention and was completely out of scope and I need to deliver this job. So I took notes of everything that I found so far, put in my research notebook and that's it. So a few weeks later, my next research opportunity came back and I was back exactly to the point where I stopped. Okay. Nucks after I started analyzing a little bit more is a very powerful uh template engine that's pretty similar to Ginger 2, they actually have like a lot of similarities and functionality. like uh one of the most powerful features in Ginger 2 is the ability of like a manipulating strings using filters and Njax has the same thing. And another
thing they also have like in common is the access to uh the ability to potentially evaluate arbitrary JavaScript code in the case ginger python in this case JavaScript code. Cool. And when I just like I did a quick Google in the internet. Hey, there's like any potential payloads to get code execution using SSTI in Njax. And there is multiple ways. And this one is the simplest one that I found. And I don't know like what Kong was thinking when they decided to implement that functionality, but if they just did the same thing, probably find that that massive red flag. However, they found that probably was fine. So I just copied this payload copied to inside um
insomnia to see if it would work and first try there that it was get code execution as mass. So to better visualize, I just slightly modify this payload. I added an alert. Then you can see the command execution pop up. Cool. Now I have the payload. I just need to convince someone saying to them, hey, can you copy that inside of Insomnia client and that's it. Get code execution the machine, right? But that will probably be a obvious fishing not going to work. So I need to find a better way to do that. The obvious way would be to use the import collection that we ask to people. So I don't know even imagine like North Korea find something like
that that that's instead of the HTTPS right convincing devs to download execute things in the machines simply like you just need to generate one of the collection files put your payload there load that inside insomnia tab to the request that have the payload and get code execution. So how how that would work in practice pretty much like this. So just go to scratchpad import load your malicious file. It going to quick scan the contents and that's it. Tap to the request get code execution. Easy ass. Nice. And then we got like a very cool proof of concept and a potential plausible pretext to convince someone to download our malicious file and execute it. However, the payload is still there
in the file and can be spotted, right? But hopefully that will be enough and I think it was time I discussed with Jesse and we thought, okay, time to report to them, see what they say. Cool. So, the first thing that we do when we we trying to report vulnerabilities to to to vendors, right? We look into directly in the GitHub usually of the product that we we trying to find uh we're looking for vulnerabilities and when we go to the security tab through to reports of through GitHub usually they handle all the CV parts and everything but Kong is a a CNA by themselves and they don't like to receive reports this way. So the
the way they asked is to send a mail to them and then we visit the web page and they were surprised but actually they had like a they at the time when we reported this vulnerabilities to them they had a self-managed uh bug bounty program meaning that like they offer up to $1,000 USD in gift cards for critical bugs and in also in Somia desktop clients were inside of the scoping of this the this bug body program. So as researchers it's pretty nice like a a pretty cool bonus right on top of like when you finding some vulnerabilities you want to report and you find someone actually offer bug bounty is a great indicative they are like a keen to work
with researchers and not only keen to work with them but also uh heate them uh financially somehow for their efforts in reporting vulnerability to them. So it's pretty cool. Um yeah so remember this chat bubbles here is just like a reenactment of like a how was the conversation not like a verbatim by verbatim but then we sent the first message hi we want to report a bug that leads to code execution when user imports a collection file we are writing a report and we will send to you soon and by the way as well we have like a public vulnerabilities disclosure policy and we publish details on seats fixed so our vulnerability uh policy is
pretty much like a similar what other companies do like Google ours is heavily based on theirs. So once we like a send the details of the vulnerabilities to the vendor we give them like a 9 days to come up with a viable patch to patch this vulnerability and after that after the patch is out we also give 30 days to start talking publicly about that. So that's it. And we told them since the first message that that that was our plan and they said, "Oh, thank you very much, but I already know about this vulnerability and duplicated issues aren't aren't covered by our buggy body program." So yeah, I guess everyone that did like a bug bounties in life at least
saw this message at least once in their life, right? So yeah, cool. So they say they already knew about that and then I quick I talked with Jesse and we did like a a more thorough search instead of the GitHub and turns out that actually someone had created a publication and talked about this vulnerability. So I don't know if you guys can see here the payload they use is slightly different the ones that we used but the results the same right and they also mentioned the ability the the potential of this vulnerability to be used in the importing files and when we messaged Kong they actually said oh you can use this vulnerability as well I don't know
loading by crossite scripting or when you navigate to insomnia column slash that's the uh what hell handle they use to import files so there was multiple ways to do Okay, then I talk with Jesse and Jesse said, "Okay, man. We'll be awesome if we could like improve this exploit like okay, let's try to do it." I I didn't like the the payload in way because like it was there in the file and could be like a easily spotted. So, can you do better than this? My mindset to attempt to find that was okay, we know that this app like a handles template everywhere. So, let's start mapping this just to you have an idea. It's a lot of places mate
and the first one that's just like hestia pretty much all the fields there are template handling and manage environment as well as I told you guys and manage cookies too and when I went to manage cookies and I noticed that actually could create templates in there and in my mind I think they allowed to do that there just because potentially could like create some sort of templating where it will fetch dynamically some sort of cookie to plant there and will be automatically fed up every time that someone send a request, right? But since like the functionality of cookies was templates and handling, then I thought, okay, maybe in working as a browser as well, meaning that like if I receive a
cookie in a set cookie header, it will also set the cookies inside of the application. So to test my theory, I quick wrote like a Python script that will do just a simple job of setting a cookie. And that's it. when I send a request to that app using insomnia tabbed back to manage cookies there was the cookie so it was actually working as a as a browser behavior right next step was obvious okay can I set like a remote template and that will also be added to the app that was what I did put the same pillow there with a slight modification so we not have like any errors in the uh web in the application interface and
when I sent the request I got the popup. But here I was still navigating the manage cookies, right? Another behavior that we notes and was pretty interesting is because in Somnia every time so at this time we're think okay do we need to every time navigate to manage cookies to get the code execution and it turns out that we didn't. How it works pretty much is like it sets the cookie in the cookie jar and every next request that is sent the application automatically loops in in every cookie try to evaluate them and matches the the the domain of the request that you're trying to send and if they match they just add the cookie
to the request and submit. So that meant that actually instead of like a poison the cookie navigate to manager cookies I just need to send two requests to using insomnia one to the malicious server and the next one to anywhere. Okay so let's see that working. So pretty much here we have like insomnia normal we send the first request if you guys notice in the top left here manage cookies appears one there means that it was the cookie jar was poison and the next high request we get our code execution. It meant that like every request after that will be like AZ exac cool. So we thought nice we could like a simply put like a connect uh back shell
there. But in case that we don't have like let's say um or you have like a fire and stuff you can't leak the content. How about we use cookie itself to leak this this information. So we wrote like a a quick Python script that will execute our command grab the results of execution put in the cookie and in the next request that information will be sent to the server. So it work like pretty much like that. First quest poison second request you get the results of the command that we put and we can keep doing that like in in loop right. So why use a C2 when we can use the cookies itself. Nice. So I found a very cool exploit
this time where the file where we send to share to the user will be completely clean. The only thing that we have there is a wo to a potential malicious server. And not only that since we control the server where the payload going to go we can increase the obsac basically let's say matching content header of the user agent right to see if it's insomnia and if it is we deliver our payload. If it's not, we completely ignore those requests. So, we'll be on the clean. And from here, my friends, your imagination is is the limit now. Pretty much you can could even like write a whole situ plant if you wanted using this idea. Nice. So,
it's time to tell Kong about our findings again. So, we have been working on this. We can now exploit it over HP malicious request and yeah, how do you want to receive the report? and they say, "Okay, please the details uh via mail." And on top of that, we have reconsidered uh what we said before and we want to treat your findings as a as a new finding because seems like you guys are into something new and yeah, we're pretty keen to see your pock. Then we sent a mail where we attached our pox described like how insomnia could be exploitable via all multiple ways uh that we found at least the ones that we found in the template handling
and we also say that we think that the cookie technique is pretty interesting because can be completely remote right and yeah now I'm going to pass over to Justin that going to discuss about the recommendations that we did and a lot of fun to just >> thank Thank you, Mario. So, we looked at what Mario had found and the cookie technique we thought was really interesting. As Mario said, it turned this into a completely remote bug that didn't require a user to open a file or perform as significant user interaction. But at the heart of it, we thought it fit this pattern. We thought it was a case of untrustworthy data making it into that template side of a
template rendering engine. And we're a big fan of Port Swigger's advice on template injection. They've got a web security academy article that has a discussion area about how developers can protect against template injection or resolve template injection issues. It's kind of this four-step process. It's this four-step uh grief handling process for template injection. And it's presented in the order, it seems to be presented in the order in which port is suggesting that these types of issues become remediated. First step is don't do it. Don't pass attacker control data or untrustworthy data into a template into the template side of a template engine. But they say sometimes you have to for functionality reasons. If you're going to do that, use
one of those less sophisticated engines like mustache. If you can't do that, it's getting pretty grim at this point. Maybe try and sandbox the the the template renderer. But they do call out that this can be this this is their words inherently difficult and prone to bypasses. Then the last step is accept that arbitrary code execution is basically inevitable and then stick the template rendering stuff in docker or something like that. We thought this is we think this is pretty salient advice. So we pondered this and thought what can Insomnia do about this issue? They can't do that first step. They can't stop rendering templates. It's a feature at this point that their users have come to
depend upon. They probably can't switch out the template engine. It's going to break compatibility and change the syntax. So, at this point, they're they're looking at probably stopping the bleeding. And we had two ideas for them. Maybe throw a warning when someone's importing a file, saying this can be dangerous. We're all used to those now with things like Visual Studio Code. But importantly, we thought that one of the best things they could do would be to simply not render a cookie that comes from a remote server as though it's a template. So, in our report, we expressed these thoughts to them. And then we followed it up by saying as Portswiger says trying to sandbox the templating or
sanitize templates can be brittle and errorprone. They responded and said we think this is a CBSS 9.3 issue where rating as a critical bug which is a huge change of heart from it being on a open GitHub issue for a few years and them saying we know about this issue. So it seems like this vector had kind of made them realize the true impact of this issue. They went on to say that we're going to prevent nunchucks from rendering require invocations and they're going to work on isolation which sounds similar to the two things that we said that we don't think are sustainable measures. At this time the fix to ban or block or prevent the use of the word require was
already in GitHub. They hadn't released it yet as a as a release but the fix was there. The trouble being at the point in the payload where the string require comes up that they're trying to ban, it's a string context and there's a teen bajillion ways to obiscate or encode or otherwise mix around that word so it's not spotted by this kind of sanitization. I particularly like the one at the end which is using nonjucks against itself to kind of flip the string around. And so we sent this off and said look like we said we think sanitization can be brittle and errorrone. here's a small handful of ways in which this approach can be bypassed.
They came back and said, "Thanks. We appreciate that feedback. We still think this patch is worth it because it's going to break exploitation in the wild if there's any going on." And there was also telemetry in there as well. If this word require was spotted in a template, it was going to ship at home to Insomnia. Then they said, you know, this is just temporary. We're still working on isolating the renderer within a web worker. So we kept an eye on Insomnia's change logs and then a version version 11 came out and in the change log it said that they'd shipped this behavior in which the word require was going to be blocked. We got in touch with Insomnia said look
we haven't tested this release yet but we can see in the change log that it's got a fix. We wanted to know if they were working on any additional fixes or hardening or mitigations at this time because now that a fix is shipped, our policy is that we will publish details to the public in 30 days so that everyone can be informed as to what's going on with this vulnerability. They got back to us and said actually we had to pull that fix from this release uh due to some stability issues. Uh but we're working on integrating into the next minor version. We said okay like we said we didn't actually test it. We just
saw the change log. Just keep us posted and let us know what you're working on. We'd appreciate it. So we sat tight and we waited for that next minor version. 1110 came out. We didn't see anything security relevant in the change logs. Uh but we fired it up and found out there was actually a few new protections in it. Uh basically they were wrapping that require function and only allowing certain external JavaScript modules to be included from within the web worker. We didn't realize until we were preparing this. Our blog post says that this fix came out in 1110. The reason why the change log is is blank on this is because this shipped in a prior
release about a month earlier uh in a change called show how to run send request from worker. Uh we didn't realize that this was actually released earlier than we'd thought. But this is what that fix looks like. Created an allow list of certain modules that can be used with the require function from within the sandbox. And if you try and pull in child process so you can run an external process, uh you get yelled at because it's not on the trusted list. Turns out you can use module.require require which bypasses this wrapper and you can still achieve arbitrary external command execution. So we let them know we got in touch and said hey we saw 1110 looks like it's got
some new protections in it. Uh it can be bypassed using module.require. At this point we'd hit the 90-day mark and and a patch had been delivered. So we said look in line with our VDP we're planning on publicly talking about this. We're going to publish a blog post in 30 days. They also said they were going to allocate a CVE and we were curious if they were going to publish a notice as well so we could cross link to it. And remember how they said they'd had that change of heart. Uh we said any updates on the bug bounty have you come to a decision and this was mind you we'd been very very crisp and clear. We're
planning on publishing. And to our delight they said yeah we'd love to give you a Amazon gift card. We just need to know which region uh you want it made out for. We told them we really appreciated this. We thought it was uh super super appreciated. We haven't gotten that gift card. this this chat went cold shortly after they came back and said all the payloads you've provided so far no longer work in 1110 which was strange because this email at thread I we we had just delivered a new bypass with module require they said we're still continuing to work towards a long-term mitigation so like we said we we started planning for our public disclosure of this issue
and while we were doing that while we were writing our post insomnia 1120 came out. Nothing in the change log, but over on GitHub, they said that they patched out the range global. But if you dig into that PR, it also patches out something called cycler. So what they' done is they'd ripped out cycler and range from the nunchucks environment. And our payload so far had been using range.constructor to kick off this chain of arbitrary code execution. So, our payload had broken and we were getting yelled at with an error message saying the range constructor is undefined or falsy. And it made us have to wonder, no, it didn't. That comes shortly. This rang a bell for us. Range
and cycler rang a bell because remember this warning that Mario talked about in the nunchucks documentation. Elsewhere on that page, it talked about three global functions inside nunchucks. Range, cycler, and joiner. With two of them booted out, would the last one work? Yep. joiner.constructor also did the job for us. So we published our blog post as we said we were planning to do. We discussed uh this bypass that we just found while writing that post. Got in touch with Insomnia and said hey look we've just published a blog post FYI also um joiner can be used instead of range. Uh and this is the last we've heard from from Insomnia at this point. So now we're into the post blog post
world. So if you've read our blog post we hope you enjoyed it. But all of what we're going through now is uh is fresh. We haven't talked about it anywhere else. In 1130, there were two changes. Joiner was addressed and tags in set cookie were also addressed. And we'll tackle these one by one. So as far as joiner goes, the third of our three stooges has been kicked out of the nunchucks environment. And so we can't use range constructor, cycler constructor, or joiner constructor, which beg the question for us, what had that been achieving for us in that payload so far? because we had a kind of vibe of what it was doing. It seemed to
set up like a bit of an eval sync so we could kick off arbitrary JavaScript and then from there pull in child process. So we need to find a replacement for that joiner constructor bit. And if we break down how this payload's been working so far starts off with joiner which is a function in JavaScript. Everything in JavaScript is an object and objects have things called constructors. And the constructor of a function is the function function. It's a function building function. So we then call that with an arbitrary string which produces the anonymous function and we then call that function. So that's how that chain's been working so far. So we need to find a new way to get to
the function function using only the bag of tricks that gets put into the nunchucks environment. Turns out there's heaps. Range, cycler, and joiner weren't actually that special to begin with. They were just what the range was what the published SSTI chains used. So if you tool around in a JavaScript console, you can go empty string.constructor is the string function which is a function which has a constructor which is the function function empty string to string constructor. There's a lot of ways to get to an eval sync from within nunchucks. We just had been using the one that was public which was the one that Insomnia had focused on breaking. So that handles the joiner concern. But
what's going on with tags in set cookie? This was exciting for us. They had banned the rendering of cookies as templates, which is something that we had early on suggested as a way to stop the bleeding on this one. Remote cookies. >> Remote cookies as templates. >> Can't hear. Sorry. Sad thing is they did it with sanitization and here's how it worked. Given a cookie with curly braces in it, it was going to zap anything that looks like nunchucks template elements. So in this case, it would strip the curly braces off and just give you the string inside of them. The trap is that this replace chain happens one by one by one by one, which we can use against
itself. So it's going to work left to right. Given this example sequence of characters here, we're going to look for left squiggly brace, left squiggly brace. We're not going to see any. Look for right squiggly brace, right squiggly brace, find it, zap it, and bring that outside together, and then look for the rest, not find them, and we're left with what we were looking for. We can kind of build this sandwich where the meat protects the bread, which comes together. This type of sanitization works if you repeat it until the water runs clean. But that only works if the string is shrinking. If it's changing or growing, you can end up being put in an infinite
loop. Uh so it was an admirable effort, but it was able to be used against itself to still smuggle in template sequences. We got bored trying to find a second way to do it for the rise quickly brace rice quickly brace. So we just brute forced it with Python. Putting these two tricks together, the cookie chain still worked. we could still get to arbitrary code execution and we're able to still pop shell and pop calc. But just this last weekend as we were building slides, we had a thought and found another way to defeat that cookie sanitizer. And that is that insomnia can be made to do what's called double interpolation. If rendering a template results in
something that looks like it's still a template, it's going to run it again, which is its own concern. Rendering a rendered template can be problematic, even if an attacker only controls the parameters on the lefth hand side. But in this case, we could use comment tags as a way to bring that sandwich back together because the comment tags weren't being replaced or zapped by the cookie sanitizer. And we can use these little dashes on the inside of template tags to kind of also consume surrounding white space. And so this is what that looks like. Given that monstrosity after the first round of templating, it's going to result in something that still looks like a template string. And nothing in
that first line is going to get zapped by the cookie sanitizer. Which is to say that sanitization as a security control can be risky. We'd missed this way of doing it for so long. And solving a problem with sanitization relies on your ingenuity as a developer to think of all the edge cases and think of all the ways that your sanitizer might miss something or it depends on your pen tester who's doing a retest or being your trusted adviser to have that creative moment. We didn't have this creative moment for a long time to think of another way that we could do this. Uh so sanitization can be tricky to get right. Moving on, 40 and 50 have come out
since. In 40, this was a significant change. What had happened is Insomnia had disabled node integration from within the sandbox, which prevents you from entirely using privileged node APIs like require. Big change. And it means that if you could still get code execution through template rendering, you're then kind of trapped in JavaScript land, which is still a fun place to be, but you can't break out through naive shell command execution. There's also considerable attack surface from within an app like Insomnia. Uh we haven't thoroughly explored it. Uh if you can break out of the sandbox, we'd love to hear how. Hit us up, shoot us an email. Then 50 had this change to the file tag
uh which allowed a user of insomnia to bless certain paths as being able to be read by the file read operation from within the sandbox. And within the pull request, someone pointed out that this could be vulnerable to dot dot slash. And then there was some confusion. I think that the person who wrote this PR thought that that might have been a feature request. So that dot dot slash could be used within the set of allow lists for the user. Uh but yeah, this this bypass works as well. So for the rest of the talk, we'll go full send. We'll bump all the way up to the latest version which came out a bit more than a week ago,
which is 1161. We can still get JavaScript execution through cookies with the template sandwich trick. Uh but we're then stuck there. we can't get out through child process to run an external command. Uh if you want to play along with this, we'd suggest you grab Insomnia from GitHub. Uh you can then open DevTools, render a template that just has the JavaScript command debugger. You'll get a breakpoint and you can then explore what the inside of that sandbox looks like from JavaScript from Electron. Even with node isolation or node integration disabled, we can still make arbitrary HTTP requests to stuff on the local network and and read the response. Um, maybe you could build a tunnel so you
could kind of squeeze yourself through someone's insomnia. I'd love to read the pentest report, uh, where someone was able to to borrow into an internal network through something like this. I think that would be super cool. We can still read local files. As long as someone's added uh, as long as an Insomnia user's added a path to their allow list for the file read, uh, we can use dot dot slash to then borrow back up to where we want to be. Uh, but that doesn't really matter anyway. there was a new API added with the disabling of node integration that wasn't protected by that anyway. Uh so we can just head straight for that internal API and read
arbitrary files. And remember how Mario was talking about that environment variables tab where users of Insomnia could put things they want to reuse across different requests whether it be API keys or other credentials. uh and it said that the data in this tab can be used in templating. Well, that means that the data in this tab is exposed to the sandbox for each template render. So, you can just yink and grab those out if you're uh executing arbitrary JavaScript in that sandbox. In fact, from within the sandbox, you can even see the entry point into the sandbox. The main thread kind of activates that sandbox template renderer through a post message. Uh and from within the sandbox you can see that
onssage handler. You can change the onssage handler. You could replace it with your own function that observes the templating request and then passes it through to the original renderer or returns arbitrary strings if you wanted to. At this point you could change that function, hook it, observe all of the template rendering going on across all the request tabs that the user has open. keep tabs on the environment variables that are configured within the environment variable screen, create a repeating, you know, set timeout poll thing. You you the world is your oyster at this point. If you wanted to, you could build an entire JavaScript implant that would live inside Insomnia until the next time it's restarted. You could
write your own C2 server. You could do all sorts of things. I don't know why you would do this, but if you did, it might look like this. confused. So, we've got an attacker's web server on the left and Insomnia on the right. The user sends a first request to the web server, which sets a couple of cookies, two of them. We've got two bypasses. May as well use both. We paid for both. Use both. Upon the second request, when these cookies get rendered, that hook gets installed. And while that hook's getting installed, we may as well read some files. Maybe we'll grab Etsy password, loop over each line, look at the home directories, grab some
SSH keys, phone those back to the server.
And then if the user tabs across to a different request and does any template rendering, it's going to activate that hooked handler. And then that replacement, that hook can then observe all the things about that other unrelated request. all the body parameters, all the path parameters, all the stuff from the environment can grab that and siphon it back off to an attacker's server. Sometime later, 30 seconds, a minute, what have you, uh there's a background loop that could pull an attacker's server to ask, do you have any JavaScript for me to run? In this case, the server is just saying, hey, what's 7* 7? Just to kind of pay homage to template injection. But at this point,
you could do anything, make local network requests, what have you.
So, in closing, bug triage is really, really hard and it's really important. This issue was reported publicly on the GitHub for insomnia in 2020. It seems to have been disregarded maybe because the only paths to exploitation exploitation seemed kind of high user interaction. Maybe it was de prioritized and then when Mario cracked the case of the kind of cookie remote vector that seems to have kind of spurned some action. But could this have been kind of looked at 5 years ago and thought we don't know how bad this could possibly be. But it seems like a dangerous sketchy thing that's going on. Perhaps vendors such as Kong could look at bug reports like this and without
having to be shown the worst that could happen could look at smells like this and remediate them regardless of the seeming exploitation vectors or the the seeming maximum impact. It's tricky though. Mitigating bugs rather than fixing them at the source can be quite brittle. Uh there were multiple rounds of back and forth on this one. various attempts to break proof of concepts uh as opposed to reasoning about and considering the root cause of this and fixing it there. And as I said earlier, using things like sanitization depends on your ingenuity as a developer or the ingenuity of your pentester or your security team to kind of provide you and furnish you with all the different routes and all the
different edge cases. Whereas fixing root causes where possible can kind of take care of it all in one one swell. So mitigating can be can be tricky and design decisions. Insomnia chose nunchucks and then passed things into nunchucks that shouldn't be passed into nunchucks. And this is now really hard to change later on because people are used to the syntax of nunchucks when using insomnia. But perhaps if a less sophisticated engine was chosen earlier on, something like mustache, things could have gone differently perhaps. And so early design decisions can be really critical. And lastly, it's 2025 and V disclosure is still really difficult. We feel like we were kept in the dark somewhat about the plans that
were in motion and the fixes that had been delivered. But we think we also could have been more patient in our communication, too. When we were preparing this presentation, we looked back on that email chain and saw instances of things we'd said where with hindsight, we wish we'd elaborated a bit more. Here's our initial advice on recommendations for this. They were quite tur especially that last line. I don't think it would have been the right thing to write an essay to a developer or to an organization or to a vendor on why we think a certain fix is not suitable. But if we had been a bit more patient in this case, I'd love to
know if that could have changed the way that the remediation efforts were approached. I'd love to go back and kind of split the universe and kind of do an AB test because we don't want to bate someone we've never met before who frankly probably knows more about JavaScript and Electron than we do, but we want to furnish people with recommendation advice that kind of hits that balance. And I feel like we're still trying to find that balance. Thank you very much.
>> Heaps of time for questions. >> Heaps. Don't know what happened there. Have we got any questions now back >> Liam?
>> Awesome talk. Thank you. >> Thank you. >> Um >> did you do any sort of like Google dorks looking for any exploitation of this in the wild? Like there any, you know, YAML files out there that are looks like they're doing dodgy stuff? >> No. That sounds like a cool exercise, but no, we didn't. >> Uh, a quick follow-up question. What did you spend the Amazon voucher on? >> We're still waiting for it. What What would we buy? >> Maybe some like for me a sleep mask and sleeping aids. I'd like to get some sleep after this conference, so maybe that.
Any other questions? Yep. >> Hi. Yeah, good talk. Was it ever pointed out that like if a server sets a cookie >> with braces in it, they don't expect it to be changed on the way back? Like that's just a bug, right? Like >> sanitizardization is one thing, but >> yeah. >> Yeah, we we we didn't go into that. We thought we were going to be pressed for time, but absolutely uh a server if it wants to set a cookie that contains curly curly 7 star 7 curly curly that's its prerogative. It's probably not expecting to get back 49. So I agree. I think this is also a functional bug. Yeah. >> Yeah. I think maybe they were just not
expecting people to send like a >> templates. Yeah. Maybe they're not expecting people to send template through cookies. I guess >> partly valid. Hardly valid. >> Yeah. You were saying that uh you're saying that your usual setup is have a piped into ports uh piped into burp suite to you know log everything for you. If you implemented like a full JavaScript C2, is that still going to go through Burpsweet if somebody like if your victim had that set up as well or is there some way that you could ignore those proxy settings so that they can't even see this? >> Great question. Uh Mario is the Insomnia user. I imagine you how do you configure
insomnia to talk to burp and then I'll take the second part. >> Uh it's just like a string proxy. You just configure insomnia to use burp suite as a proxy and that's it. Go from there. >> What I would check to know that it's a great question is whether doing fetch operations from within the sandbox respect that proxy setting or if it's only an application level proxy setting that kind of governs how that purple send button works. I'm not sure. That would be what would have to be tested. >> Too easy. Thank you. >> Um I know the public reason you called it sleep the strings was cuz the client was insomnia, but was the real reason
actually because Justin got no sleep to do three CFPs? >> Yeah. Yeah. I'm getting home and crashing out. Can't wait. >> Awesome. Great talk. Thanks. >> Thank you. >> Any other questions out there? Yep. >> Great talk. Really enjoyed it. Um, I just wanted one of the last points you made about vulnerability, uh, vulnerability disclosure being so difficult. What do you think or what kind of advice could you make to software companies to improve that? And I know you also said you had some self-reflection about improving it on your end as well. >> Mhm. Yeah, I think I covered off I think we covered off what we wish we'd done differently, which is to know that level of verosity to hit. What
I wish vendors would do, and this is me personally, uh, I'm Yappy. I've worked in organizations alongside developers, and I love working with them on reasoning about security and reasoning about vulnerabilities and reasoning about fixes. I would have loved to have jumped on a 20-minut half hour call with a developer who was working on this trench of bypasses to understand their point of view, express my point of view and see if we could teach each other things about JavaScript, electron and template injection. Uh I would have loved to have chatted with a developer. I don't know who we were speaking with behind that security at uh the vendor mailbox. I feel like we weren't directly
communicating with a developer. And so to vendors, if you've got someone turning up with a vulnerability, it can be scary exposing developers to unknown quantity outside your organization. I like to think that we try to be kind of friendly. Uh and I would have loved to have talked to a developer directly and see if we can shortcut some of this back and forth would be my suggestion.
Any more questions? Okay, let's give another big round of applause to Marco and Justin.