← All talks

Translating Mobile App Security Lessons To The Flutter Stack

BSides Munich · 202529:0053 viewsPublished 2026-02Watch on YouTube ↗
Speakers
Tags
About this talk
Flutter's cross-platform abstraction layer creates new blind spots for traditional security analysis tools. Samuel Hopstock walks through how Flutter apps compile differently than native applications, why standard decompilers miss vulnerabilities in Dart code, and how classic mobile security issues like certificate pinning failures and insecure deserialization persist in the Flutter ecosystem.
Show transcript [en]

All right. Hello everybody. Um, thank you Radu for the introduction. Um, as you mentioned, I'm working in the area of app security. So, I'm not developing apps myself, but I'm actually working on both analyzing apps. Um, that's the the app suite product that we have at Guardsquare and we also have protection and I'm working on both of those at the same time in the iOS and Flutter world. Previously I also worked on the Android side of our scanning uh solution and uh when I transitioned to the flutter part um I thought there was a very interesting distinction between how flutter apps look um to an analysis uh tool and regular apps. So um let's look

at that a bit today. So first uh what actually is flutter if you haven't seen it before it the idea is that it's a app development framework where you just use one code base for I think five platforms even nowadays. So you can uh write uh the code and deploy it on Android, iOS uh in the web also on Windows sometimes and uh this is a very neat thing for developers of course because you just write all of your code in one language. Um but the structure of the app thus needs to be a bit more complex to handle all of this. So you have uh the Dart code that compiles to native code that

looks exactly the same for all platforms. And to be able to run this code properly, you have the Flutter engine and the Dart VM underneath it that actually execute this and make sure that it wires it up correctly with the APIs for the native platform that we're talking about. And you can also write uh native specific parts in your flutter app. If you for example have certain plugins in there that you want to um have certain behavior for iOS or for Android um or also third party plugins that you use can have parts that are platform specific and then those uh pieces of code can communicate directly through something that Flutter calls the method channel API. So you define methods and

the signature and you can then interact between the Dart code and the platform native part of your app. Now the thing with Flutter apps and maybe also mobile apps in general is that some uh parts of it when you think about it on a security uh point of view are a bit unclear maybe sometimes or people have some misconceptions. So for example that is something that you typically hear on the iOS world for example um that h if you have an Android app that has Java code in it you can decompile Java code very easily. So attackers have a field day with your app whenever they want to find vulnerabilities in it or if they want to

find your sensitive data and native code if it compiles to machine code is completely opaque. So it's much better and that of course is not really true. Um but we'll get to that later a bit. And now Dart is even a bit more extreme in that regard because it's not just that it compiles to native code for your platform but it also compiles to a slightly non-standard API. And if you then throw that into a standard disassembler or decompiler, you may see code that doesn't look as clean as it would look for regular native code. Um and that's also for example the reason why um you can then upload your flutter app to existing analysis tools and you

will not get any real findings in there apart from the native part of your app. And so if you think ah that's an app analysis tool must be detecting my findings and if there's no findings in the app the app should not have any vulnerabilities in that regard. Another issue is that uh flutter by default provides this flag called minus obuscate when you build your app. That sounds pretty cool on the surface, I guess, if you are treating with uh your cool new algorithm that you're shipping or with nowadays what's very cool is that you ship local AI models in your app. And so obuscate sounds like the thing you want to do because that makes

reverse engineers lives a bit harder. But the issue with that is that uh it really only uh swizzles the names in your app and the actual control flow and the actual algorithms you write are not offiscated in any way. And so you need to think about it if it's really something for your threat model or if it's just some neat gimmick that doesn't really provide any extra security protection for your use case. And the last part is that uh flutter and dart as a crossplatform framework they need to use a lot of abstractions to make the code work nicely on all platforms. And you might think ah um they then surely don't make the same

mistakes or provide the same hard to use APIs that maybe for example Android SDK or the iOS SDK provide. though that it's hard to misuse and u as a consequence it should be much easier to write safe apps uh in dart and using the flutter framework but unfortunately and that's what we're going to look at today is that you can still have the exact same insecure code that you saw in other um mobile apps also in your dart code um to walk through the first few findings um I prepared a very little uh snippet of dart code it I think looks from a language language point of view very much like a mix of native code and

maybe some JavaScripty APIs in there. So it's I think really nice to read sometimes. Um and so here what we have is we have a very secret method that you want to keep away from reverse engineers eyes under all circumstances. Um in there you're using uh some logging functionality that is also something that attackers may want to take a look um to see what your app is doing in which places and we are handling uh secrets um an API key Google API key in this case. Um yeah so we have a nice package that we can now step through and see how would you approach a flutter app if you have it in front of your eyes.

Uh the logging part is the easiest in my opinion. You don't even need to look at the code or disassemble it in any way. you just uh hook it up to your to your ADB for example on Android or on on iOS. You can also look at logs from your device and then logging messages pop up. That's no different than any other um app. Also, I wouldn't really say it's a very security sensitive thing per se, but um that's also something that we see in the wild very often is that people log a lot and they also lock um maybe API keys for example if they're about to use them, which is not ideal and you

shouldn't be doing that. Now, if you look for hard-coded secrets, the first thing that usually is done with any app or compiled code, you just run the strings command on it and usually somewhere your secret pops up. That's the same thing for Dart code. So, it doesn't do anything weird with your strings in that regard. So, it handles string objects slightly differently than regular apps, but the strings command can still find it. So for that case that's a easy u easy first view at your app and nothing spectacularly new. And um for the very secret method that you remember that we didn't want to share with reverse engineers because it for example contains password handling or

your new AI algorithm that you don't want to share. Um that's usually done by looking at symbols in your app. And so what we do is we use NM the classic tool for looking at symbols. But then that's the very first weird thing that happens and M tells you there's literally no symbols in that binary that is interesting I think. Um but we have to dig a bit deeper because if you look at the strings output of uh of that dart binary again suddenly we see strings as well that look very much like method names class names and also the very secret name that we tried to hide. So if we don't have symbols but we have names

of functions in there that means there is some metadata about your code that for example the Dart VM needs to use but this is done in a bit of a non-standard way in Dart and that's where we look at how uh the Dart VM actually executes your code. This is a bit of a simplified diagram but you have your lip app.so or on iOS apps it's called app.framework framework that contains your entirety of your Dart code um that you wrote for your app. And next to that Dart code, it also has something that's called the Dart snapshot. And the snapshot is something that the Flutter engine, lip flutter.so in this case, loads from within your binary. So that's the only

real symbol that is exposed in uh in lib.so. And that Dart snapshot is a serialized form of what Dart calls the object pool. And this object pool contains all the Dart objects that you are going to use at runtime. And for example, if you're printing string, that string is such an object that lives in the Dart object pool. And after deserializing that in the Flutter engine, it gets put in this Flutter heap. And whenever the Dart code needs to access a certain object, it asks the object pool, hey, can you please give me a reference to this? And then you get a pointer back and can use it. And the reason why we saw those

function names is that the object pool also contains objects for all code entities. And this is because uh Flutter uses this VM structure which is also as a side effect. You can use non-compiled dart code to run in your VM. for example, in debug mode, that's what's happening. Um, and so it needs to know about those function objects and class objects. And if you want to look at this object pool and see what does this app actually contain, there's this very nice tool called bl that uh is supposed to parse the object pool and give you a summary of what it contains. So if you throw any lip appso into this tool, you see that uh it does some parsing. It

talks about the dart heap, it talks about the object pool. Sounds very nice. So let's see what that actually now produces. The output tree structure um contains as the top level uh a few scripts for example that you can use if you want to uh tinker with your app in Freda or if you uh are using IDA pro as a decompiler. They have some uh very useful scripts to actually make reverse engineering that app easier with metadata retrieved from the Dart object pool. for example, function names, function signatures. Um, and if we if we take a look at the ASM directory, that actually is supposed to generate a file tree of the Dart files uh your lip app.so was compiled

from. So every function that you have in your app also for example has debugging metadata like what is the file where it was defined, what is the line and how how big is that function and that's what they try to reconstruct here. Um so for example they see that we have a main function and a very secret function tell us about that type and then uh give us some disassembly but in my opinion that disassembly there is maybe a bit unclear to look at sometimes. Uh so what I would rather suggest is that we take a look at those addresses and when we when we look at the binary in an actual disassembler we can just map the names with those

addresses that Blutter gives us and then we have the best of both worlds and then this is what comes out of this. So we see there's a main function at this address that Blutter told us. Um and it's not very big has a few instructions. So those are RM64 instructions if you've not looked at assembly before. But uh the main part that's happening here is that we have a a call to a function called very secret. And this is exactly what happened in the in the Dart code that we wrote. But there's currently nothing else really happening. If you look at very secret itself, you remember we had a print in there that just printed some nice API

key. But there's no reference to any API key or a known string in there because otherwise uh this disassembler would have noted that it found some string reference. Um and it just finds a call to print. And this is also one quirk of flutter. It does very aggressive code optimization. So in this case actually the the argument to print is even pushed inside the function because the compiler saw that print was called with a hard-coded fun uh hard-coded string. So it can do some constant propagation and move it down to the internal call print to console. And uh here is where the actual argument that we want to print was loaded. But still it's loaded in a

very weird way. We have some uh register x27 that we want to use and read some data from it. And there's no string anywhere to see. And this is now um actually what the object pool is supposed to do. So at runtime x27 on uh on ARM for example um will have a pointer to the object pool itself and then this code here requests to load something from offset 1 cb8 from the object pool and then print that to console. And if we look at uh the data that blave us in this file pp.txt txt which I don't know why pp but it's actually contains the object pool and we just uh grab for this offset maybe it

tells us something and here it says ah at offset 1 cb8 the object pool contains a string object with the string that we expected to see and that's now how you see that if you have a traditional analysis tool even one that can analyze native code this would have not been spotted because this is not very traditional way of uh of loading coding strings usually would have them hardcoded somewhere in your binary and not in something that you first need to des serialize and then uh know that x27 has a specific value at runtime. Um and that's just how how findings get lost. But um if you do a little bit of analysis with blutter for example uh you

still find all these codes in your binary. Now if you are past those let's say easy findings that you would usually expect like hardcoded secrets there are also those uh security issues I was talking about initially where maybe the abstractions that flutter does uh could help with making it less errorprone but unfortunately one one very classic thing that happens for Android still lives on on flutter which is um on flutter it's called the bad certificate callback uh in Android it's a a mix of trust manager implement mentations and other APIs. But the gist of it is you have a TLS connection that you for example want to load data from and now for whatever reason you can't use the default setup

because your server doesn't provide a certificate that the system accepts. This could be for example you're using a development instance where you have a self-signed certificate or for whatever reason your company has their own certificate authority um and they don't uh get accepted by the system certificate storage and if you if you go through stack overflow it's a very prominent question of hey my TLS connection was not accepted what can I do and the the typical response is ah there's this bad certificate call back. You know, it's just a lambda that gives you the certificate, the host, and the port you wanted to connect. And you can then decide what you want to do. And the

easiest thing, since it's a a boolean lambda, it's just return true. And then everybody is happy. The connection request works again, but they don't really see that this is not what you should be doing, that maybe the system does do something with that certificate first. And so it's this classic thing where you can have a man-in-the-middle attack. Um if you accept any sort of certificate, this setup for example just has I think a self-signed certificate uh that was generated and the app doesn't complain. Uh the attacker can just replace any data in the uh in the request and then the user thinks they are connecting to the correct service but they are not. And so this is

something that could have maybe been prevented by uh by the flutter developers that they say no we don't allow you to use any non-standard certificates but unfortunately this is not what what happened because probably somebody said this is a use case I definitely need to support uh so they added that API. Now if you see issues like this and for example some analysis tool tells you hey you shouldn't be using an all accepting TLS uh validator then you should ask yourself first why do you actually need this non-standard behavior why can't I just use a standard TLS certificate there's free certificates nowadays so pricing should not be the issue anymore um and maybe if you if you have

self-signed certificates or your company needs to use a custom A you should again ask yourself and maybe your company IT person if that is something that they really need or if there's something that they can use for example for for internal domain sometimes uh you may think that you need your own CA because nobody sees that internal domain but nowadays uh this works very nicely if you have access to the top level domain and if you then have use cases where you need to for example set specific bad certificate call back then please don't use some custom code that you found somewhere on Stack Overflow. But maybe look for that specific use case of yours

and see if there are well-known and well tested third party libraries that for example support the use of uh specific self-signed certificates instead of just allowing everything. Now the next uh vulnerability kind that you can still also have in your Flutter code is the classic class of injection vulnerabilities. So I would say the evergreen of this class is the SQL injection where you just have some in in this case for example an SQLite API in Flutter uh and you execute some queries against it. This sample code is actually modeled very closely to some existing uh I think it was a library not an app that two years ago um had CVA for exactly that. they had some list of of downloads

of uh paid videos that you could uh purchase in that app. And if you could provide a different uh download URL that would just be pasted inside the SQL query and then you could uh circumvent uh certain security checks which is very unfortunate because of course if you just do string formatting for your SQL queries this is just SQL injection again and it's 2025 this should not be happening anymore. And also SQF light, for example, that's the uh library that's pretty common for SQL and Flutter does support uh pre-made SQL queries and all these sorts of defenses that have been uh in out in the wild for a while. But if you if you again look at ST overflow,

those raw queries are still apparently the thing that people sometimes prefer because they're easier to use. So in general, not just uh for SQL injection, but also I mean the cool new thing is prompt injection I suppose, but there's also any other things where you get data from somewhere externally that you maybe shouldn't trust and then you do something with that data that has pretty high impact like for example uh dynamic code loading, something that you can also do in uh in Flutter in certain platforms. um you should maybe not use data that you got from an intent from the outside and use that to load code dynamically and all these sorts of things. So please um as usual you need

to verify where that data comes from, sanitize it if necessary and then uh take special care if you if that data ever reaches APIs that you uh should use extra care with. Now those are of course only some examples for um what mobile specific security vulnerabilities you could you could have and uh one thing that I'm very happy exists nowadays is the the OAS project called mobile application security verification standard and uh the testing guide that accompanies it. And one thing it does is that it shows you how do you actually analyze your app and see that uh certain vulnerability classes exist in there. And the other part is also that it shows you what what

you should do instead. So this is a really good guide that you can just basically go through from front to back uh and check that against your app. You can also do um a pentest for example that specializes on over specific issues um however you you want to do that. Now you can of course also do automated security testing. That's more the approach that we are for example taking. Um because if you do a pentest only once before every release that could be something that uh just starts costing you more money than checking incremental changes whenever somebody pushes a pull request uh to the repo and then you immediately see okay this introduced a new vulnerability maybe you should uh

fix it immediately. If we go outside of the classic is my app secure part of the topics, we can also look at some extra uh attack vectors which is uh runtime tempering. Uh this is a bit of a different scenario which is not anymore that you want to for example protect the user data that you're handling but you actually want to protect against the user because maybe your your user in your scenario should not get access to certain parts of your app. maybe uh some IP of yours is in there and uh then that's where runtime tempering comes into play. And uh at runtime you can patch functions, you can do inline hooking for example with Freda

and uh that's also true for uh Flutter and Dart code. So that nothing really changed in that regard. Um maybe less one-click solutions because you need to do this custom Dart hooking. But then again, if you for example hook the Flutter VM itself, then you get access to a vast amount of information about the app that's running because naturally the VM needs to have this uh more comprehensive look at your app. Then if you don't do runtime patching, but you patch the app once and distribute it to users, that would be repackaging. That's a very uh popular thing on Android, of course, if you don't want to pay for certain services or also on iOS when you want to cheat on

games. Um, so repackaging just uses all the techniques from runtime tempering, packages that and distributes it to the users. So what you would do is you would unpack the app bundle, modify the resources, the binary or the flutter VM for example, patch out any checks, remove paid features, add your own features if you want to skip ads in a certain video platform, for example. Um you can also add some permissions, change the package name, um rezip it and then you can distribute everything. Uh you can protect against this by something that's called runtime app self protection. You can inject certain checks inside your own code that make sure that for example a specific sensitive function that you have uh is

not tempered with and it has the same code that you wanted it to have. But those checks have the same issue as your app itself because the attacker could just patch it out if they find them. So if if you say your threat model needs this, then you should also make sure that those checks that you insert are obuscated and not that easy to patch out and locate because otherwise they're not really helpful. Those checks can also alternatively for example send data to some server where you want to then collect some statistics of does my app user population resort to these sorts of attacks? Do they distribute repackaged versions or not? Um and then you can

decide what you want to do with this. Now if we come to the conclusion of everything, what is it that you should take away uh is that you need to know what your app actually contains when you ship it. Be it a Flutter app, be it a native app, it contains your code. That is of course pretty obvious. But uh remember that everybody can see that code and if it if it's insecurely written, people will find that eventually and will maybe find an exploit for uh insecure data handling. Um and if that is your attacker model, you should make sure to protect against it. And also take a more holistic approach to security. So really think

about what is it that I want to protect against? What is it that I do not need to protect against? And make sure that you use the right tools and maybe also not overshoot. If you have an open source app, you don't need to necessarily obiscate it, I guess. And then if you integrate all of that in your CI setup, um it happens uh very naturally that you detect new issues and you protect against it. Um and you don't have this thing where you now need to start scrambling after the fact. And of course, Flutter apps are just as secure or insecure as any other mobile app. It's really not special. It just looks a bit different to the analysis

eye. So with that, thank you very much. And I think we have some time for questions. >> Thank you very much, Samuel. We have a few minutes for questions if you have some. Yeah. One second. Okay. So, uh just taking the first example where you printed an API key. So, in a typical production Flutter app, I won't name I won't name a method really as very secret or something like that. >> That's what you would think. Yes. But we we see that very often that I mean it's it's nice to name it that way and then surely since you compile it, nobody cares about that name anymore. Oh yeah, >> it's a production app, the Dart object

pool will have a lot of methods. Obviously, >> it will. Yes. So if you if you then for example use this obiscate flag, it still has metadata about that function, but the name indeed actually changes. So if if it's just the name that you're interested about um that changes, but usually I think it's not the name that's sensitive about a function, but what it does at runtime. Other questions?

Hello. Hello. Yeah. How do you deal with um all the abstractions that are introduced by these um several layers that you mentioned? uh dealing as whom like the the app developer for example if you think about your >> is like do you like um try to wrangle all that abstraction or you just go for the this uh the strings >> ah so it depends how you do it there are some tools for example that really just run strings on every binary and that's it but this then stops very quickly so you wouldn't be able to detect most of those issues that the mastg for example tells you about so it it really makes sense that you have some flutter

specific analysis that knows about those abstractions and also knows for example about the interaction between the native part of the code and the flutter code so that you can see if for example data that the user provides flows to some APIs it shouldn't um and not treat it as a black box really >> and that's done with the app that you mentioned or do you also engineer something um yourself at the gutsquare >> yeah okay I mean we do have uh plus analysis as well of Um but yeah, Flutter for example is one of the other most prominent things that uh does flutter specific analysis. >> Okay, thanks. >> Other questions? If not, I would like to thank you one

more time, Samuel, for being here with us today.