
hello Belfast uh first of all shout out to bid this this is amazing seeing everyone here and I'm really excited and happy that I was asked to come out here also um uh shout out to just a Belfast I've never been here I'm from Atlanta Georgia where it's 90° fah and 100% humidity and it's not raining uh and I won't say anything good about the weather here uh again I'm Ray Kelly uh I work at synopsis many of you may not know the name synopsis but you're probably familiar with the tools whether it be City black duck or white hat um so that's our family there uh I run our Das R&D group so I focus on automated uh
scanning Technologies how do we hack websites better faster more accurately uh I've been a developer for around 25 years uh I've been in appc for 20 years got my start in 2003 with a company called spy Dynamics uh you might know the uh scanner web inspect so I wrote most of that all through the 2000s um and that's where I got really started there uh also we went through a whole Litany of Acquisitions like most cyber security companies while I was at HP uh I ran our mobile pent test department so we had a team of mobile pen testers and a lot of the examples you'll see is from some of that experience as well as things that have
been publicly announced a couple of considerations so uh everything I'm about to show you I am showing you real hacks and such things that we've seen uh everything is either publicly disclosed or it's been scrubbed so no zero days for any of you and also the other consideration is that these are uh these aren't malicious people that I'm showing you these weren't hackers that are exploiting these uh what we did while we were testing the applications but but these are really developers that have made mistakes in code that cause those vulnerabilities to happen so keep that in mind right um these aren't people that are trying to make insecure apps they're trying to do their
best talking about insecure devices um this is a bit of a stretch but I like telling the story so I'm going to tell it anyways uh looking at a paper shredder so back in the 70s during the Cold War uh you know Russia and America were having their beef going on and uh so in a hotel in Washington DC where famous dignitaries would stay whether your Presidents or your queen uh having to be there they would usually stay at this one particular hotel and in all of the rooms they were kind enough to provide you with a paper shred because you're important you have classified documents and you need to shred your papers while you're traveling the
problem though was that this particular paper shredder was made by a three-letter acronym of the US government and what they had done is in the top of the Shredder right before it get shredded was a document scanner and so as you're shredding your documents and scanning it and setting it over the electrical line uh through like impulses variating the uh the electrical current like an old school modem and they were reconstructing the documents in another room of the hotel I saw that like that is [Music] awesome uh looking at the mobile landscape um mobile development is hot I don't have to tell anyone that right uh unfortunately because of that security usually takes a backseat right so we
have developers but you know what you're getting your your pressure from upper management your product managers hey we got to get that new feature in there when are we shipping that app why is the review taking so long right and so there's just constant pressure and security is rarely if ever number one on the priority list right um so that's a big problem out there where why many things don't get caught uh another uh challenge is that mobile app development is relatively easy right things like Cordo you don't need to learn uh you don't have to understand xcode right horrible Language by the way I don't if anyone writes in that uh but uh it's there are languages
out there that make it really easy almost drag and drop your application together ship it up to the App Store and you're good to go right but did you think about security no it was easy it was fast no problem um the other problem is that uh these vulnerabilities that the developers typically make are platform agnostic so for instance if my backend API is vulnerable to SQL injection it doesn't matter if I'm on Android or iOS right it doesn't matter um so that's another
challenge the number of devices out there there again don't think I need to tell you but uh they're exploding right and I know it might be hard to read but 255 billion mobile app downloads uh in uh was it last year let say last year 255 billion downloads right of applications that are going out not knowing if they're secure or not looking at the mobile threat Service uh we did a study uh we took 120 mobile app applications uh from a single customer and I know that sounds crazy but when you're dealing with large uh corporations a lot of times whether you're uh depending on what products you're selling each product may have its own individual application written by
different teams or different Outsource teams um you know it is quite possible sounds a little crazy but it does happen and we found that 66% of them contained a critical or high vulnerability in those applications and when we say critical or high we're talking about it either disclosed uh personal information through say like a third party data leakage I'll show examples of that um or we were able to completely compromise the backend
system so uh again many of you probably understand a lot about server side you know Das hacking that that side of things uh but there's two big differences uh that need to be considered when looking at uh mobile applications one of them is the ex uh magnified Network exposure and what I mean by that is Wi-Fi right I'm you know you take your phone around and you're getting on free public Wi-Fi at the airport is that really the airport's Wi-Fi maybe it is maybe it isn't right you don't know if they're uh doing man in the- Middle sniffing that traffic so shady networks is definitely a problem problem magnified physical exposure right uh I have my phone here and I
could leave it anywhere if someone nefarious person comes along or wants to you know steal my device for uh information they they would be able to get that and uh since we're in a safe space here right I did this presentation in other place I said hey you know physical and I reached and my phone's not there I'm like where's my phone I left it on a table in the back while I was getting ready see I meant to do that
so the mobile threat surface so when doing a pent test when you want to analyze a mobile application it it's difficult right it's time consuming it's tedious and it makes it even harder when the developers actually do some good things and and we'll talk about that but in general you want to look at three areas when looking at a mobile application we're looking at the client again on the device itself are we encrypting sensitive information on the device with the data that your app is collecting right um poor certificate management so things like that actually looking on the device what is happening what's getting stored what data is it accessing we move over to the next level
the network okay so we want to look at the actual traffic that's going across the network to the backend API is it secure that's one of the simplest things what kind of authentication is it using is there a third- party uh data leakage right where's the data going show you example of that as well and lastly we have the server side so the back end again which is typically an API it can be vulnerable to all the same things that all of you are familiar with right SQL injection command injection all of those things sometimes velers go well this is for a mobile app what does it matter right it's just my app talking to that you know and I guarantee you bad
people will find that API and be able to exploit those API calls so what I'm going to do is go through some examples and I'm going to start on the server side okay and again like I said that they are typically vulnerable to all the same D type findings that you're familiar with um one thing is that again young developers or people not familiar with security they're like well this is an API endpoint you can't Google that right I I I can't search for you know my backend endpoint it typically won't come up it doesn't get indexed by Google but the bad guys will find that end point guaranteed so you need to be sure that
you know sanitize your inputs all the same rules happen on the back end uh this particular example was one we actually uh performed so we had a mobile app it talked to the backend API unfortunately on the API end they had webdav enabled and they allowed the put method so essentially we could upload any file we wanted to to their backend server we were able to turn it into like a one drive for for anyone we wanted to or upload a nefarious link right click here to win eil tickets on this particular website and then what I'm going to do is Spam that to millions of people hey we got tickets here using this third party uh API basically click
here to win tickets and when you click that of course it's going to take you to some ransomware or some other bad stuff right to go and infect say like your computer so uh again in this case they were not protecting that any upload arbitrary upload of any file could happen on this British Airway this might be sensitive to some it's okay over in in the US when I talk about this um so they were they had a problem they uh got hit by the Mage cart skimmer anyone familiar with magecart so essentially it's a JavaScript uh it's a malicious piece of JavaScript that basically what it does when you fill out your credit card
information on a website it will take that it not only submits it to you know British Airways in this case but we'll send a copy of that to the bad guys server right all of that same information um I'm not sure what the entry point was for this vulnerability but it ended up on their server somehow so this happened to have gotten expanded though because on the mobile app it was a HTML 5 based application and it pulled its JavaScript files from that that server so now we went from one uh you know one hacked website to who how many mobile app users are there for uh British Airways right 100,000 anybody that was buying tickets on their phone
that same information was getting sent up to the bad guys server all the credit card information uh and they were stealing things like email addresses uh credit card numbers expiration dates the cdb number right all of that information and I just thought that was interesting that you know one vulnerable web server ended up infecting how many how many uh different mobile apps if I'm not mistaken this was the first uh or almost I I'm pretty sure it was the first instance of a gdpr fine that happened and they were hit with $190 million fine based on that um and again I believe that's based on either what revenue of the corporation or the amount of data that was stolen and
somehow they whittel that down to 20 million I guess cuz they maybe didn't realize when gdpr came out how cost Poss L it could actually be if this happened and unfortunately British Airways was the first ones to figure that out when I saw that I was like wow that's kind of interesting right my uh Delta mobile app has all kinds of information passport number right all of my information is in there so I had my team do a study hey let's download as many uh airplane you know airline apps as we can and take a look at now we couldn't actually legally attack those say like endpoints um but we were able to do binary analysis right anyone can
take it take a mobile app and take a look at it and when we did I know it's hard to read but we were able to find things like weak crypto we found things like uh basic authentication was being used between uh endpoints uh let me see unsafe you know they didn't pin their certificate uh the SSL certificate in the application um um week certificate signing thirdparty data leakage we found all kinds of things that were in these applications just by binary analysis not actually poking on say like the back end um and we can see I have it split up between IOS and Android uh usually when we get to the questions PE area people
say okay so what's more secure Android or iOS and again like I said beginning these are platform agnostic right the back end it doesn't matter if you're on Android or iOS or if in my code in the mobile app I'm concatenating strings to build my SQL career right for SQL injection that has nothing to do with the OS that's purely en code where that
happens another example on server side this one came out on the Press uh there was an application called bright city and what this app did um so it was a way that a homeowner or anyone you could go around and take pictures of your valuables and what they would so if there was an accident say like a uh your house burned down or a natural disaster it would catalog everything that's in your house so you can go to the insurance company say look I really did have diamond rings I really did have a big screen TV and keep all of that for those purposes uh the problem came up though where the backend API and it's probably
again it's kind of small hard to read but uh
um not working but anyways in the uh uh in the URL to the API uh the last parameter or you can see the the method called get user and then there was an ID there and the ID is something like 1,000 well what happens if I hit 1,1 this huge chunk of data comes back for the next person that's that uses the application so right there we're doing account enumeration right we're just walking through all of those accounts the really bad part though is within this information cell phone number email date of birth username and password for the account so now we have access to everybody's information we can get into any account uh just by simply
enumerating the accounts on the backend API again this was for a mobile application but the back end is the thing that's vulnerable here I called it Amazon for criminals like if I I need a new big screen TV let me go through and see what I can find who lives nearby right and go rob the
house next um now we're going to move into the network area right the network section things we're looking for is like third- party data leakage clear text Data not using SSL so in this example here and it is blurry because there's a lot of sensitive information here and a lot of even on top of redacted information um so we had this application it was for a big- time boy band like think of like New Kids on the blog any fans new kids no got okay we got two over here so you can figure their Target demographic is like teenage girls for this particular band and so we were testing the application the tester was man in the
middling right so they were using a proxy server physically using the application created an account and he's watching the traffic through the proxy log and he sees a chunk of data go by and it kind of caught his eye and he's looking at it and he sees his home address in there in that data going to the backend API and he was thinking like I didn't put my home address in here what's going on and what the application we determined was doing was using GPS coordinates to narrow down the actual home address address taking that and uploading it to the backend server so they were collecting or harvesting home addresses of these people and um you
know was is that a vulnerability I mean certainly it's private information but did anyone read the Ula who does knows sort of that iffy situation did they mean to do that it's very possible they didn't right they may have been just using GPS for something and didn't mean to even collect that information um so that one was was uh a little bit Shady would we did flag that is an issue um again even if the Ula or they intended to do it is probably not a great idea so we did flag that as an
issue so uh like I was explaining it's it's again it's difficult to do this testing right there's a lot of you have to physically use the app you have to get man in the middle pentest the back end so I was thinking how can we make this easier we we got cuz it takes a lot of time right again the C there people are under pressure hey man let's ship this um so I thought well Android is somewhat open source why don't I make my own OS so what I did is I took Android a version of Android the source code went in uh and modified several of the files in there where things like uh network
communications so like the uh um java.net right anything that happens within that Library I said hey take that traffic and also send it over here to something that I have a little I built a monitor to see what happens and that way when you're using the app it's actually spitting out everything before it gets encrypted to the backend server so we can kind of map out the back end if we don't have you know if we don't have a definition of that it was also storing things like uh anything any file rights would also get spit out and we can see what was being uh stored see if it was encrypted and such um so I created this thing got it all
going it seemed like it was kind of working I was like okay let's give this thing let's let's really put it to the test so I in uh and it was on an emulator so basically you just installed this I called it Shadow OS in the emulator and then you could drop any app on it and now I can see exactly what it's doing as you're exercising it and I put it on there I took a popular would say social networking companies app dumped it on here and started exercising it and immediately I see this huge hunk of data you know this big mess go out over the network I'm like H what did I
break what is this all about and I start looking at it and I'm seeing crazy stuff in here like rear camera true front camera false key guard type Wi-Fi enabled uh screen brightness how much dis space is free on the device all of this was just being harvested off the device um and sent up and again did you get permission for that I you know questionable um and I was kind of thinking why would this even be important anyone want to take a shot at why somebody would want this information about your device finger what's that fingerprinting uh I don't know about fingerprinting but that's a good thought it could be that and I don't know the
exact answer I have a hunch it's one option uh the kind of the thing that I went down is you know if my free dis space is low all of a sudden within my fee Here Comes an advertisement right hey need a new SD card um you know and like be able to Target you while you're using your device in certain situations you know targeted advertising um and I'm sure they you know companies sell this data all the time just to other other folks but I thought that was an interesting uh way to go about
that uh also on the network uh uh Bose's headphones uh they were also outed as well so nowadays you know you can't use your headphones without a mobile app you got to have that uh yeah that's different I won't get that story but uh so Bose what they were doing was actually collecting your listening Hab they were taking what you were listening to okay um and it that doesn't sound terrible but when I saw this quote um it really stuck out uh stuck with me uh the complaint accuses Boston Bas Bose of violating the wir act in a variety of State privacy laws adding that a person's audio history can include a window into a person's life and Views
that based on what what I'm listening to that people can you know make assumptions again maybe targeted advertising or or who knows what they want with that and I thought that was kind of interesting um for me I like big hair rock music I think the only thing they get out of that is I horrible taste in music uh Starbucks mobile app this made a big splash um this this one's pretty old uh they had their application was also HTML 5 based so what they were doing is they were using crash litics right just another one of the hundreds of types of analytic uh data capturing Services right third party so what they had done was they
were capturing screen you know just so because they wanted to track what screens are people going to and such and for whatever reason they were actually uploading the HTML content to crash litic uh with that data the problem here though is they were uploading it after the usern the user typed in their username and password so now crash litics ended up having everybody's username and password for Starbucks in that case um so again that's that's an example of a thirdparty data Le AG uh I thought this was interesting when when the quote came out um when reached Wednesday crash ltic of auson B's firm that specializes in Crash reporting Solutions couldn't comment on specific customers but they did reiterate that
the firm doesn't recommend developers log sensitive information what's the fix don't do
that client side logging so um anyone can do this actually with your with a mobile app if you have um xcode installed uh you can open up the developer console I'm not they keep changing the name I'm not sure what it's called right now but uh you know you plug in your device xcode open the console up and you'll see all kinds of information flying through there right a lot of data regarding the OS itself um and then developers that you're writing code you know log. debug a certain thing well that's where that shows up and a lot of times developers won't remove those log messages okay so we did see examples where we could see usernames
and passwords being thrown out into the device log um and again that's probably the developer at the time just saying you know what my login isn't working let me just output some log information so I can see what's happening here and they don't remove that before uploading it to say like the app store or Google play um so logging is also another area to look at
so in this case uh we were handed a banking application and uh this was uh this was a while ago and um kind of when banking applications really started hitting hitting the market and it was sort of a new feature right taking a doing a mobile depos deposit anyone do mobile deposits uh with their banking uh with the phone and in this case what we found that they were doing is when you take a picture of the check it was storing it to the global camera roll of the device which all applications have access to right it wasn't in the sandbox where that data should be stored and so you know we flagged that hey
bang they had handed it to us and said hey we're in a hurry we need to get this out like okay you know give us some time we test so we reported that said look don't ship this you got a critical vulnerability we we think you should fix this so we sent it back they come back like a day or two later and say we fixed the issue we're all good now and by the way we just push this up into the App Store okay we'll check and make sure it's fixed but uh you know we'll do that so we take it we look at it they did fix it they fix that problem when you make
the deposit we found though that they had another feature where you could go through everything that went through your account by swiping through the images and as you swipe it's writing each one of those back to the global camera rooll everything that's been through your account uh and we go back and like no they ended up having to pull that uh you know pull it out a review and go back and start all over again and I just cannot stress enough how much pressure I know I know we're all hardcore right we're all Security Guys hey we know what's best but developers are under pressure to push those features out and product managers um by
Executives to get these things out and Security in this Cas they just like look we pushed it up we're hoping for the best and those are the type of pressures the developers are under and I think most of us probably have an understanding of that you know we see dumb stuff all the time but there's usually a reason
right client side debug screens uh this one's really interesting so a lot of times they'll be a debug screen embedded in an application that you never see that you actually you know the apps that you download from the store the screen's in there but it's not enabled usually it's say like a build flag uh that would enable that or not um or if you have admin credentials let's say or certain types of credentials that screen will show up but there are tools that allow you to modify in memory uh on the device to be able to set different flags that you can see based on variable names you know if your code isn't obfuscated that
you can see some of these variable names and kind of play with them and tweak them and these are some of the examples of the things we saw in where again normal people shouldn't see these screens and uh I again I know it's hard to read uh but let's see there's a flag there for disable SSL certificate pinning fantastic um we have upload log file in this particular application we looked at that and it wasn't just upload log file you could use that endpoint to upload anything you wanted right it didn't even validate that the file was coming from the application itself it was just a wide open again server to upload any files you want
to uh this example over here has um you can pick your server and it has production right it also has staging and Canary so as an attacker that just gave me two more assets right two more targets that I can go hit that are probably less secure on the back end uh for me to probably you know hope gain access into the network or hit it with SQL injection because surely it's probably not as fortified as the production server um so again it's just a way to harvest more information as a pentester or as a bad guy to figure out more ways to steal data or get into into the
servers uh this one was one of my favorites so uh we tested an application and it was so so secure that it used voice recognition to log into the app so what you would do is you start up the app and when you set it up you would say anything you wanted uh to log in you know I could say asparagus and if that's where I recorded it worked and we would hand it around to other people other pentesters you know we'd say carrots or whatever right and it worked it was really good no one else even if someone else said asparagus it didn't work okay so it was pretty good voice recognition one of the techniques for uh
pent Testing mobile apps is to take a look at the directory where the files store right so if you're not familiar that's where pist is stored um all of your uh files that your application needs will be put in a folder just like a normal PC for that to work and what you do is you take a snapshot of that and say okay here's what the directory looks like now maybe there's 10 files in there now I open the app I do a spare right and I exercise the application I do things I'm trying it out great now I go back and look at the directory again and I see 12 files okay cool now that's
interesting to me what are those two files and uh one of them made no sense and it was something like zebra doxyz made no sense okay well whatever all right let's let's throw it in the notepad and just see what happens right or or a hex editor and see and it looks like garbage complete binary stuff like okay not sure but as we look closer near the top we see things like genre year artist anyone Voice MP3 so what we did is we took that file pulled it off onto the computer renamed it to MP3 held the phone up hit play asparagus it opens up so an example of security through obscurity right that hey I'm just going
to rename this file and no one will ever figure it out and uh so that was a that one was pretty clever in that case resources uh OAS was a great resource they have the mobile application security testing guide uh on there uh they also have is that on the next one uh no they don't they have uh several mobile applications that you can use that's you know like webg go right or je shop uh that are mobile app for IOS and Android that you can actually try this and walk through the guide and see all these different ways to uh find vulnerabilities in mobile applications they also have the mobile uh OAS mobile top 10 uh you know the top
10 vulnerabilities for mobile applications uh so that they are a great resource as well okay with that I'll take some questions um if I don't know the answer I'll pretend like I didn't understand your language and we'll see what goes on any questions or you repeat the question he absolutely yes we have a
question yes yes so uh he was asking about the account enumeration like I said it had date of birth username and password uh that you could see and actually it wasn't plain text it was over SSL it so it was secure but at at the end point right where we're at and we're creating that data back it's clear text to you right you you just went over the SSL connection um again kind of the thing like oh it's over SSL it's safe well you just sort of hid all your networking devices from seeing these steal all that information so we talked about um logging logging vulnerabilities and what do we talk about the service side one of
the vulnerabilities we often discuss is loging jje that um when you're logging um logs locally how bad is there's log injection yeah when it's so the question is about that logging issue that I had talked about that one is um as far as for a nefarious person it's not that bad right because it's on the device so like you have to have the device to see that information but as a pentester right that's incredibly useful to us to see some information that's being spit out in those logs um but to use it for bad stuff is it it's the risk level is pretty low because you actually have to get physical access to that device um
hopefully that kind of answers your your question and I guess question where those logs are stored in the device what's the probability that another device would have access to the log this device exactly so uh so other applications so a good example is Angry Birds right how many fake Angry Birds applications were created um so they were made in China typically and uh when you jailbreak or root your device all bets are off right so I can install that you know sideload applications onto there my my free Angry Birds app but in reality what it's doing is going through those logs on the device or going through your Global camera roll and stealing all that information uploading
it to the bad guys server um so that's an example where that could happen yes so which one's more secureid none of them I I would say though um so I had a mobile application that was fairly popular it was Minecraft related um and I would upload it to both Android and iOS I I think IO uh Apple did a better job of making sure that the app was somewhat secure I know they did like Blacklist testing to make sure any URLs that are embedded within your app don't have blacklisted UR URLs um and it would usually take a week for the review I don't know if they were actually working on it or what's going
on I do know that people actually used it I would get emailed by their assessment team saying hey we tried your username password and it didn't work someone was actually trying to use the app in on Apple's team or I was getting fished but um uh and on Android it was usually up within an hour it just apps ready you know so I I so I don't know if that means it's better or not but that that's my one takeaway I could really give you I think there yeah one way way in the
backat yeah so question about facial recognition I don't have an answer for that hopefully someone does and I hope it's secure because I use it all the time um uh because I hate typing in passwords and trying to remember them but uh I I don't have any information on that yes of social media in your experience most I I don't know I don't have data on that like which application would collect the most intrusive data what classification or you know industry and I don't I I think it's just completely up to each application or each organization that's shipping those apps uh as to what they want to collect off the device right and and every app is
just different I've seen from a whole lot to nothing right so I I'm not sure uh there's really a category just have [Music] any it's up to the yeah that's a great point about protecting yourself I and I do media inquiries all the time right through synopsis we'll get hit up when there's a zero dat that comes out like last week on iOS right that a malicious image or PDF causes command execution on your device opens you up to arbit to command command injection and the question is okay so what do users do don't use your device you know I mean what can you do right that's in the OS right or if it's a developer mistake you
most of the time you don't even know actually so there's not a lot that a user can do I know there's like scanners that look for you know malicious Angry Birds at apps more so for Android but a lot of times those will be malicious too so I don't know if I can even recommend that my time a couple minutes couple minutes anything else [Music]
yes yes yeah so it kind of um I moved out of the pen testing group back into the Das world when while I was doing that the intent was to open source it and it is on GitHub publicly available uh VB is best I usually that's usually the first question is your handle really VB is dust um yes it is and um so it is up there and I have some documentation about what it does and such and a couple of modified Android files I'm sure they're at a date now I can't remember what andro version it was I would like to get back to that though and and try to get that back working again um so
that is up there called Shadow OS um hopefully I can get back to that because that was a really cool tool you just start exercising the app and the little monitor will show you hey this file got written hey this request just went out even over SSL and show you the request the response um and there was something else he did I can't remember oh I think it was log watching the logs that are getting spit out from from that application specifically that you're testing so hopefully I'll get back to
it yeah so one of the we talked about SSL certificate pinning uh if the developer doesn't do that you can simply proxy just run TI your uh your device through a proxy burp if you will um and just watch the traffic go through that's all you have to do um you might be thinking well why not why don't developers pin the certificates why don't they fingerprint them make sure that the thing I'm talking to is really what I'm supposed to be talking to and the challenge there usually is testing that you have a QA Department you have a QA lab right and they don't want to buy a certificate for that right they want to use self signs whatever ever but to
do that you need to disable pinning in the application or in the function call usually it's like on certificate validation for example return true hey it's just always good no problem and then the testers are all happy ah thank you that's great and then they ship it to the app store without undoing that um ideally that should be a build flag that I have a debug Bild that will allow that but when I we do a release bill that certificate pinning is re-enabled again uh to prevent that but a lot of the times I don't just slip through and they just don't catch
it uh so I guess I'll end it there I uh I'm here all day and hopefully at the Afterparty I love talking D I love talking uh pen testing so please come up I like to meet everyone and everyone here in Belfast has been amazingly friendly I've had a great time here so thank you all