
hello everybody um let me present Tony and Tusa and the Savage
C all right all right people um our presentation is on the topic of mobile SSL failures um we are really excited to be here at bides and this is my first talk ever at a security conference so thank you all for the opportunity so um our plan is to Showcase in the next 45 minutes or so um some of the systemic issues we found in popular mobile applications and operating systems also um we will be presenting a new technique of achieving almost undetectable and persistent man inthe middle capabilities in certain IOS and Android applications so a little bit about a background I'm tshad DVI I have with me Tony trummer we both are senior SEC
engineers at LinkedIn and we are primarily responsible for vulnerability research and penetration testing at um LinkedIn but like most of the security folks we spend most of our time hurting cats uh previously I worked at maafe as a security consultant and I also got my masters from John Hopkins in information security uh Tony on the other hand is a Serial Dropout um comes from a military and networking background and has almost 20 years of experience in it and just to be clear this presentation is purely a part of our site project research work and so our opinion does not necessarily reflect the views of our employer all right so the premise of this presentation is that mobile
applications have come a long way from what they used to be and they offer so much more control to the developers to define the exact behavior and in most cases you know an application would be written for several different platforms you have Android iOS Windows mobile and so on and uh of all the problem apps um we found uh most of them were in Android we did find a few in iOS but surprisingly with didn't find any on the Windows platform um since you know each of these platforms have differences and the way they offer um you know they can do the exact same thing in different ways the developers need to have a clear
understanding of you know what the implications might be for the flexibility of to do the exact same thing on different platforms and you know this is especially important because um you know if if it comes to security controls such as SSL So speaking of SSL you know since most of us are familiar with the concept of SSL let's try to recap the basics in you know let's say one slide so SSL provides two key benefits right first is the secrecy or the encryption of data and then we have authenticity which guarantees the identity of the entity that you're trying to connect to so typically in the event of an SSL handshake failure you know browsers
would typically show a browser warning saying you know this is probably not the site that you're trying to visit and so on and so you get um a gist of it but in mobile application this Safeguard is left at the mercy of the developers and that has you know some security implications as we would see and if you think about it you know SSL is the only real production right now against you know preventing mandm attacks and they are sort of significantly easier to carry out on mobile devices and we have you know certain reasons to believe so um the fact that most users cell phone data plans are not unlimited the connectivity speed is significantly
slower over cellular networks as compared to open Wi-Fi networks and so the users usually have a tendency to hop on to any open wi-fi network they find and would you know stay connected for as long as possible and cell phone providers actually encouraged this to save bandw and this is the reason why your iPhone automatically automatically connects to any Wi-Fi that's called at Wi-Fi and some previous research has shown that even without you actively trying to connect to an open wi-fi network the device itself would try to communicate to you know any preconfigured or um any previously known set of ssids and also uh if you guys are aware of the Snoopy framework it's designed
specifically to look for Wi-Fi probes for such devices trying to connect to open Wi-Fi networks and then tries to imitate um as being the same Wi-Fi network so that users can connect to that Network so we'll show why if an attacker can lure a victim to join the same network as them or wor at and a network that they control you know it's essentially game over for them and add to this the fact that researchers recently recently claim to have cracked WPA2 encryption so many of you would be thinking you know what are the chances of such a large scale mind the middle attack taking place and well as a matter of fact Nokia was found to be performing
the exact same thing in 2013 and they reportedly did this for performance reasons but just think about this for a minute the manufacturer one of the leading device manufacturers had the capacity or was actively intercepting and decrypting their customer traffic and also let's not forget that according to you know allegedly leaked documents the NSA and gchq were performing similar mind the midle traffic um snooping SSL traffic so the real question is you know how do companies show their customers that they really care about SSL well uh turns out many of them you know just put up these logos on the website without actually considering you know what they they are promising to their customers and knive
customers or knive consumers on the other hand have you know a reasonable expectation that these signify that their data is immune to E dropping so we are not lawyers but you know it turns out that stating that you use SSL on your Communication in this way or even in you know as a part of a disclaimer turns out to be legally binding which we'll discuss later and the fact that you know encryption is just a part of security and not the end of it so one night after a few drinks we decided to start hacking some mobile applications and as usual we started by examining the traffic that flows in and out of you know the device and used by
the applications so then we notice something very strange like strange Behavior so Tony will now discuss the details of what we F thanks sure um hello everybody I'm Tony hopefully you're enjoying yourselves at bside so far um before getting into exactly what we found we just wanted to make sure everybody you know is obviously on the same framework not everybody is necessarily deep technical person um so um in general uh in order for SSL to validate the identity of uh the remote side of a conversation uh it checks that the certificate it receives is cryptographically signed with a private key that corresponds with the essentially a public key that it already trust uh a certificate authorities
public key that already trust companies like verisign diger GoDaddy and all them are generally automatically included in these Stores um obviously the entire trust model falls apart if any random attacker could spin up a CA create a certificate for dubdub duub google.com and the victim browser or app would accept it right so um before proxying mobile SSL traffic as anybody who's done this knows you have to install the proxy CA certificate into your um device so that it will trust it right um since it's generally not already trusted uh for example we use burps weed all the time and um this would be the port swiger CA um anybody who's ever done this knows this so um have a having
established these facts uh what we initially found was that we're still seeing traffic in our proxies uh despite not having installed the port swiger CA on our device so something was wrong um it turns out uh we the reason we were still seeing SSL traffic um was thattime dur in a development process uh the validation of certificate authorities was disabled um unfortunately this is a common practice and it may be uh may have been done due to Developers um not fully understanding what they're doing or uh simply because they disabled it for testing because they wanted to proxy their requests and were too lazy to install the sech on their devices or emulators or whatever
uh and then just forgot to turn it back on so um since you can you know install search on your device or your emulator there's no real reason to do this in code but um we just think it's an example of just because you can do something in code doesn't mean you should right um to be clear anybody who didn't follow uh at this point an attacker could simply gone down to the local coffee shop or any place with open wi-fi join the network gain man in the middle position via either um AR cach poisoning uh DNS sorry DNS cash poisoning AR spoofing something along those lines uh or alternatively gone anywhere and just set up their own
Hotspot with a common SSID and a device may have connected to it right uh they even might have set up their own fto cell and just waited for victims to connect to that there's plenty of straters there's even a drone now that will fly around and follow you and broadcast SS IDs to you so they could potentially just hover above you owning you all day long right so um where was it uh at this point basically uh an attacker on the same network or with manal position uh would be able to in uh intercept and decrypt all of the traffic for all of these apps um at this point Shar and I were were probably thinking W
we're really cool we're like the Next Generation lead hackers right um we found something really cool and then our boss is like no this we there's a Cod um a paper that came out in 2012 called the most dangerous code in the world if you haven't read it you should read it you should but essentially it talks about failure and and SSL of certificate validation and many platforms anyway um not being discouraged we decided we'd push on and just see how common these these issues were so first of all we'll we'll try to as a disclaimer we'll try to avoid naming companies by names um simply because they didn't uh validate our findings uh or never even respond to us some
companies right um but we fill our we do did our due diligence and making sure that they were the actual apps and that we did contact them and then we validated results etc etc etc but there is a phenomenon in the App Stores uh where some people just make fake apps right they'll just put up a logo and say hey it's Bob atgmailcom and I published the Gmail app right and some people will download it right so um it's always possible we got uh an errant result but we didn't actually find that but just in case we don't want to get sued right um so when we selected the apps uh we obviously tried to focus on the top apps
generally by download count um and we thought you know we were going to focus on the ones that uh led to interceptions of passwords significant session tokens credit cards or something uh some piece of sensitive pii right not just you know oh I can change the page you looked at or something stupid like that um so uh unless we specifically State what could be pilf fored we're just saying there was an issue with the app um partly because we don't have time to go through each case and mostly because we don't want to get Ed by anything if we Mis misstate anything so um in total we found about 10% of the top apps uh were
vulnerable to this problem uh the problems I should say that we that we outline here um in in some sort of immediate meaningful way right like you could do something really bad inside the app not just Shenanigans right um we didn't necessarily include all the apps we found you know simply for the fact that you know uh if you're using bad SSL but you're not doing anything sensitive yeah I can I could stuff some JavaScript in the responses or or I could send you a malicious redirect or something silly like that but that's inherent with HTTP traffic as well so it didn't really seem to be noteworthy um so moving on uh you might be thinking well these aren't really
like the top apps and clearly you know some of these companies aren't very big or they're maybe they're big but they don't really have you know a a significant it security presence or something along those lines but again this is just a taste so um as with tribles you know um bad coding practices just seem to multiply uncontrollably right um unfortunately while uh for example the Windows phone may have fared well in the apps that were published for for that platform Microsoft's own apps uh for other platforms didn't do so hot they were um don't get me wrong I'm not a Microsoft Basher but they they were uh probably the top offender that we found uh it
sort of makes sense if you think about probably the number of apps that they might have and you know any place with a developer driven culture tends to uh you know let the inmates run the Asylum right um so uh another noteworthy thing up here uh the Google Cloud messaging uh app or Ser it's really a service uh so this represents a service that is used by nearly every Android app in the world right um and what it does is it's basically how the app registers with Google for receiving push notifications uh it's only half of the equation because you generally still have to register with the app servers the app developers server to get the actual
messages and um I'm also told uh that it's uh used by all the Chrome extensions too so it might actually been in the Chrome Library somewhere that this bug was but uh we didn't dive deep into that I just wanted to point it out uh since we didn't really figure out how to exploit this in any meaningful way we just figured it was noteworthy so we'd include it um taking a look at the other apps we found uh you know we thought to ourselves you wouldn't really have any problem unless you know you went and bought some shoes or uh some books or Electronics or used one of these like like credit card uh went to some store
that was using one of these credit card swipers or bought movies or um went to your bank or manag your Investment Portfolio access your home security system you know uh maybe your networking vendor support site your help desk the largest payroll provider in the country um use one of the numerous vulnerable two-factor authenticators uh made a conference call access your ISP account your corporate VPN or logged into your Microsoft account again with uh either your being or Outlook I'm not sure why you need to log into being but it's there um just because I like these guys all of a sudden um I just give a shout out to the zapo staff who were like one the few
people who were like oh we know exactly what's wrong you know we didn't have to handhold them through the process and explain why this what was going on they were just like boom got it and we got free shoes so um moving on so uh if it wasn't clear if if you if if you don't know how to test for this um you know call Scotty and he'll tell you um we'll be posting this on a we put up seco.com um don't worry if you can't write this down but basically you can just use burp Suite out of the box with the default proxy settings it it generates a dynamic um certificate signed by it ca to match
whatever uh host name you're trying to reach essentially so you basically just configure device to proxy through it step through it if you see SSL traffic there's a problem right that's provided you did not install that certific the ca the port sger CA certificate on the device first and if you have you want to D you either way you want to double check that you don't that you see a certain warning if you visit a um slsl protected site in the native browser just to make sure that it would throw up a warning if something was wrong right um so of course after looking at these we uh wanted to see well what else could
go wrong so we sort of just stepped down the the certificate validation chain right like what what would you do next right so the second as aspect of uh validating certificates is basically does the subject common name or alternative name match the host name of the site that I'm trying to contact right so um this uh again this entire trust model falls apart if anybody could go to go to you know let's say diger and get um f.com and then your app accepts it even though they're trying to go to you know google.com or whatever it is uh the example we gave was that you know if the certificate for nsa.gov was accepted when you're were trying to visit tour
project.org there's probably going to be bad bad times ahead for somebody right um so while these apps all correctly validated the certificate Authority was someone that they trusted they never bothered to check that the name on the certificate they received actually matched who they were trying to talk to um so we we we uh went and got a free certificate from start ssl.com for our domain tmit tm.com and uh basically just offered that to all the apps and said who wants to still talk to us right um so to be clear start ssl.com is uh CA is already trusted out of the box on both IOS and Android so we didn't have to to install the search first which we
thought made the testing more pure and easier to understand and more valid there there's a way you could do it in um burp uh there's a configuration you could do burp to to basically do the same thing but I I just didn't want to mess with that I thought this was a cleaner example so um again uh looking at the apps we see several significant uh Financial applications tax software um Del leading blog uh softwares admin app um a domain register and ironically a SSL certificate Authority um an ISP uh some security software cable television company uh one of the biggest Chinese internet companies uh the California DMV which sucks for all of us who live in
California um and uh note if you do see uh an app um on multiple slides it's because it had multiple problems uh or it was vulnerable to different things on different platforms uh we don't really want to waste the time in well this was iOS and this is Android Etc so um Oracle wanted us to let you know we got a lot of you know people please say how well we did it fixing this and we did it quickly and and if you have to include us you know tell everybody how awesome we are and we basically told them all to go away but um Oracle told us uh they chose the nuclear option and just got
rid of the app so the work called Now app has just gone out of the store and the folks at Citrix wanted to tell us want us to tell you that the Citrix Receiver app was some sort of demo and and there was no in their in their opinion there was no CR uh customer data ever at risk so okay I have no problem with that so um we also had some interesting experiences uh namely AT&T uh we submitted this information to them and about four months later they came back to us and they said yeah seems fine and I said okay well at least with the the first one the my AT&T app so I I I
said okay well you say it's fine I'm going go check it right um do my due diligence and I checked it and I happen to notice that um it was updated the Friday prior to the Monday that they contacted me right and I was like oh okay I I I see what you did there right so yeah it's fixed I'm sure so I'm just going to exclude all their apps I found some Replacements so no big deal um and then Yahoo uh I don't want to bash them as bad but uh similar kind of experience I told them four months later they got back to me says it looks fine now so okay found someone else um
so um yeah so we had some some funny uh trials with some of these um anyway uh as you can imagine a lot of people contact us didn't want to be in any kind of presentation regarding bad security practices but anyway uh stepping through this again I pretty much explained it but uh obtain a valid certificate for any domain uh use the uh use custom certificate option in burp plug it in there right and then just configure The Listener to go through your proxy or sorry configure the uh device to go through your proxy if you're still seeing SSL traffic as you pop through it something's wrong right um again I was caution to make sure you
see a certificate warning in the mobile browser just to make sure everything's as you expect it that it will fail there's always a chance a mobile browser could be broken to but we didn't see that so luckily um so oh Sam skipping ah head sorry um so when we were doing this this is probably like in February March when we were wrapping up and that was about the time that the goto go-to fail bug came out with for iOS and we're like well [ __ ] you can I you can man in the middle of everything on iOS for years so what the [ __ ] difference does it make if you can you know man in the middle of a
certain app right um so rather than trying to explain to people like oh well it's not that it's this we figured that was going to be pretty hard sell to people so we're like forget it we just we'll just leave iOS alone and whatever and then we saw an RS technical article came out which uh it looked like some people might have been doing research similar to this so we said well we're not going to keep going down this road and like duplicate efforts or whatever and obviously they decided go media with with it where we thought that uh more responsible disclosure probably was we thought maybe I don't want to get a flame war with anyone just we we decided
to take a different app okay so um we did however find another way in which some apps fail to use SSL correctly and that was by not using it at all so um at least not in important requests right um this included leaking session tokens from Kora a popular um information sharing uh application for you know luminaries or something like that uh the uh entire registration process for the Cisco WebEx um app including the password creation uh wasn't encrypted um the usernames and passwords for uh Angie's List Business Center which is probably no big deal it had like 5,000 downloads um was uh was not encrypting stuff uh again the usernames and passwords um now there's an application
called rockbot which uh we use at our work it's basically like a digital juke box right you you can buy but you can buy credits to like sort of vote your own songs or something like that using credit cards right so those were also not encrypted um but then we found one that we thought was particularly interesting and uh we consider it Pony worthy so of course um to be Pony worthy uh first of all we hope you enjoy the animations on this slide just it cost a several thousand dollars so uh we want to ensure we got our money's worth um obviously you know for for an app to be Pony worthy you have to you have to uh
leak credit cards right so we had unencrypted credit card information uh difference here is that they're tier one PCI mer right so we're talking about scale um and then obviously uh tier one status means you process lots and lots and lots of credit cards right um I'm pretty sure last time I looked at a PCI checklist there's something about encrypting using SSL in there I'm not 100% sure on that but I'm pretty confident so um their installation base was more than 10 million users right so that was probably bad news for a lot of people uh as you'll see it's probably an application that most of us use um I know I do so um
so who was it exactly that forgot that https ends with an s Red Box you guys probably all know them they're the little video rental um uh kiosks at all the 7-Elevens and all that Supermarket stuff so you could have you know milk people a dollar at a time or something like that like they do um but anyway uh so uh just to show you uh I know it's blurry as hell but oh well uh you can clearly see here there's a post to um API account saave card going to http dd. redbox.com right um and I'm highlighting there the full 16-digit credit card number it's not my real one so don't it's it's fake
number so I don't try to steal it off of there or anything like that but um uh we thought you know uh this is pretty good uh representative sample of all the [ __ ] up apps in the app stores and and uh we figured you know it was good enough to UND underscore the the uh needs for improvement in the mobile sdlc regarding uh transport layer encryption right um there's several different ways you can fail and uh we obviously want to encourage anybody who's uh got an icon that looks like anything on here to make sure that they've updated um to the to the latest version and it should be fixed um and um let's see um you know ideally we wish
these companies would would have to come forth and sort of uh notify their customers that while their data wasn't actively stolen it was leaked all over the place and there are you know plenty of opportunities that somebody could have pill fored this stuff uh again you know your your mobile devices are transient by Nature right um you're always bouncing in and out networks whether you know it or not right so um why this is so important um getting back to those SSL usage disclaimers U made by uh the many sites and apps that you see you know we're all familiar with these padlock things right that um tar was previously talking about well during our research we found out there was a
previously FTC suits against Fandango and Credit Karma for essentially the same thing right promising to use SSL but then not validating SSL certificates properly uh that was one of the main U citations from the suit so obviously you won't want your company to go through this um and what ended up happening was is that the company settled with the FTC um and in addition to being you know publicly scolded by them for breaking security promises to their customers they were compelled to Institute uh what was termed comprehensive Security Programs um which you would think you know companies should have anyway so I'm not exactly sure what that means but uh unfortunately it also came with a 202 20
year penalty uh essentially an oversight by the FTC where they're going to be in their business for the next 20 years um which pretty stiff penalty right um so what ends up happening in this situation um is when you're under FTC you know guidance let's say uh if you [ __ ] up again something the penalties are worse right because it's not your first defense so hopefully there's no one here from the FTC at least for the sake of the companies we've shown here um I don't know how hope they uh pursue this but if you've ever been to security conference you you knew this was coming right um we we we didn't want to leave out the
uh the tinfoil hat crowd so sorry bones but um so there's this there's this thing called SSL session caching right if you're not familiar with what this is um it's it's an effort to make SSL handshakes or the overall SSL process um from start to finish more efficient and faster right so someone decided the way they were going to do this it was a good idea that if we just validated that the site that we wanted to talk to was the right site the first time we hand shaked with them right we don't have to do it again right because we already know they're the good person right um so um what they do is they basically validate
the certificate the first time and then just they just cach this right and then they say okay we can talk to you uh by storing an identifier of some type right they just keep talking to them uh we found that Google Maps uh was the app that we were testing on had this um feature enabled as many others do so we thought ourselves okay so well what if what if a bad guy could make that first connection right what would happen so obviously the you know Skeptics say well how would a bad guy get my phone um and that's a logical question and unless you travel and get stopped by the TSA or or the Customs people right um You have to
give your device to your um to law enforcement like if you get detained by them have a jealous girlfriend or got your phone from or have to give your phone to your it Department remember that the tech at the cell phone store and various people along the supply chain already had access to this device um and uh Asar mentioned no Nokia was busted for you know playing with their their customer SSL traffic in the past so it's it's not like it hasn't happened before and probably will happen again obviously uh you could also drink too much this week and have your hacker buddies [ __ ] with you so that's another option right um so I I expect that most
Skeptics would say um well if I if I ever misplace my phone um I you know I would scour that thing top to bottom to make sure that nobody's monkeyed with it before I started trusting it again right and we'll show why um that's not going to be as easy as you might think right so the reason we think that this is more problematic on mobile because session caching is not a mobile phenomenon right people like to use it on mobile because of latency and bandwidth issues right but it's it's for other reasons as well so um the reason we think it's more problematic here though is that just because there are more opportunities for people to get
access device they can't get into your data center to get your secure servers but your devices are all over the place all the time right um You eventually have to go to sleep you leave it you know plugged into a wall charging somewhere or something like that um so obviously most of them have screen locks but a lot of people don't use screen locks at least you know not security people obviously but um people use weak screen lock pass U passwords um they uh sorry they're they're new screen like bypasses that come out all the time um smudge attacks are surprisingly effective and if you ever use your device like say in I don't know if you
unlocked it in like an airport or Cino lately somebody's already recorded your PIN code whether you thought about that or not right obviously you know there's the creepy Google Glass people um spying at you while you're unlocking your phone and all that sort of things right so it's pretty much a given that somebody's got your PIN right some security camera somewhere captured it in today even right so you might say well Biometrics are the answer but I'm willing to bet you the cops have most of your fingerprints already for one reason or another so obviously you ask you say well okay if I get physical access to the device there's all kinds of fun stuff like I can install malicious apps
uh I could just take the data right off the phone um so malware can be installed but you know if you have one of these things like Lookout potentially it's going to you know detect the the uh malware um you might also ask well why wouldn't they just you know rip rip the data off directly again um if the apps are foolish enough to store data on the SD card or if you are um U then yeah you're you have problems but if it's if it's not and uh generally accessing the data may require you have a rooted device or or something like that uh just just depends on the platform Etc but um if the device isn't
already rooted um rooted or at least the boot Lo loader unlocked uh it might require that you wipe the device so the data is not going to be there when they go to get it right or you as an attacker go to get it um so if you're if you're obviously if you're running your device rooted at right now then you're already doing it wrong and you're probably [ __ ] and don't know um so uh lastly so all those things that are attacks against data that's at rest on your device now but let's say you got you hadn't quite done what somebody wanted to know about yet right you got detained by police but they wanted to essentially
know what you're going to do in the future right none of those attacks really help there so um if we can get aert installed onto your device it only takes a few seconds it's a lot quicker than uh installing an app right now along in that process of installing it we can also delete it very quickly uh and you may wonder why that matters so you know Android latest versions have these little banners that come up and say you know somebody might be monitoring what you're what you're doing right so um in both iOS and uh Android uh you can delete the man- INE middle certificate as soon as you establish that first connection due to session
caching right because it's never checked again and then there's virtually no way for anyone to ever know it was there now um like I said even that Network maybe monitor warning disappears right so so there's no way that you'd ever know that the certificate was there on iOS you'll still be able to intercept all the traffic from the device the application I should say uh until the device is rebooted but on Android it's a little bit different so in session caching essentially the server gets to decide how long it's going to uh accept the C the cache session for um we verify that we could maintain uh this position this ability to decrypt for for over 24 hours
and unless there's some limit we missed we're pretty sure we could probably choose to do this forever but uh you know two years is probably Forever on a mobile device anyway um so getting back to the Android Behavior Uh unfortunately uh the session cache files that Android creates um are persistent and they persist reboots so that means anytime anyone ever had access to your device they could install could have installed a certificate deleted it and you have no way of knowing that was ever there so we refer to this as ever poning an app and there isn't much a user really can do to prevent this since um if they do lose physical access or
control of their device even momentarily right um without having to root the device because the the files the cache files are actually protected on the on dis right so you would actually have to basically root the device to get to the directory to know that the file was there in in a lot of cases anyway we'll say um uh so so your really only protection is to uh basically reinstall the OS altogether um which or throw the phone in the [ __ ] trash and go get a new phone if this if you ever lose control of it right uh it was reported to Apple and and uh Android security teams um and uh that's why we're in the
Android security hall of fame which they just published this so anyway um so uh we thought we'd leave you with a few tips to take back to your organizations on how to protect your apps from these problems uh initially we thought you know we'd use the vanilla recommendations of review your code ment policies saying you know you can't disable certificate validation or host name validation even in development in QA um obviously test for like as we described pre-release make sure you know uh there aren't any problems train your developers just hey it's not that hard to install a certificate right I mean it's like just don't do it right and as far as the code review um it's not an
extensive list but uh you know code review is basically a a GPA away right you just look for these sorts of things uh again not an exhaustive list but and I'm not an iOS guy so uh you want to look for these sorts of things uh essentially a trust manager is something that you build and it's basically um it manages who it's going to trust right so what what they'll do is you you look for trust managers that don't actually do anything they're just empty bodies right they just say okay fine we'll just keep moving on right um there's host host name verifier interfaces commonly um they just fail open where they like okay it it failed
but we'll just keep moving along right um You any use of SSL sockets where there's no uh call to the get default host name verifier uh with the host name specified um because it'll actually it's like a Boolean and if you don't check it then you'll fail some of this is on the Android security website they they they obviously realize these shortcomings or or possible pitfalls um and again there's some iOS stuff here as well so conventional wisdom generally says that you know at in this day and age you should just use certificate pinning in your app and and we don't disagree with that at all I don't want to I don't want to state that but uh if
you've ever tried to mess around with certificate pinning for like testing and stuff like that it's a pain in the ass that's usually why people don't do because they say it's just too much of a pain in the ass we can't actually Pro you know uh protect our our we can't actually test effectively right we're not going to give U the certificates the production certificates to people to test with and and so we're kind of at an impath so we just don't do it um but Tashara and I are both U big fans of self-defending code and um what we what we thought of is uh to eliminate the chance for a human error um we we found
what we think is a novel concept or at least a novel implementation which was um the fact that the Ides for both uh Android and iOS uh have like essentially development signing certificates that they use when you're doing stuff in QA and and development those can't be used to sign apps that you push to the stores they they will reject them is is our understanding so we thought well why don't you build the code that if you need to sell disable certificate validation in any way if if you absolutely need this functionality um have it checked which certificate it's signed with if it's signed with the dev or QA certificate you just don't allow it right or sorry you allow it but if
it's not then then it can't did I say that clearly sorry um point being if it's signed with a QA certificate it can allow for a certificate being uh certificate validation being disabled if it's not then it can't um so uh we think that this is a foolproof toggle so you can't forget something right it's not something where you can accidentally leave it on and push it out to prod and you won't have these problems um and we think it's a simple and straightforward solution that would uh be unlikely to meet with any realistic uh objections so um thanks that's all we have uh I hope you enjoy the presentation and uh if you have any
questions or comments we'd love to hear from you
like client side certificates no we didn't try we didn't we didn't honestly look because we were looking for the top those would be n Niche case right yes uh I did not you did you double check the iOS one I I assume we did at the time this you got to remember this was five months ago so um but yeah they were prompted as you can imagine they were yeah so he could he yes on the fix for sure prompt fix and iOS didn't have the problem as far as we know yes
sir I have no idea but uh yeah I wouldn't necessarily want to be them I mean that's a pretty blatant you know I don't that that's why we highlighted them because generally we didn't you know we weren't trying to call out people like specifically but I mean 10 million in ation and you don't even bother to encrypt it's just like come on that's such a simple thing to miss it's like it's unforgivable in RP
so right so fundamentally I I question why the design models exist that put this in the developer's hands right I I don't I don't know why you needed to give this power to the developers I I honestly I could see just retiring in this so the exception to that is is so theoretically if you use certificate pinning this can be more secure than a person visiting your website right because it's not a general uh general purpose Computing application you can say yes I only want that certificate from that CA done right and you're good but why did you need to give them the ability to turn this off in the first place I mean it's like because you can
install the search on the uh devices and you can and the emulators there's really no reason for it so I agree I mean it just shouldn't happen and and that's what we initially discussed like all the different platforms have you know given a lot of flexibility to the deps and so you know these kind of problems do come so we also found some apps will actually have a toggle in there trust all certificates so they they were aware that they needed the functionality and they solved it in a different way and uh Windows phone I think the Windows phone actually or the one we were testing on actually had that toggle as well on the
device level which is kind of
interesting yeah uh those so those were some of the things that would cause you potential problem right so again I'm not an iOS expert I'm just good at Googling so yes handsome man in the fourth third fourth
so we tried looking at the exactly so I played with it and I was able to control the lifetime of the the client how long the client would hold on to that right and right right so when I actually examined the session file and I admittedly didn't give it a thorough you know scientific investigation but um when I investigated it um basically looked like the certificate like it just had pulled down the certificate but there's got to be an there's there's supposed to be an identifier in there somewhere that some sort of TTL yeah there's well a TTL but also an identifier to say hey this is the session I'm trying to resume so the
quick answer to this question I don't know right and then that
ISS it's aid but says that's not RCA should be talk right right so we also uh I you know I I winged some of the slides and didn't cover it but uh we actually we we we theorize that there there's got to be problems with uh apps accepting uh expired certificates revoked certificates not recursing the chain to make sure it's actually a trusted route right like right they're out there we know it you know it's just because of some of the other stuff that was going on time we didn't pursue it any further than that anything else yes
sir they were far better yeah most of the hate to admit it yeah most of the problem apps were Android a few were iOS very little yeah we we we thought we were seeing about the same rate initially but um we weren't so it really was primarily an Android problem but it's not it's not a problem with the platform it's a problem with the app developer right so maybe the maybe people who develop for Android aren't as good or something like that or more prone to security problems for some reason um and again Windows Phone performed excellent so if you really want a secure device go get a Windows phone
right all right thanks everybody [Applause]