← All talks

Choice Architecture for Security Practitioners

BSides Augusta · 201829:31378 viewsPublished 2018-10Watch on YouTube ↗
Speakers
Tags
StyleTalk
About this talk
Chris Sanders (@chrissanders88) The security of a device or network often hinges on a single choice made by a non-technical user. This could be the choice to click a button to enable macros in a word document, the choice to enable flash on a page in chrome, or the choice to execute an attachment sent over e-mail. Each of these choices was designed for the user, in part, by a security practitioner. In this talk, we’ll examine the concept of choice architecture and the delicate balance that exists between allowing users to make choices that benefit them while also nudging them in a direction that keeps them from unwittingly unleashing disaster. We’ll go through several practical examples of common choices users make that have security implications and discuss a framework for architecting software and websites in a way that better aligns choices with security best practices. You’ll walk away from this presentation with a greater awareness of how security practitioners influence and impact user behavior.
Show transcript [en]

I recommend not clapping till the end because it very well may be horrible I don't know why you clap at the beginning but I'll take it nonetheless no well thank you for the kind introduction I want to thank all the volunteers who are here at the conference today as you have noticed this is a massive massive massive conference there are over a thousand people registered and it doesn't work without the volunteers as you leave today and you see these folks in the blue shirts and the purple shirts make sure you thank them because they're doing a lot of great work today my name is Chris Sanders thank y'all for coming out today and spending a little bit of

time with me a lot of ways you could spend your Saturday I appreciate the 30 minutes we're gonna spend together here today we're going to talk a little bit about choice architecture and its application security the practice of security if you don't know me you can go read my bio I'm not gonna spend a lot of time on me other to say that my background originally was in public education working in the School District I graduated from eventually landed in a career in security work for a few places work for the DoD a company called n Guardians mandiant I left a couple years ago to start my own training firm called applied networking defense I'll talk

more of that about that a little bit at the end I've written a few books along the way and my research area is currently focused on the intersection of cybersecurity and psychology and that's what we're going to talk about today and specifically we're going to talk about choice and decisions because a lot of the things we do are based upon the choices and decisions of the people we serve the users and we do indeed serve the users so we're going to talk a little bit about today and I want to frame that through something I like to frame lots of things through and that is food now most of us have familiarity with the concept of a buffet right

pretty simple in the earliest form of the that most of us dealt with is the school cafeteria so I want you to put yourself in the mindset of the cafeteria worker the lunch lady or lunch gentlemen if you will what I want you to do is think about how you arrange the lunch line so when someone walks into the lunch line the food laid out and it may be self-serve maybe you're serving them but it's laid out in a specific order and that order is something I want you to think about because let's say you're an enterprising lunch locker worker and you started tracking the consumption of food based upon where it is placed in

that line and what you find is that no matter what you place in the very first of that line at the entry to the line whatever is placed in that first position gets consumed at a much higher rate than it normally would so now you're in an interesting situation because you've been empowered with knowledge you can't unknow know you're in a position where you can actually influence behavior because whatever you put in that first slot is going to be consumed much more than it would be normally what do you do about that do you choose to ignore it and try to intentionally randomize what goes into that slot or you try to influence behavior by let's say putting people's

favorite item there so it's easier for them to get maybe something that is the least favorite item so that it's easier to clear the inventory maybe you put the healthy items there because you want to influence the health of those people there are a few different options here but the fact of the matter is is you are empowered with information you can ton know what you know so you're going to be influencing behavior for the most part in some way or another and that's an interesting thing and behavior is complex what influences our behavior is one of the most studied and to some degree least understood things in the world we have a lot of famous

psychologists who have studied behavior and depending on what you ask them they might cite different reasons for our primary influencers of behavior John Watson founded what we think of as behaviorism he would probably say your environment influences your behavior more than everything alfred Binet of the Binet Stanford intelligence test might say that it is your intelligence that influences the choices you make Carl Jung was a humanist he believed the whole was greater than the parts and he thought that maybe your feelings may be influences behavior alfred bandura created something called social learning theory which basically states that most of the things we learn are somewhat of a product of the interactions we have with others so he might say other people do

it and of course we have Freud who is probably the most popular name in common psychology and he would say that weird choices you make are probably based upon whether or not you got enough hugs as a kid or related right so all these different psychologists are not wrong and they're not necessarily right it's probably a combination of these things but they probably wouldn't necessarily agree on what most strongly influences behavior but they would all agree on one thing and that's that generally speaking humans will always do what's in their best interest given all available information and no significant bias and no significant stressors oh yeah and one other big one no strong outside influences that's a big one

so they all think that the primary influence their behavior is going to be a little bit different but they do agree humans doing their best interests absolutely all these other things so behavior is complicated and it certainly affects what we do in security I don't have to heart too much on the consequences of less than desirable behaviors clicking links and things opening attachments enabling macros installing on when it's software and browser plugins that's on the user side but even on the practitioner side choosing to not look at an evidence source or not look at an alerts we have all those things that are all based upon choices and I don't talk about this to shame our users it's not about shaming

users it's about engineering solutions designed to better help them in some way we all are involved in that process whether they're actually creating the solutions or informing them or just talking about them publicly and helping people understand them so I gave you a few options with the lunch line situation putting the least popular foods up front the more popular foods let's actually do a show of hands real quick who would choose option a put the least popular foods up front a few of you how many would put the most popular foods foods up front giving the people what they want very few who would put the healthiest foods up front that's the biggest majority right there it looks

like how many of you would randomize the foods completely a few of you you're choosing not to participate in the influencing and behavior which is fine so ultimately most security practitioners are choice architects in some way when you have information that allows you to influence behavior via choices that someone else makes that's the process of choice architecture the buffet represents what we would call a trigger which is an opportunity for choice right an opportunity for choices a trigger and when you have the info ability to allow choice while still influencing behavior that's always nudge many of you may have heard that term used in popular psychology it's a nudge we're arranging the choices in favor of

specific outcomes so if I have a buffet or punch line and out of the first slot gets consumed more and I put corn in that slot I know that people will eat corn more than they would otherwise right let's think about that from a security perspective starting with window 7 I believe it was when Windows would download updates in the background for you when you went to shut down a restart it would change your default option from merely shut down and restart to shut down an update or restart an update it would do both of those at once so what you have here is Microsoft making use of a very powerful nudging technique called using defaults

right we are more likely to go to the default option all things being equal so they decided that by nudging users in this direction we're gonna make them more secure because updates for the most part make us more secure so that's what they're doing here Microsoft is taking the role of choice architect in this case nudging our behavior while still allowing freedom of choice right trigger allows the choice to be made and the nudge pushes us in that certain direction now the concept we have to work with here is something we call libertarian paternalism and that's kind of a big word so we'll break it down there's the paternalist view wherein the board would say resistance is futile you

will be assimilated they believe it is best for you to be assimilated and you do not deserve any choice in the matter that is one end of the spectrum the paternalist on the libertarian side of things the Borg really just doesn't care and they think you should have the ultimate choice you can do what you want I don't care be assimilated don't it's up to you in the middle of those we have the libertarian libertarian paternalism it's very simply the idea that is both possible and illegitimate for someone to affect behavior while also respecting freedom of choice so the libertarian kernel is says you will be assimilated unless you opt out very nice of them

this term kind of freaks people out sometimes they see the word libertarian and they think oh is this about politics and it's not the word libertarian just means in this case freedom of choice it existed long before a political party decided to co-opt it as its name if this was about politics this would probably be a picture of Ron Paul up here instead although maybe I'm not far off anyway so I gave you the choice earlier of a B C or D and as you'll notice most of you raise your hands for C which is not Almond most security practitioners are ultimately practitioners of libertarian paternalism we believe we have to give users choice for usability and

flexibility but we also know that most of the time got all the time but most the time from a keeping them safe perspective we generally know what's best we try to nudge them into making the best decisions for themselves so that's kind of what we're doing and again I think most security practitioners are participants in the notion of libertarian paternalism now for the rest of the talk I want to talk about just a few ways choice architecture manifest both out in the broader world and security and how we can think about how the choices we put in front of people as well as the ones we make and I'll start with the notion of defaults and I think this really

comes into play well when we talk about organ donation if you're familiar with how that works in the u.s. when you'll get your driver's license you have the option to up in to organ donation a little red heart on the back of your license when you die they will know that you are an organ donor so you have to opt into it and as a result the organ donation rate in the US is about 15 percent which is not super great so there are other countries who experiment with different forms of opting in or out on this and there's another country out there I believe it's it's Austria who experimented with pyramid with opt in or

default opt out so you're an organ donor unless you specifically say you will not be anybody want to care to guess what the organ donation rate is at Slovakia all right eighty-five that's pretty good 90% 90% of people in Austria are our organ donors did you know that or was that just a really good guess good very good so correlation does not equal causation we're not gonna say that this is the only reason for this cultural norms and things like that going to play but I would say it is certainly a big part of it right is that the default heavily influences behavior in this case and we see this in our field all the

time we use defaults from one way to relay expertise we have this is a screen on the left from security onion of course we had the security onion conference yesterday with Doug Burks and all the great work he does and I like to call the screen on the left the Doug Burke's knows more about this than you do screen because he probably does so you have the option to go and customize the or you can say best practices again let Doug set it up for me which for the vast majority of people is what they want to do so this is Doug subtly nudging you to set up your sensor in the way he believes it will work best which is a

great idea we do this a lot with antivirus as well most of the time an antivirus pops up and it's found something the default option is generally not do nothing it's generally quarantine or clean or something along those lines so antivirus is the way that users interact with that same type of choice architecture via default some of our defaults evolve over time it used to be if you went to a website with a there was HTTPS with a bad or malfunctioning certificate it would just go there and maybe you get a little pop up then it came to the point where it would go there and it would say click OK to proceed and that was the default option

nowadays you get something like this where your default option is actually back to safety and it takes you back to where you were previously if you want to bypass it you have to click that little advance the button click another thing you have to do two clicks that advancing and moving forward is not necessarily the default option so that's what we have going on there also worth noting that defaults often real priorities this is a screenshot from Instagram when you sign up on a mobile device and notice that there's a Next button and then there's a little button under that says continue without sinking contacts so the default is actually going to sync all of

your contacts to your Instagram account which is not really from a privacy perspective so this isn't to say Instagram doesn't care about privacy but at least the way this decision is architected they're less concerned about privacy than you getting all of your context contacts signed up for Instagram right so you can kind of reveal priorities in that in that way so it's interesting to kind of derive that as well the next kind of a choice in behavior choices architecting behaviors influences through incentives and I use the picture here of the car - because you'll notice in the middle there's the thing that shows you your miles per gallon right which is really just a lot

of cars have that now and it's interesting because it's beneficial for both the car manufacturer and for you for the car manufacturer they want you to get really good fuel mileage because it helps their writing and your perception of their car you'll tell all your friends you get really good fuel mileage in this vehicle and more people will want to buy it it helps you because it helps you see your in style and you can do things because it updates in real time to change the way you drive and get better fuel economy right so it ultimately influences of your behavior about changing the way you drive you start maybe accelerating a little slower out

of a stop easy enough to break when you when you pull up to someone so you can do the whole hypermiling thing I know once I started thinking about this certainly it influence my behavior and how I Drive when I get that real-time feedback which basically says the vine is all the way to the ride I'm costing myself more money and it's all the way to the left and I'm saving a little bit so this is incentivizing you to drive better in a way that helps both the car manufacturer and you right so by knowing answering these questions who uses who chooses who pays who profits you can kind of figure out where and how the

incentives need to be applied when you're doing choice architecture I started thinking about choice architecture and I started thinking about denial-of-service and I thought about where my service gets denial denied most times and it actually is when I'm on conference calls because I'm not actually doing work a lot of the times so I was thinking about WebEx and I was like well how can we change WebEx something that denies service for me a lot of the time how could we make it better such that we could use these concepts of choice architecture and it really kind of dawned on me let's hook it up to the payroll database and let's just have it show the average salary of all the

attendees and what this meeting is costing the company right so I use I use Cisco as an example but any any web conferencing software if there's anybody from from Cisco in this audience and I'm not staring at anyone directly at all right now feel free to take this idea you don't even have to credit me I think it'll make the world generally speaking a better place less useless meetings there's simple choice architecture very understanding mapping so this is something apples are really really good at a particular when they came out with the iPod because you get an iPod and you really only have two options what color do you want it in and what size and most

people at the time didn't understand megabytes and gigabytes so when you went the default option was actually not I want 32 or 64 128 megabytes or gigabytes it was I need to hold 200 500 a thousand songs right people understood songs they didn't understand size so that is the way that's one of the ways that really helped Apple sell their product and you see that actually mirrored across a lot of their products so in an era where Dell was the the leader in that space and you had a thousand configurable options on your Dell computer and actually I'd to call someone to help you do it when you went to buy an iPod or any other Apple product you often had

lesser choice it was simplified more for you and you could maybe customize a little more if you needed to if you talk to an expert or someone at the store or something but for the most part they simplified those mappings for you today took complex choices and made them simple choices primarily to influence behavior to help people pick the thing that was best for them and obviously to maybe help upsell a little bit too but it would influence the behavior in that way so apples really good at that a lot of other people have kind of started co-opting that for their own their own uses I think we struggle with clearer mappings and what we do a lot of the times

everybody probably knows this screen if you open a document it has a macro in it you have the option here and I would also say the default option here is to enable macros so a user sees this they open up a document there's no content well why is it their content well there's this nice yellow bar at the top with an enable content button let me click that it's a default option there's not a clear mapping for the user what the consequences are of doing this most users don't understand this it says security warning macros have been disabled many users don't actually know what a macro is right it's not even a great name for it from a user

perspective if you want to close this document or get rid of this warning you don't have to go to the X at the top right of the screen or the X on this bar which is gray and hard to see oh they're out of the screen I've cropped it out here because most users are not even going to see it that's not the way our eyes move through things we typically move in somewhat of an S pattern and depending on how you do that you may actually even skip the part with the X in it so generally speaking this is an unclear mapping we don't really do a great job of this there's a lot of

opportunity to make better software to help users make better choices where they consider their consequences number four is giving feedback and I chose the picture of the speedometer sign because I know about you but if I drive the same way every day so if I go to work on the same drive every day eventually I get kind of on autopilot and you know I'm paying attention but I don't like I'm not constantly looking at my speedometer I go the speed I go and I don't think about it when I see one of these speedometer I do what I think most people do number one I look around to see if there are any cops in here around right even if I

don't think I'm speeding I just look around to see if anybody is like looking to see if I'm speeding as well and now I look down at my speed to see what is in a sign and if I am speeding I probably slow down right this is influencing my behavior it's taking me out of kind of this automatic thought mode and putting me into a more deliberate thought mode which you're consequently the way humans think we have system 1 thinking which is automatic thinking system 2 thinking which is very slow deliberate thinking so it's forcing me into this deliberate mode of thought it makes me think about what I'm doing more often something that's a lot more automatic by creating

what we would call a forcing function that's why these Suns exist they're not necessarily there to catch you most of them don't have cameras in them and they're not sending out tickets in the mail they're there to make you think about how fast you're going and to increase safety by making you go slower in an area where people are maybe speeding a little bit too much so it's giving feedback and helping you make decisions based upon that feedback a great example of this I think is the way we do authentications and we used to do this a lot more most UNIX and Linux systems do this or have always done this when you log in and you see right there

it shows you your last login table right that's a pretty cool thing from a security perspective there's a really great story about this and hey has anybody read cliffs tolls the Cuckoo's egg yes yeah a lot of you right so there's a story in that where I believe it was an NSA at the NSA there was a secretary there and she came and logged into her computer on Monday and it said hey welcome back your last login time was Thursday and that was weird because she was on vacation all that week so what happened so she told somebody they did some research and actually found some kind of insider threat going on where someone was

accessing accounts they shouldn't be and someone had logged in as her the book was in a story the the story's true the the Cuckoo's egg is a true story and the story within the story is also true so that's an actual thing that happened they found somebody and prosecuted them as a result of that essentially turning the human into the intrusion detection system which is pretty slick so most unix systems do this I think it's not nearly apparent enough because I think it's something that again it forces you we don't think about logging in Allah but it takes you out of that automatic thinking mode and to deliberate thinking so you see that and that's pretty useful

locations collect this data right most of them store authentication history one of those in this case is Facebook and if you log into Facebook and you go to settings and then security and logon settings you see this and it shows you your last logon Tom and actually the device and location you logged on from really good big problem nobody's ever gonna go look at it right nobody no normal user on a regular basis goes to settings and then security and login settings and looks at this so my thing is why not make it more apparent why not put it in the left bar when you log in why not maybe make the user click OK

after they log in to get to their main page to some degree that might become a little bit automatic forum as well well you can also do some interesting things there to highlight things that haven't been seen before so you've logged in from a state you've never logged into before you're logged in on a new device things like that so that way when a user sees this they can click on the right-hand side of that and see that not you button and click on it and report it and again try to be a little more secure so generally speaking any application that supports authentication I'm a firm believer should support some sort of last time place system you logged in for

them and it should be very very apparent because it plays upon these concepts of choice architecture and will help influence behavior to help users be secure and report things to you the security practitioner that will help you do your job better last one number five expecting error when you get into the business of architecting decisions for people sometimes it doesn't go the way you think it will and that's no surprise to us who creates software and deal with computers all the time as well a great example of this are chip based credit cards so if you had a chip based card earlier on before they became so mainstream you probably noticed that you would put it into the into the chip slot

and you would pay the attendant or she would print up the thing for you to sign you would sign it Italy you're at the grocery store they hand you your items you go and leave so what happens you forgot your card right today's you don't do that why because a lot of the times the thing beeps that you like crazy so you know your cards in there but also most of the time it will not print your receipt until you remove the card and these sales associates are often trained not to give you your items until you have removed your card so they put kind of an intermediate forcing function in that process to make sure people don't

leave their cards behind because that's not really good for anyone there's a lot of lobby he associated for that the place has to hold on here card for a while not a good situation for anyone so the chip based credit card was a decision that was being architected and how you deal in process with that and there was an error that it covered there so that had to be accounted for in that way now we have these insecurities I think probably one of the most common is s words one of the policies that we preached for for so so long which you should change your password every three months and what happens when you have a user changing

their password every three months it looks like this right and this is at best right oftentimes they just put a one or two and they increment the number all the time so this is at best so as a result we've had to take what we're doing with passwords and maybe amend this or put some software controls in place so we really have to expect error in this case regarding these things so everything I've talked about today is again it's rooted a little more in psychology than in InfoSec but we can certainly apply it to those things a lot of the ideas I've talked about today are found in a book called nudge by Richard Thaler and Cass Sunstein book was

popular enough that Richard Thaler was awarded the Nobel Prize for economics last year very deserving thing so the book is definitely not about security but it is a great intro to these concepts I don't recommend stopping there if you're really into this there's a lot of great research and other examples of nudges and how we develop and do this whole thing of choice architecture there's also the counter to a nudge which is sludge which is when choice architecture goes really really wrong if people are pushed pushed into bad choices as a result such as enabling macros all right so that's sludge versus nudge and everything I've talked about today there's a nice little acronym nudges there I talked about five of them

there's one more I can get to due to time but that's the Nestle acronym where you can remember these different ways you can nudge people using choice architecture again libertarian paternalism the idea that is both available to you and okay to subtly influence behavior when you have the opportunity while still allowing choice so that's what we talked about today my name is Chris Sanders if you want to find me I'll be around for the rest of the day I teach a training course based on a lot of this stuff we haven't applied network defense booth out there about where you checked in and I'll be there most of the rest of the day so I've got a few minutes and I

can take probably a few questions if y'all have them Yago

yes so so the question was what if someone like Microsoft took a version of Windows that kind of took all this and kind of went almost hyperactive with it and everything you did you were getting some type of nudge and some type of behavior I think we have that and honestly I think that is like the iPhone OS right I think that's a it's a subtle version of that where and your choice is severely limited in favor of optimizing the experience and also to some degree making you more secure I think that's a that's one way to go about that the other way to go about that is often you see web apps where you log in for the

first time and like you have everything Gray's out and you have the highlighted circle and this is over here and that over over here and I know this is kind of silly but everybody remember Clippy from Microsoft Office yeah that that's choice architecture that's architecture right there that's a way to do it I'm all for you know bring Clippy back 2020 your right leg let's do it so so yeah I definitely think it's I have given thought to it I do think it's something that could be done and could be done better than we're doing it now

yeah absolutely I mean choice architecture applies to not just the users and the operating systems and the tools they use but the tools we use as well so you know if you're a security analyst and you're using security onion and squeal and and all these other user interface tools and even the commercial solutions out there using Splunk or an darkside or what have you we you know not every analyst is an expert right off the bat we have to teach them and ultimately nudges are a great teaching tool someone doesn't know how to investigate a specific alert then if you have a tool that kind of nudges them in the right direction guides them through the investigation so to speak

eventually they start to learn how to do those investigations as a product of it so I do believe that's certainly legitimate applicable and required to progress the industry there's a question up here ok I think that's it I have a couple or there's one back here

yeah so the question is there user behavior analytics big thing right now particularly in terms of detection and tech ting things that aren't being detected by other things and how does this kind of play into that and I think generally speaking from the detection side that's a little bit of a different story but I do believe any time you're influencing behavior you should try to collect as much data as you legitimately can about that behavior so you understand it really well so if you're you know you have a web app and you're concerned about how users are clicking around on that web app and if they're doing the right things you really need to have hooks into that web app to

understand where they're going where they're coming from what behaviors lead to other things you have to really understand that behavior again in the best and most ethical way possible because just collecting data on a bunch of people is a tricky and slippery slope so absolutely so I have a just a couple things to give away real quick I have the conference was nice enough to buy a copy of each of my recent books so practical packet analysis right here and I have applied network security monitoring my co-author Jason Smith is actually sitting in the front row so whoever Wednesday's we're glad to sign them the trick is we get to write in them whatever we want so here's the deal

first hand I see I named a bunch of okay not on okay calm down calm down after the question after the question I had a bunch of psychologists on the screen earlier named one I saw you Freud okay you went first which one that guy okay go again I saw you who yep yep that's you so you passed that guy right there and we'd be glad to sign them for you later as well so again thank you all for coming out so much again be sure when you see fallen peers and the people helping out with the conference thank them they're doing a lot of great work today thank you all for coming out today take

[Applause]