← All talks

I Am The Cavalry (IATC) Introduction and Overview - Joshua Corman

BSides Las Vegas23:33200 viewsPublished 2016-08Watch on YouTube ↗
About this talk
IATC Introduction and Overview - Joshua Corman I Am The Cavalry (IATC) BSidesLV 2016 - Tuscany Hotel - Aug 02, 2016
Show transcript [en]

Here we go. All right. Thank you for all for coming. This is an awesome room. Uh we were actually born here at Bides Las Vegas exactly three years ago yesterday. So, uh we'll get to that in a minute. But, um it's Bides has really been integral to getting us launched and getting this kind of silly experiment off the ground. Um I'm Josh Corman. I'm one of the founders of I am the Cavalry. I'm Bo Woods. I'm a another person in I am the Cavalry. Uh Bo doesn't may may or may not tell this story, but Bo actually competed with our reveal. So while Nick PCO and I were presenting the reveal of I am the Cavalry and asking you guys to

join us on this crazy mission, Bo was actually presenting in a different track here and never even saw it. Uh, but we like to say Bo is our first and one of our best recruits because we met over Japanese whiskey in the speaker room immediately after and uh he's been incredibly committed and uh you'll see how committed um throughout the day. Um all right, so let's just do this really quickly. How many of you are brand new to the cavalry versus how much can I assume? Okay, so I I'll split the difference a little bit here. Um, really quickly, we just turned 3 years old yesterday. So, we're going to explain a little bit of what's happened over this

first three years of experimentation. Um, there are certain things that are working incredibly well and you're going to see them throughout the day. You'll see some of the the people we're able to bring to bear from government, from safety critical industries, and the kind of success stories that give us energy and encouragement in the face of some really, really hard problems. Um, the last 12 months have been particularly challenging. Um, and even though we're getting better, we're getting a lot worse faster. So, Bo and I are going to talk about some of the the things that have happened that may not be obvious to everybody. In fact, I'm shocked how very few hackers know some of these stories

we're going to share. Uh, but also, they really compelled and catalyzed some pretty big changes that we've made uh, personally in that we think are necessary for this mission to continue. Um, and then I'm going to talk a little about about terraforming, but I'm not going to try to take all the oxygen out of the whole two days that we have. We have an amazing lineup for two days and you're going to see what's happening in automotive cyber safety. You're going to see what's happening in medical cyber safety. Uh you're going to see um some of the the fruits of working with government from Jen Ellison, a whole bunch of folks on where we're working to

preserve researcher rights to to change the way government looks at our kind and to try to to drive safer outcomes sooner. Then tomorrow, just to frame a little bit, tomorrow's really different. So, it may be unclear what it is on the schedule, but today is really about where we're winning uh and where things look promising. Tomorrow is really, really, really hard problems that we don't have solutions for. And we're going to be very honest and candid with each other. In fact, we'll probably shut off the camera for large chunks of it. Um, one of those speakers later today is with me on the health and human services sec um cyber security task force that Congress asked

for. And the closer we look at clinical medical environments, the scarier it gets. It's not funny. Not funny at all. And there's really no obvious solutions. So tomorrow morning, we're going to try to tackle really uncomfortable truths in medical, in oil and gas, in agricultural technology to try to surface in the afternoon maybe some uncomfortable and unnatural responses that we can be innovative to come up with in response to those. All right, but let's dive in. So, a little bit of celebration. Um, I didn't know if this thing was going to last 3 weeks or 3 months, but ultimately uh we just turned three on August 1st, uh, which was yesterday. And, uh, oh, um, you guys have been amazing. And it's

not going to be for everybody. and our fight isn't necessarily your fight, but the idea of trying new things and trying to get outside the echo chamber on things that matter and have consequence. Um, it just felt like we were ready to do it and we were ready to take the next step in our own evolution. And essentially for those of you that were here, what we said is that, you know, I looked high and low in the government. Nick RCO also tried through his role at Spider Labs. And we kept trying to look for the adults in the room. We kept trying to find out who can we find that cares about these public safety, human life

kind of issues. You know, we're all focused on credit card protection. We're all focused on privacy. We're all focused on things that matter on their own, but in the greater context, um our belief when we looked at the internet of things is the unifying statement we made was our dependence on connected technology is growing faster than our ability to secure it in areas affecting public safety and human life. So, we've gotten a little pathier since then, and now we point out it's where bits and bites meet flesh and blood. But as we add software to things, we make them weak. And as we add connectivity to everything else, we expose them to every sociopath on the on the internet. Um, so

when we said the cavalry isn't coming, uh, the natural consequence we said is it falls to you to fill that void. So I am the cavalry wasn't Josh, it wasn't Bo, it wasn't Nick, it wasn't Katie, it wasn't Jen Ellis or Space Frog. You had a group of folks that really wanted to get moving, but this was your personal at astation that you're going to be part of the solution. And the challenge was, will you be a voice of reason and technical literacy? Will you be an ambassador who translates what we know to the public to the public policy makers and these safety critical industries like automotive, medical, industrial control systems, u the internet of everything essentially

because what we saw is we were adding bacon to everything or adding Bluetooth to everything or adding you know software and connectivity to everything. When Mark Andre said software is eating the world what he meant was every company's becoming a software company. When I hear it, I hear software is infecting the world and we're making everything weak and hackable. And there's going to be certain use cases where that's entirely appropriate and it's going to make our lives better. And there's going to be other use cases where it's entirely inappropriate and we're going to regret it. Um, and we just wanted to be that voice of reason. And many of you answered that that that call including Bo who wasn't even in the

room right? Yeah. All right. Anything before I go to the next chunk? No, I think that's a Okay. Good recap. You're going to hear lots of stories. Um, in fact, I hope Bo does tell why he joined even though he didn't see the talk, but uh perhaps that'll be in his medical session. Now, this is a little controversial, but I want most of the talks throughout the next couple days to use this. This is an oversimplification on purpose. Every one of them starts with a P on purpose because when you're being a translator and ambassador, you're going to end up using words like cyber when you talk outside the echo chamber. And I already have my shot of

scotch, so uh no one can tell me to drink. Um, but one of the things that's turned out to be in hindsight, one of the most important points to make when we talk outside the echo chamber is this. So, when you go to an auto company that's been around for a hundred years and they don't know why anyone would ever want to hack their cars, like why would anyone do that? They must be trying to extort us, right? We hear that all the time. In fact, you're going to hear from Alan Freriedman uh from uh the US Commerce Department. He's helping to run uh the NTA working groups on voluntary coordinated disclosure policies uh for coordinated

vulnerability disclosures. And when we do the working groups on these safety critical industries, they believe that people are out to extort them or trick them or entrap them. And when we went to the Food and Drug Administration, you're going to hear from Suzanne Schwarz who's the she's the head of the uh cyber um basically the cyber standards and guidance for all connected medical devices in the through the US uh food and drug administration. She too didn't initially understand why would white hats even want to hack an insulin pump? Why would they care about a bedside infusion pump? And while many of us kind of know this, I just want to put this into your consciousness because it

becomes important for dozens of other use cases later. I essentially ask people I meet, why do you hack? What got you into hacking? And I find that most of us major in one of these motivations and minor in the other. And we're not all the same. Uh so just to outline what these are, I think there's protectors who want to protect things or make the world a safer place, right? They just wake up every day. Uh they don't superpowers, they use their hacking as their superpower, right? Number two, there's puzzlers. And this is probably the original zeitgeist to the hacker community. We want to we want hard problems. We want to solve the Rubik's cube. We got want the challenge. It's

the curiosity, right? You know, if we're criminals, if there's any crime we're committing, it's the crime of curiosity, right? So protectors want to make the world a safer place. Puzzlers want hard challenges and problems. But we also, as many of you have experienced, we have a rockstar culture, right? We both put people up on pedestals and people love being on pedestals. And that's the prestige or the pride. Make a name for yourself. Win the white jacket at pone to own. And there's nothing wrong with this motivation either. It's just one of the reasons people might do this. Then comes profit, right? We can make a living off this. You might be able to make a lot of money. Maybe you start off

in the like a a bug bounty program and you get little payouts for lots of little things. Maybe you turn it into a consulting practice, but sometimes there's a there's a personal gain motivation. And then lastly, this is the one that most people get confused on. Um but there's also protest or or uh maybe um some sort of political bent. And this might be helping dissident in different countries. It might be helping to out corruption. It's the folks that work with Amy Citizen Lab or if you ever seen Morgan uh Marqueis who does stuff under Head Hunter. It's really kind of lawfully helping for some sort for something or against something. And if you think about protectors and puzzlers

and um prestige, profit, and politics, this becomes really necessary. And what I think I've noticed is one way that we've eased these safety critical industries into trusting us is uh it's a little counterintuitive. We say don't do a bug bounty program. We don't want you to jump straight to a hacker one or a bug crowd. You can use their platform, but if you simply say, "We will not sue you." If you put out a welcome mat and say, "We won't sue researchers." Acting in good faith with no recognition or reward, then you're pretty much only going to attract the protectors and the puzzlers. So, you'll get fewer bugs. They might be easier to help you

build your muscles. You might get your feet underneath you. You might see how many bugs, what type of bugs before you open the floodgates to everybody else. So sometimes Cavalry takes a stand on something that is a little confusing like why wouldn't you pay researchers? We're not against paying researchers. In fact, some of these organizations like GM who came out with a coordinated vulnerability disclosure program earlier this January, they did it because um not because they're cheap, but they had no cash prize because they wanted to see what kind of bugs, what kind of volume, and they're perfectly willing to pay very good money in invite only and private um uh hackathons and whatnot. And if you think of Tesla, they started

off with no cash prize, then they added a small cash prize. And at Defcon last year, they upped it to $10,000. So you'll see these kind of things happen. And while this isn't exclusively about bug bounty programs, we found that to to help the outside world understand who we are, where we're coming from, we've tried to make a friendlier face and a more inviting face by focusing them on the protectors and the puzzlers. And for my sake, I'm first and foremost a protector, right? I want to save lives. I happen to also like really tough problems. So that's why I choose the puzzler as my second one. So whether this resonates with you or not,

someone's got an argument with me. I said I can add a sixth one. We could be pedantic if we want. Um they're not perfect. They're not comprehensive, but they they're fairly decent way to articulate that we're really a domestic resource, not a domestic threat. Right? All right. So, really quickly, especially because we got a little bit of a late start and you're going to get a lot of really great content throughout the day, a few highlights of what's worked over the last three years. One of the things we did um and I had a lot of help from a lot of people in this room was to make uh some sort of gesture to the outside world. So, Nick Boo had done

a TEDex. He invited me to come help do one in Chicago. So, if you haven't seen this, this is a good way for your neighbor or for your co-workers to understand it. I did a TEDex called swimming with sharks were and uh cyber safety in the internet of things. And without getting into the whole metaphor here, I went on a shark dive with Dave Lichfield and I realized how incredibly similar the internet of things is to getting in the water with Apex Predators. And while most people can't understand the jargon and the gobbly cook, they do get this and it's been pretty effective. But our first real big deliverable was on our first birthday. So two years ago yesterday uh at Defcon

we launched um a five-star automotive cyber safety framework for connected vehicles and this was essentially our olive branch to the automotive industry to say look you're masters of your domain you've been making cars safer for hundred years but we're masters of our domain in cyber security and now that our domains have collided we will be safer sooner if we work together and we thought this would be not a PCI framework or a checklist or an ISO standard but really just the admission that all systems systems fail and here's five basic foundational capabilities you need towards failure. Now our belief was this would give us a way to meet these people meet the entire ecosystem of the

suppliers the insurers the drivers the dealers and it did right we got to get in front of government folks we built relationships we had arguments we were accused of being extortionists and whatnot initially but ultimately we have been heavily embraced and you're going to see some of the fruits of that later today. Um the sad part though is we thought this fivestar would really be the starting line and that once all cars had these things then we could start doing real security on top of this. That way when there's a car hack um the damage will be less the public fear and anxiety will be lesser the response times will be compressed etc. Um, and

it's been very very good. And I have a slide a little bit later that shows what these five things are. But essentially, um, it was an opportunity framework to work with all the different stakeholders, put ourselves in the map, and show that we're not just going to hack, you know, stuff on television and scare people. We're actually going to be part of the solution. And it took a little while, but it's worked. And we've been invited since then uh to at least a dozen Yeah. automotive conferences over the last two years. After we put that out, uh, we got a lot of responses in. Some of which were, um, what the hell are you guys doing?

Who are you? Some of them are were, um, I've been saying this inside my own organization for several years. This gives me the external ammunition to go and take it to my executive stakeholders again and it will probably move the ball forward, right? So, it wasn't us pushing from the outside. It was us identifying people on the inside who could then invite us in and to have those dialogues. Yeah. And it's important to know that uh we got a lot of guff from even from our own community that this wasn't comprehensive enough or needed more things or we should have talked to you know bothered to talk to a single auto worker. The truth was we had spoken for

nine months with most of the stakeholders we could find in the auto industry. We talked to tier one suppliers OEMs uh regulators people who were working on the the the vehicle-to-vehicle encryption standard. We we had done quite a bit of ground work before we revealed it. So part of the ambassador job is finding willing allies uh and working with them. That said, it there's a lot more to do. Um one piece of encouragement is two Fridays ago I was in Detroit for one of these invited cyber security conferences and the CEO of GM her keynote was highly similar to one I had given on cyber safety. we have successfully infected one of the leaders in that space and by

her taking that leadership stance and using a lot of our jargon uh like the the importance of working with third party researchers by putting out a welcome mat instead of a beware of dog sign that's setting the tone for all of her competitors and for the regulars as well. In fact, we have the National Highway Transportation Safety Administration here today for for one of the panels. Um, similar in shape and spirit, Bo took the lead this uh this past January. Um, because of the relationship we built with the Food and Drug Administration, we made a hypocratic oath for connecting medical devices. So, want to say a minute on that? Yeah. So, uh, I'll talk in more detail

about this a little bit later on today at 3M in the medical device cyber safety talk. uh but the idea here was that as physicians take a hypocratic oath to act in the best interests of the patients increasingly medical devices are the delivery mechanism for that patient care. So shouldn't those things also have a similar type of ethos and it's kind of a a mashup of like the hypocratic oath and Isaac Azimov's three laws robotics. Um so if you can think about it that way uh it would be very very clear but subversively we also designed it so that each person within the stakeholder ecosystem in that chain of care delivery can see themselves and

their own role in engaging on these topics. So each one of them can be read in a way that a physician can say, "Oh my god, that's my job." And so that a medical device maker can say, "That's my job too." And so that a nurse and so that a hospital administrator, so that a healthcare procurement officer who buys the things can all see themselves reflected in that framework. Yeah, it really changed the tone as well. Um, we revealed it at the FDA. They had a two-day workshop and in the two days of panels, I think we had a cavalry person on every single panel, at least one. It was pretty amazing. Um, but some guy got up that we've never met

before and said, "This is amazing. We are all p, you know, we all came from a healthcare delivery. We want to save lives. Is there a single person in here who won't make this pledge right now?" account. He tried to get everybody on camera to kind of, you know, at least volunteer to this because we share a similar zeitgeist and a similar motivational structure and that they got into that because they wanted to save lives. Um, so we we found that they went from looking at the security people as the ones that make it impossible to log into the crash cart when people are dying to saying, "Wait a second, they can be a teammate here." So, we kind of

changed the tone. Um I'm going to get into this one a lot more tomorrow but just as a reveal um we've been using this probably since day one informally when a safety critical industry wanted to understand how is IoT different like how is safety critical different than our traditional IT security. So I've had this framework of six things and we're going to publish it after this week um after some we battle test it here amongst you a little bit tomorrow as well but it's incredibly important that everything that you've done in your cyber security career or in your pentesting career your researcher career has mostly been focused on the confidentiality of data keeping secret secrets more specifically most of it's

been on credit cards or more recently maybe some intellectual property but we spend an awful lot of time our Best practices are really the sum of an equation that's focused on the confidentiality of data. Things get really really different when you start talking about cyber physical impact, loss of life and limb, critical patient care delivery. Um the idea of detect and respond breaks down immediately. So some of us grow at the term killchain, but we've kind of resigned ourselves to the fact that we have to wait for a failure, notice it sooner, and respond. But how do you respond and undo dead patients? like if you had to push earlier into not indicators of compromise but indicators

of reconnaissance or indicators of t tampering. We don't have good answers for that because we've resigned ourselves that breaches will happen. They're going to happen often and we'll just respond. So one of the things I've had to remind people is we're not necessarily facing a a Russian mafia card scammer when we're talking about a hospital. So we have different adversaries. We have different consequences of failure measured in loss of life uh measured in public confidence measured in GDP hit. We have different um environments and context. So where these things operate, you can't put a perimeter around your insulin pump. You can't run an IDS and IPS device on your hip with a 9volt battery, right? So the

environment, some of these are migratory in nature. Some of them are underneath the ocean. Some of them are on an oil rig. So the context is different. The composition is very different. This isn't all Windows-based, right? It's going to be different hardware, firmware, software stacks that don't necessarily have aftermarket security available to them. Maybe not even the computational power to do anything like encrypted uh comms layers. There's different economics. Some of these smaller IoT device points are so low you'll never get a penny spent on security. They'll never have a CISO. They'll never pay for a pen tester. They won't have a coordinated disclosure program. Inversely, some of these really big generators that it took a train to

move into the facility before they built the facility. Uh that there's there's $30 million and they're meant to last for 30 plus years. So, the economics are wildly different. And then the time scales, like I said, maybe up to 30 years, right? Our cell phones are meant to last maybe three before we replace them. Our cars are durable goods that are meant to last for 11, but the software in them gets old really quick. So these guys are grappling with these things again different adversaries uh mostly ideological not so much rational uh different consequences different context different composition different economics and different uh time scales so it'll become clearer tomorrow as we dive in but a lot of our best practices

and a lot of our conventional wisdom falls apart really really fast if I were to say right now if the vendor won't fix the bug that you found in their product what's the best recommended action most of you guys would say, "Well, we'll just have to do full disclosure, right? They won't fix it until there's a disclosure event." Here's the problem. When you have a forever day, an unpatchable flaw in a life safety critical system that the manufacturer is out of business, if you were to do that, you might be the thing that puts people in harm's way. I mean, I say to Bo, we have that old cliche that security through obscurity is no

security at all. But what we keep encountering is situations where that's all we have. That's all we have. And that's not comforting, but it is the truth. So, we're going to dive into this a little bit more tomorrow. Um, but I've spent about as much time in the opening remarks as I can, but this is becoming a really important tool to use when we engage policy makers, insurers, uh, hospitals, medical device makers, automakers, etc. And a lot of what we believe changes when you start running through this this matrix. All right, a couple big changes in the last uh 12 months. Um I made a bold claim and no one's pushed back yet, so I'm going to keep making

it. We have not yet had a high consequence failure in cyber security, period. Try to think of the worst one you've ever thought of. Worst one you've seen, and I'm going to say we haven't really had a high consequence. Was it Target? Well, their stock price is fine. CEO lost his job. people still shop there. So, was it Ashley Madison? Well, I'm sure some people got divorced, but you know, it didn't end the world. We didn't all all of a sudden start making much more secure things. Was it OPM? OPM was pretty devastating to people affected. But has it triggered has it catalyzed a massively different policy response? The answer is no. And I think

sadly we're about to leave the era of low consequence failures. So what I'm defining as a high consequence failure is something that shatters the public confidence and creates a crisis of confidence in a key industry that will have a loss of life, an exotic loss of life, will have um A neck.