← All talks

Just because you can, doesn't mean you should - Mark Orgeron

BSides Albuquerque38:1324 viewsPublished 2025-08Watch on YouTube ↗
Speakers
Show transcript [en]

these people are I think with the exception of of one of them um are in public service. So I would add to to that what can we do together to serve the public. So thank you all for for that contribution. All right, let's get the uh let's get you hooked up, man. Yeah.

All right. So Mark Orderon is a data privacy specialist at the University of New Mexico and a current co-chair for IAP's Albuquer Albuquerque knowledge chapter. He holds several master's degrees and is certified through IA and GIA. So without further ado, thank you very much for being here. Take it away. >> Excellent. >> Thank you. Hopefully this will you guys hear me? Okay. I'm horrible with microphones. Uh, so real quick, I just want to give a shout out to Dell. Dell is one of my good friends. Everything that he uh he does, everything that I see that he presents on um is just always insightful. And so I'm kind of honored today to get to present after

Dell. I'm a little disappointed he didn't invite me to the panel. Hopefully I'm invited to be later. Um but anyway, let's give give one more round of applause for Dell because

So, um, and I appreciate the introduction. Uh, my name is Mark Oron. I'll jump into that in a second. But along with the panel that just happened, uh, I got to say I really appreciate a lot of the presentations this morning because they're going to tie into this. I Mark started off talking about uh, philosophy and K and Plato and that's all right up my alley and then Mari was talking about community. This is going to be more of an ethics presentation and there's nothing that really shapes our ethics and values more than the communities that we're a part of. And then uh so I think it's just been a a great uh great tiein for what I'm going

to talk to guys about today. So the presentation is uh just because you can should you and it's understanding the ethical divide between permission and uh protection. And so because I'm from academia at the University of New Mexico, I'm going to spend 37 and a half minutes explaining who I am and then be confused as to why I ran out of time. Um, actually, thank you for the great introduction. You pretty much covered covered everything for me. Uh, Jeff Gasway was supposed to be a co-presenter with me today. Unfortunately, due to the uh events that happened on campus, he was he was pulled away um and is over on main campus dealing with with that. But the big take

away from this slide that I want everybody to really kind of um understand is is Jeff and I are both engaged people and we're both very community centric people. And so later this afternoon, a week from now, six months from now, if you have a question about any of this stuff or something comes up specifically with data privacy, feel free to reach out to us. Like we are more than happy to talk about this stuff um pretty much until your ears bleed because we we live it every day at the university. So then again, from from the university. I always like to have some student learning outcomes because this is sort of more of a an ethical presentation. I

want everybody to walk away from this with just a different perspective today. So, I'm not going to outline anything other than just kind of this loose principle of cyber security protects the institution. Well, data privacy protects the individual. One of the approaches we take a lot at the university, especially when it comes to data privacy, is we look at it really as a human rights issue and uh a civil rights matter. We work very closely with our our ethics and compliance office and I'll get a little bit into that with some examples later on. Um but one of the things about this presentation is there's no right answer. And so if I say something that

aggravates you, good. If I say something that you totally agree with, good. Uh I want everybody to leave today with just a slightly different pres u perspective on how you're thinking about cyber security and data privacy and kind of start to think are these two sides of the same coin or are these competing entities and I think at times they they're both they they can switch back and forth and so that starts to really pull into our own our own ethics and morals on how to engage some of these problems especially with AI. Uh, talking about AI, I love the agenic AI. One of my favorite hobbies has been making deep fakes with my friends and

uh, I charge my uh, friends kids a hundred bucks if they want a deep fake of their parents saying they don't have to go to school today. So, AI, it's a blast. I think you probably just got a little little taste of my my ethics and where I stand on some of this stuff. Um, but I I like to I like to play around and see where we can push the envelope. So we're going to define some concepts real quick and then uh so for cyber security we're going to focus on protection of systems networks and data from attacks and then our main concern is really some of those traditional principles like the CIA triad right like

very very technical and very system focused whereas data privacy we're going to focus on the rights of individuals over um their personal data. So data is important right we need to understand that we need to define that but on the privacy side we're we're really trying to ask that question of do we need to collect this and what is ultimately going to happen to this data not just what are we doing with it today um and then we're concerned with consent purpose transparency andization so when we talk about people process and technology the process or the the privacy side is really that people and process where that cyber security I look at it as the process and the technology

And then again just to reiterate uh cyber security is about protection, data privacy is about permission. Um and then we have some technical versus ethical areas. So the goal for cyber security prevent uniz unauthorized access data privacy ensure proper and ethical use. Um and you know I'm not going to read through these. I'll let you guys just kind of digest them. But again, the big the big thing that I want folks to start to understand is really really looking at where are we going with this? What what is the problem that we're trying to solve? What's what are the potential outcomes both good and and bad? Um right. And so this last one, the primary

question is can we secure it? That's always an important question. And then should we even collect it? We get into some case studies later that sort of deal with that. Um but uh on the previous panel we mentioned security cameras. If you've been following the news today of what happened at UNM, security cameras all over campus would have been huge. They would have been a great um a great resource for the police right now. However, security cameras collect a lot of data. And so that's going to differ on your environment of how those are are really deployed. So a university, some people love the university setting, some people hate the university setting. Um because it's a

diverse environment where young people are allowed to come and express their minds and sort of grow who they are as a person. And so if we have a campus laden with security camp uh security cameras, that can be a huge asset being right off central and university. Um especially help with police investigations, can mitigate crime. So there's a lot of positives with that. But then the question is is our faculty who are doing research in kind of unique or risky areas to understand the human condition are they going to be limited by some of this additional data collection? Are our students going to be allowed to fully grow and develop and engage in in social

discourse if we have a campus that sort of become a big brother state? Is there a right answer to that? I don't know. I think everybody in this room probably has a slightly different opinion. Um and And that's a good thing and these are the questions we need to we need to talk about and so as we get more into the surveillance the big data um you know this is just the sort of because you can we have the access for it um but should you and so is the user aware you know cameras are getting smaller and smaller we have all sorts of pixel tracking things like that that our users our everyday users don't know what data

they're necessarily giving up and then can be misused even unintentionally. So, I'm going to kick off my first case study here with Target. Um, I kind of like Target, right? Because data cyber security, we always have a target on our back with what we're doing, but does anybody remember maybe about 10 years ago, Target was uh sending notifications to pregnant women. Does anybody remember this controversy? >> Yes. So, this is always this has always fascinated me. And I actually had a friend um who who got caught up in this and every few months would post on Facebook about Target sending her coupons for prenatal vitamins, for diapers, um for for different things you would need along the way for a pres uh

pregnancy. Now, great marketing idea from Target, right? Like you're going to increase sales if you you have this group who's who's looking at things that are going to kick off something that's on a on a nine month cycle like pregnancy. This is a great idea, great marketing plan. However, this friend of mine had a very early miscarriage and every time Target sent her another another marketing notification, it sort of like reopened that wound for her. And so, was that Target's intention? Absolutely not. But when I talk about unintended consequences and what are we collecting uh what type of data are we collecting and why, these are sort of those those types of scenarios we need

to sort of formulate and ask like what what could go wrong? And I just remember my friend going through this and uh you know it's here's your advertisement for uh for diapers, here's your your advertisement for onesies. And this woman was a paramedic who who worked on an ambulance in a high crime area much like Albuquerque and it threw her off her game. So you can kind of see the cascading events here of you have a data collection and then there wasn't a way for her to remove herself from this because she wasn't even aware that that data had been collected. And so this is a study that I like to kind of kick things off of or a scenario to kind of

get people thinking about like this is bigger than cyber security. This is this is bigger than just the regular threat vector. This is bigger than zero trust. There's more things at play, especially when we talk about cultivating um environments. So, just talked about target 23 and me. Who's heard of the what's been going on with 23 and me? So, so I'm going to give you guys some information about me. This is probably like second or third date mark material. Um, but I was I was originally adopted as a child and so early on when 23 and me came about. Uh, you know, I spit in a little tube, sent it off and got all the information I

wanted. It was a great service for me. I don't look anything like my family. Um, you know, you look at a family family picture and it's like you say, who doesn't belong? There's Mark. Um and so for me 23 and me was a great a great tool to sort of understand where I came from as a as a person. Um however 23 and me is going through a bankruptcy now. Um and that data is there was some questions of what was going to happen to the user data. Now, 23 and me is has gone out and said, "We're going to purge this data if you send in a request." And to date, about 15% of their database has been deleted

from uh from the potential sale. And so, for me, this is an example of a company standing by their privacy policy. And so, if you're going to create a polic privacy policy, you have to follow through what it says. Um, and there's been a lot of advocacy groups. The, uh, state attorney general for California has really put a lot of pressure on 23 and me to give people an easy avenue for deleting that data. I've deleted my data. Um, I'm operating under the assumption that it got deleted. Uh, we'll find out in six to 10 years. Um, but again, as we're we're talking about GRC um, and those types of things, it's really trying to have that forethought

of we're doing this today. what is the landscape going to look like in five, six, 10 years from now? Um, and then along those lines, this is another fascinating one to me. There's a run of like fine dining restaurants in the San Francisco Bay area. And what they're doing is when you book a reservation, they're going and basically creating a a portfolio of you based off of your online footprint. So that way when you go into the restaurant, they're going to give you more of a a tailored customized experience in this article that that's linked to that everybody should have uh access to later on. Yes. Yes. Um, you guys can read this at at your leisure,

but what they're doing is one of the one of the people that was mentioned has gone to one of these restaurants that's been doing this three years in a row for her birthday. So, she's a customer and the restaurant found out that she likes penguins. So, they created a penguin theme for the restaurant for her birthday that night, right? Like totally benign, right? It's kind of a fun thing. Again, great marketing idea. But the question is is What's happening to this data once the restaurants don't use it anymore? Or is there a line that these restaurants are crossing with information that that they're collecting? Because they're basically creating intelligence um portfolios on their clients. Are you know are those

getting secured? Do you know does all the staff have access to it? So there's a lot of different things there that come up in my mind is that sort of like we can do it. There's a good business reason for it, but then should we really do it or are there ways to to mitigate risk in those in those areas? And I kind of picked these examples. I didn't want to come up and bore you guys about UNM. Uh we do a lot of cool stuff there, but I wanted people to think about these different types of diverse scenarios that are happening every day across every walk of life and start to apply your ethics to them. You

know, your morals is that something you would want to have happen to you. Would you like to walk into a restaurant for the first time and everybody on the staff knows your name? There are some days for me that'd be awesome. Like, oh, you know, I like Clyde May's whiskey, drop it right in front of me. Uh, and then there's other days I just want to be totally anonymous in the crowd. You know, I don't want anybody to talk to me. And so giving people the ability to sort of live their lives, form those communities how they want is is a big piece of this in our our approach of how we're looking at things within data

privacy. And so kind of towards uh toward an ethical data governance, right? We're going to build cyber security privacy by design. It's kind of standard process. Minimize data collection only what's needed. Standard process. uh communicate data practices clearly. This is one that should be a standard process but I see people fail at all the time. Um who remembers when the Facebook Messenger app became a thing, right? So good friend of mine from college was actually on the team that built that app. They never intended that app to steal all the data that it did. However, something happened in the development that they could never actually figure out. And so they never shouldn't say they never, right? But

with Facebook and a lot of their their practices, they're not always communicating to the end user um of of what's being taken, what's what's being marketed based off their their data. And one of the big things we're in big at UN M is empowering user control over your data. We give students the ability to opt out of things. Uh we give students the ability to have a privacy flag on their account so you can't look them up in our online directory. Um, and so giving the that individual the ability to choose to be in that in that community or not. And then the last big thing is is a lot of times in cyber security and data privacy, you know,

we're always seen as kind of a pathway to no, right? Like you come to us, we're going to tell you no, you can't do that for X, Y, and Z. But the goal is to not limit innovation is to have those conversations. It's to have those debates. Um I wish Jeff was here today um was able to make it because he and I debate all the time. He comes from a very traditional IT side. Uh I came from more of a physical security side. Um I had the fortune misfortune, however you want to look at it, for working for our governor during COVID for a while. um and had some very interesting perspectives there about what type of

data we're collecting during COVID. So he and I are constantly debating what our pathway forward, what our compliance should look like and through those debates we generally get to a right answer. And so having those open communications, being able to have those discussions uh with your teams is huge, especially people who are not necessarily technically savvy with those end users. We work a lot with our attorneys, our compliance group, and then uh faculty. I like to say that at UNM we call them faculty because if we could call them adults, we would. Um but there glad that one got a laugh. Uh you know, so we we have some interesting groups at UN M who

don't always understand the things that we do. And so being able to engage them is is important. Um especially when we're talking about what type of data we're collecting. And then some questions that uh we tell our stakeholders if you're going to be doing this, try to ask yourself these questions. Are we building trust or are we eroding it? Would I be okay if this was my data? How does this serve the public good? So along those lines, if you have any questions, um I'm happy to answer them. Otherwise, all my contact is up here. We have um an AP chapter. I'm one of the co-chairs. If folks are interested in that, we'd love to get you involved. We

do sort of alternating educational events that are generally over Zoom and then local networking events. So, if that's a community you're interested in getting more involved in, this area Absolutely. Um, so the question was, can I tell you guys more about that group? So, IP stands for the International Association of Privacy Professionals. Um, I would look at it as sort of like heads and tails, right? So, the the heads you have IAP on the privacy side, the tail is is like you have CNS and CISSP. So the AP is going to be more law, ethics, compliance, regulatory focused where the SAMS and the CIA SSP are going to be more focus uh technically focused within the uh the information security

and privacy environment. >> Yes sir.

things like that. >> So, yeah, a little outside of the specifics of, you know, the framework of what you're talking about here. But once you start getting into like the endusers experience with things, uh, that's where often times a lot of these nos and other things oftenimes really impact people's lifespan and and their livelihood and everything else. Um do you have thoughts or um feelings on specific things like blockers, ad blockers, strippers, uh the the the things that give the the end users some tools to sometimes evade or get around some of the uh the annoyances that get tossed in their way by the designers. So if our UNM attorneys were sitting right here, they'd hear my answer and

they'd be like, "He does listen." Um so my answer to that is it depends. Uh I think it really depends on on your environment and what you're trying to do. And when you think about like the CIA triad, um you know, there's certain environments where you want a lot of accessibility and maybe confidentiality doesn't always matter, but then we have certain things in federal government where we want a lot of confidentiality or access, you know, low accessibility. sort of with my experience with 23 and me, I wanted my data to have a high level of confidentiality. I wanted that to stay, you know, the integrity to stay and then I wanted a high level of

accessibility for me but nobody else. Um, so there's certain things we look at. We look at everything like that sort of on a case- by case basis. Um, we try not to limit things um really for academic freedom. want to have people have that freedom. Uh when Facebook first came about uh and I wish Jeff was here because he he would explain to you this explain this to you because he was the one that did it. We banned Facebook from campus. Uh so you could not access Facebook uh from the university network because of how poor their privacy policy was at the time. Was that a good idea? I don't know. We did it. Um

but but more and more um Generally within our environment, the only things that we try to block or strip away are things that we can clearly say this this is a threat to our campus community.

>> Yeah. >> Sometimes I feel hopeless about privacy because the concession of privacy is built into the terms of service on the device, the operating system, the application, the service, all levels are a concession. So how can you reasonably approach best practices for privacy when so much of that concession is built into being allowed to use the service in the first place? >> So this is one of those things where I think as individuals we really need to empower ourselves. I think that's first and for foremost and I think that's going to be one of the hardest things in this. We give in the United States, we give so much information out for free. And the question is for us, right, as a

as a community, whether it's this besides community, our regional community of um just IT professionals, what are we what are we comfortable with sharing and what what do we need to keep private? Right? So, as a society, you know, HIPPA information, personal health information, we've decided that this is going to be an area where we're not going to talk about it so much so that people get it wrong in in a positive way, I think. Um, and so I think the first thing is is facilitating these conversations, but then looking at things like GDPR, you know, what what are their foundations on, right? Like there's a lot of things from GDPR that go back to preWorld War II. Um

information the Nazis used to basically target um other political factions. And so it's it's an interesting paradigm. I think there's a lot of uh thinking in that area that's probably the correct thinking that gets viewed more as radicalized thinking a lot of times. But I think it's our willingness as people to come in and look at a company like Facebook or look at the Google's um you know I use all these companies so I'm not trying to trying to bash them or throw them under the bus. They're just easy household names of saying you know I'm I'm going to limit my service usage because you're engaging in these practices. Sorry. Does that answer your question?

>> That's I felt a little politician there. I feel like feel like I was channeling Michael Padilla there for a second. Mark, what can we do to protect or educate people about Christ? I know young people are still post.

>> So, that's a great question. I'm going to start that off with a joke because I do fashion myself as a little bit of a comedian. I got some things I'm going to be working at Casada's comedy club later. Uh so I appreciate you guys's uh tolerance today. But I grew up in that that generation right it was the early days of America online aim things like that and my parents said don't give out anybody you know don't give out where we live don't meet anybody online you know don't meet strangers right and now from my phone in five minutes right like I'm gonna order some food for dinner I'm going to get on Tinder and find a date

for this weekend and I'm gonna have Uber drive me around wherever I need to go. So it's the culture around all of that has shifted significantly, right? Um there was that additional fear to that that comfort level that we have now. And I think one of the biggest things, especially working on the the campus environment and seeing like the amount of crime that we have in Albuquerque, um I've been fortunate to wear other hats that have have done things like that have been like anti-terrorism, anti-rime, and getting to work with people like our district attorney um and the Albuquerque Police Department. There's a lot of like needless crime that happens by young people is just

sort of like I'm doing I'm doing this for Instagram. I'm doing this for likes and and that's a big problem. Um that's probably bigger than kind of what you you were envisioning, but getting away from this culture, this social media influencer culture. Um it definitely has its place in our society, but I think and this is this is totally world according to Mark. You hear about some radical ideas. Um, I think kids shouldn't put football pads on until they're at least 16 years old. I think some of our cyber security and COPA laws and and things like that, I think 13's too young. Um, I I have uh some neighbors that I I really like. They

they don't have a TV. They have a projector screen and they do family movie night. Three kids, elementary school. Uh, our oldest is 13. He's got a flip phone and their parents have told their kids they get $100 a year for every year they're not on social media. Are there ways that that's sort of depriving them? Absolutely. But you know what? These kids are playing outside. They're reading. Um it's kind of like uh you look at my neighborhood, it's like the 1980s with kids out on bikes and pogo sticks. And so again, I think it comes back to our individual ethics and morals. I have other friends who their kids I don't have any kids by the way.

So I'm totally looking into the the fishbowl uh judging everybody on this one. Uh but then I have other friends their kids just sit behind an iPad all day and their their parents have no idea what they're doing online. And so I think it's one of those things it definitely it takes a village. You know in New Mexico we have we have one of the worst education systems in the country. You know does that factor into this issue? Absolutely. Um And so I think it's a multiaceted factor that you know we just need to get people talking about it. Um you know and I'm not talking about it like in terms of the political yelling that we have

with people you know facilitated conversations panels like what y'all had events like this. Um just to start to to shift that paradigm and get people to understand like this can have an impact. you know, these five or six likes and shares that you got on the video are going to be that dopamine hit for the next five minutes, but are you even going to remember it five years from now when you can't apply for a loan? Things like that. Um, at the university, one of our compliance regulations is is fur and that protects student work up to a certain point. At the university, we like to give students that environment where they can express themselves. And

it's important to us to protect that data. So if a student writes something in one of their classes that's like totally radical, you know, I don't care if it's leftwing, rightwing, like whatever, you know, whatever the opposite of your belief is, right? We want to give them the environment to express that, right? And then have the discourse. But what we don't want is as that student's grown, evolved, changed, their perspective has changed. Um, you know, maybe they've done things in their life to to travel a little bit, broaden their their uh their horizons. You know, we don't want that paper then to come back and be a hindrance to them later on in life. And so I think that's one of the

things we do very well at the university is sort of protecting that in in the long run and giving our faculty environments where they can they can let kids right because that's where our college students are their kids have that freedom but then we can protect them. It's just all these other environments where there isn't that oversight. So, so I had kind of left this presentation a little bit short, more for questions and and conversation. Um, and so see, no, I think somebody was just giving me the finger over there. Um, I uh so feel free to ask questions. I'm happy to field anything. Again, all my contact information is up here. Um, I'm happy to talk offline about anything.

Um, but you know, the time is time is yours. Otherwise, I'm happy to to give everybody a little, >> you know, I I have a comment actually. I mean, just reflecting about what you're saying. To me, it sounds like contextual integrity. You said the word integrity a moment ago, but it's this contextual integrity that exists now for for the little one, right? Which then will change over time as their identity changes because like you led with in the beginning, our identity is coming out of the communities that we're in. And so we really want that contextual integrity to follow our identity, to follow us, and not have anybody exploit that. I just wanted to see if you have any thoughts

on that. So I'm going to take you down a rabbit hole. Um, so for a while I was working on a PhD that really focused on on radicalism, how people radicalized and then and then ultimately jumped to extremist behavior. And has anybody ever heard of the historical event in the United States, Harper's Ferry? So Harper's Ferry is very, it's like kind of like the first ever domestic terrorism event in the United States. And the people who pulled off this heinous act of violence were radicals and extremists. But what what they were fighting for was it was an anti-slavery moment. And so this terrorist attack that they they pulled off at Harper's Ferry was basically owning slave owners trying to highlight

that owning slaves was was unjust. Um, and I like using this as an example because now today, right, like we look at equal rights and um how the country's evolved, we would never consider those people like radicals or or extremists. And so it's really the the cultural shifts with within our society that are gonna um we're going to dictate this. And you look at some of our political environments right now um it's difficult to just have honest conversations with people without people getting. And I think the power is really with the people as cliche as that sounds um to make those changes. And it's it's it's all interconnected. the data privacy um some of the things that have passed

recently with with AI and not being able to be regulated for 10 years. There's going to be some great innovations, right? But that's going to create all sorts of quagmires for us. Um and so again, I think it's really comes down to us taking ownership of our space and our communities. So, is that >> I mean, yeah, I I think that's a great response. To go further into the rabbit hole, I see us as needing this shared or collective understanding of of reality. I was just having this conversation with this gentleman yesterday. We we tend to have a different interpretation of facts as individuals, right? And so at this conference or in a group discussion,

what we're then trying to do is create this shared understanding of or collective understanding, right? But to do that, we need to understand each other's goals and what principles we're operating with, right? And what strategies we will use to adopt uh or or get to those goals, right? And so it's this collective understanding that I think we're missing right now, but we're creating it. We're trying to create it, right? >> 100%. And and you talk about truth, one of my in society right now are people like it's my truth. This is my truth. No, no, it's your perspective. Um, and we need to remember that. And I think we've we've somehow subbed out the words

truth and perspective too much. And it's getting back to being able to to share perspectives in healthy ways. And a lot of this stuff, right? Like there's not wrong answer like there's definitely things we could do very wrong but when it comes to privacy you know there's a lot of right answers and there's a lot of things we can do better and I think if we start focusing on the things we can do better and making those things better that's going to be the pathway to get down and sharing those perspect perspectives like I'm a very open person um part of the dynamic between between Jeff and I and if you ever had the pleasure of meeting my boss

Jeff um the ISPO from UNM we are solar opposites. It's crazy. Uh but Jeff is very very very private. Um and I'm a person where if you ask me a direct question, I'll give you a direct answer. Um and it doesn't necessarily bother me so much and that's my perspective. Whereas the environment that Jeff grew up in, he needed to be a private person to sort of survive. And so and I probably have too much about him than he would be comfortable with. Um, but he's not here. So, we'll just trying not to make an Epstein joke right now. We'll just There's a minute there in that camera we can we can edit out, I'm sure. Um,

but uh it's told you uh Sorry, I totally just threw myself off. Um, but I think I think that's the is we need to all look at like okay what are you know what are my privacy values what are your privacy values what are your privacy values and then bring that to the conversation and say okay what works for the group and then what are additional controls we can put in for those people who who need a little bit more to feel comfortable >> I feel like you and I could talk for hours and hours and hours does anybody else have any questions

All right. Well, thank you very much. We're going to go ahead and take a break right after this. We'll be back. Um