A logo with accompanying text "Listen on Spotify"A logo with accompanying text "Listen on Apple Podcasts"
Cyber Equity and Empowering Through Security
Season Two
· Episode
10

Cyber Equity and Empowering Through Security

In this episode, host Raghu Nandakumara sits down with Nicole Tisdale, Founder and Principal of Advocacy Blueprints. Nicole spent 15 years as a national security expert at The White House - National Security Council and the U.S. Congress's House Committee on Homeland Security. She joins the podcast to discuss cyber equity and security policy.

Transcript

00:00 Raghu Nandakumara  

Well, Nicole, it's such a pleasure to have you here on The Segment. Welcome!

00:06 Nicole Tisdale

Thank you. Thank you for having me. I'm happy to be here.

00:09 Raghu Nandakumara

Your background is incredible, right? National Security Council, US Congress, you founded Advocacy Blueprints. I think it's always interesting to hear about the background and the journey our guests have taken to get to where they are and what they're up to, and understand what motivates them. So,, let's start there. Take us on that journey.

00:35 Nicole Tisdale

Sure! Journey would imply that I've made it where I'm going and that I knew where I was going when I started, and that has not been my intro into cyber at all. So, you'll be able to hear from my accent. I'm very southern. I grew up in a no-traffic-light town in Mississippi—they are real, they exist. I got into public policy, really, because I started in undergrad at the University of Mississippi, and I was taking a political science, like intro to political science class, and just the idea that there are people who advocate for others through public policy seemed really fun to me. Up until that point, I wanted to be the first lawyer in my family and the first lawyer in my town. And I remember thinking, "Oh well, I'd rather advocate for people in public policy because then nobody's going to go to jail, and no one's going to lose any money if I get it wrong." Clearly, I didn't have a full understanding of how public policy works! But after, all during undergrad and then subsequently, when I went to law school, I just always did public service internships. Over the course of seven years, I did eight internships, and I really just like dealing with people, hearing about their problems, but then also trying to figure out public policy solutions. And I will say, early on, I fell in love with the idea that public policy could solve people's problems because it just helps scale. Right? Like, if there was a public policy solution to a problem that you brought me, I can use the same public policy solution to someone who lives in Mississippi to someone who lives in New York. And so early on, I just saw and really kind of fell in love with how scalable public policy is in helping people with their everyday lives. I started working on Capitol Hill. A number of my internships were on Capitol Hill. And I ended up on the House Homeland Security Committee, which for most people, they're like, "Oh, you just like, jumped in the middle of the fight." And I always tell people it's one of the most bipartisan committees in the US Congress, but you only hear about it when they're fighting about something, right? So, you hear when they're fighting about immigration policy or when they're fighting about counterterrorism policy. But they also have jurisdiction of the entire Department of Homeland Security, and so they work on cyber. We, when I was on the committee, we worked on cyber policy just as much. But for you, as you know, sometimes the press doesn't cover the cybersecurity policy and the way that they cover some of the other things. And so that was kind of my introduction to cyber policy. And honestly, I had the cyber portfolio the same time that I was working on counterterrorism policy and counterintelligence policy, which is just a long word for espionage policy. And when I first started working on The Hill in 2009, cybersecurity policy was not at the top of that list, right? Like, I was really focused on ISIS and Al Qaeda and the growth of white nationalists, but then also the Snowden leak happened. It was just all of these other things. And every once in a while, there would be a data breach and a data leak, and so Congress would focus on it for a little bit, but it just wasn't a priority. And I would say really, 2016 and the election interference is when for most people, cybersecurity showed up to their doorstep, in their polling place. And before that, it had just been data leaks, which are very important, but it's hard for people to see the immediate impact of a data leak, and I know we'll get into that a little bit. A lot of cyber policy is not tangible and it's not immediate, but the 2016 Russian interference in the US elections was something that was very close to the average person. So, a lot of people are like, "What is election security? What is cybersecurity?"

04:27 Raghu Nandakumara

I mean, that's already interesting. And just that last point that you hit on right about when the average Joe or Jeanette thinks about cyber is, you're absolutely right, like data leaks, okay, they are a dime a dozen. But when it is something that you can associate with that, okay, I go to a voting booth and you're wondering whether the vote you cast is the vote that's recorded, right? That suddenly makes it real. And I think in the same way, if we think about why cybersecurity now is very top of mind, and people are aware, I think it's because the impact of some of these cyberattacks is becoming much more tangible to the average person. It's that connection about, I actually see that, oh, that cyberattack, let's say Colonial Pipeline a few years ago impacted my gas prices, right? "Oh, god, okay, I now care about this." I think, I think that was the tipping point.  

05:29 Nicole Tisdale

Yeah, and I mean, it falls in line with some of those other issues that I told you that I was working on, right? Like the thing I always tell people about national security, especially my friends and my family, because we've set up these barriers to entry and education as it relates to national security. We can be really snobbish. I think the cyber community is guilty of that a little bit too. I tell people, I'm very honest, I don't actually want you thinking about national security policy every day. It's not a good sign. When I was working on counterterrorism policy, I used to tell people, I don't really want you to be afraid to go to the football game because you're afraid that there might be a terrorist attack. The same thing is true with cybersecurity policy. I don't want people to feel fearful about cybersecurity. I want it to be empowering. And so, to your point, I do think there is something to be said of when cybersecurity becomes tangible to people because you have their attention for a fixed amount of time. And in my world, public policy, you need timing just as much as you need good policy. And so it's funny, you bring up Colonial Pipeline, but you know, for me, Colonial Pipeline was an opportunity to talk about cyber incident reporting again. But I worked on that legislation when I was on Capitol Hill. When Colonial Pipeline happened, I was working at the White House, where I was the Director of Legislative Affairs for the National Security Council. I tell people that wasn't a new concept. I just had Congress's attention, and Congress had their constituents, and they had the public who were pushing them to do something. People were actually really shocked that you don't have to report major cyber incidents to the US government. And so that Colonial Pipeline followed by the JBS meat processing attack, I was in the White House, so it was a little annoying to me when the press was saying Joe Biden was the reason we weren't going to have hamburgers for the Fourth of July. But you know, for people who weren't impacted by the gas shortage—they want their beef, they want their hot dogs. And then after that, we had the Kaseya attack. We had the log4j. There were just a number of things that happened in terms of timing that really showed the American public we may not have all the solutions right now, but at a minimum, we need people to report these incidents to government.

07:51 Raghu Nandakumara

Yeah, you're so right. And I mean, you rattled off sort of so many of the high-profile cyber incidents that we've had over the last handful of years, right? And it's actually quite scary how, if you think about it if you'd rewind the clock, you probably had one of these major things a year, right? And now it's like a major event. It's not even once a quarter. It's like once every few weeks, we have something that is reportable. And you're absolutely right, I think the reporting is so important because the transparency is important. Keeping this sort of closed and secret is a far greater risk than being transparent about it, owning it, and then discussing, okay, how do we ensure that we minimize the probability of this happening again.

08:45 Nicole Tisdale

Yeah, and good public policy does that, right? Like, to me, that bill was about transparency, and not just transparency that the government needs you to report those attacks are also in line with the rise of ransomware, right? And, like, what we needed is to be able to start trying to put the dots together. And when we have some people reporting and some people not reporting, it's really, really hard to protect the next person. But it's important when you're doing public policy, to also think about the the other side of that. And so I always tell people—and it got a little contentious when we were at the White House—but I always, I always refer to the people reporting as victims. Not businesses, not critical infrastructure owners and operators, it was very intentional to call them victims. And I would talk to members of Congress, but also talk to other policymakers. And I would say, you know, one of my internships, I worked at a domestic violence clinic, and the way we treat victims is you don't punish them because they waited too long to report, they don't have all the information when they report. You thank them for reporting. You take the information that you have and you try to gather more. And it was really contentious, right? For some people who feel like some businesses were taken advantage of the fact that there was no requirement to report, who wanted to kind of bring the hammer down really fast. I'm like, good public policy understands you don't do all the things at one time. What we need people to do is start reporting. We can talk about penalties, we can talk about incentives, all of that stuff will come. But at a minimum, just getting that framework of what is reported, who reports, and who do we report to, was a really good place to start that. But also, because I'm really good at public policy, I knew I had a window that was going to close. Right? And I couldn't spend two years fighting about penalties or fighting about incentives for reporting. I needed to get that framework done with the idea that while Congress doesn't repeal laws often, they do amend them very often. And so, being able to say like there may be another set of cyber incidents that go in, maybe we will see where someone was reckless in how their security was set up, and so maybe we need to have a conversation about penalties. Maybe we see 2-3 years from now, people are reporting. When you report, not only should you give data to the US government, it should be triggers. Like we're going to talk about cyber equity. If a critical infrastructure owner and operator reports, and we know they don't have the resources to respond when they report, are they going to get technical assistance from the government? So, being really clear that, like, this is the baseline, and this is the framework, and we're going to keep building on that.

11:34 Raghu Nandakumara

I really appreciate you going into that level of detail, because I think it's important for listeners, and I know there'll be some who are sort of very aware and very informed about this, but equally, there will be many who are sort of really, truly understanding this for the first time, and will be questioning, what is the need for reporting? I think you made that sort of very clear. You use the word victim versus an organization that is impacted, et cetera, or a business that is impacted. Go into a bit more detail about why it's important to use that word and to think of those affected as victims.

12:15 Nicole Tisdale

Yeah, I will. So, I like to use examples. So, one of the things when you work in public policy and you work on cyber, you have to have examples. You have to have analogies. I tell people I just like to roll with a toolbox full of explanations for lay people, but also people that just have expertise in different areas. To me, one of the examples that I use for victims is the water owners and operators in Flint, Michigan, or Jackson, Mississippi. So, most of your listeners are going to be really familiar with the Flint water crisis that happened a few years ago, but then also, the Jackson water crisis that is still ongoing and happened a couple of years ago. That is a physical, both of those were physical water crises. But I think when we think about it in our cyber domain, we now have intelligence reports that the People's Republic of China, the PRC, may be pre-positioning on some of our critical infrastructure. Maybe some of that critical infrastructure is the water sector. So, when you think about that, I always tell people I'm like, if Flint, Michigan and Jackson, Mississippi, didn't have enough money to invest in their physical infrastructure for their water facilities. It is not shocking that they don't have the money to invest in their cyber infrastructure as well. And so the idea that we would be thinking about them as anything other than victims, right, with a state actor targeting their infrastructure, is almost ridiculous. Like I mean, it's just like, how could you possibly think that they would have enough resources to counter a government actor or a state actor? And so when I think about those victims, I think about, yes, the city of Flint, Michigan, or the city of Jackson, Mississippi, is a victim of the city infrastructure, but also the people that they serve, right? So, these are also low-income communities or communities that are completely reliant on their city infrastructure. To me, all of those people that are in that chain are victims. And so I always give that as an example to people to say, like, "How could they not be?" And people will say, "Well, you know, they should have been invested in the infrastructure over time." Shoulda, woulda, coulda. Public policy is not about penalizing people for what they could have been doing or should have been doing. It's about making it better in the present and then making it better in the future.  

14:46 Raghu Nandakumara

That's fantastic, like making it better in the present and making it better in the future, rather than punishing them for what maybe they should have done previously. And you gave such a great example there about the disproportionality between the target and the person essentially attacking. And in this case, right, you've got a local or a state, or a state local organization, right, who we already know are struggling for budget. They're typically in significant debt. And then you've got a nation-state actor, which is potentially got sort of bottomless pockets. So, there is that disproportionality between the attacker and their victim. And I know that something that you're particularly passionate about is cyber equity. How do you reduce that disproportionality? So, before we get into that discussion, define cyber equity for us.  

15:43 Nicole Tisdale

Sure, I will say this is a working definition because the definition doesn't exist, and so it's something that I have had to create as I talk to people like you, but other people to actually name something that we all know exists, but then structure a definition that is complete. So, cyber equity refers to the fair and just distribution of cybersecurity resources, protections and opportunities across all segments of society, focusing on vulnerable and marginalized communities. And what that means is cyber equity means equal access to secure technologies and infrastructure. It means inclusive cybersecurity, education, and awareness programs. The example I always use is, I don't say digital literacy. I don't say cyber literacy. That is not inclusive language, also it's technically wrong. I always tell people, I'm like, so literacy implies that you're going to get to a level of understanding where you're not going to have to learn anything anymore, right? When you're at a sixth grade reading level, you'll always be at a sixth grade reading level. If you've learned something about cybersecurity best practices today, I guarantee you in 10 years, it's going to be timed out. It's not going to be relevant anymore. And so being really clear of we're not going to show up to communities who have not had equal access to education, who have had discrimination in education, and talk to them about their literacy rates. Like we're not doing that. But it also means proportionate protection from cyber threats and cyberattacks. That goes back to what we were just talking about in terms of understanding that some communities are going to be a higher target for our adversaries, not because of the communities they serve, but because of the protections that they don't have. People go for the easier targets. But then also fair representation in the cybersecurity workforce and decision-making processes. So, one of the things I'm very passionate about is I, too, believe we need more diverse voices in cybersecurity, but I don't want them all over-indexed in entry-level operational roles. Because I work on the policy side of the house, when there are not diverse people at the policy table, which a lot of times those are not entry-level positions; they're more senior leadership-level positions. When there are diverse voices at the policy table, for organizations, for businesses, for government, policy suffers, and the people in the operational roles can't change the policy. And so being really clear that when we say diverse representation for vulnerable and marginalized communities in the workforce, we mean at all levels of the workforce, and we mean at all positions. And then the last part of that is I might build it out just being really clear of equitable policies and regulations that address unique cybersecurity needs for diverse populations. So, we'll get into some of the most basic things that we're asking from a cybersecurity standpoint. But I always tell people I'm always a little disappointed when it seems like a lot of cyber policy people opt out of rural broadband and broadband access programs or policies. And they're like, "Oh, we don't really work on that." And so I always ask. I'm like, "Well, as a general rule, do you support password managers? Do you support multi-factor authentication?" We're on a zero-trust podcast, like, "Do you support zero trust?" And it's like, “Yes”, I'm like, “Don't you need internet to implement all of those things?” So how can you opt out of a broadband conversation if people don't have access to reliable and secure internet, they start using sketchy internet, and they have very sketchy cyber practices. And so you can't opt out of the regulations and the policies just because it's not something that is in your normal technical toolbox.

19:37 Raghu Nandakumara

Thank you for that sort of very detailed overview, and I think it's such, cyber equity is such an important thing because, almost tying it back to that last thing that you said in that, in this sort of connected world that everyone has to operate in, right? It doesn't matter where you are in terms of your economic status, et cetera, race and so on, ethnicity, right? Ultimately, we're all in this connected world, and we're all exposed in some way or another to the same threats. But if we don't have access to that right level of awareness of protection, then the risks that some of us have right are far disproportionately greater than those who do have access to that right. As I assume that that is kind of a key focus of what you're looking to address through cyber equity.

20:34 Nicole Tisdale

Yes, it's the operational roles and the policies, as you just stated, but it's also all the things that we need to make good policy. One of the things that I tell people, I'm like, there just isn't a lot of research that even centers on the impacts or the targeting of humans in cyber research, right? Like we are over indexed on our adversaries. What tools are they using, what techniques are they using, what tactics their ATP? Almost and I never say that we shouldn't be doing that. But then, when I try to talk about, well, I want to know the demographics of race, income, and ages that are using password managers, or they have adopted multi-factor authentication, the data just doesn't exist. It's almost to the point of, like, we want to focus on the adversaries and how they are taking advantage of that. It's like, okay, two things can be true. We can focus on what the adversaries are doing, but then we need to connect the dot to what the people, the average Joe, the average Jane, is doing too, so that we can start to build in those defenses. Because, to your point, it does me no good. I use my family as an example of like I think I have very secure practices. But when someone in my family's banking information or financial information is stolen, it looks very different because a lot of them are already living under the poverty line. And so when I talk about, I wrote this article for WIRED magazine at the beginning of the year, just kind of laying this out for people. And I talk about the cyberattacks that have been happening to the SNAP program, which is our food supplemental program, and cyber criminals have been attacking the EBT card, the electronic benefit transfer card, which is how they use their benefits, even the language that we use to talk about this in the policy field right now, although I'm working to change this, we call those low dollar crimes. And I think that that is a misnomer. Those are high-impact crimes; the average amount of money stolen is $1,000 from a family. And I use Maryland as an example because Maryland has had the most money stolen in 2023. A family of four can't make more than $40,000 a year in Maryland. They also cannot have more than $2,000 in their checking and their savings account combined. So, what they do, what we all do, sometimes you need more money for food, during the Christmas holidays, when kids are out for summer vacation, they eat more. So many of them have what they are kind of storing money on their EBT card, so it will be there in case of an emergency. Same thing with natural disaster victims or veterans, they're all kind of using the EBT card as a way to manage their food and their necessity financials. If you steal $1,000 from a family of four that makes $40,000 a year, that is devastating. And EBT cards don't have chip cards like our debit cards and our credit cards. They don't have fraud protection. You can't just call and have the money put back into your account. Understanding that that is high impact. And our adversaries appear to have figured out that there is less of a focus on public policy or less of a focus on the solutions because they continue to attack those programs. Every state, every territory, has their own SNAP program, and they're all being attacked. Figuring out what a solution looks like that is not like, "Oh well, they should just have fraud protection, and so whenever someone loses the money, the state should just put it back in." And I'm like, these states don't have the money for that, y'all or this; I've talked to cyber folks, and they're like, "We gotta get chip cards and all of these EBT cards." And I tell them that is not going to happen at the speed that you think is going to happen. What is going to happen is California is going to have chip cards first, and then maybe New York. But what about Mississippi? What about Michigan? What about states that have a large portion of people already living under the poverty line, so they don't have a big tax base? You're just going to put a target on their back because when you get to a place where California has the protections and New York has the protections, you're drawing more attention to the programs in the states that don't have them. Yeah. So, that's a long example, but I think in terms of the question of what it looks like and how you integrate it, just making sure the cyber community knows like you're not going to be able to come up with solutions to these issues without taking in social equity policy and social equity viewpoints, because the security answer, it may just be that, it may just be a security answer that is not practical and can't be implemented, and thus it's not good public policy.

25:32 Raghu Nandakumara

This is a really interesting area that we're discussing here, but I think a part of the challenge is actually when you think about how a cyber incident is reported in the media, you often hear about, okay, let's say a million records were stolen, right? Or if we think about something that happened very, very recently, which wasn't a cyber incident, was an IT incident was massive outage, right? Millions of systems going offline, etc. But very, very rarely, sort of tying it back to sort of the example you gave about, like The EBT cards, right? Very, very rarely do you hear about the social impact of those incidents, or it'll be a tiny sort of subpoint in a bigger article about, oh, they should have done X, they should have done Y, et cetera, et cetera, right? I feel that has to change. And then more of the reporting of these sort of, like high profile cyber incidents, must focus on the social impact. Because I think the only way we're going to, like organizations are going to focus on, how do we avoid this in the future? Because I know reporting is very important, but we also need to think about, how do you ensure that this doesn't happen again? I think that the only way that's going to happen is if more of the social impact is reported and people do see that. "Oh my god, right. This is affecting people every day", and often the most impacted people are the ones who can't afford to be impacted.  

27:07 Nicole Tisdale

Yes, like, to your point, the people most impacted are going to have the hardest time of recovering, and some of them are never going to recover, right? Like this is, if you don't get that money put back into your account from the EBT program, that is just two months of your food budget that is gone, and so you have to go to a food shelter. You have to figure out some other way, or parents are just going to skip meals. But to your point, the cyber equity issues underlie really every cyber policy. When you were talking about healthcare attacks, I tell people all the time when you think about some of these healthcare facilities that are being attacked, you can look at the social economic indicators of who they are serving. So many of them are serving the most vulnerable groups, whether that is low-income, whether that is our veterans, whether that is our aging population; it creates an erosion of trust that is not being connected for the cyber community, right? So many of these communities are already hesitant to receive care. So many of these communities are already hesitant to use digital health services and telehealth services. When you attack the systems that they have entrusted, that's the difference between someone just not rescheduling that exam, that preventative health exam, or I'm not going back to that hospital because the last time I went two months later, my checking account was wiped out. Like that does happen, and people do make those connections. And so being really clear that that cyberattack is not just about what happened to that facility. And I know people want to get very dismissive and be like, "Well, they had ransomware insurance," and I'm like, that means that the facility is in a place where they can recover. What about all of the patients who have now had their data stolen? Same thing with schools, the fact that people are doing ransomware attacks on schools, and a lot of them public schools, we know that that is not about people being like, “Oh, the Los Angeles Public School System, that's just a bag of money, let's attack them.” It's like, okay, 100% of the students who are in the LA public school system are on the free lunch program. That's not about the money. That is about stealing the identities of those students. And these are students who come from communities where their parents don't have credit monitoring set up on their kids, they don't have it set up for themselves. These kids, if they can survive all the things that are in between them and getting a quality education. Now, when they graduate and they try to go get a student loan or try to go get a job and try to get a security clearance, their identity was stolen 10 years ago, and it's ripped. And so being really clear of some of these things in cyber world that we think our protections are not protecting the people who are most impacted.

30:12 Raghu Nandakumara

I think that's such a powerful and devastating example of the impact, right, is taking away the identity of people who are struggling to have an identity. And I'm playing on the word identity there, right? These are they're struggling to establish their own identity, and then they have essentially their digital identity stripped away from them, and it means that it's struggle. So, that's a very sort of crushing example of the impact of cyberattacks. And really, that move, and we see this in the data, like from, let's say, the IBM cost of data breach report, right about the increase in truly disruptive cyberattacks. And often, when we think about disruption, we think about production lines not producing a bag of chips, but I think what you just described, that's the real disruption, right? That's the disruption that is completely unquantifiable. So, I want to kind of move from here, and you said earlier that this is a zero trust podcast. So, firstly, to you, what does zero trust mean?

31:22 Nicole Tisdale

That's a really good question, depending on the audience that I'm talking to. I just explain it in different ways. If I had one ask of the cyber community, I wish we would like stop coming up with these terms that are just like they're always the villain in a Marvel movie, like they're just the scariest, the worst term. So, I will say my examples tend to minimize the fear that exists around cybersecurity. So, I usually ask people like a Q&A because it's a good way to get people talking. And I'm like, "Do you lock your car doors?" And they're like, "Yeah". I'm like, "Do you lock your car door when your car is parked in your yard?" And people are like, "Yes", I'm like, "Why? Why?" And they'll say, "I mean, it's in my house, but I just don't trust that people won't come in my yard and unlock the car." And so that gets me to a conversation of that is what zero trust is. Zero trust is not about no one is ever who they say they are zero. Trust is not about the login is never to be trusted, the people, the technology can never be trusted. But we do need to do some things to make sure that the trust is needed. And so, for me, zero trust in public policy and the way that I explain it to people is it is about making sure that we have the hardest defenses possible. Understanding that nothing is going to be perfect, we're just trying to make it harder to attack us. And so in public policy, I find when I have it because that's like a three-minute conversation with a lawmaker policymaker, but it's a really good way to get them to understand what zero trust is in public policy. And I will tell you, I've never gone into the technical aspects of it. I don't have to, usually with good public policy, I always tell people, because the cyber community seems obsessed with lawmakers understanding the technical details of what is going on. And I'm like, they don't have to understand the technical details. They have to understand the implications to people and the implications to things. And so for me, that's usually the example that I give, and once I get someone locked in on, "Oh, well, yeah, zero trust makes sense, and we should have that", then I can talk to them about all right. So, we need to make sure that the US government is only procuring products and services that implement zero trust from the ground up. And also, now we're going to be changing the policy at this department or at this agency because their previous policies trusted that we didn't need to lock car doors in the driveway. And that makes sense to people. And I always say the example, I'm like, your members of Congress don't actually understand how open-heart surgery works, but we do have laws and regulations around what is required if you're going to be cutting people's chest open. And so, to the question of what zero trust means, I would say I have a very broad, nontechnical definition, but that works for a public policy attorney.  

34:21 Raghu Nandakumara

I think you absolutely nailed it, full stop. I have heard very technical definitions of zero trust. I have heard lots of analogies about zero trust. I've never heard the car door analogy, and I liked it a lot. But I think you absolutely encapsulated the need to be able to take what is potentially a technical concept and make it completely accessible to every single person that is going to encounter it, right, and remove and add to your point that the Marvel analogy right is we must remove the fear from these concepts. Right? And make people say, ah, that's common sense, right? I think that's what you landed on, right? It's that zero trust, the way I sometimes think about it is, is that if you put all the marketing hype aside, zero trust is a commonsense approach to building security, and securing whatever it is you're trying to secure is a common sense approach, right? And in the same way that locking your car door, even if it's in your garage or in your yard, so as you say in the US, it's common sense. So, I absolutely love it. So, thank you, Nicole, thank you so much for sharing that in amongst all the other gems that you share. So, now, let's connect zero trust to cyber equity. How are they linked?

35:45 Nicole Tisdale

So, this is something that I'm in the process of trying to create a cyber equity policy framework so that people start to understand. For a lot of people, they have your reaction, they hear and they're like, "Oh, yeah, I should be doing this". And then they're like, "Okay, but how do I do this on everything?" So, I prepared for this. And I think the way to think about zero trust in terms of cyber equity would be secure by design. And I know you've had a couple of people come on and kind of talk about secure by design. And so what I will say is zero trust has to remove the burden from the end user in cyber equity. And I need us to do secure by design. I need us to fully implement secure by design, because it makes it easier for the end user to understand and then implement zero trust. I will say part of my job at the White House in Congress is like, you meet with people from industry, because as we're trying to figure out the policies, like we would never want the US government to pass a federal regulation that everything has to be zero trust before we've actually talked to the organizations that are going to make the products, but then also talk to the advocacy groups. And so part of the, I'm trying to avoid trust but verify, because I really hate using that. It means something else in public policy world, but the idea that the end user can be trusted if there are certain measurements that they meet. If the products are secure by design from the beginning, it just makes it easier for them. And so one of the things that I talk about all the time is like, I advise people that the best way to use multi-factor authentication is not through text messages, but most of the time, I am talking to communities that have never even heard of two-factor authentication, let alone multi-factor authentication. And being really clear with them, the reason that I don't want you to rely on text messages is because you also get a lot of spam text messages. And so I'm trying to teach them how to use pass keys from the beginning so they never even know a secure world can exist with text messages, like they just kind of know that pass keys are the way to go. And so, for me, zero trust just requires what it means in cyber equity before I can even get to the end user, the equitable piece, the people who have created the products and designed the products have already built in the security. So, it's a little bit of like a it's a strange way to come at zero trust because I think for most people, zero trust is about products and people who procure the products. It's not always about the end user, right? So, a company decides we're going to have a zero-trust architecture, and then they are talking about, if you're going to work here, you have to use this no matter what. Zero trust in an equity space is not always going to be a requirement, right? Like, one of the reasons why EBT systems have not all moved to pass keys is because the communities that they serve don't have access to those devices, and so it's not that they don't see the value in having a zero-trust architecture. They cannot implement that because then they cannot serve the communities that the programs are designed for.

39:19 Raghu Nandakumara

That's a great point, and I never thought about secure by design in that way, in actually by adopting secure by design, this is both security vendors, but then also organizations that are providing services and building apps, et cetera. By adopting secure by design, you then make it easier for your customers, those that you support to better adopt security practices, right? So, if you're not adopting secure by design, then you're not making that easy for them to live a more secure life, right? Which means that they then default to practices that are convenient for them. I say this that I think that good security is security that is easy to adopt. Because it then encourages good practice, because it's kind of like, when do you use a product when you love interacting with it on a on a daily basis, on a regular basis, right? And I feel that a key part of sort of cyber equity, and I speak this as an employee of a security vendor, is making your security product as easy to adopt as possible not just for your super sophisticated user but also for your least sophisticated user. That's the way to drive change. That same principle applies down to the end user that you're supporting via apps or chip and PIN cards or whatever it may be.

40:48 Nicole Tisdale

Yeah. And another example that I use that makes it really tangible for people is, it is driving me crazy that our awareness and education on phishing is like, "don't click on bad links." And I'm just like, Y'all, people say this, but it is absolutely true. Criminals only have to be right once. Like, we're asking my grandmother to be right 500 times a day. That's insanity. Secure by design, like, as I'm as the federal government is moving toward it, and we're pushing people. I'm like, how do we make sure that these links don't show up on people's phones and in their inbox in the first place? In the United States, we know the US Postal Service, their website ends in.gov. Why are people getting spam messages, text messages that are ups.com? Like that is not that's actually not real. And anyone that is doing that is, like, trying to spam people. And so being really clear of, like, don't rely on me going out to a community and be like, okay, so when you open the text message, and you see it, you need to, like, also, just think about how Amazon spells Amazon. And also, you know, once you click on the link, make sure there's a lock at the top so that you know it's like, “Y'all, we know these domain name systems are not registered to the people who are actually Amazon or UPS. How is that showing up in the phone? And what can we do?” And so being really clear, of like, it is much easier for me to put the burden on the DNS community and Amazons of the world to say you need to come to the table so that we can solve this. Then us launching a multi million dollar don't click on bad links campaign.  

42:33 Raghu Nandakumara

Oh, absolutely. And it's funny, you use that example, because very regularly, a few times a month, my dad sort of says, "Hey, I've just got this text message. Should I click on it?" and I look at it, and I have one look at it, and I say, "Nah", right, that's not a legitimate domain. You're so right? Because there are certain things where education moves the needle, right, but there are other areas where sort of just this constant stream of the same repetitive message has no impact, because it's actually practically not possible. And I'll speak for my dad here, right? I could tell him, "Okay, this is how you detect whether it's a legitimate domain or not". He's never going to be able to do it. He's never going to be able to put it into practice. So, yeah, I think we should put that burden, and like, it goes back your secure by design principle, right? Put that burden on the organizations who can deliver that, move that upstream, so that the end users, they have less unconventional things to learn.

43:35 Nicole Tisdale

Yeah, and they feel empowered. I think that is part of I tell people all the time, like, kind of going back to this, the terminology and the terms that we use. So many of them, are just also not empowering, and that is why you have people who are like, "well, I don't really care about this data leak, because so and so already lost my data", or, like all these other companies have lost my data before, because It's not a message of empowerment. Where it is like, yes, your information may have been stolen in a previous data leak, but that was also 10 years ago, and 10 years ago, you weren't making as much money as you are now. You also didn't have kids. There are things that have changed in your life that we want to keep off of the internet and keep off of the hands of bad actors. And we want to empower you to do that. And that also is a message of resilience, right? Like, I tell people all the time, like, I literally, like, when I'm at like, beauty shops or, like, when I'm in the grocery store, it's like, a whole thing it's actually absolutely crazy. Like, I'm just, like, my thing is, like, can I get everybody to use a password manager, and can I get people to use two-step authentication? And I know there are more sophisticated things that we could be having people do, but again, I'm in communities where, like, no one has even said the word to them, Password Manager, like they're over here thinking that their birthday plus an asterisk is secure. Right? And so it's so funny because I'll, like, be at the beauty salon or the nail salon, and so I'm like, “Everybody, we're going to go in, we're going to just start using the password managers on your phone.” Like, that's what I tell people. I'm like, let's just start where you are. I have things that I use that are different, but I just want you to start using it on your phone. We can have another conversation in a year. And afterward, people are like, okay, "so I'm not going to get hacked. Like, I'm going to be good". And I'm like, "No, but it's going to be much harder for you to be hacked, and this is going to help you recover faster if it does happen," and that is a good message. Like, no one has been like, oh, I wasted my time. You wasted my time to call. They're like, “Oh, okay, what? I'll take that.” And so, empowering people through security is not just something that sounds good; it's reality based. And I kid you not, most of them come back and like, or they just start doing it everywhere. I'm like, so many of these folks are like, social media influencers. I'm like, that's your money. Like, yes, you need two factor authentication set up on your Instagram, because what if your Instagram gets locked for two weeks? That's actually that's your livelihood, and they will come back to me and be like, okay, so I was using the password manager, but I got really into it, and so now I'm thinking about downloading something else. So, I'm like, "That's great. Also, while I have your attention, let me talk to you about PASS keys, because, like, you're on the password train. Let's just step up your security." And when people are empowered, they want to do more. They want to tap in. They don't self-select out because you've told them that no matter what they do, they're still going to get hacked. It's like you can be honest with people and still empower them.  

46:39 Raghu Nandakumara

Yeah, absolutely. And that is living the change, right? That's living your message and sort of taking it to your community, right, to that community that you interact with day to day, and say, “Hey, I've made a difference here just by simple things that people can relate to and people can understand.” I feel, Nicole, that there's loads I just, just because of your background and all the things that you have touched, I feel that there's so many things that we could continue to talk about, but I'm very respectful of and conscious of your time. But it would be remiss of us to not you said that sort of the 2016 election was sort of that, a bit of that aha moment when there was sort of the concerns around tampering with the elections, et cetera. And people are wondering, sort of the threat from sort of nation states? We're here 2024 right elections are right around the corner. So, we've talked about concerns with election security. What do you think is going to be essential to safeguard the integrity of the elections this year.

47:44 Nicole Tisdale

Thank you for asking that question. I will tell you I don't think it is the technical answer that you might hear from other people. When I think about election security. There are three pillars of election security. There is the infrastructure that we use to vote, to register to vote. There is also physical security, so making sure that you can vote without intimidation, without fear that have physical access to polls. And then the third pillar, which is the mindset and the influence operations. I feel very confident about pillar one, especially in the United States. Since 2016, this is a lot of growth for me, I was so mad in 2016 because there were just so many things that I'm like, how did we this happen? Right? Like, how are we in a place where we can't even tell our secretaries of states, who control our elections, that we have intelligence because the intelligence is classified and none of them have security clearances, or we don't have we don't even have phone numbers for all the Secretaries of State, right? Like we're calling their 1-800 number. But also the way we had invested in physical infrastructure was kind of how sometimes the cyber community invests is like, you wait until something happens, and then you put a lot of money into it, and then you kind of walk away and be like, "fixed it." And so I will say, in the eight years since 2016 we have done so much on the critical infrastructure side and the election infrastructure side of the house that I feel really good about that. In terms of the physical security in the United States, we're dealing with a lot of physical intimidation for voting. I think we are making progress on that. I think we're in a place where people feel like they can go vote without being intimidated, harassed or physically attacked. But the influence side of the house looks very different in the United States right now, because I think people are starting to wonder if democracy even is worth it. And I think that's a feature, not a bug, the idea that you can convince people not that they should be fighting over who should win, what candidate should win, what party should be in control, that they would just decide they don't care at all. To me, is much more of an issue and a fear than the reverse. And so, as we go into 2024 in the US, I was recently talking about this at DEF CON, but then also talking about this on the news. What you have to do in 2024 in the US is commit to voting. You have to commit to voting. You have to say, no matter what happens between now and Election Day, no matter what hack and leak comes out, no matter what deep fake audio or video comes out, if there's a forgery, whatever the case, you will show up when it is time and you will vote. And I think for us, this digital depression is really new in the United States, but also globally. I think many countries are seeing this as ,well people are starting to not fully believe that democracy delivers. Going back to our cyber equity conversation. If you're already struggling, you've had you have a year where you've had a natural disaster, and now you're on the SNAP program or a food benefit program, and then someone is stealing that money from you. It's really hard to mentally prepare yourself to be like, I got to go support this democracy, though. And so being really clear that a lot of these cyber equity issues are not nice to haves, they are the pillars of democracy. A democracy has to serve its people, or it crumbles. And so it's kind of like a as we're getting close to the end the election security, to me, is about making sure people have the tools to address the inequities in our society. And when you take away people's democratic rights, or you take away democratic participation, or the will to participate, we can't address these cyber inequities that I've talked about the rest of the hour. Like, I actually need public policy. I need people to believe that government can figure out how to get criminals to stop attacking our healthcare facilities. That we can protect your child's data when they are in school, and if you don't to me, that's the beginning of the end.

52:21 Raghu Nandakumara

I can't think of a more powerful message on which to end this conversation. I'd love to just carry on, Nicole, but I think you, that was a slam dunk. This is really about keeping our children safe in school, right, educated so they can grow. It's about keeping people off the streets. It's about putting food on the table. It's about ensuring that people who need hospital care get it in a timely manner. And cyber equity, keeping all of our critical infrastructure safe and the people safe is just a paramount responsibility in any democracy, in any in any society. So, Nicole, it's been inspiring speaking to you today. So, thank you so much for agreeing to be on the segment and being an incredible guest. Thank you,

53:16 Nicole Tisdale

Of course, thank you for having me and to all your listeners. I hope it can be overwhelming, I know when I like start talking about these issues for people, but you can practice cyber equity, whatever you were doing before you listened to this podcast. You just need to do it through a lens of equity, which is thinking about that end user, thinking about the human impact. And I actually need people to do that in the spaces where they are right like everybody doesn't have to come join us on the public policy side of cyber. I'd love it if you did, but I also need technical people to commit to looking at cyber solutions through a cyber equity lens.  

53: 55 Raghu Nandakumara

That's awesome. Thank you.  

53:56 Nicole Tisdale

Well, thank you for having me.