Cyber challenges from attacks to Zero Trust with Meghan Good
With the rising number of cyberattacks reported in the news, and the growing importance of complex cyber environments, there has never been a better time to talk about cyber awareness.
Given the increasingly more sophisticated threats including data breaches and ransomware attacks that are occurring all too frequently, how can organizations protect the data within their cyber environments? And what are the core cyber challenges our customers face?
Vice President and Director of the Cyber Accelerator Meghan Good shines a light on emerging cyber technologies, trends in cybersecurity, and the increasing adoption of the Zero Trust philosophy.
“There's a lot of promise about applying machine learning to cyber use cases. For us, we're looking at the detection challenge, and how we can use machine learning to enhance our detection capabilities, which really is very similar to how it would be used in Zero Trust cases.”
On today’s podcast:
- The major challenges for cyber technology
- President Biden's Executive Order
- What Zero Trust means
- Why create “weird” environments
- The challenge facing cyber-physical systems
-
Meghan Good (00:00): How do we do better? How do we improve the way that we are doing detections, the way we're doing response, the way we're being more proactive? What can we add to change this seemingly constant narrative of that there was this breach, there was this incident? What are we doing to make it so that we're resolving those things before they have these tremendous impacts?
Bridget Bell (00:31): Welcome to MindSET. I'm your host, Bridget Bell.
Katea Murray (00:35): And I'm your host, Katea Murray. In this episode, we discuss cybersecurity technology, core challenges, and Zero Trust.
Bridget Bell (00:42): We also get into the presidential executive order, cyberattacks in the news, and the importance of complex cyber environments. Having just wrapped up Cyber Awareness Month, this conversation was a great opportunity to talk through timely issues. Let's get started with Meghan Good, Director of Leidos' Cyber Accelerator.
Bridget Bell (01:14): Welcome to MindSET. Today, we're speaking with Meghan Good, Director of Leidos' Cyber Accelerator. In today's episode, we focus on cybersecurity technologies, which is a massive topic to take on in a 30-minute podcast. Let's start very high-level. With what you are seeing as far as core challenges, what would you define as major challenges for cyber technology today and how have those changed in recent months or years?
Meghan Good (01:40): I think broadly in the market, there are a lot of different cyber challenges. From a security perspective, you're definitely looking at environments that are increasingly more heterogeneous. There's lots of different kinds of IT in our environment. There's different kinds of servers, different kinds of endpoints. There's different kinds of ways that we're processing information and that we're using it. Different users in that environment who have different kind of access levels. All of that ties into a larger cybersecurity challenge.
Meghan Good (02:14): From a technology perspective, all of those technologies then have to talk to each other. They have to be communicating. You have to be able to audit them from the security side and really understand what's happening there. On top of that, to make sure that you are getting all the right kind of protections for the data that's within those environments. You have to be able to do some sort of analytics across that data to make better decisions about what gets secured, what gets protected and when, and then what your responses are when those things don't come true.
Meghan Good (02:50): It's a big data challenge. It's a lot of complex devices all put together. Then, on top of that, I think there is this evolving threat landscape going on about those who want to get access to that data. More and more, what we've seen over time is that those threats are increasingly more sophisticated. They're taking advantage of vulnerabilities within that environment. Then from there, they're able to leverage that big because most of our environments today are set up so where once you're in, once you're authenticated in some way into the environment, or you have a kind of trusted presence, you can then do all sorts of things within those environments. You can pivot, you can laterally move into another area that you might be more interested it and find other data and reek other kinds of havoc that you want to do.
Meghan Good (03:44): And so, from a technology perspective, we want to be able to limit that. We want to be able to really be aware of what users are doing within the environment, and then want to make sure that we're taking the right kind of actions against it.
Katea Murray (03:58): Meghan, it seems like nearly every day there's a new report of a data breach or ransomware attack. Is this just increased media attention or has there been an influx of cyberattacks recently?
Meghan Good (04:09): I'd say it's a little bit of both. I think there certainly is increased media attention that we've seen over time. Because the headlines prove it themselves. Just about every day, just about every week, there's something else that's more significant that we're discovering. Now, over time, I'm not sure that means that there's been an influx in cyberattacks, or if it's something that we're more aware of. I think on the one hand, you could say that there are more because there's more attack capabilities out there that are available. If you had the right intent and that's what you wanted to do against a particular environment, you now have tools available to you that are open source and could then use that for some initial access to perform some sort of, kind of attack.
Meghan Good (04:52): Ransomware, for instance, we're seeing an evolution of that capability where it's ransomware as a service it's seeming, and those abilities are available for folks who want to do those kinds of attacks. But yet at the same time, I think we're getting better in our response capability of being able to connect the dots of sharing information between different kind of defender organizations, of really upping our game in that space as well so that we're able to better detect these kinds of incidents. But all in all, I think it's something where, no matter what, it's a growing emphasis. Over the course of my career, when I tell somebody that we work in cybersecurity, you often hear them say, "Well, that's really important. That's something that we need now." I think in the beginning that was never the case, where people wouldn't have found my job interesting. Now they do, because it is in the media. There is this influx of it could happen to lots of different organizations and that it affects the whole range of industries, of critical infrastructure, of governments. Everybody is feeling this cybersecurity set of challenges that we were just discussing.
Katea Murray (06:04): With that, what do you think we can learn from these news stories?
Meghan Good (06:08): I think from these news stories, what we can really learn, there's a couple points here. First, I think there's an instinct of shaming the organization that comes across in these news stories. That they didn't do something right and that something went wrong and it was an egregious mistake that led to this instant. Oftentimes, I don't think that's true. I think there's a combination of errors that may have happened, but there's also things from a more persistent kind of adversary trying to get into an environment, too. It's an outmatched capability to the kinds of systems that were there.
Meghan Good (06:47): I think secondly, from all of these news stories, what we hear is that I think it matters to protect the kind of data that an organization is using and has stored within their environments. That means that they really do need to raise the bar on how they're protecting that information, how they're protecting their systems and their environments. I think that level of importance and really that presence of it in our news headlines, means that it matters. It means that we have to take actions.
Meghan Good (07:20): I think from the third side, from a technologist perspective and someone who's been in this industry for nearly 20 years, it's a lot about how do we do better. How do we improve the way that we are doing detections, the way we're doing response, the way we're being more proactive? What can we add to change this seemingly constant narrative of that there was this breach, there was this incident? What are we doing to make it so that we're resolving those things before they have these tremendous impacts?
Bridget Bell (07:54): Speaking of news related to cyber, we can't talk about cybersecurity without also talking about President Biden's Executive Order on improving the nation’s cybersecurity. It's been almost six months. How has this changed the cybersecurity industry?
Meghan Good (08:12): Well, that's a good point. I really think that what the Biden administration has done with the Executive Order is taking that step to raise the bar on what cybersecurity needs to look like across the federal government and then using their buying power to make it across the industry of those who are service providers to the government, of those who are actually providing technologies that are then integrated into government systems and networks as well.
Meghan Good (08:40): With this, I think the industry response has been largely very positive about the executive order and that it's showing this path to change in innovation and being able to pull in technologies that should be common place that help us move to the trends that we're seeing across the market. I mean, Zero Trust was mentioned in there about 11 times, our Zero Trust expert would say. Then from there, there's also things like things around identity, things around moving to the cloud, about securing our software, which is often what all introduces those vulnerabilities that get taken advantage of in these larger incidents.
Meghan Good (09:24): All in all, I think it's something very positive for the government to take the step from a customer requirements perspective. I think they're then responding to needing to address the kind of requirements that are within the executive order. And we're starting to see that flow through in a lot of conversations and discussions as we move forward through this time period.
Katea Murray (09:49): Another trend we are seeing in cybersecurity, and you've actually just mentioned it, is Zero Trust. Help our listeners understand this philosophy.
Meghan Good (09:57): Sure. Zero Trust, I think, the first time you hear it, definitely makes you wonder what's going on there and it's one that's touted often. But effectively, it's where you're starting to take more granular control over what's allowed within your environment. We're not trusting that user once they're already within your environment's bounds. Often that's the kind of castle-and-moat look at a network where you fortify the walls around all the data and all the endpoints inside. Then, once you're inside, you can do what you want.
Meghan Good (10:36): With Zero Trust, it's really the shift to every interaction between a user, a device, data that they're trying to access or some other resource. You are checking if it makes sense for that combination to occur. And so, with that, there's a lot of data that you need to make those decisions. There's a lot of computing power that you need to make decisions really quickly so that it doesn't impact the legitimate usage of those systems. There's always this balance in cybersecurity engineering of where you want to be putting security measures in place; but at the same time, these systems just have to work. They have to work for the purpose that they were there for.
Meghan Good (11:16): I think with Zero Trust, it's getting those combined in a way through a set of policies and decisions that then make it so that you can ensure that balance is there but there's a big emphasis on security. I think the promise of Zero Trust for all of us from a cyber perspective is that it cuts down on what we call lateral movement. Of being able to get into one particular device within an environment that might not be as well protected, or that might have extra vulnerabilities in them than something else, the weak link, and then being able to maneuver to things that are of higher interest to you to actually capture data that you couldn't have seen from that initial device.
Meghan Good (12:03): Zero Trust is meant to really limit, minimize, or actually make impossible that lateral movement. It also questions around persistence, which is another capability that from an attacker perspective you want to be able to stay in an environment in a hidden way. With Zero Trust, we're looking to make sure that as that particular actor is within that environment, we can see more of it. We're making more decisions about what they're able to do, which limits their capability to persist in those environments.
Meghan Good (12:37): There's a lot of promise around Zero Trust. I think one of the challenges is it becomes somewhat of a marketing term or a term of art at this point. And so, there's a lot of products that would go into a particular architecture, a Zero Trust architecture. It's become a challenge of what combination of those products, as well as what combination of those policies and those rules and the kind of data that you need to make all of that happen, what really works for an organization. And we're finding a lot of organizations coming to us to talk about what their readiness is going to be for Zero Trust and of helping them progress through the kinds of systems that they have to have in place to get there. What they need to understand about the identities within their environment, about the kinds of data resources that those identities might be accessing, the kinds of devices, what the configurations of all of those need to look like, and how they can just slowly build to creating this Zero Trust environment.
Meghan Good (13:43): It's a lot about transformation and transition, not a rip and replace of that castle moat. If you thought about it as a home renovation project, not a complete rebuild, but you're trying to use the pieces that have worked. You're adding in some new capabilities over time. In the end, you're creating something that's more fortified from the inside out, not from the outside in.
Katea Murray (14:08): Excellent. You mentioned a large amount of data that has to be analyzed and resources that have to be checked over time to implement Zero Trust. How do you think artificial intelligence or machine learning comes into play there?
Meghan Good (14:22): Well, I think artificial intelligence and machine learning is absolutely critical in that case, because it's about the speed of making those decisions. It's also a lot about not just the decision that Meghan should be able to use this laptop to access this website and to get this kind of data in one individual case. It also could be how that spans across an organization should multiple folks from different areas be able to access this at one time. What is the overall trend? That has to happen so quickly that it really can't be done on an individual basis. It can't really be done with a human in the loop. We're getting to a point where that has to be powered by something that is learning over time about the environment and by something that can continue to react faster than a human-enabled system could.
Meghan Good (15:20): A lot of our defensive systems today are really rule-based and a number of our research and development efforts is okay, so with those rules comes inherent bias. I know Ron Keesing has discussed on past efforts about trusted AI/ML. So what we're looking is the combination of how we can use machine learning to look at those rule sets and really evaluate where there are actual gaps in the rules, that something like a machine-learning powered capability can actually identify.
Bridget Bell (15:51): That's really interesting. It sounds like there's implications even more broadly than just within Zero Trust. What about AI with cyber as a whole?
Meghan Good (16:03): I think there's a lot of promise about applying machine learning to cyber use cases. For us, we're looking at the detection challenge and how we can use machine learning to enhance our detection capabilities, which really is very similar of how it would be used in that Zero Trust case. How you're detecting by some combination in those policies of a larger event that might be occurring.
Meghan Good (16:30): I think we're also looking at how machine learning can help us with our defensive systems and really testing the efficacy of those defensive systems. A lot of them today, whether it's at a perimeter, even at an endpoint or at any point in between with the layers of security that we put within environments, we're seeing that it's lot rule-based. Those systems are fed by a set of rules that are human generated based on threat intelligence that we have or different network policies and settings, configuration settings, compliance driven needs that we've implemented.
Meghan Good (17:11): Now, the challenge there is that we can start to use machine learning to identify the gaps in those rules. Rules inevitably have some form of bias and they mean that there are some loopholes in them. We can use machine learning to start to figure out what some of those loopholes are and then enhance our detection based on that. And so, we're exploring a lot of research in that area and see a lot of promise in the capability of using machine learning to identify those gaps in advance so that we're enhancing our detections faster.
Bridget Bell (17:45): Switching topics, somewhat. I've heard you talk about the importance of complex environments or even weird environments. Why is that so important and how can we keep up with these ever-changing landscapes?
Meghan Good (17:59): Right. Well, I certainly have said my share of let's keep environments weird. Before, Bridget, we've talked a lot about it, there's some security and obscurity as well. If your environment looks strange, when someone tries to scan it, they don't quite see what is really there. There's a little bit of this hall of mirrors effect there that we're looking for. I think that is important for me in that, we've talked about, there's so many vulnerabilities in our environments and they're so complex already that by adding another layer of complexity, maybe it makes it more difficult to defend.
Meghan Good (18:38): I would contend that if you're adding that extra complexity, you're making it harder to find where the critical data actually lives and what those assets really look like. And so, what we're really looking to do is figure out ways that from a defender perspective, you can employ that kind of strategy. You can orchestrate the creation of these weird environments, but yet still make it so that there are protective capability. And keep adversaries busy within environments against targets that they think are interesting, but that are actually not things that are affecting critical assets.
Meghan Good (19:18): A lot of this has fallen under terms in the past, like honey pots or decoy technologies, or could be classed as cyber deception capabilities. But I think in this world where we have a lot of software-defined capabilities, software-defined networking, we can really spin up these kind of environments that are very protective and make it so that it imposes costs to get to the real data, the real devices, and the real capabilities within an environment.
Meghan Good (19:50): That's my keep it weird. I think it is one of where it has to be done at scale. It has to be done intentionally. It has to be done with all the capabilities to automate it and to orchestrate it, to make it something that's even feasible within environments, but it's one where we need to drive towards that.
Meghan Good (20:09): Our environments already are weird and complex. Let's use our weird complexity to our advantage to really help with defense. Then, the other benefit of those weird technologies being out there is then that we have the capability to detect kinds of activity that are happening against those, which would only be caused by someone who's not already within the environment or who is not somebody that, an identity that we know and trust. In our Zero Trust world, if you looked at that, we would know what that identity was. We would know what kind of data they were trying to access. So, we would have more of a chance to make those decisions within that environment and use it for threat hunting where you're trying to figure out what sorts of threats are in your environment. Then, you'd be able to action with more confidence about what's going on in your own environment.
Bridget Bell (21:06): To bring that full circle, I'm sure you could use AI in that threat hunting so that it all comes back to everything you've already talked about.
Meghan Good (21:15): Well, certainly. I think there's a machine learning opportunity there of what kinds of environments give us the right kinds of information and how do we learn over time to make those environments more useful and practical. And so, there's a machine learning challenge and problem there too.
Katea Murray (21:32): Meghan, moving away from weird machines and keeping it weird. What do you think the challenge are surrounding cyber physical systems?
Meghan Good (21:39): I think there's a lot with cyber physical systems. For those out there, cyber physical systems are really those that can be accessed using communications, using connectivity, just like our standard IT, like laptops and computers and cell phones. But with cyber physical systems, they create an impact in our world as well. You can think of things like medical devices or for us, things that are kind of security scanning kinds of products. Then, there's also industrial control systems, things that control utilities and energy, and those sorts of critical infrastructure pieces, as well as things from a defense perspective like weapon systems and platform. Those are all things that we would class as cyber physical systems these days, because they really are still connected.
Meghan Good (22:36): I think that's the challenge around cyber physical systems, is that a lot of them have been built over the years without thinking that they were going to be connected to the internet. We have this legacy challenge of, it's things that weren't designed to operate in the world as we know it today. Then, there are those ones that are operating in the world as we know it today, but maybe not from a security perspective, how they were designed. There are vulnerabilities within the kinds of embedded software and firmware and hardware that exist within those systems. I think that challenge is when there's really a lot of layers of potential vulnerabilities, just like there are within our IT environments. But with them, they have more of an impact when they can be exploited.
Katea Murray (23:25): How would you think the challenge, I think, for us, or for industry that doesn't control those systems, how do we help them protect themselves better?
Meghan Good (23:34): From an industry perspective, from a technology vendor sort of perspective, I think we're going to start to see a lot more being addressed at that market and of solutions that are really looking at that convergence. I think there are a number of different products and capabilities that are already available there, but I think we will see increasing sophistication of those over time, too. I'm sure more of those will include the kinds of themes that we've already talked about around Zero Trust kinds of capabilities, about machine learning, about better analytics, about better automation and orchestration of some of those, both detection and response actions within those environments. But that really is an area that it didn't need to exist when a lot of those systems were built or put into place, but it needs to exist now for them to operate in the current environment, in the current world.
Bridget Bell (24:32): It goes back to what you were saying earlier of when you started your career, people weren't quite as interested. Now, when you say that you're in cybersecurity, that's very important and we need more of this out there. So, I think those physical systems, they had no idea that they were going to be interconnected and need the type of security that we now know is so vital.
Meghan Good (24:57): I think over the course of my career too, what's pretty interesting are the folks that I've worked with from a cyber secure perspective often have a basis in some of those legacy capabilities. They were the ones who had to figure out the security that wrapped around it. Then now, I think we have a lot of folks who have cybersecurity training and background who are coming into these career fields through college, through other sort of training and upskilling kinds of programs. Now, it's a matter of how we are applying their skills with some of those folks who have that kind of legacy knowledge of these systems. It's this giant collaboration that has to happen and almost mind meld from lots of different perspectives.
Meghan Good (25:48): I think with that, we're standing on this precipice of a lot of innovation that's going to come in so many different ways from very small changes of how we interpret data to see if there's a particular threat activity occurring within a particular environment, to some of those more macro challenges of how's that happening across a larger enterprise environment, to then even these cyber physical challenges that we're seeing across lots of critical infrastructure industries. Then, what are the cybersecurity implications? What are their technology needs to really drive all of that and protect those systems in the future?
Meghan Good (26:31): We're on the edge of a lot of things happening here, and they already are. It's just a matter of time where more and more is coming. But I would say, in this market, there's a need for technologies. There's a need for change. It's an evolving threat landscape, which means it's an evolving innovation landscape, so that we're actually countering those threats and that we're staying ahead.
Bridget Bell (26:55): Sounds like a lot of exciting things to come.
Meghan Good (26:58): More to come for sure.
Bridget Bell (27:00): All right. Well, thank you so much, Meghan. This has been a great conversation.
Meghan Good (27:05): Thanks for having me.
Katea Murray (27:06): And thanks to our audience. If you like this episode, please share with your colleagues and visit leidos.com/MindSET.