Sara Duffer:
So switching topics a little bit, Steve, I'm going to go to you. There's quite often and quite frequently when we talk security, we very much get into the new technology and the evolving worlds that we have in front of us. But at the end of the day, a threat actor is a threat actor and is a human. And I would love to hear a little bit more around how you think about the human dimension associated with cybersecurity.
Steve Schmidt:
Certainly. So breaking news, computer security, information security, cybersecurity, whatever you want to call it, is not a technical problem. It is a people problem. One of the things I learned a long time ago when I was in the FBI focused on counterintelligence work was that while yes, chasing the spies is the job, and it's interesting, they're there for a reason. They're motivated by something. Traditionally in the espionage world, it was money, ideology, coercion, or ego. The same thing is true in the cybersecurity world. People are interested in money. That's your ransomware actor. Ideology. It's your traditional nation state actor who's gathering intelligence or preparing a battlefield. Influence, which is a new one in this space, is getting a population to think in a particular way, shifting their opinions, causing things to happen in the world. Or ego. That's the script kitty, who really wants to be the biggest, baddest hacker out there and causes a DDoS attack as part of that.
And why do we care about knowing why these people are doing this or their motivation? Because it helps us understand the kinds of tools, the kinds of capabilities that they're going to have, where they're likely to go after us, and their tolerance for risk or exposure. As in, is this a big deal if they get caught and the FBI comes knocking on the door, or is this something that doesn't really matter because they're currently sitting in a basement in Belarus or something like that? Which is the kinds of spectrum that we have to work with to understand what we have to do as defenders and how we build systems that help prevent those people from gaining access.
The interesting thing is that same mindset needs to be applied for our own employees. Our own employees are universally well-intentioned. They want to do the right thing, they want to help, et cetera. But let's face it, they're also human. So sometimes they get in money trouble. Sometimes they don't like the direction something is going. Sometimes they just have a bad day. And as a result, we as defenders need to be able to be prepared to understand what they're doing, why they're doing it, and how we're going to make sure that they're not doing something that they shouldn't.
But a lot of what the really important component of cybersecurity and people within a company is, is the culture of the company. The security culture of a company is the thing that'll make or break. We have all seen what happens in the public news when you have a deficient security culture. You end up with nation-state actors breaking into an organization repeatedly and exploiting them for their own benefit. Why? Because the people in the company were not measured or motivated by the right thing.
They weren't motivated on protecting your data or your information. They were goaled on something else. And so building your culture, which says, the most important thing for you as a person, a developer in my company, is to, number one, be safe physically yourself, and number two, protect your customer's data. Because that allows them to make good decisions every time during the day when they're thinking about something. “Should I go left? Should I go right? Should I do one thing? Should I do another? Should I ask for help because I really don't know? Let me go find an expert in this space.”
And I think the incentive there of making sure that you've got the right culture is that it leads you to lower cost down the road because you don't have to clean up the messes that somebody made because they were moving quickly to hit a profit goal, a margin goal or a delivery goal as opposed to getting security right for your end customers in the businesses.
Sara Duffer:
And it makes my life easier as well in the security assurance world as well. It's a nice outcome. Chris, taking that point around culture, we talk a lot at AWS about security being a top priority. Talk to me a little bit about how we actually do that. How do you build that culture?
Chris Betz:
One of the reasons why I think culture is so important is not only does it lead to the long investment, but I think every company that I know works to train, provide tools, provide capabilities around cybersecurity. And one of the major differentiators is culture. Because security is constantly changing. To your conversation earlier, I can't tell you how many times we end up talking about AI recently, right? AI is constantly changing. And that ability to adapt. That ability to raise your hand and say, "You know what? I'm seeing a conflict here, or I'm seeing a better way to do security." Let's think about this. Can we do this better? Not just blindly follow the process and the tools, but actually go ask the question.
Or, “I think these processes and tools are missing something. I see this risk, I see this issue. How do I bring that up?” Those things are incredibly important. And so as you said, the culture pays off over time in an amazing way. Building that culture takes deliberate time and energy. It starts at the top. It starts with aligning the culture to the way the organization functions. Part of that is telling yourself who we are. It's internally just as much as externally, hearing Matt say, "Everything starts with security."
And furthermore, it's how people spend their time. Steve and I both talked about the weekly meetings that happen that led by our CEO. Again, making sure that that's part of how the organization functions is incredibly important. Once you have the security built into the culture of the organization, it's important to emphasize that security is everybody's job. Each person gets a specific role. That's the opportunity to raise your hand and say, "We have to do something different. We think we're missing something. I'm confused. I'm not sure." Security, everybody needs to understand that security is their job and it's our job as security leaders to make that job as easy as possible. Because if people are spending their time focused on security every step of the way, that's going to add friction to the organization. What goes hand-in-hand with making securities everybody's job is security working to make it easy and natural for people to do security. That means that security needs to be distributed across the company. We need to make sure that training, knowledge, capabilities are well-designed to make sure that happens across the whole of the organization.
And then lastly, we need to be willing to invest. We need to be willing to invest in innovation that improves security. We need to be willing to invest in innovation that makes it easier to be secure. Because if you don't, you end up getting caught in the past and you're never able to move forward as an organization. One of the ways to do some of this is through things like a security guardians program, where we rely on our people, we train them on deep security matters within the service teams, within the engineering teams, so they can make sure that people are thinking about security very early during the development processes and continuously and they've got that right knowledge. And it helps make things very, very, very scalable. So there's one thing that you all can do with your teams, it's look very deeply at can we create a security guardians program, and how do we create a culture of security within our company?
Sara Duffer:
Okay. So what are three questions that business leaders in the room can bring back to their security and compliance programs?
Chris Betz:
I'll give you one that I always love to ask. For all of us technology leaders, I don't know how many of you have what we call a builder tools or developer tools organization. As a security leader, those organizations are my favorite organizations in the whole organization. If you don't have one, these are the teams that build tools that make your developers' lives easy. There's massive leverage there. If there's a place that I love to see companies put their top talent, it's in the builder tools organization, because in one organization, you can make all of your development processes much, much better. From a security standpoint, that's where the leverage is. Because you can take your security knowledge and your security capabilities and build it into those tools and get massive scalability and make security a natural motion.
And so the question that I would take back if I were you all is ask your security leaders and your builder tool leaders what their relationship is, how well they're working together, and how well all of the security outcomes that you're looking for are built into the builder tools capability.
Sara Duffer:
Steve?
Steve Schmidt:
So this is to sort of reiterate something I said before is ask your teams, "Where are we building Gen AI applications right now?" And then ask your teams, "What is the mechanism that we have in place so that we know tomorrow where we're building GenAI applications, and what is the latency between someone building a new one and us knowing about it?" And you'll find that in many cases the answer is, "Scramble, scramble, scramble. Quick, find some data. Here's the answer." Awesome. It's now the next day. Okay.
So you need a method, a mechanism, a tool which allows you to do that on a regular basis, to keep up to date with it, to make sure that you are responsible operators and stewards of that infrastructure, and that you can be responsible owners of the data that you collect on behalf of your customers.
The second piece there is what guardrails do you have in place, and is there a mechanism to update those guardrails as the world changes around generative AI? In the time we've been sitting on stage here, the generative AI world has advanced incredibly. There's something new that's going on. There's some new way to cause a problem with the foundation model that some smart person has thought up, and we have to be able to defend against it. So what is the rapid iteration method to be able to influence the guardrails you've got around your Gen AI applications?
Sara Duffer:
I think you cheated. I think that was two. I, too, am going to cheat a little bit. And I would say it's very much asking teams about how they're actually ensuring compliance. And what I mean by that is it's not just a case of in the moment, how are we able to tell whether or not we are compliant with the various either standards, laws, et cetera, but it's also actually understanding how you're able to get continuous assurance over time so that you can really determine what the cost of the builders is.
So I would say there's two core questions, which is, how are you compliant with either your internal practices or laws, et cetera, and what is the cost to the builders, which is really important because you want to be able to innovate very fast and you want to ensure that you're monitoring how much it's costing your builders and still ensuring that you remain compliant.
So in closing, the final question that I have is, in speaking with customers on a regular basis, what is the best advice that you have for customers to immediately approve their security posture?
Chris Betz:
For your internal company and for your customers, find ways to implement passkeys. Getting away from passwords is just game changing for your people and for your customers. Take advantage of that technology. It's a major leap forward. Implement it today.
Steve Schmidt:
And then not only is it much more secure, it's a better user experience. It is so smooth. So get your tech people focused on this and find out why they're not doing it right now.
Number two is much less exciting than passkeys, and its security vegetables. Vulnerability management. Patch your stuff. It is the single best defense that you've got against people out there.
Chris Betz:
Or make us patch it for you.
Steve Schmidt:
Exactly.
Chris Betz:
Use Lambdas and other things.
Steve Schmidt:
Yep.
Sara Duffer:
Well, thank you very much for joining us today and thank you again for the time.