Especially when you're in the cloud, I think you want to do security from the inside out and that addresses a lot of these things.
Our security experts touched on lots of interesting topics like DevSecOps, Privacy, and more.
Ben: Welcome to the Masters of Data Podcast. The podcast where we bring the human to data and I'm your host, Ben Newton. We are excited to bring to you our second security experts panel discussion for 2019. We brought together a high powered set of individuals that know security inside and out. As always, we have George Gerchow, The Chief Security Officer at Sumo Logic, and he is also joined by Jadee Hanson, The Chief Information Security Officer at Code42, and Teri Radichel, The CEO at 2nd Sight Lab. We had a wide ranging and fun discussion about everything from DevSecOps to privacy. Buckle up and enjoy the show.
All right, welcome everyone to the live Masters of Data Security panel. We should have people joining and rolling in here just a minute. As you're coming on I'll remind you again later, you can all ask questions as we go along, so please drop in those questions. We'll be able to see those live and we will try to get to them.
Ben: With no further ado, I want to introduce our very illustrious panel here. We'll start on the west coast. With me we have a Teri Radichel, did I get that right?
Teri: Yes you did.
Ben: Okay. I got first right. She's coming out of Seattle and she is the CEO of 2nd Sight Lab. Thank you for coming on Teri.
Teri: Thanks for having me.
Ben: Okay. There's this other fine fellow over here to the right, Mr. George Gerchow, who is The Chief security Officer at Sumo Logic works, works with me. Welcome on, George.
George: Thanks Ben. It's good to be with you again.
Ben: Yeah, absolutely. Last and certainly not least is Jadee Hanson. She is at Minnesota from she's the CSO, Chief Information Security Officer at Code42. Welcome.
Jadee: Thank you. I'm excited.
Ben: I'm very excited to have all of you. Like I said, I think I said in one of my tweets, I'm the only one on here without a C level title, so I'm feeling a little.
George: You got the biggest mic, right?
Ben: That's right. That's why I need the biggest mic, I have to make up for it. I was actually thinking before I got on this today, I spent half an hour yesterday trying to wrangle the Mueller report onto my Kindle. It feels like it has some applicability here. There's a few things in my crossover. We talked about a bunch of different topics that we wanted to start off with and one of the things that came up was insider threats and Teri, you actually posted something really interesting about some insiders threat. Why don't you go ahead and start with that. What was it that you posted and why did that interest you?
Teri: Well it was really interesting. I didn't get the reason behind it but someone went into a university and basically took a killer USB and took down a bunch of machines and he apparently wasn't a student there anymore. Somehow he got in, so whatever security was there he got in, he was able to do that. In full disclosure, when I started my first business, I used to go back to the University to print some stuff because I didn't have a printer. I think it's gotten a lot better, but it's still goes to show that insider threats can come from a lot of different places, you don't know where.
Ben: Jadee from your viewpoint, I mean, what does that look like to you? Because you had suggested this as a topic even before when we were planning this. From your perspective, what are you seeing right now and how does that concern you?
Jadee: First and foremost, the would trend that I'm seeing right now is gotten good at some of the perimeter defenses, and so as those get better and better, the bad guy still have to get away in and it's scary to think, but what we're seeing is that a lot of times internal people are being recruited by bad actors outside. When you can't come in through the front door and those things are locked down, in order for them to actually get a foothold in that environment, they'll target individuals within a company to essentially work for a bad actor and then allow certain things to happen. There's even places like right now on the dark web where you can go, so let's pretend you worked for some large company. You can go and say what information that you know of, and then also talk about what access you currently have, and malicious actors can bid on that and say what they're willing to pay you in order to get that so that they can get a foothold within an environment. That to me is really scary.
Up until this point, we've talked a lot about the external threat coming in, people we don't know, people that aren't part of our own organization, but I see a trend where this insider's issue is going to become larger and larger as our perimeter defenses get stronger.
Ben: It's a little scary to think about it that way. What do you see George?
George: Well, I think from our viewpoint there is no perimeter anymore really. I think the days of doing a hard shell soft center security type approach just doesn't work. Especially when you're in the cloud, I think you want to do security from the inside out and that addresses a lot of these things. To Jadee's point, and Teri's point, it's always comes down to the people, right? Oh, my gosh, I broke the golden rule. My phone is ringing in the background.
Ben: Come on George.
George: I told you I was going to be the weakest link. I [inaudible 00:05:31] though.
Ben: Okay, so how do I kick you off? I think I can kick you out right here.
George: Speaking of the insider threat, but it can come from anywhere. I'll tell you one of the tough things, especially being at a software company and I'm sure you know both of you can relate to this as well, is that developers usually have godlike rights. I mean, to be able to innovate and get things done and so you want to trust them. Their point is always like, we're the ones working on things so we should be trusted, and it's like, no.
Because it's not even just about trust, a lot of times it's also about mistakes, do you remember the outage that happened at USC and AWS? I think about two years or so ago, it was a fat finger, which is mind blowing that something like that would happen in AWS, but it's not even just a malicious activity. It's also someone making a mistake because they have too many rights.
Ben: Jadee, some of the things you mentioned about somebody actually putting themselves up for sale on the dark web, which sounds like the start of some movie, but-
George: Black Mirror.
Ben: ... is that going to be very, very technical people that are doing this? Or is this something that because it seems like you'd actually have to have some level of knowledge and skill set, I don't even know what the dark web is, much less how to post something on there. Is this really a technical threat from people like developers and security professionals or other people getting into this?
Jadee: I think other people are getting into it and here is why. I did an interview doing RSA and the conversation that took place was actually fascinating to me. Is it's not necessarily people going out and looking for places that they can do this, but it's also people being targeted. This particular individual was telling me the story of how the malicious actor externally was targeting lonely developers with pictures of attractive women to incite some engagement. Then from there, I think a step further of, well what access do you have, and moving beyond that.
I think it's pervasive, the tactics that are being used right now. I want to come back to the point that I made about perimeter. I totally agree with George on, there is no perimeter. It's almost like as our controls get better, so whether it's like the end point controls and the controls getting to cloud, our controls are actually getting better from a security standpoint than what they were five, 10 years ago. As our controls get better, it's harder for malicious actors to get in without that compromise on the insider side.
Ben: Yeah. It seems like what you're getting at is we're getting better on the technology side, but the humans are still the weak point. We shouldn't be surprised [inaudible 00:08:18].
George: It was fun though, Ben, did you hear Teri? She's pointing me out as one of those lonely developer types that could get scammed by the pitchers of lemon.
Teri: I [inaudible 00:08:26].
Ben: You know what it reminds me of is I worked selling with to the government and working with and government contracts for years and years and years when I lived in DC. I remember there was a period of time where there was all of these questionable LinkedIn profiles and you see from different things and it was that same thing, attractive women reaching out, trying to connect, asking questions. I remember being warned about that. It was the oddest thing. I mean, it's fairly pervasive. I mean, and I think I've figured that's been going on for a long time with intelligence agencies so now it sounds like you even have organizations that are more in it for the money hackers that are basically doing the same thing, which is a little scary I guess. The flip side of that is, okay, so you're a CSO. What do you do?
George: It is good question. I mean, I think just being able to vet things out in a deeper way with people. I think a lot of times someone will line up, connect all the dots, but it's conversation after conversation, after conversation and seeing them in different social settings. For example, whenever we recruit somebody, at least, I'm just speaking pretty much for our team, we put them in all kinds of different social situations. We do the hardcore interviewing, the technical piece of it, but it'll be like lunches, sporting events, anything to see how people behave, how they act, have them talk about competitors and everything else. Because what you're looking for is really someone very ethical, like the conversation we were having earlier before we got on this where Jadee and I are doing an employee exchange, we can leave it at that.
Ben: Is that what you want to call it?
Jadee: Do you think I'm getting in contact with this employee?
Teri: Plus you're getting [inaudible 00:10:09].
George: We're working on that, I owe. Okay. Let's leave it at that. This gentleman happened to be super ethical, right? He demonstrated that yesterday by telling the truth and coming out with it. He could have given any fictitious answer or said he didn't want to say, but the relationship was more important. I think when you put people in different social settings you see how they behave. The hiring process, yes you want to move quick on talent but you also want to see how they behave in different situations.
Jadee: Teri, what do you think?
Teri: I think this gets into one of our other topics and something I feel strongly about. When it comes to the cloud, I was one of the early people to say, "Hey, I think some companies could be more secure in the cloud because there are all these controls you can put in place a lot more easily than you can in the traditional environment." I came through up as a software developer for 25 years, I also ran my own company and hired contractors and did the whole mini data center thing. I looked at the cloud and all the companies I contacted, and I saw a lot of benefits to that, in the cloud you can put controls in place, you can put monitoring like George's company does. You can, your company does. You can put monitoring in place and trigger actions when people are taking actions that look a little suspicious.
It's also a lot easier to design segmentation and segregation of duties into your processes and into your applications. Even with microservices, so you don't give one person all the code and you can do things like separate the people from the data, which is something that Amazon talks a lot about. There's a lot of things you can do with cloud technologies and deployments to monitor what people are doing and set up guardrails for what they're doing.
Ben: I mean, for at risk of asking the obvious, why is that easier in the cloud? Is it more because Amazon and Azure and GCP have opened up a level of controls in automation that you don't have if you build it yourself or what's really truly different?
Teri: That is really the difference. when I looked at the cloud, I saw big automated platform and I've seen companies try to do this and entirely themselves with their own internal clouds, but the thing is Amazon has been doing this for 10 years. They specialize in it. They've built this completely automated configuration management platform. Everything that you can do with the push of a button, you can write code to do that. You can automate your deployments. A lot of security problems are from human error. George mentioned one, that's three. I was still wondering why there was not some automation there. I'm always curious but the more we can automate, you don't hear about a ton of mistakes from Amazon because there is so much automation there.
I recently read an article said they had one person in their sock because that one person babysits automation and if something goes on then they call other people. The automation of the whole platform is what brings all the power. You could definitely do the same thing internally, but just think of all the things you would have to build internally to match that platform.
Ben: Yeah. Then that makes a lot of sense to me. Jadee, from your perspective, does that resonate with you what they've said?
Jadee: Yeah, absolutely. We can have both worlds going on. We have a lot in the cloud, but then we also have our own cloud. We have from a customer environment pieces that are part of our own cloud and then we have development pipelines and what not in the different cloud platforms. To run the business, we run everything software as a service, so we know that those are running in some cloud and from a security perspective, exactly what Teri said, we are 100% focused on automation and putting our security controls in place from an automated way. We're using Cloud Custodian, which is a Capital One homegrown solution. Oh, Teri loved that.
Teri: That was my alma model, I've helped Capital One move to the cloud.
Jadee: Awesome. Yeah. We're using Cloud Custodian. We love it. It's free open source. It's great. We've written a ton of Cloud Custodian policies so that as our development team is pushing stuff into production, Cloud Custodian is seeing all of that. We have two groups, we have tell them that this is bad and they should go and fix it. Then we have the group of policies that take action and slowly we're getting much more into the take action group of policies. Things like unencrypted S3 bucket should never ever happen. If the developer makes a mistake and deploy something and deploys an open unencrypted S3 bucket, Could Custodian will see that, in a matter of minutes, it'll just tear it down and the risk is gone. That level of automation I agree with Teri is that's why the cloud is much more secure because we can't do that in the traditional sense.
Ben: That makes a lot of sense to me. It's kind of applies across the board because you lay a foundation of that automation and tool sets, then it lets you focus more on actually getting it done rather than going out there and getting all the tools then, that makes a lot of sense.
George: There's also another piece to it though, Ben, which is, it's also a chance to refresh because in a traditional data center you're always bolting things on. You have applications that are coming in and going into this traditional data center, you're bolting security around that and you never really have many opportunities in your career to have a brand new start. Yeah, you can build a new data center, but you're still moving the same applications over to it. You went from P to B, right? I mean, I worked for VM where I did security there, but we pretty much lifted and shifted the same shit that we had in these physical images into these virtual images, the same services running everything else. It wasn't an opportunity to really cleanse what you're doing and start from scratch the right way. I think cloud provides that quite a bit.
One thing I'd love to know from the two ladies, which neither one of them mentioned yet is yeah, you can put more controls in and I'm totally down with that and I think that that's a great way to keep people from doing things. Again, we said it, Jadee, you and I talked about this, it's about the people. What are you doing now to start taking a look at the profile of people your company is hiring to try to ensure that someone who is a little shady doesn't get brought on board that's going to sell off information and access?
Jadee: Great question.
George: Or what could you do?
Jadee: Yeah, what could you do? I mean, today I think it's like the traditional stuff related to background checks and whatnot. Once a person is in, this is a fields control, it's not like a hard technical control, but we've built a ton on company culture around security. Our CEO is talking about security all the time. This is cool. We just launched a Security Ninja program, and so we have a series of different color belts and everybody in the company, it's like do it at your own pace and everybody can go through certain belts. I'm about, I guess nine, I'm a green belt right now.
George: It's awesome.
Jadee: It's also like we, our first quarter, we just launched in January, our first quarter. We have the goal of getting 20 people in the company through their white belt and we had a 121 that wanted to do it and sign up and go through it.
Jadee: People were like really excited about the security space and learning more aspects of it. There's a component of they're like you have hold other people accountable to the same standards that you are. Are bad people probably going to get hired? I would guess so, but do we have enough of the cultural elements in place to try to find it and correct it, hopefully.
George: That's awesome.
Teri: I really like that answer. I think this is a challenge and I think the challenge comes from the fact that we all want to trust people and we all want to be nice. I think some people are maybe a little more suspicious than others because of things that have happened to them or things they know, like they work in security. I think exposing more developers through training and more people in the company, just like Jadee is saying to what is, what should be happening and what might be a problem and what could happen, potential risk will help people as a group collectively correct that behavior or point it out.
For example, I heard this in a class, I don't know if it's true, but another student said that Edward Zone was a really nice guy and he asked to borrow someone's key and that's how he got the documents. The other person probably felt awkward and like, oh, he's a really nice guy. I should just give him this credential. Right. I've seen that happen with developers and QA and they don't understand the potential or the risk. Just instilling that culture exactly like Jadee just said, I think it's really important.
Ben: Now, I'm envisioning those posters from what [inaudible 00:19:14] loose lips, sink ships. What's the equivalent?
George: But it's true though. I mean, Ben, do you remember, I don't know if you, I mean, you and I have worked together for a long time. Do you remember my first day at Sumo Logic and a joke that I played by chance?
Ben: Yes. I have a vague memory. Something about yes, sticky on your computer about ... I don't remember you. You tell the story. I do remember something.
Yeah. I got hired March 31st, and I knew I was at the right company for a couple of different reasons day one. The first one was that the security onboarding was literally eight hours long. Do you remember how brutal that used to be, Ben, Jones?
George: I'm sitting there and I'm like, this is too much, by the way, way too much, but it showed the commitment to the company on security, no matter if you're in sales or where else. At the end of the day I took a sticky note and I wrote all of our applications on it and then I put fake passwords next to them all and I stuck it on a monitor and I went home that night. Well, the next day happened to be April fools, right? That night, I mean, at least 50 people emailed our CEO going, do you realize that the new security guys at 88, he has a sticky note on his laptop like what's going on here?
The next day I walk in the office and everyone is avoiding me like the plague. I could have had like the bubonic plague. I walk up to our founder Christian and I was like, "Hey, are we still on for drinks tonight?" He's just looking at me like he wants to be nowhere near me. A marketing lady walked up and she goes, "Hey, do you realize that everyone can see all your passwords, all your applications on that sticky note?" I was like, "Don't worry about it. I'll have it up there for two weeks once I memorize it and I'll take it down." Finally she freaked out and everyone started freaking out so hard that we had to bring the joke back in, or Cecil actually discovered it because the very last password I had, if you were reading this, you were duped.
It made me so proud though in a way because like Jadee was saying, that's hard for people to turn other people in and to hold other people accountable but so many people at Sumo Logic were like, "The new security guy is an idiot, look what he's doing. Look what he did." I thought it was really cool that that actually happened.
Teri: Awesome. When you think of it it's like you created buzz about security and there's no bad part about creating buzz about security. Right?
George: It's true. Some exposure, a lot of people didn't like me for it but, oh, well that's funny.
Teri: I know a fun story too. When I started, we actually didn't have a training awareness video related to security and we looked out there and there was a bunch of really old dry ones and nobody really liked them and people would have done it and we would have checked the box from a compliance standpoint, but nobody would have learned anything. We ended up buying a bunch of Star Wars costumes off of Amazon and we filmed our own security awareness training. I got my boss, the CEO to dress up as like Obi-Wan Kenobi and do the opening. I recruited my little seven year old and she was Yoda for parts of the video. It was a blast. Again, and like-
George: I love it.
Teri: ... I got to show you it, but again you're creating buzz in the office about like what, what are they doing? What are they talking about? Oh, what's these topics about security plus you're like, you're doing security in a way where it's fun and it doesn't have to be dry, and scary and people really actually want to know more. I think anytime that you can do that and create buzz is probably a good thing.
Ben: I think that's great. I think what you guys are talking about, the thing I like it's a lot of it is about culture because having worked with various agencies, let's leave it at that. I mean, the way that they would enforce these sorts of things would had, it ended up being so draconian. There's reasons why that we can actually do this, that it doesn't involve things being strapped across your chest or having delivered your tax documents would the company, we're in a good space. One thing I'm hearing with you guys too having been in that space myself before, I mean, a lot of these things that make people vulnerable are cultural things, it's like they're having a hard time at home or they're struggling, they're stressed out, they're burnout, they're having financial issues. Those things are much more likely to be uncovered by a culture that's caring and that cares about his people. It does make a lot of sense.
Now, one thing before we go on, this is to everybody who's listening, you can ask your questions live, take advantage of that. We will answer them. Any embarrassing questions go only to George, the other appropriate ones go [crosstalk 00:23:39]. Yeah, so onto the next time, because I think it relates directly to this. I had the pleasure of interviewing on the podcast basically the first one I published was with Bill Burns. He's the Chief Trust Officer of Informatica. I remember he talked about what he does there and what he did at Netflix, and he talked about this idea of guard rails that so security teams are typically the team of no, right? It's like they tell everybody no, and then nobody likes them but that doesn't really work in this fast moving modern environment.
He had this idea of guard-rails that he tried to tell them, "Look, if you stay within this area, you're good. If you go outside, then we got to talk." He was trying to figure out ways to make that relationship more productive where it feels like the security team is actually helping accelerate company innovation, not holding it back. Jadee starting off with you, I mean, what have you seen there? I mean, does that resonate with you? I mean, what do you do to maintain that image of security as not just a team of no.
Jadee: Yeah, good question. I think this is critical and a team of no thing, it just doesn't work. You could try it at any organization and no matter what, people are just going to go around you and you're not going to have visibility to it, and you're not going to see what you need to see, and you're not going to be engaged in the places that you need to be engaged. Starting out, it just doesn't work. Don't do it. Figure out. Like for me, this was really important for me. I had to and this is not a calculation and so you have to really figure out what the risk appetite is for your organization. Then you have to take that into that guard-rails concept of, okay, well what is our risk appetite and what are the guard-rails that we actually need to have in place in order for the company to move as fast as we can, still maintaining the risk level that we need to.
One of the things that I did when I started here at Code42 is we talked about, what do we want to be known for as a security company? When people talk about us, how should they be talking about us? What do we want to hear? We came up with not things like, oh, that we do really great security. We want to be supportive, we want to be enabling the business, we want to be helping drive the business forward and all of those things are things that we needed to think about, make sure we have the right security things in place, but like let the team move as fast as they can.
One of the guys on my team talks about it as like, if all of our security controls are working like they should, nobody should even know that they're there. I mean, it should just be part of their every day and it should be in the background. The analogy he uses is like, it's like a brake pedal on a car. If you do something really stupid and you find yourself going 95 down a highway, you're going to need it and you should step on it and it will be there but for the most, you shouldn't know that it's there until you need it.
Teri: I think this is interesting. I think there's a lot of people who say, "Well there's the old school model of just get in the room with the security folks and have the meeting and they say no at the end of the project," and much that's going away as far as I can tell, there may still be people doing it, but I hear just a lot of people saying that does doesn't work. I remember at a particular company I was at where I was trying to deliver a project, we got in a room with a guy, I had worked there for maybe three years and he's like, "I have to approve everything before it goes to production." And I'm like, "I've never seen you before." I don't know. I've been building and leading a team deploying into production. We know that doesn't work, right?
The idea of guard-rails is great, but sometimes what I see and I teach a lot of security people, right, and when I say this, they all nod their head, the DevOps and the developers got there first, the security team didn't really think the cloud was going to count in and they come in after and they're trying to implement all these controls and then there's like conflict, right?
My belief is twofold. One for all these companies who say they don't have no security people. I say, yes you do. You just haven't trained them yet. There's a whole bunch of developers, there's an army of security people, really smart people who could understand this. The reason they're fighting against security is because they don't understand it. Developers need to know why. They need to know why you're doing things. They don't know all the security stories that we've all heard in security a million times. My belief is that they get trained, they hear those stories, that they understand the reasoning behind these guard-rails. They will actually help build and support those guard-rails. Yes, guardrails are really important and I think we need to enlist more people in building them and helping put them in place and understanding them.
Ben: Good. George, jump in.
George: I think a lot of it, it's by empowering developers. I totally agree with that so much besides the rest of the company. For example, it can be project based. We're standing up, Jadee and I talk about this all the time, Fed Ramp. I had to mentioned Fed Ramp, I was just coming. We were setting up this federal environment and we were looking for new tools to do code scanning, and this is something that I've seen that a lot of different companies I was with two where the security team has the budget, but then they go out and select the code scanning tool and then they're like, "Hey developers, here you go. This is the tool we selected." The developer is going to turn around and do exactly what Teri just said. Why, first of all, why that tool? How does it work? For us, we were like, "Look, we got the budget, these are the five we recommend. Why don't you guys go and test them out with us? And then you recommend to us which one you'd like to use?" In that way it empowers them. Now when we have the guard rail in, which is going to be the tool, we get to audit and monitor the tool, but they're the ones that feel empowered because they were selected that they rolled it out.
Does that makes sense?
Ben: Yeah. Well, you know what George, you bring one thing in mind that I'd love to get all of your reactions on is, I definitely spend a lot of my career on the more the operation side and particularly the DevOps movement. One of the things I definitely saw is that early on DevOps was this idea, okay, you got these Dev people, you've got these ops people and then you got to get them to work together. I think what I've seen over time and particularly in organizations that really went all the way to the cloud and their microservices and all they're modern whatever, is instead of there being this melding in some sense, engineering started to take over. Because in some sense what the cloud did is it reduced, it's kind of back to what we were talking before, it reduced the barrier to entry and made it easier to get up and going.
To that point, these engineers are smart. If they can learn a subject then they can go implement it. In some sense what you saw is you saw traditional IT, they might screamed at the top of their lungs, shadow IT, but at the end of the day they were being sidelined. They were being pushed to the side. You were seeing things like site reliability engineering and disciplines like that where the development team was actually directly taking over what you might normally consider operations.
Now, what I'm hearing you guys talk about sounds a little bit is like a way for security to maintain relevance and maintain a partnership that maybe and some times IT operations didn't do five plus years ago. What do you guys think? Does that resonate at all? What do you think and maybe starting with Jadee, do you feel there's that tension going on or is it different than it was for like say what a CIO was having to deal with 10 years ago?
Jadee: Honestly it feels like, [inaudible 00:31:01] that you maybe feel it is just the change in responsibility and accountability. A lot of times people talk about DevOps and you take these ops people and you embed them with engineers and they're off to the races. One of the core components of DevOps that we sometimes forget about is like, it's a mindset shift around accountability and ownership. The only tension that I see is when we do all this and we do it really quickly, sometimes like Scrum Masters and POs are like, "Wait a minute, what am I responsible for now and where am I supposed to go?" Just that lack of knowledge around what am I now accountable for? Because in the old world, they just had to, I use this analogy all the time, they had to build the house and build the house as fast as you can and then add features to the house as fast as you can.
Somebody in the ops world would come and clean it and they would update servers and they would catch vulnerabilities. Now with it all being under one umbrella, it's just more about like education as to like what they're now accountable to do and now own. They own it all. You got to build the house, you got to wash the tables, you had to clean the floor.
Ben: Wait, I had to push back on that a little bit on that analogy Jadee because I think the way I usually felt about the developers is I had this great house, I decorated it really well as an operations person and as developers moved in their nasty, you have futon that hadn't been cleaned in 10 years, so I don't know.
Jadee: [inaudible 00:32:26] more to it.
Ben: I mean, George, have you managed to balance that yourself? Because I think we do at Sumo Logic we have, it's a pretty productive relationship but I don't think that was a given.
George: No, it wasn't. It was definitely not a given. Because what typically happens and we started talking about this early on is developers rule the roost. SRE team like you mentioned, they're just going out and doing things at a high speed but if you can start getting them to think about security first and then I think it solves a lot of those problems, but I actually believe most of the issue is with the traditional security team. In my mind, your job should be to get the hell out of the way as fast as possible. If you can provide the right guard-rails and provide the right system, you should be almost transparent and seamless because you're really auditing things. I'll give you a great example and this is something really dumb that we were doing at Sumo, which was on my team. Whatever someone wanted rights, escalated rights or permissions or rights to do something, it would have to go through us.
One day we sat around and looked at each other and we're like, why in the hell are we reacting to this? Why shouldn't the manager decide if this individual gets those rights are not, as long as we're auditing everything, we're slowing things down. We don't know exactly what rights that person needs or doesn't need. We handed that off and that's been our team model for a while now, which is get out of the way as much as you can. If you have the right controls in place, if you have the right processes and procedures in place, you can get out of the way, and everyone else in the company can be an armed security, especially developers.
It takes a lot to get there for sure because, especially at our company, you had the SREs are first rolling out all the different vulnerability products and doing all the patching and everything. Then it's like, well, who's really supposed to do that? Eventually we sat back and said, "Hey, let's not have any pride. If they're doing the job and they're doing it well, great, let's just audit it and then make sure that they're doing it properly and then give them some guidance when it's not." It's become so effective man. It's like one big security family, but it does take a minute or two to get there, and usually it's the traditional security teams fault, not the developers fault.
Ben: I think it's interesting. I remember one of the things we have that we basically already delved into that, this idea of DevSecOvps. I remember when that term first started coming up at Sumo, I was having PTSD flashbacks too. I got into a Twitter war with somebody over at HP about whether you could call it rugged DevOps or not. That was the old term and, and he's like, "Whoa, DevOps should be rugged." I'm like, "Whatever dude." Teri, I'll let you use you start with this one. I think there is some back and forth about DevSecOps as a term, but do you think what we're describing here is basically DevSecOps or is there something else there?
Teri: There's a lot of arguments over different terms and I also feel like, yeah, there's also these terms that become popular like scrum and then scrum master, and then pretty soon all these consultants jump on the bandwagon and become a scrum master and they completely change it to a way that makes my job hard sometimes. I don't do that anymore, but so I don't like to argue necessarily about semantics. I like to think about what are we trying to do here. We are trying to bring security into the environment and I personally think about DevOps and DevSecOps as using automation. I don't think it's going around security, or other things I've seen that term used for and people with unicorn pictures, and conferences that are really nice. I don't get into all that. I just think let's talk about what we're trying to do here and let's bring everyone together.
I'm a big believer coming from both the security side and from the developer side that we need people merging and working together, but I also like what George says, which I think is really important. That doesn't mean developers do everything, you still need someone externally auditing. You need that segregation of duties and an external auditing team, but definitely coming together and working on everything and focusing on, what are we trying to do versus a particular term. If someone likes DevOps or whatever they want to call it, I don't care as long as we're getting security in that [inaudible 00:36:32].
Ben: No, I think that makes a lot of sense. One thing that this leads me question is well, partly we were talking about, I know George, you and I have talked about this before is it seems to me that if we're going to achieve this vision, you're part of the looking for different hiring and recruiting profiles too. I know the security profile of the people that I used to work with on early in my career seemed to be very, very different than what the person that is going to really thrive on a team today. Jadee, maybe talk a little bit about that from your perspective, like when you're recruiting people on your team, I mean, what profiles are you looking for now to build the modern security team?
Jadee: They need to know code. We've done a lot within my team where one we're hiring for people who actually know code, and we'll put them through a little red team exercise to see if they can find vulnerabilities and whatnot. Then in addition to that, the people that are part of the team today, we've started an initiative where we're all learning Python. At January we kicked it off. By mid year everybody has to get through a certain training and then by end of year everybody has to use their new found Python skills to automate a portion of their job. This is a team thing. Everybody on the team is doing it whether or not you find yourself looking at code during the day, it doesn't really matter. It's a skill that we all feel is important and that we need to have going forward. That's how we've changed our mix up, what we're going after. A guy on my team actually wrote a blog about what we're up to and how important we think it is for security people just to have that skillset.
Ben: Teri, particularly considering your background, what do you think is more likely to happen if security people learning code, or developers basically going over into security? I mean, what are you seeing?
Teri: Well, I do think that it's really important to have developers as part of the security team, but I also have a little bit of a different perspective because there are people who have 20, 30 years of security experience and they have invaluable knowledge. They know deep security things that you don't necessarily even get to in the cloud. Like crafting packets, bypass firewalls and other sorts of deep security knowledge that you don't want to just throw away because they don't know code. Right? I like to think of these people as product managers or someone who's going to come and say, "Here are the rules we need to follow. Go build it for me." They can explain to the developers, they have a lot of value. Sometimes the problem I see with people who say, I do cloud security is that they know the cloud platform security tools. They don't necessarily know security like cybersecurity, all the training you get from various organizations. I think having those people there is super valuable. They just start to play a different role and I don't think they necessarily all have to be a programmer.
Ben: No, I think that makes a lot of sense. George, what do you, and like I said, you and I have talked about this before. I mean, how do you look at it?
George: I'm on both sides. I look for developers first because I think that they're hungry depending on what the task is. For example, when we were building out our team, we went after developers to automate as many inefficient things that were happening, not only across our team, but cross functionally within the company, which by the way, that really helps embed security into the company. For example, we just went away from WordPress and went to Craft and there was this whole new public facing website initiative and one of our guys, our automation guy really helped lead that charge. Now the marketing people are just like, "Wow, this guy is so good, and security team is so awesome and on our side." I do think that you need at least an automation bucket or DevSecOps bucket on your team, whatever you want to call it, that has coding skill.
Well, I think it depends on the position because like the leader of our sock and you know him, you know Roland, by now you've seen Roland. I think sometimes, especially at a software company you need rigor. For the sock I was looking for people with military backgrounds, people who could come in, get a mission, and execute on that mission at all cost. I think it just depends on what the positions are. I do think you need a lot of security knowledge, but I also think that you need to have coding experience.
Then the other one too that we haven't mentioned yet, which by the way I'm so lucky to be on this panel today, is the capability to talk itself and express yourself and market your department. I think now as being a security leader you have to have evangelistic skills, you just do to either recruit people onto your team or to sell your initiatives across the company all the way from the board to the East app and everything else. I think it's so important to be able to talk and that's where the risk comes in sometimes with developers is, even the developers I've brought onto my team, they can't talk man. These are vampires that sit around with all the lights off in the middle of the room that don't want to ever talk to people. Which is cool with me because I get along with those folks but there has to be a balance amongst the team of rigor and development skills as well as evangelistic skills too.
Ben: No, I think that makes a lot of sense because I know when I first started out my career, I usually thought of the security people as they were the ones who pressed the button on the scanning software, and then set their white ran and then gave me like a 30 page output that they didn't understand themselves. We've advanced a lot beyond that. No, I think that makes a lot of sense because I've seen you in particular George, at the recruiting you've done [inaudible 00:41:59].
George: Thanks Jadee.
Ben: I mean, you bring on a lot of difference.
Actually, something you said, Teri really which stuck with me. I was a product manager for years and I think you're absolutely right because in some sense you're borrowing the best of more of an agile software engineering model is that you have the people that other subject matter experts that come in and provide that guidance. Then you have people that actually go execute it and people that run it, and people that are thinking of talking to the outside world. You've got to have all those different skill sets in it. It makes a lot of sense and it seems like that's maybe how things are changing because these companies are taking security more seriously. They have to have a more broad complete approach to it. Does that resonate?
George: Yeah, so I'll go first too because I wrote a note of that Ben and I'm glad you brought that up. When she said PM, that's exactly why we look at things again, that code scanning project we had, you should be like a PM, you have the requirements. Okay, so you have the requirements around security and what the best possible outcome can be according to what Jadee said as well too for the business. If you just go to like development or other lines of business and say, "These are the requirements." How you get there, I don't care, I'll consult with you if you want. I'd love to be part of this, but these are the requirements. Then let them go out and choose the best way to get there because they're going to know their line of business way better than you are. I think a PM approach is actually a really smart way to put it and a good stance to have.
Ben: Because, it actually reminds me back to some other things we said. I think when the DevOps movement first started, it was a lot of people that knew just enough coding to stitch things together and can figure things out. I think that worked in some sense but that seems to me like a big difference what happened with site reliability engineering is like, okay, let's take a step back and let's treat this like you're actually building a product. Let's treat this with rigor, let's treat this with process and actually think about it end to end, which is a different skill set. Then I learned a little coding on the weekend and I can make this, I can tape it together with gum and sticks and whatever and it works.
I think putting a bow on this, I think this is, it was super interesting and what I want to do with each of these is go around and I'd love to hear from all three of you about where your mind is next. What are you thinking about that maybe not everybody else is thinking about? Jadee, why don't we start with you?
Jadee: Sure. I'm thinking about next getting out of here and going to my lake place.
Ben: Okay. That's good. I like that.
Jadee: I'm just kidding.
George: Oh, that's great.
Jadee: That [inaudible 00:44:44] like long term for us. I mean, for us it's really all about automation and orchestration this year. That's what we're very focused on. I am thinking a lot about this insider threat concept and how do I make sure that I keep a pulse on that? Because I do think that that's going to be a risk that's going to continue to rise. From my just delivery standpoint, we're very much focused on automation. From a CSO risk perspective, I'm worried about the data that we have here at the organization and just internal people and threats related to that.
Ben: That makes a lot of sense. How about you Teri? What are you thinking about?
Teri: The thing I've been thinking about a lot recently is as you know there's tons of breaches in the news all the time. We have all this automation and we have all this data, and this even relates to Sumo Logic, right? You have all this data around your processes, your automation, the cloud, really distilling that into what is the risk from all these different security problems that are existing environment. Having the guard-rails reporting on what people are doing that are creating risk and then also who is giving exceptions when these things are, oh, it's okay to have that CVE, that concept for bugs in those system. When you get these exceptions, tracking that and then distilling that into reports that the highest level executives can understand and look at. My theory is they're looking at financials. Financials are not simple. You have to learn to read them. You have to learn to understand them. You have to learn to read and understand your security reports and take that risk into consideration.
We have CEOs going up testifying in front of Congress now, losing jobs, boards getting sued. We've got legislation coming down because these things are not being handled appropriately, so I really have a strong feeling right now that a lot of, there has to be a better way to do risk reporting up to the highest level and that they should get involved and understand what these risks are so they can evaluate them appropriately.
Ben: Well, that makes sense. Is it raining in Seattle this weekend? Are you going to be able to go out enjoy the weather?
Teri: It's raining a little bit. It hasn't iced.
George: I love that.
Ben: Now George, I mean, I guess you start off with when are you going to have a cool whiteboard behind you?
George: Never. I'm not that smart. I mean, look, this is the funny thing about me having to go last now. You have Jadee like talking about how she can't wait to get to the lake. Teri with her intellectual prowess over there and then now you've got the lame guy going at the end. No.
Ben: [inaudible 00:47:12].
George: Totally, no whiteboard for me. My things are a little different. I look at pain points first. The two main pain points that I foresee coming up for us are number one multi-cloud security, so we focus a lot on this conversation on AWS. I mean, I think all three of us are into AWS, but we're taking a very close look and probably launching into GCP this year. That adds a bunch of new challenges, so one skill level of security, everything else that we've talked about, we now have to bifurcate those efforts across multiple cloud providers.
The other one too is what's killing us the most right now. I'm not gonna say fed ramp Jadee, it's got to be privacy. Privacy is slowing down or capability to do business so much with GDPR, CCPA to California Consumer Privacy Act, Illinois, New York, Australia, Brazil. How do we get our hands around that? One of the things that we've done is we're working very closely with ISO, ISO 317 in particular around privacy by design program. We sit on the technical advisory group for it. I'm hoping that we have this overarching privacy standards that can manage most of these individual groups because this is going to get ridiculous. Contracts have slowed down so much and I guarantee you both of my peers on this call have seen that as well. It's slowed down business because everyone is just trying so hard to focus on these privacy regulations instead of trying to do best level of effort and do what's best to be able to do business together. It's never going to scale at this rate.
Teri: Yeah. It's funny that you mentioned that. This morning my senior vice president of research and development is a little bit of a jokester and he's dealing with a bunch of legal privacy requirements. He entered a Jira ticket to our help desk that says he would just like to purchase Canada because he thinks it'll be easier that way.
George: That's awesome, and very relevant.
Teri: Right. I said close, will not fix.
Ben: I don't know this, I think Canada's value is rising these days. You need to move quickly.
George: They are PIPEDA though. They have PIPEDA which is a privacy regulation. That's a pain.
George: That's right.
Ben: What does that stand for?
George: Don't ask me that. I just know the acronym. Gosh, ask one of them. They'll know, but it is, it's so interestingly enough, I was just in Canada last week and we have a PIDA which goes across most of Canada, but then BC on the western side, Vancouver, they have their own and so it's like, I think everyone is just trying to focus hard on individuality, which I really respect that. Again, when you're talking to economies of scale, it's going to keep people from getting things done. Ultimately, this is the last thing I'll say about it before I get off my bandwagon, because we could do a whole session on this. A lot of the responsibilities are on the consumer. Know what data you're giving out, know what data you're passing and then hold vendors accountable or platforms accountable.
Ben: Yes. We're leaving on a note that you're expecting users to take care of their security?
George: Yes. Totally. Absolutely. It's going to come down to that one day.
Ben: Yeah. You're right. You have to take some personal responsibility.
Well, all three of you, thank you so much for coming on. I think out of all of these Jadee, I liked you as the best. I wish I had the chance to go through. That sounds nice.
George: I just want to bloody marry right now.
Ben: Yeah. Thank you all for coming on and thank you for everybody that joined us live and we will be posting this up so that you can share it out and look at it again and everybody have a great weekend.
George: Thanks, you too. Thanks everyone. I really appreciate it. Nice job.
Speaker 5: Masters of Data is brought to you by Sumo Logic. Sumo Logic is a cloud native machine data analytics platform, and they bring real time continuous intelligence as a service to build, run, and secure modern applications. Sumo Logic empowers the people who power modern business. For more information, go to sumologic.com
For more on Masters of Data, go to mastersofdata.com and subscribe and spread the word by rating us on iTunes or your favorite podcast app.