MOD: Masters of Data

Bringing the human to the data

Bill Mew: Data Ethics is the new Data Privacy

Founder & Strategist, Mew Era Consulting

2019年01月21日

33:44

Make data ethics part of your cultural behavior [...]

Bill discusses what companies have to do to avoid some of the recent privacy and ethical disasters.

Show notes

Ben: Welcome to the Masters of Data Podcast, the podcast that brings a human to data. And I'm your host, Ben Newton. Some clear themes have arisen on the podcast so far. Trust, privacy, ethics around data. We have talked to authors, company leaders, big thinkers, and influencers. Bill Mew was a guest in June of 2018 talking to us about data privacy and the European General Protection Regulation, or GDPR. We have brought him back on to talk about his current focus, data ethics.

Bill has crafted a very solid message around what companies have to do to avoid some of the recent privacy and ethical disasters that led to huge problems for the brands and loss of their customer's trust. So without any further ado, let's dig in.

Welcome everybody to another episode of Masters of Data and I'm very happy to have a repeat guest on here, Bill Mew, he is a cloud strategist, he's actually focusing now on digital ethics and transformation. I'm really excited to have you on here again. Thanks a lot for coming on Bill Mew.

Bill: Oh, it was great to meet you last time. The last one we recorded was actually in London, face-to-face. This time we couldn't meet up face-to-face, but it's great to chat again.

Ben: I'm definitely planning on coming out there again, so we'll just have to make this a repeating thing we do, Bill. So it's been a little while since we talked, so we, like you said, we met when I was in London in the July timeframe, it was just after GDPR had gone through and was actually formally adopted. And what's been happening with you since then? What have you been focusing on?

Bill: Well, when we met last, it was so soon after GDPR that I think a lot of organizations stood in a certain state of shock. I think all the regulations have bedded down a certain amount. I've completed my two year stint at UKCloud and I've decided to go into Bannon. I'm now consulting and speaking and doing a whole lot of stuff around helping companies strike the right balance between digital transformation and digital ethics, between changing the world and doing the right thing. And I was obviously, the headlines, all those since we met last, have been filled with one Facebook scandal after the next. It is evident to all that something needs to be happened, or something needs to be done on the digital ethics side, and I'm currently doing a lot of speaking, I've been invited to do a book which I'm just starting on. A whole lot of papers and other stuff. It's been focused on by a number of the commentators as possibly going to be the biggest issue we think next year.

I know that Gartner, for the first time, they focused on it as one of their top ten technology issues for the year ahead and they've said that it's absolutely essential that companies go from a tick-box attitude to more of an attitude where they're changing the way that they behave in order to take it seriously and to engender trust by doing the right thing. I think Forrester come out with some very similar predictions. They've predicted a spike in privacy tool adoption and the use of opt-out settings and they said that people will make these decisions based on their own personal preferences but it will make the life of marketers very difficult because hyper-personalization is going to be exceedingly difficult when people opt-out.

We're also going to see a number of consumers being far more aware and therefore far more active in enforcing their privacy. They'll alert regulators to any kind of malfeasance, they'll be filing lawsuits and class action lawsuits are starting to appear in Europe, they're an infection that we're getting from you guys in the States. [crosstalk 00:03:50] cutting companies if they fail to protect their privacy. And there's also been some very, very interesting research recently where they polled the general public, the consumers both in the UK and the US asking them what they really cared about and very high on the list of issues is access to healthcare, access to education, protection from terrorism, all the things you'd expect. Then you'd say well those are the kind of things that you expect the government to deal with. What is it you expect from companies and from brands?

In the past it's been things like diversity or sustainability or other worth issues. [inaudible 00:04:26] has jumped straight to number one in the charts, and it's data security and privacy. And we've never seen [crosstalk 00:04:32] and it is now the number one issue. If you're a company out there, and I'm not just talking about tech firms here, I'm talking about any company out there, and we've just seen the, only in the last few days, the big issue with Marriott hotels, and it goes to show it's not just the tech firms that need to be data ethics aware, it's any company out there that's processing data, any company that could have an issue, and that's almost all of them. And therefore you need to take that really seriously.

I think we're, for the very first time, we're seeing a single issue not only be the lead brand attribute, so that ultimately you want to be a trusted branded, you want to be perceived by your customers as taking a stand and trustworthy with their data, but at the same time that could turn on the flip of a coin if you have an incident and all of a sudden data security could be the one issue causing the greatest risk and threat to you. It is the first time that we've ever seen a single issue be both the leading brand attribute and the leading brand risk at the same time.

Ben: It's fascinating. What do you think has changed? Because I'm definitely seeing a lot of that same change in the conversations I'm having in a lot of different places. But what do you think has changed?

Bill: I think previously, this may have been naivety on all our parts, but we just trusted technology. We trusted that it would work. We trusted that wherever the data went, it would be held securely and it would be used ethically and responsibly. In hindsight, you may say, "Well that was incredibly naïve of us," but we will work out. You've seen the Cambridge Analytica situation. We've seen big companies like Marriott fall over their feet. We've seen all sorts of things and it's brought home to us that actually we shouldn't be resting on our laurels, we shouldn't be just accepting that our data's being shared and taking it for granted. Under GDPR, we have the right now to have out records removed, we have the right to privacy, we have the right to be removed and forgotten, and people are going to be exercising those rights and they're also going to be coming out and really punishing companies that don't get this kind of thing right.

Ben: And what do you think there is in particular with the average users? Because I can see how the GDPR regulations itself generated a lot more awareness at the corporate level, perhaps, but pretty sure your average person on the street, why do you think they're more aware of it now? Have there particular things do you think have happened or is it just a natural evolution of the technology?

Bill: I think things have moved on. It's not only their level of awareness and the ways that company are using data, it's actually the fact that most companies don't really have a [inaudible 00:07:15] handle on their data. We've seen a migration to the cloud at the same time as we're seeing an exponential rise in data. And this actually causes an enormous risk. When people talk about moving to the cloud, then normally their focus is on the applications and they look at all the different applications and they look at which ones are right to go cloud-native and to be developed in that environment or re-engineered to be hosted in the cloud-native environment. They look at which ones are virtualized and could be hosted quite easily in that manner. And they look at which ones possibly aren't cloud-ready.

And most of the focus in the migration strategy is looking at the different workloads and the different applications. One of the things often overlooked is the data. So once you migrate a lot of your applications into the cloud, if you're continually having data grow at an exponential rate, then you've got a real problem. How do you handle all that data? And there are hardly any CIO's out there that have got a budgets that are growing exponentially. If anything most of them are under pressure to reduce budgets. And therefore when you look at their ability to just deal with the massively growing volumes of data and then you say well, what are you doing with it? How are you managing it? How are you keeping it secure? How do you know what personalized information out there? What if you have customers who've come and asked for their data records to be removed? How do you handle some of those requests? Are you able to do so?

And it's actually, the organization's ability to manage this exponential growth in data, to manage what data within that they have that is subject to GDPR because it's personally identifiable data, it's a real challenge for organizations out there.

Ben: Yeah, I know, I definitely agree and one thing that comes to mind, in some of the things that I've been reading about how you've been thinking about this, when you use the term ethics and you're talking about the brand risk and brand attributes around trust, how do you tie the ethics to the brand risk there because in some sense ... do you feel like people are seeing this as an ethical issue? Because to me those are kind of approaching the problem from two different sides, as one is like yeah, if I have a hacker come in and steal, like at Marriott, come and steal a bunch of user information, that's obviously a risk to my brand in terms of trust, but how does that relate to ethics from an ethical standpoint?

Bill: Okay, there are three aspects here. Fist of all, you need to have a company that has actually got it together in order to have any chance of acting responsibly and ethically with data. Actually this requires an alignment of functions that haven't traditionally worked together. And [inaudible 00:09:58] in any organization you have the chief marketing officer who owns the brand, he will be the person who is responsible, possibly through his PR department, of having a crisis management plan if things go wrong and there's a big data leak. But at the same time he will also in his marketing department be wanting to look at customer data to do some sort of predictive analysis, to use the data to its best extent to do the sort of analytics that provide data insights.

Then you have the CIO, now the CIO will have the application and the data management headaches. But he will also have the information security strategy, and that information security strategy won't necessarily be aligned with the crisis management plan. And then you have a third group, you have the chief risk officer and they're very much focused on compliance. They will have the privacy impact assessment and they will be insuring that we're ticking every single box on that particular side. But obviously, they have potentially a third different plan.

And the privacy impact assessment isn't necessarily aligned with the information security strategy or the crisis management plan. And actually, to be in a position to actually deal effectively and ethically with your data, you need to have those three aligned. And this is important in order for companies to be on the front foot.

And then the second aspect is when things are going well and we're not having an issue on our door at this very moment in time, we want to think about what is our ethical position on use of data? What information do we want? How do we want to explain to our customers what we're doing with it? What are the measures that we take? What is the actual culture within the company about how we treat customer data? This is becoming increasingly important because in order to be a trusted brand where people to actually want to do business with you, you actually need to sort of demonstrate the ethical behavior that engenders that kind of trust. It has to be cultural, it has to be widespread throughout the company because there have been various different interesting case law that has come out in recent times.

In the UK there's a supermarket chain called Morrison's and one employee within Morrison's posted the personal data of a hundred thousand employees online. There was a big stink about this. But what was most interesting was the fact, in the past, a company could not be held criminally liable for the criminal actions of its employees. Something called vicarious liability. But in this instance, the judge ruled that because Morrison's had put this employee in a position of authority and trust and he had access to that data, they were liable. And this opens a whole new door so that Morrison's then needs to compensate the hundred thousand employees for the criminal actions of one of its employees. And it's not that, we're also seeing class action suits.

In France, we've had a massive class action suit that's been posted by the French Internet Society, they're trying to get a hundred million users together to post a challenge in the courts with Facebook. We're also seeing shareholder action. If a company doesn't respond quickly or doesn't act responsibly with its data, the shareholders will take action. In 2016, we didn't see a single shareholder lawsuit around data security. I think we saw nine in 2017. And in 2018, [inaudible 00:13:26] already seeing that take off, so it is very much a trend.

There are all sort of legal aspects there but then it goes beyond that. There's the regulator, we've got GDPR now in Europe and very soon we are likely to have some form of regulations in the US at a federal level. It's not so long ago that companies were saying this will never happen. Their tune has changed, they're saying well this is how it could happen. And they're all trying to actually influence what comes to the floor in terms of federal regulation, but most people now see it coming. And then beyond that there could be fines, they could even remove, and under GDPR they can do this, the could remove your right to process data. And that would bring your company to a complete standstill overnight.

And then beyond that, the law or the regulation, there's the brand aspect. Look at how tarnished brands can be. Just speak to Equifax, look at the guys at Marriott right now. Every single aspect here has to be taken into consideration. When you do this kind of thing, it's important that we focus not only on what happens when things go right but you need to be ready for when things go wrong. In terms of when things go wrong, the crisis management plan needs to come to the floor. It needs to be aligned in advance with things like your information security strategy and your privacy impact assessment. But also you need to think about it in a different way than companies have in the past.

Traditionally, crisis management is all around containment. Your strategists will come in from some crisis management plan or possibly your own PR department if you don't have a crisis management agency and they will seek to contain an issue, to sort of hush it up as much as possible, to make sure that you're handing and controlling the message. That used to be okay with normal crisis situations.

But with regulations like GDPR, it actually mandates that you have to make prompt disclosure to the regulatory authorities and to the impacted customers. And therefore containing the issue or hushing it up is no longer an option. In terms of controlling the message, if you've had a major breach, and the guys at Marriott are probably working overtime at the moment on the situation there, you're gonna have real trouble countering the level of hysteria and misinformation that you're going to see in the press. And this will be at a time when your credibility and your ability to hold the line on that media speculation is at its lowest ebb. Your credibility will be particularly low at this point in time. That is where we're recommending that you need to work with influencers out there who can help you get your message across, who can help you counter some of the hysteria, who can help you counter some of the misinformation, because you're going to have your work cut out.

And there threat here from the legal side, from the regulatory side, and from being tried in the press and the all the hysteria and misinformation that could occur is massive. This is something that is a threat to any organization, we're just not talking tech firms here. We're talking like Marriott, we're talking about Equifax [inaudible 00:16:27], any company that handles data, it could happen to you.

From my perspective, I hope that most of the other companies out there, certainly the ones that are listening to us now, aren't vulnerable, it doesn't happen to them, but ideally we hopefully gonna reach a position where they're better placed to do that because they taken digital ethics seriously at a corporate level.

Ben: This is really fascinating, the way you're talking about this because I definitely, kind of summarizing it a bit, is that we've moved from this idea about data privacy, data ethics, about this trust from being as from a containment thing that if something goes terribly wrong, how do you contain it? Kind of have the check mark of security protection and these issues that you have to be more proactive. One thing that stands out from what you're saying is that beyond you having to be proactive is that you're users are now more empowered because of GDPR so they can be proactive on their own to push it back against you. And then I think there's a lot of awareness in journalism and authors. I've interviewed a couple on the podcast like Cathy O'Neil with her book Weapons of Math Destruction and there's a couple more out there like that that they're actually pushing back on the companies about how you're using their data. So even if you think you're maintaining and keeping the ship level, people will push back against you. So in some sense you have to assume these things are going to happen and you have to prepare for them, right?

Bill: Yeah, any kind of malfeasance is going to be punished exceedingly harshly by consumers. The awareness is now that much higher. Their propensity to actually act is higher. They're going to be alerting regulators immediately to malfeasance, anything they see. I've talked about some of the class action lawsuits, not just from users but from shareholders. And then there's the brand loyalty. There's been research coming out that's saying that more than half of customers would abandon a brand if they felt that there'd been any type of breach or that their data had been mishandled in any way. A significant portion would go further than that and actually consider taking legal action. I think we're going to start to see consumers become privacy enforcers in their own right.

Ben: Right. That makes sense. And one thing that stands out, too, actually I just had a conversation recently with one of the founders of Samsung Smart Things, so they make home automation, and one thing struck me the way he was talking about this is he talked about establishing trust, and to that point, they started out early on like this. They built a community where the users feel involved. And maybe that was one thing that stands out to me about the companies that do this well and the companies that don't is like do you feel like you're an object of their data ethics plan or whatever it is? Or do you feel like you're a participant?

Because if the users feel like they actually have some insight, if they actually have some involvement, there's a community voice there, it feels like in their case that their users feel pretty empowered and they feel pretty happy with what's going on. Whereas if you think, some other situation, some of the high-profile ones, like Facebook, even thought it's a community itself, you don't feel like you're actually a participant in how your data was going to be used until very recently. Does that sound right to you?

Bill: Part of it is that level of engagement, but also it's taking a visible stand. If the way in which you ask permissions, the way in which you act and the corporate culture around the use of data is very evident, then people will actually sit up and say well here's a bunch of guys, they are actually taking a visible stand in terms of digital ethics and the way that they treat customers, and they have a totally different orientation towards you than they will to people who don't act in that manner. It's going to be a source of competitive differentiation. You've already seen a level of backlash, certainly in Europe, against some of the global brands because there is a lack of trust at this moment in time and some of these large brands are gonna have a battle getting back the level of trust that they've lost. I think Facebook in particular has got a massive uphill battle.

Ben: That makes it ... for some reason, this actually reminds me, you think this has any sort of parallels with some of the things we've seen around food quality and around organic versus nonorganic foods where there's this sense that people wanted to know the origin of their food and they cared and they wanted the labeling and they got involved and there was a lot of community involvement in that. It seems to some extent we're seeing that with our data, too, right now.

Bill: Absolutely. And people wouldn't necessarily a short while ago have cared what was happening with their data because that level of awareness simply didn't exist and therefore back to the sort of naivety scenario that used to exist where you trusted technology, you trusted it would work, you trusted data would be held safely somewhere and that it would be used ethically. That naivety has vanished. It's a bit like after you have a food scare, and in the UK we had BSE with a British beef, it led to a heightened period of awareness amongst consumers about the food chain. And I think we're seeing that now in the sort of data management food chain, if you like.

Ben: Yeah. Absolutely. I love history and I've been going back and reading about some of the stuff in the early 20th century where we first had, where the FDA was formed and we saw some of these books coming out about food safety and about treatment of workers and things like that. It seems like we're going through one of those moments where there's a heightened awareness. But it seems like partly what you're saying here, too, is it's not just the ... it's not just the companies themselves. Talk a bit more about the influencer a bit here.

So if I'm hearing you here right, probably what you're saying is these companies have to actually engage with people that are not necessarily employees to help get that message out, is that what you're saying?

Bill: It is useful if there are ... I mean I'm a very public advocate and influencer in this particular arena. If you're going to Twitter and you type #privacy, I'll probably be the number one individual in the world at any point in time depending on whether I'm having a good week or bad week, so I'm one of the most particularly high-profile advocates out there talking privacy. The whole time I've worked with some of the leading privacy campaigners such as [inaudible 00:22:42] and helping him to crowdfund an NGO to represent people's privacy across Europe. And I've worked with some of the leading cloud firms in this country who have provided particularly secure data sovereign services to differentiate themselves against some of the global players. I would be one of those, and there are a number of influencers out there who are well-known round either security or privacy.

Those are the type of people, possibly also journalists and analysts, that you need to have an existing relationship with long before things go wrong such that if and when they do go wrong and your credibility is at it's lowest ebb, you're able to brief them, you're able to explain to them the real situation to get through the hype and some of the misinformation. And hope that they can actually, using their authority in the market and their credibility in this particular space, talk rationally about what is really happening, to cut through that hype and that misinformation.

But these relationships with these type of people is not something that you can do at the very last minute. You have to have done this beforehand. It's like suddenly deciding to learn to swim once you're drowning. [crosstalk 00:23:55] you're stuffed. But if you [inaudible 00:23:58] some time in advance, you've got a good chance of surviving. In the same instance, you need to be able to work with influencers, to brief them about your data ethics standards, such that when things go wrong you have an existing relationship with them. You can call out to them and say look this is misinformation, this is hysteria, you're a responsible journalist, you're a responsible analyst, you're a responsible privacy influencer. Help us put the record straight. And actually that is the way I think that companies are going to need to go because the whole traditional crisis management orientation around containment simply isn't going to work in this day and age.

Ben: No, it's fascinating the way you describe it. Because going back to where you started where you were talking about the different functions within the company that have to collaborate typically on the security and definitely on the technology side, CIO, CISO's, that arena, they're not used to engaging like this with influencers and thinking about perceptions and how you have to get ahead from a communication perspective. And I can see how that's one way the marketing function, the CMO, getting involved with this is going to be super important. Because just like you said, coming from a software background, we spend a lot of time with analysts because you have to and you can't just do that when you release a product feature or something goes wrong, you actually have to build a relationship over time.

Is that partly where you see the marketing function coming in?

Bill: Yeah, I think if you're a brand custodian out there, my biggest message to you would be take digital ethics seriously and try and get it as part of your corporate [inaudible 00:25:37] to establish yourself as a trusted brand because actually that will differentiate you in the market, that will be a competitive advantage, but also you need to have the alignment between your crisis management plan then also the guys that in the CIO department and what they're doing about information security management and also the chief risk officer or the compliance group around their privacy impact assessment to make sure those are aligned.

You also need to actually have a plan that is practiced at intervals so that you do scenario planning and you do rehearsals. And on top of that, you're developing relationships with those analysts, the privacy influencers such as myself, so that you have a relationship well in advance of anything going wrong. They reckon that, at a time of a crisis, it is the first hour that will define possibly the outcome of the entire scenario. If you are quick to get all the right people together, to get a holding statement together, to alert all the different key constituents and stakeholders, including those influencers that you need to reach out to in that sort of time, and you can pull together a lot of stuff within the first hour, you have a far better chance of cracking the whole problem and actually getting ahead of it.

If you can't do that and if you suddenly decide I'm going to start learning to swim now because I'm staring to drown, it's too late. You're never going to pull to get things together in the first hour, the first day, it may even take you weeks, and you will be behind the game all the way throughout that period and your brand will be cut to shreds.

Ben: I mean, if anybody's listening to that, I would hope they would ... they're getting a little afraid if they haven't thought about this already. Okay, so there's somebody listening to you, you hear somewhere else saying this and kind of putting the message out there they need to take this seriously, where is the right place to get started? What's the first couple things they need to make sure they get done if they haven't really gone this direction yet?

Bill: If you can go and you can get crisis management templates anywhere off the web, they're fairly straightforward to get ahold of. They're crisis management agencies out there. Not many of them are yet oriented to specifically around the data incident, or the data crisis scenario. I've got a group called FUD Busters who are out there [inaudible 00:27:55] are try and counter all the fear, uncertainty, and doubt that occurs in this sort of scenario. We're very happy to work with people when disaster occurs, but we'd far prefer to actually consult with people well in advance to help them align their different departmental strategies to make sure that they have the correct preparations in place to make sure that actually they have the right scenario planning, possibly the ability to rehearse those type of scenarios, and that they're ready, engaged with influencers such as myself, but there are others out there that they could work with in order to actually be well-prepared.

The other thing you need to consider about, if you make data ethics part of your cultural behavior, your employees are far more likely to act responsibly. They're also far more likely to deal with customer data as a asset and something precious. And you're going to be far better placed because you've done this alignment of the different departmental strategies in order to respond. So actually, the digital ethics approach is not just about differentiating yourself in the market and having a competitive market advantage when things go well, it's actually making sure that disasters are less likely to happen and if they do you're better positioned and better prepared to deal with them.

Ben: What you're saying about the employee culture, too, makes a lot of sense. I'm having these images in my mind early on in the days here, our chief security officer had her selection of a few different stories that she would scare the crap out of us with. So we're like this can happen, and this happens and this happens then this company fail. I think having that early on, that employees really see the privacy of their customer's data as a top concern, back to Samsung Smart Things, Scott [inaudible 00:29:44] who I interviewed said we believe that customers, our users own their own data.

And while that seems kind of obvious when you hear it, I don't think a lot of companies act that way. So having somebody say that and proactively act on that, that's a cultural attribute that pervades the entire organization, that seems to be key to really being [inaudible 00:30:03]. Can't be just the leaders saying this, it actually has to pervade the entire organization.

Bill: To look at your own company, whichever company you are, whether you're listening to us here, just think about it. If your company's a types of organization that thinks information security, well that's something the IT department do, sort of a compliance with the GDPR or whatever regulations, that’s something the compliance department, the CRO does. And then you think about your brand reputation, oh what do you know, that’s something that the marketeers do. If that’s the attitude in your company, you’re already heading in the wrong direction. There are many organizations already that have an ethos that we don’t just have a marketing department, everyone’s marketing. Everyone’s out [inaudible 00:30:44] sell the brand, everyone is part of … has a responsibility in putting forward the corporate ethos. Well if you have that more positive, a forward-looking attitude and you incorporate digital ethics such that you’re not only being responsible about looking after the customer but also thinking about their data and about the data security, then actually culturally you’re at an advantage in all the ways that I mentioned earlier.

Ben: Yeah. That makes a lot of sense. Going forward here, I mean, what’s coming for you? So it sounds like if you’re getting your message out there, hopefully you’re spending a lot of time with companies that are trying to engage you to ahead of this, I mean is that?

Bill: I’m speaking to a number of different organizations. I’ve been approached by some publishers who’ve seen a lot of my writing and speaking and they want me to write a book. And I’m basically planning that out at the moment. I’ve got some speaking opportunities, I’m still very active on social. Follow me @BillMew and I’m sure the link will be included in the write up here. And I’m passionate about it. I mean that’s probably evident from the way I talk. [inaudible 00:31:45] ways in which I can help companies. Not just help them when things go wrong, and I can do that, but I also want to help things, companies well before it goes wrong and trying to prevent it go wrong.

Ben: Yeah, absolutely. And I think that’s a great message. It’s all about thinking ahead of time and actually considering where you want to be and planning about that ahead of time so that you’re able to react more quickly and more comprehensively. And I think that that’s a good message to get out there. Like I said, Bill, I’d love to get you on here again to see how that’s going and if you write a book I most definitely want to be talking to you about it. So I appreciate you taking the time to come on here.

Bill: It’s been a pleasure. Just like last time. It’s a fascinating and fastly, very quickly evolving arena. So no doubt whenever we speak next more will have happened and it was only Marriott the last couple a days, I’m sure there are going to be more incidents, I’m sure there’re going to be more brands that really suffer in this and if your company and your brand doesn’t want to be one of them, you need to wake up. Digital ethics. It’s gonna be the big issue of 2019, just you wait.

Ben: All right. Well we’re gonna … I think you’re on the right track. Well thanks a lot, Bill, for coming on. And for everybody listening, we’ll put some things in the notes linked to some of the things that Bill’s been talking about and definitely subscribe to us on your favorite podcast app and rate us so that other people can find us. We appreciate you taking the time to listen.

Speaker 3: Masters of Data is brought to you by Sumo Logic. Sumo Logic is a cloud-native machine data analytics platform delivering real-time, continuous intelligence as a service to build, run, and secure modern applications. Sumo Logic empowers the people who power modern business. For more information, go to sumologic.com. For more on Masters of Data, go to mastersofdata.com and subscribe and spread the word by rating us on iTunes or your favorite podcast app.

The guy behind the mic

Ben Newton

Ben Newton

Ben is a veteran of the IT Operations market, with a two decade career across large and small companies like Loudcloud, BladeLogic, Northrop Grumman, EDS, and BMC. Ben got to do DevOps before DevOps was cool, working with government agencies and major commercial brands to be more agile and move faster. More recently, Ben spent 5 years in product management at Sumo Logic, and is now running product marketing for Operations Analytics at Sumo Logic. His latest project, Masters of Data, has let him combine his love of podcasts and music with his love of good conversations.

More posts by Ben Newton.

Listen anytime, anywhere

Available to stream or download via these and other prodcast apps.