In today’s ever-changing business landscape, those that operate using a software-driven model will be the most successful. These businesses recognize the power of transforming enormous volumes of data generated by digital operations into real-time insights that propel further success. The ability to do this in real-time, all the time, across multiple functional disciplines, lies at the heart of continuous intelligence.
As part of its ongoing commitment to innovation, a leading global airline company embarked on a major initiative that—when fully completed–would entail moving hundreds of applications to the cloud. However, essential to this initiative was the need for the company’s nascent cloud platforms to first attain compliance with the highly demanding PCI Data Security Standard. Failing to achieve this milestone would endanger the company’s entire digital transformation efforts.
Cost-effective, cloud-native platform that accelerates your PCI readiness with ease. Cybercriminals are continuously evolving their tactics to access valuable information, and for organizations that handle credit card data, just one breach can have far-reaching consequences that negatively impact brand reputation and bottom line.
Among other vital responsibilities, Vaimo is tasked with safeguarding its clients eCommerce operations. This means relentlessly applying creative techniques to thwart persistent, malicious behaviors that include spam, identity deceit, and bogus user creation, all while attempting to restrict access to key infrastructure resources to a core group of authorized specialists. Concurrently, the company eagerly sought new ways to satisfy an ever-increasing set of regulatory requirements.
As part of Paf’s journey to the cloud, the company needed a data analytics solution to help consolidate tools and give them the visibility to help monitor their distributed systems as they worked to modernize its application stack into Amazon Web Services (AWS). Paf was looking for an automated solution that would support its hybrid infrastructure, correlate log and metric data and support security regulations such as GDPR.
As is the case with every financial services organization, eNett was obligated to obey a constantly evolving collection of compliance regulations, while concurrently striving to reduce its security exposure. Carrying out these responsibilities consumed a significant amount of time and effort, which siphoned off resources from the company’s primary mission. At the same time, eNett eagerly sought techniques to empower its front-line customer support staff to speedily resolve client issues without needing to disturb its software developers
Syapse’s innovative precision medicine solution requires ingesting enormous volumes of disparate information from numerous sources. Loading, organizing, and relating all of this raw data is regularly subject to complications such as interruptions, errors, and discrepancies. These issues often required manual resolution, which routinely distracted software development and operations staff from their primary tasks. As a HIPAA compliant and HITRUST certified company, Syapse needed to have standardized processes and audit capabilities to protect patient data.
In this latest SnapSecChat video series, our CSO, George Gerchow, talks about the imperative for building the next-gen security operations center (SOC) of the future. Why? Because today's IT landscape is becoming increasingly complex and cloud-based. Everything has changed, from applications being built on microservices, to the CI/CD pipeline, to new and merging toolsets that help us manage and secure our cloud-based infrastructure via APIs. That means we must completely reimagine how we manage security operations if we are to keep pace with the rate of technological innovation. This includes a new level of rigor, adaptive processes, and industry and team collaboration to ensure we take a proactive security approach to continue to stay ahead of attackers. For Sumo Logic, this means working with customers like The Pokemon Co. International and partners like Nordcloud to share industry best practices, processes and key learnings on how to effectively operate a modern day global SOC. To learn more about how Sumo Logic is working with Pokemon on joint SOC initiatives, check out this recent blog as well as the customer press release. If you enjoyed this video, then be sure to stay tuned for another one coming to a website near you soon! And don’t forget to follow George on Twitter at @GeorgeGerchow, and use the hashtag #SnapSecChat to join the security conversation!
What are Sumo Cert Jams? Sumo Logic Cert Jams are one and two-day training events held in major cities all over the world to help you ramp up your product knowledge, improve your skills and walk away with a certification confirming your product mastery. We started doing Cert Jams about a year ago to help educate our users around the world on what Sumo can really do and give you a chance to network and share use cases with other Sumo Logic users. Not to mention, you get a t-shirt. So far, we’ve had over 4,700 certifications from 2,700+ unique users across 650+ organizations worldwide. And we only launched the Sumo Cert Jam program in April! If you’re still undecided, check out this short video where our very own Mario Sanchez, Director of the Sumo Logic Learn team, shares why you should get the credit and recognition you deserve! Currently there are four certifications for Sumo Logic: Pro User Power User Power Admin Security User And these are offered in a choose-your-own-adventure format. While everyone starts out with the Pro User certification to learn the fundamentals, you can take any of the remaining exams depending on your interest in DevOps (Power User), Security, or Admin. Once you complete Sumo Pro User, you can choose your own path to Certification success. For a more detailed breakdown on the different certification levels, check out our web page, or our Top Reasons to Get Sumo Certified blog. What’s the Value? Often customers ask me in one-on-one situations what is the value of certification, and I tell them that we have seen significant gains in user understanding, operator usage and search performance once we get users certified. Our first Cert Jam in Delhi, India with members from the Bed, Bath and Beyond team showing their certification swag! First, there’s the ability to rise above “Mere Mortals” (those who haven’t been certified) and write better and more complex queries. From parsing to correlation, there’s a significant increase by certified users taking Pro (Level 1), Power User (Level 2), Admin (Level 3) and Security. Certified users are taking advantage of more Sumo Logic features, not only getting more value out of their investment, but also creating more efficient/performant queries. And from a more general perspective, once you know how to write better queries and dashboards, you can create the kind of custom content that you want. When it comes to monitoring and alerting, certified users are more likely to create dashboards and alerts to stay on top of what’s important to their organizations, further benefiting from Sumo Logic as a part of their daily workload. Here we can see that certified users show an increase in the creation of searches, dashboards and alerts, as well as key optimization features such as Field Extraction Rules (FERs), scheduled views and partitions: Join Us If you’re looking to host a Cert Jam at your company, and have classroom space for 50, reach out to our team. We are happy to work with you and see if we can host one in your area. If you’re looking for ways to get certified, or know someone who would benefit, check out our list of upcoming Cert Jams we’re offering. Don’t have Sumo Logic, but want to get started? Sign up for Sumo Logic for free! Our Cert Jam hosted by Tealium in May. Everyone was so enthusiastic to be certified.
This post gives a brief overview of some ideas we presented at the recent Scale By the Bay conference in San Francisco, for more details you can see a video of the talk or take a look at the slides. The Problems of Sensitive Data and Leakage Data science and machine learning have gotten a lot of attention recently, and the ecosystem around these topics is moving fast. One significant trend has been the rise of data science notebooks (including our own here at Sumo Logic): interactive computing environments that allow individuals to rapidly explore, analyze, and prototype against datasets. However, this ease and speed can compound existing risks. Governments, companies, and the general public are increasingly alert to the potential issues around sensitive or personal data (see, for example, GDPR). Data scientists and engineers need to continuously balance the benefits of data-driven features and products against these concerns. Ideally, we’d like a technological assistance that makes it easier for engineers to do the right thing and avoid unintended data processing or revelation. Furthermore, there is also a subtle technical problem known in the data mining community as “leakage”. Kaufman et al won the best paper award at KDD 2011 for Leakage in Data Mining: Formulation, Detection, and Avoidance, which describes how it is possible to (completely by accident) allow your machine learning model to “cheat” because of unintended information leaks in the training data contaminating the results. This can lead machine learning systems which work well on sample datasets but whose performance is significantly degraded in the real world. As this can be a major problem, especially in systems that pull data from disparate sources to make important predictions. Oscar Boykin of Stripe presented an approach to this problem at Scale By the Bay 2017 using functional-reactive feature generation from time-based event streams. Information Flow Control (IFC) for Data Science My talk at Scale By the Bay 2018 discussed how we might use Scala to encode notions of data sensitivity, privacy, or contamination, thereby helping engineers and scientists avoid these problems. The idea is based on programming languages (PL) research by Russo et al, where sensitive data (“x” below) is put in a container data type (the “box” below) which is associated with some security level. Other code can apply transformations or analyses to the data in-place (known as Functor “map” operation in functional programming), but only specially trusted code with an equal or greater security level can “unbox” the data. To encode the levels, Russo et al propose using the Lattice model of secure information flow developed by Dorothy E. Denning. In this model, the security levels form a partially ordered set with the guarantee that any given pair of levels will have a unique greatest lower bound and least upper bound. This allows for a clear and principled mechanism for determining the appropriate level when combining two pieces of information. In the Russo paper and our Scale By the Bay presentation, we use two levels for simplicity: High for sensitive data, and Low for non-sensitive data. To map this research to our problem domain, recall that we want data scientists and engineers to be able to quickly experiment and iterate when working with data. However, when data may be from sensitive sources or be contaminated with prediction target information, we want only certain, specially-audited or reviewed code to be able to directly access or export the results. For example, we may want to lift this restriction only after data has been suitably anonymized or aggregated, perhaps according to some quantitative standard like differential privacy. Another use case might be that we are constructing data pipelines or workflows and we want the code itself to track the provenance and sensitivity of different pieces of data to prevent unintended or inappropriate usage. Note that, unlike much of the research in this area, we are not aiming to prevent truly malicious actors (internal or external) from accessing sensitive data – we simply want to provide automatic support in order to assist engineers in handling data appropriately. Implementation and Beyond Depending on how exactly we want to adapt the ideas from Russo et al, there are a few different ways to implement our secure data wrapper layer in Scala. Here we demonstrate one approach using typeclass instances and implicit scoping (similar to the paper) as well as two versions where we modify the formulation slightly to allow changing the security level as a monadic effect (ie, with flatMap) having last-write-wins (LWW) semantics, and create a new Neutral security level that always “defers” to the other security levels High and Low. Implicit scoping Most similar to the original Russo paper, we can create special “security level” object instances, and require one of them to be in implicit scope when de-classifying data. (Thanks to Sergei Winitzki of Workday who suggested this at the conference!) Value encoding For LWW flatMap, we can encode the levels as values. In this case, the security level is dynamically determined at runtime by the type of the associated level argument, and the de-classify method reveal() returns a type Option[T] where it is None if the level is High. This implementation uses Scala’s pattern-matching functionality. Type encoding For LWW flatMap, we can encode the levels as types. In this case, the compiler itself will statically determine if reveal() calls are valid (ie, against the Low security level type), and simply fail to compile code which accesses sensitive data illegally. This implementation relies on some tricks derived from Stefan Zeiger’s excellent Type-Level Computations in Scala presentation. Data science and machine learning workflows can be complex, and in particular there are often potential problems lurking in the data handling aspects. Existing research in security and PL can be a rich source of tools and ideas to help navigate these challenges, and my goal for the talk was to give people some examples and starting points in this direction. Finally, it must be emphasized that a single software library can in no way replace a thorough organization-wide commitment to responsible data handling. By encoding notions of data sensitivity in software, we can automate some best practices and safeguards, but it will necessarily only be a part of a complete solution. Watch the Full Presentation at Scale by the Bay Learn More
To gain a better understanding of the adoption and usage of machine data in Europe, Sumo Logic commissioned 451 Research to survey 250 executives across the UK, Sweden, the Netherlands and Germany, and to compare this data with a previous survey of U.S. respondents that were asked the same questions. The research set out to answer a number of questions, including: Is machine data in fact an important source of fuel in the analytics economy? Do businesses recognize the role machine data can play in driving business intelligence? Are businesses that recognize the power of machine data leaders in their fields? The report, “Using Machine Data Analytics to Gain Advantage in the Analytics Economy, the European Edition,” released at DockerCon Europe in Barcelona this week, reveals that companies in the U.S. are currently more likely to use and understand the value of machine data analytics than their European counterparts, but that Europeans lead the U.S. in using machine data for security use cases. Europeans Trail US in Recognizing Value of Machine Data Analytics Let’s dig deeper into the stats regarding U.S. respondents that stated they were more likely to use and understand the value of machine data analytics. For instance, 36 percent of U.S. respondents have more than 100 users interacting with machine data at least once a week, while in Europe, only 21 percent of respondents have that many users. Likewise, 64 percent of U.S. respondents said that machine data is extremely important to their company’s ability to meet its goals, with 54 percent of European respondents saying the same. When asked if machine data tools are deployed on-premises, only 48 percent of European correspondents responded affirmatively, compared to 74 percent of U.S. respondents. The gap might be explained by idea that U.S. businesses are more likely to have a software-centric mindset. According to the data, 64 percent of U.S. respondents said most of their company had software-centric mindsets, while only 40 percent of European respondents said the same. Software-centric businesses are more likely to recognize that machine data can deliver critical insights, from both an operational and business perspective, as they are more likely to integrate their business intelligence and machine data analytics tools. Software-centric companies are also more likely to say that a wide variety of users, including head of IT, head of security, line-of-business users, product managers and C-level executives recognize the business value of machine data. Europeans Lead US in Using Machine Data for Security At 63 percent, European companies lead the way in recognising the benefit of machine data analytics in security use cases, which is ahead of the U.S. Given strict data privacy regulations in Europe, including the new European Union (EU) General Data Protection Regulation (GDPR), it only seems natural that security is a significant driver for machine data tools in the region. Business Insight Recognized by Europeans as Valuable Beyond security, other top use cases cited for machine data in Europe are monitoring (55 percent), troubleshooting (48 percent) and business insight (48 percent). This means Europeans are clearly recognizing the value of machine data analytics beyond the typical security, monitoring and troubleshooting use-cases — they’re using it as a strategic tool to move the business forward. When IT operations teams have better insight into business performance, they are better equipped to prioritize incident response and improve their ability to support business goals. A Wide Array of European Employees in Different Roles Use Machine Data Analytics The data further show that, in addition to IT operations teams, a wide array of employees in other roles commonly use machine data analytics. Security analysts, product managers and data analysts — some of whom may serve lines of business or senior executives — all appeared at the top of the list of the roles using machine data analytics tools. The finding emphasizes that companies recognize the many ways that machine data can drive intelligence across the business. Customer Experience and Product Development Seen as Most Beneficial to Europeans Although security emerged as an important priority for users of machine data, improved customer experience and more efficient product development emerged as the top benefit of machine data analytics tools. Businesses are discovering that the machine analytic tools they use to improve their security posture can also drive value in other areas, including better end-user experiences, more efficient and smarter product development, optimized cloud and infrastructure spending, and improved sales and marketing performance. Barriers Preventing Wider Usage of Machine Data The report also provided insight into the barriers preventing wider usage of machine data analytics. The number one capability that users said was lacking in their existing tools was real-time access to data (37 percent), followed by fast, ad hoc querying (34 percent). Another notable barrier to broader usage is the lack of capabilities to effectively manage different machine data analytics tools. European respondents also stated that the adoption of modern technologies does make it harder to get the data they need for speedy decision-making (47 percent). Whilst moving to microservices and container-based architectures like Docker makes it easier to deploy at scale, it seems it is hard to effectively monitor activities over time without the right approach to logs and metrics in place. In Conclusion Europe is adopting modern tools and technologies at a slower rate than their U.S. counterparts, and fewer companies currently have a ‘software-led’ mindset in place. Software-centric businesses are doing more than their less advanced counterparts to make the most out of the intelligence available to them in machine data analytics tools. However, a desire for more continuous insights derived from machine data is there: the data show is that once European organisations start using machine data analytics to gain visibility into their security operations, they start to see the value for other use cases across operations, development and the business. The combination of customer experience and compliance with security represent strong value for European users of machine data analytics tools. Users want their machine data tools to drive even more insight into the customer experience, which is increasingly important to many businesses, and at the same time help ensure compliance. Additional Resources Download the full 451 Research report for more insights Check out the Sumo Logic DockerCon Europe press release Download the Paf customer case study Read the European GDPR competitive edge blog Sign up for a free trial of Sumo Logic
The world is changing. The way we do business, the way we communicate, and the way we secure the enterprise are all vastly different today than they were 20 years ago. This natural evolution of technology innovation is powered by the cloud, which has not only freed teams from on-premises security infrastructure, but has also provided them with the resources and agility needed to automate mundane tasks. The reality is that we have to automate in the enterprise if we are to remain relevant in an increasingly competitive digital world. Automation and security are a natural pairing, and when we think about the broader cybersecurity skills talent gap, we really should be thinking about how we can replace simple tasks through automation to make way for teams and security practitioners to be more innovative, focused and strategic. A Dynamic Duo That’s why Sumo Logic and our partner, The Pokemon Co. International, are all in on bringing together the tech and security innovations of today and using those tools and techniques to completely redefine how we do security operations, starting with creating a new model for how security operations center (SOC) should be structured and how it should function. So how exactly are we teaming up to build a modern day SOC, and what does it look like in terms of techniques, talent and tooling? We’ll get into that, and more, in this blog post. Three Pillars of the Modern Day SOC Adopt Military InfoSec Techniques The first pillar is all about mindset and adopting a new level of rigor and way of thinking for security. Both the Sumo Logic and Pokemon security teams are built on the backbone of a military technique called the OODA loop, which was originally coined by U.S. Air Force fighter pilot and Pentagon consultant of the late twentieth century, John Boyd. Boyd created the OODA loop to implement a change in military doctrine that focused on an air-to-air combat model. OODA stands for observe, orient, decide and act, and Boyd’s thinking was that if you followed this model and ensured that your OODA loop was faster than that of your adversary’s, then you’d win the conflict. Applying that to today’s modern security operations, all of the decisions made by your security leadership — whether it’s around the people, process or tools you’re using — should be aimed at reducing your OODA loop to a point where, when a situation happens, or when you’re preparing for a situation, you can easily follow the protocol to observe the behavior, orient yourself, make effective and efficient decisions, and then act upon those decisions. Sound familiar? This approach is almost identical to most current incident response and security protocols, because we live in an environment where every six, 12 or 24 months we’re seeing more tactics and techniques changing. That’s why the SOC of the future is going to be dependent on a security team’s ability to break down barriers and abandon older schools of thought for faster decision making models like the OODA loop. This model is also applicable across an organization to encourage teams to be more efficient and collaborative cross-departmentally, and to move faster and with greater confidence in order to achieve mutually beneficial business goals. Build and Maintain an Agile Team But it’s not enough to have the right processes in place. You also need the right people that are collectively and transparently working towards the same shared goal. Historically, security has been full of naysayers, but it’s time to shift our mindset to that of transparency and enablement, where security teams are plugged into other departments and are able to move forward with their programs as quickly and as securely as they can without creating bottlenecks. This dotted line approach is how Pokemon operates and it’s allowed the security team to share information horizontally, which empowers development, operations, finance and other cross-functional teams to also move forward in true DevSecOps spirit. One of the main reasons why this new and modern Sumo Logic security team structure has been successful is because it’s enabled each function — data protection/privacy, SOC, DevSecOps and federal — to work in unison not only with each other, but also cross-departmentally. In addition to knowing how to structure your security team, you also need to know what to look for when recruiting new talent. Here are three tips from Pokemon’s Director of Information Security and Data Protection Officer, John Visneski: Go Against the Grain. Unfortunately there are no purple security unicorns out there. Instead of finding the “ideal” security professional, go against the grain. Find people with the attitude and aptitude to succeed, regardless of direct security experience. The threat environment is changing rapidly, and burnout can happen fast, which is why it’s more important to have someone on in your team with those two qualities.Why? No one can know everything about security and sometimes you have to adapt and throw old rules and mindsets out the window. Prioritize an Operational Mindset. QAs and test engineers are good at automation and finding gaps in seams, very applicable to security. Best Security Engineers didn’t know a think about security before joining Pokemon, but he had a valuable skill set.Find talent pools that know how the sausage is made. Best and brightest security professionals didn’t even start out in security but their value add is that they are problem solvers first, and security pros secondary. Think Transparency. The goal is to get your security team to a point where they’re sharing information at a rapid enough pace and integrating themselves with the rest of the business. This allows for core functions to help solve each other’s problems and share use-cases, and it can only be successful if you create a culture that is open and transparent. The bottom line: Don’t be afraid to think outside of the box when it comes to recruiting talent. It’s more important to build a team based on want, desire and rigor, which is why bringing in folks with military experience has been vital to both Sumo Logic’s and Pokemon’s security strategies. Security skills can be learned. What delivers real value to a company are people that have a desire to be there, a thirst for knowledge and the capability to execute on the job. Build a Modern Day Security Stack Now that you have your process, and your people, you need your third pillar — tools sets. This is the Sumo Logic reference architecture that empowers us to be more secure and agile. You’ll notice that all of these providers are either born in the cloud or are open source. The Sumo Logic platform is at the core of this stack, but its these partnerships and tools that enable us to deliver our cloud-native machine data analytics as a service, and provide SIEM capabilities that easily prioritize and correlate sophisticated security threats in the most flexible way possible for our customers. We want to grow and transform with our own customer’s modern application stacks and cloud architectures as they digitally transform. Pokemon has a very similar approach to their security stack: The driving force behind Pokemon’s modern toolset is the move away from old school customer mentality of presenting a budget and asking for services. The customer-vendor relationship needs to mirror a two way partnership with mutually invested interests and clear benefits on both sides. Three vendors — AWS, CrowdStrike and Sumo Logic — comprise the core base of the Pokemon security platform, and the remainder of the stack is modular in nature. This plug and play model is key as the security and threat environments continue to evolve because it allows for flexibility in swapping in and out new vendors/tools as they come along. As long as the foundation of the platform is strong, the rest of the stack can evolve to match the current needs of the threat landscape. Our Ideal Model May Not Be Yours We’ve given you a peek inside the security kimono, but it’s important to remember that every organization is different, and what works for Pokemon or Sumo Logic may not work for every particular team dynamic. While you can use our respective approaches as a guide to implement your own modern day security operations, the biggest takeaway here is that you find a framework that is appropriate for your organization’s goals and that will help you build success and agility within your security team and across the business. The threat landscape is only going to grow more complex, technologies more advanced and attackers more sophisticated. If you truly want to stay ahead of those trends, then you’ve got to be progressive in how you think about your security stack, teams and operations. Because regardless of whether you’re an on-premises, hybrid or cloud environment, the industry and business are going to leave you no choice but to adopt a modern application stack whether you want to or not. Additional Resources Learn about Sumo Logic's security analytics capabilities in this short video. Hear how Sumo Logic has teamed up with HackerOne to take a DevSecOps approach to bug bounties in this SnapSecChat video. Learn how Pokemon leveraged Sumo Logic to manage its data privacy and GDPR compliance program and improve its security posture.