Data-centric Security: an in depth look at data security defense

Last year, organizations saw the highest average cost of a data breach in 17 years, with costs rising from $3.86 million USD in 2020 to $4.24 million USD in 2021. As a result, overburdened security teams are consistently trying to stay on top of the latest threats, vulnerabilities and hacker tactics.

Complicating matters even more is the rapid adoption of the public cloud, which has surged from $270 million USD in 2020 to an estimated $397 million USD in 2022. This fast-paced digital transformation gave way to the rapid development of digital products and services. Unfortunately, the cloud has also blurred the security perimeter and opened up more opportunities for attackers to exploit data.

While chasing down cyber adversaries and attempting to reduce the opportunities for hackers to attack seems like the right step to take, many security teams are missing the point: the data itself.

Be Your Data’s Guardian in the Cloud

Data has gone from a commodity to a currency. As a result, it is just as valuable for attackers as it is for business. Having a solid understanding of the latest cyberthreats is important, but just as critical of an issue is that security teams are almost blind when it comes to data residing in public cloud infrastructure due to the sprawl of cloud services and pace of change for devops.

And how can you protect what you can’t see?

There are three critical steps that security teams must put into motion if they want to maintain efficient and adequate visibility of sensitive data within public cloud environments and ultimately bolster security posture.

1. Find a Cloud-Native Security Tool

In cybersecurity, there is no one size fits all solution. However, the cloud is an ever-changing environment which means the solutions must change too. Solutions can now be built into an organization’s public cloud infrastructure to combat data breaches by autonomously discovering data stores and continuously analyzing and remediating risks or leaks. Too often do data security professionals and leaders find themselves unable to see the full picture of their data. Ensuring your security solution can integrate with cloud infrastructure allows for a seamless transition and visibility, identifying data that resides in the shadows.

With the ever-expanding public cloud, and how bloated with data they are becoming, CISOs everywhere are scared about their unknown and unprotected data stores. Criminals are capitalizing on this and repeatedly breaking through these systems due to the rapidly changing landscape – our defenses must adapt.

2. Monitoring and Protecting Your Treasured Data

As previously mentioned, a company’s sensitive data can and will be copied and backed up. It is an organization’s responsibility to ensure that this data can be properly monitored and protected. This responsibility can only be achieved by understanding the data, where it is, and where it is going. Security relies heavily on known variables hence a solution without full visibility compromises the entire organization’s security posture.

Whether accidentally or intentionally, human error can cause devastating losses both financially and socially for a company. Up to 85 percent of data breaches now have a human element. All organizations must understand data exposure, who is within their system and why they should be accessing public cloud data at all times, otherwise organizations risk losing their treasure trove.

3. Always Have A Plan

The “Achilles heel” in cybersecurity is too often, a leadership team with their heads in the sand. Far too many organizations believe themselves to be immune to the current ransomware crisis looming over industries across the board. It is essential to have an incident response plan and team in place. Excruciating detail should be provided for the roles that each core pillar of an organization should play during an ongoing crisis. Proactive monitoring of the crown jewels allows security teams to be notified of abnormalities and access risks that was not possible a few years ago. This Zero-Trust approach to data allows for less human error and more power into security operation centers.

A Data Centered World

Accepting that technology is fluid and ever-changing will dramatically assist security teams and leaders when it comes to protecting an organization. The cloud is here to stay, and it is being relied upon even more as the pandemic ensues and teams continue to work off-site. Thus, finding the appropriate solution for an organization’s security needs must become a foundational level priority moving forward. Personal and corporate data should be protected as the treasure trove it truly is. Efficient and effective Public Cloud solutions should be able to monitor and protect data silos, revealing what data is hiding in the shadows. It is important to remember that it is incredibly costly for business and reputation if cyber adversaries sell their treasure trove of data to the highest bidder.

The Future of Data Security: Data-Centric Security

Data protection and cloud security have enterprises running around a giant hamster wheel. They know that they are practically blind when it comes to where sensitive data is in the cloud and how well it’s protected. Meanwhile, data protection teams are crying out for a way to gain a complete and accurate view of their data. It doesn’t seem like such a tall ask, considering that data is at the center of cloud transformation—no matter how you slice it. Yet, still, some companies are living in the renaissance period of cloud security and blissfully unaware of their assets in the cloud.

Setting the Scene

If innovation were a Hollywood movie, data would be the lead actor. Data is inarguably the most critical piece of the puzzle when it comes to innovation within the modern cloud-first enterprise. Most business leaders have wrapped their heads around this concept and recognize the facts; they agree that: In order to give my developers and data scientists the tools they need to innovate, our data must be democratized and we must be able to support new applications on the cloud. While most businesses understand that data is important, that it’s critical to protect and that it is a source of differentiation, they often fall short of understanding what exactly is involved in effective data security. Especially when it comes to sensitive data stored in the cloud, many security teams are still in the dark.

This misunderstanding—or possibly misinformation—leads enterprise leaders to rely on traditional methods of data security. Outdated technology hasn’t adjusted to the new cloud-native environment. This means that data security and privacy workflows, reviews, committees and assessments are all manual. Herein lies a tremendous growth opportunity.

We could discuss the problems with current approaches until we’re blue in the face. Problems of alert fatigue, FUD, friction with developers and of course exposure to data exfiltration and security risks are holding organizations back from reaching their full “cloud potential.” While recent approaches, like Cloud Security Posture Management (CSPM) tools, have brought some useful capabilities for cloud infrastructure—such as VMs, containers, etc.—they don’t address the needs of data security teams who have been left in the dust. Traditional data security solutions and manual processes haven’t adjusted to the new cloud-first environment, which makes the work of the modern analyst much more challenging, and, most significantly, has positioned them as “gatekeepers” rather than “enablers” of business and innovation.

Stuck in the Past

Legacy data security suites have left enterprises ignorant to what sensitive and regulated data they have in the public cloud. This impacts several components of a data security strategy. First, teams are left conducting manual, periodic interviews with application owners to identify sensitive data stores that are out of date (usually days later) as the cloud environment is agile and dynamic as developers and data scientists can make copies of data anytime they want. This is all in a failed effort to determine where their sensitive data lives in the cloud. They’re stuck in a “trust but no verify” approach that is completely manual and unable to keep up with the speed of the cloud.

Second, when securing and controlling cloud data, they often rely on written policies with little to no enforcement. Instead of automated approaches to enforce policies, they have to trust that developers will understand standard policies and properly implement them. They are involved in laborious compliance audits of policies, which can easily leave gaps. Lastly, legacy data loss protection (DLP) solutions only cover email, endpoint and on-premises infrastructure, which means data security teams have limited to zero visibility into potentially ruinous data leaks in IaaS and PaaS environments.

Third, security teams may be thought of by others within the organization as a hindrance to business and innovation. While leadership can preach all day to stakeholders and marketing about the importance of security, there can be a disconnect between security teams and other decision-makers when it comes to acting on the platitude. Security might seem like a great idea “when we get to it,” but when it comes to enacting security best practices, leadership might not want to risk possible disruption that would slow or even stop a project. Without the right tools in place, security teams can feel like they are fighting an uphill battle, which can be discouraging and lead to neglect of pertinent issues just to reduce friction with colleagues.

A Place in the Future

Where does this leave the modern enterprise that wants to gain a complete and accurate view of all assets on the cloud to move innovation forward? A cloud-native, data-centric approach will take organizations from the past to the future of data management and protection. Let’s break down the components of a forward-thinking, modern approach to cloud security.

Eat, Sleep, Breath Cloud

There’s no arguing that cloud is integral to most businesses today. Thus, a modern data management approach must start by integrating fully with the public cloud itself, using modern, cloud-native approaches. Within virtually every enterprise are hundreds of technologies and apps that store, use and share data in the cloud. These tools can be managed by cloud service providers (AWS S3 buckets, Google Cloud Storage, Azure Blob Storage, etc.), IT (AWS RDS) and even developers or operations teams (database that runs on an EC2 or a Kubernetes node). Furthermore, each technology is configured and used differently on a daily basis. These architectures are complex, dynamic and constantly changing, which increases risk dramatically over legacy data management.

For this reason, a cloud-native tool or application is critical for companies seeking a place in the future. A cloud-native tool or application is designed to capitalize on the characteristics of a cloud computing software delivery model. They utilize the cloud service provider’s (CSP) native APIs that are designed to meet these needs. While cloud-native data security solutions aren’t mainstream yet, they’re gaining traction among larger, established organizations that recognize their unmatched value and their unique ability to discover, classify, secure and control the data that lives in the cloud more deeply.

Full Visibility

If security teams don’t know where their sensitive data is, who has access to it and can’t understand the risk posture associated with certain assets, how can they expect to know about leaks and vulnerabilities in a timely manner? Gaining that deep, all-encompassing visibility into every piece of organizational data stored in the cloud—whether that data asset is managed by the cloud provider, if it’s a formal data store or in compute or if it’s public or isolated—and continuously monitoring the movement and management of that data is the most effective way to stay nimble and reduce the attack surface. For companies living in the “future” of cloud data management, this means connecting security tools directly with their cloud account to agentlessly scan the entire cloud environment and autonomously discover all data stores. Autonomous solutions are critical as cloud environments are agile and dynamic where security teams and even application developers are not aware of the typically thousands of data assets in their cloud accounts. Many data management solutions will automatically scan known datastores with the right credentials to gain access, but only autonomous solutions discover ALL resources without knowledge of the environment. Achieving this level of visibility without disrupting workflows is huge in terms of moving security teams away from a gatekeeper persona to business enablers.

Putting the Pieces Together

Collecting and analyzing all data assets is just the first step toward a more advanced, forward-thinking approach to data security in the cloud. Modern cloud-native solutions are also able to autonomously scan all of those discovered pieces of data to understand where to focus first—the most sensitive data and the most critical issues—and present that information to security analysts. Cloud-native tools can also autonomously scan audit logs, network flow logs and various data sources in order to build a profile for every data access point. A cloud-native, agentless approach allows data security teams to detect leaks and remediate them faster by monitoring unwanted data access in real-time by analyzing access logs for anomalous activity. Cloud security teams are no longer stuck in an environment of alert fatigue and burnout because they finally have eyes on all of their sensitive data at any given moment.

Without the right tools, today’s security professionals will continue to live in fear of the unknown, like unknown data repositories (what we call shadow data) that can be targeted with the least odds of detection. Security teams are afraid of being out of the loop and susceptible to breaches. This creates tension between security teams and the rest of the enterprise. But with the right tools, security teams can champion digital transformation and innovation and truly become heroes within their organization.

Cloud Data Security: The Cost of Doing Nothing

The world has changed dramatically over the past couple of years—especially in the areas of business and technology. The COVID pandemic accelerated digital transformation and forced a shift to a remote or hybrid business model, leading to a significant spike in the adoption of public cloud services. Gartner estimates that public cloud services spending will increase by 47% from $270 billion in 2020 to a projected $397 billion in 2022.

Cloud services and data have been essential for enabling remote workers to maintain productivity, but it is also a double-edged sword that could lead to complexity, excessive costs and unnecessary security risks. With increasingly more data  getting stored in the cloud, it is also becoming increasingly challenging for IT teams to effectively manage and protect it all. The result is a breach culture that is only going to get worse.  The cost of maintaining the status quo or doing nothing increases daily.

Cloud Data Challenges

Cloud platforms and services have been a lifesaver for businesses during the pandemic. Companies have embraced cloud services to provide accessibility, streamline productivity and increase operational resilience for employees working remotely.

However, for most organizations, the rapid adoption of cloud services came with consequences as well. Visibility was sacrificed and security was compromised in the name of expedience. The percentage of corporate data stored in the cloud has doubled from 30% in 2015 to 60% in 2022 and continues to grow.

The data sprawl results in unknown and unmonitored data stores. Cloud services and DevOps practices enable end users to self-provision applications and services and allow developers to spin up new databases at the push of a button. Our State of Public Cloud Data Security Report 2022 found that less than half (49%) of the survey respondents claimed to have full visibility when developers spin up a new data repository. More than a third (35%) reported partial visibility, while 12% indicated they have no visibility at all.

Not So Hidden Cost of ‘Shadow Data’

The complexity and lack of visibility results in “shadow data.” Test environments, cloud data store backups, remnants of cloud data migration, data logs and other artifacts consume resources.

Unfortunately, cloud data storage is not free. There is a real cost of unknown and unnecessary cloud storage. One of our customers acknowledged that they will save $115,000 per year by eliminating shadow data and consolidating data stores.

Shadow data also introduces an increasing cost of additional risk. These unknown data stores often contain sensitive information like customer or employee data, financial information, intellectual property, or other classified or confidential information.

Unfortunately, the mantra that “you can’t protect what you can’t see” is very true. IT and data security teams can’t possibly enforce policies, monitor access or protect data of which they are unaware. That is why more than 4 out of 5 (82%) of the senior data security professionals we surveyed revealed they are concerned or very concerned about shadow data.

The pace and scale of data breaches continue to grow—along with the cost of being breached. According to the annual Cost of a Data Breach Report from the Ponemon Institute, the average cost to remediate a breach was $4.24 million in 2021. In addition, the 2021 Data Breach Investigations Report from Verizon found that 90% of data breaches target the public cloud.

There is also a less tangible cost in terms of efficiency without clear visibility of cloud data. With a clear understanding of where data is stored, IT and data security teams can focus on speed of access—which improves productivity and streamlines operations.

Doing Nothing Is Costly

There are significant and essential benefits from cloud services and applications. Even if everything could somehow go back to the way it was before the pandemic, few businesses would choose to do so. While the circumstances that drove much of the digital transformation and cloud adoption over the past couple of years have been tragic, they forced organizations to make changes that have resulted in improved productivity and efficiency.

However, shadow data and cloud data security are very real problems. In a best-case scenario, shadow data results in unnecessary expenses for cloud storage resources, and in a worst-case scenario, the unprotected data could lead to millions of dollars in costs to remediate and recover from a data breach.

With cloud services here to stay, what’s important is that data security teams have the visibility and tools to effectively mitigate and manage cloud data risks. Organizations need cloud-native security to automatically and autonomously discover all data across the cloud ecosystem and recognize and classify PII (Personally Identifiable Information) and other sensitive data. Autonomy is important as these assets are unknown to data security. Once they have full visibility, they can prioritize data stores based on sensitivity and exposure to risk, enforce data security policies, and monitor data access and egress to detect and alert on suspicious activity.

Maintaining the status quo is a bad strategy. The cost of doing nothing is an expense few organizations can afford.

Best Practices for Effective Cloud Data Security

Digital transformation and the shift to the cloud have accelerated in the past couple of years due to COVID-19 and the remote, work-from-home business model. Gartner projects that companies will spend nearly $400 billion on public cloud platforms in 2022.

The more organizations embrace the cloud, the more data is stored in the cloud. More than 60% of corporate data is stored in the cloud today, which is double what it was just seven years ago—and that figure continues to grow.

There is enterprise data security on-premises and cloud security for infrastructure, but nothing secures data for everything you build and run in the cloud. While developers and data scientists have free reign to capture, copy and manipulate sensitive data in public cloud environments, security and data teams have lost visibility and have much less control.

Challenges of Protecting Data in the Cloud

Adapting to the cloud has created a number of unique pain points for organizations in terms of data protection. For one, there is a serious lack of visibility for the IT teams tasked with data security. Multiple departments can use SaaS (Software-as-a-Service) applications and cloud storage platforms, and developers can spin up new databases without the knowledge or consent of IT. The net result is that there is no consolidated view of data across the environment.

The problem is exacerbated by a lack of context that leads to an inefficient allocation of resources. After all, not all data is created equally. Some data is more sensitive or confidential and deserves greater protection. Still, security controls are often applied uniformly for the entire environment rather than understanding the context and prioritizing data security efforts accordingly.

Cloud computing and digital transformation have dramatically expanded the exposed attack surface that IT teams need to defend. The exposure of data across a hybrid or multi-cloud environment, combined with the lack of comprehensive visibility, makes it impossible to assess your data security posture accurately. The complexity of the environment also makes it virtually impossible to monitor for attacks in progress or detect data leaks effectively.

Protecting data across an increasingly complex web of platforms and applications is a challenge. Organizations need to find the balance and take advantage of the agility and scalability of cloud computing without sacrificing data security.

Cloud Data Security Methodology

The Laminar Cloud Data Security Methodology provides a framework and strategy for assuring data security in the cloud. Effective data protection is dependent on four primary pillars: Discover, Prioritize, Secure, and Monitor:

  • Discover: It seems both obvious and trite, but the reality is that you cannot protect what you can’t see. Effective data security in the cloud begins with knowing what data you have, who owns it, and where it is located. Data security and data governance both require that you have a way to find, characterize and classify known data and “shadow” or unknown data across your entire environment.
  • Prioritize: Once you have comprehensive visibility of your data, you need to understand the context of the data and prioritize protection accordingly. You should analyze the data and where and how it is used and allocate data security based on a variety of factors, including the sensitivity of the data, the current security posture, governance, and compliance mandates and exposure.
  • Secure: You need to strengthen and maintain your data security posture. This means reducing and minimizing the attack surface and enforcing data security best practices and established data policies.
  • Monitor: There is no perfect defense. Attacks will still happen despite data policies and best practices. Effective cloud data security also requires vigilance—detecting new data assets or changes to existing assets. The IT teams tasked with data security should continuously monitor the environment for access anomalies and indications of data leaks or compromise.

Effective Cloud Data Security

The cloud is not optional at this point. Organizations need to take advantage of the accessibility, agility, scalability, and cost-efficiency to remain competitive. However, it is also important to effectively manage security and data protection across this expanding and increasingly complex environment.

Cloud-native data requires cloud-native protection and data-centric cloud security.  Modern-day cloud data protection solutions must go beyond identity and access management and basic security controls for accessing cloud applications and services and address the unique challenges of protecting data in the cloud.

Organizations need complete data observability for everything in their hybrid, multi-cloud environments. Data protection teams have to have tools in place to autonomously discover and classify new datastores for complete visibility, prioritize risk based on data sensitivity and risk posture, secure data by remediating weak controls, and actively monitor for egress and access anomalies. The Cloud Data Security Methodology is a crucial component of that strategy. It is essential for enabling data security teams to reduce the attack surface, detect data leaks in real-time, and regain control over their data.

Learn more about how customers have successfully implemented Laminar’s cloud data security methodology in just six weeks. Schedule a demo today!

Hello World, Meet Laminar

Data protection at the speed of cloud

There are rare moments while working on a problem when a much more significant, fundamental problem arises. When this happens, you drop everything and dedicate yourself to the issue that matters most. This is the story of how we founded Laminar.

Right before the pandemic hit in 2020, I was leading the project to develop Magic Leap’s first cloud API. Our API’s input data was everything the user could hear and see while wearing our AR device. Probably the most sensitive personal data one can imagine. We knew that this project would never see the light of day without best-in-class data security and privacy, so I started looking for off-the-shelf solutions. We had a standard cloud infrastructure setup, along with modern application architecture, yet when trying to ensure our users’ data security and privacy, I could not find any solution for preventing data leakage and exfiltration from the cloud.

Faced with an unsolved technical problem, I went about solving it the same way I had since age 14: I asked my friend Oran (now my co-founder and CTO at Laminar) what he recommended. At the time, Oran was the first employee and Chief Architect at Medigate. They had the same problem in a completely different industry, handling health information, and there was no good solution for it.

“data protection has fallen behind in the data democratization movement”

We then spoke with over one hundred CISOs about this problem and their attempts to solve it, and the universal feedback from these conversations was simple, clear, and very consistent: data protection has fallen behind in the data democratization movement, and security teams must find a way to gain control without slowing down the business. The cloud infrastructure transformation amplified that when it made development way easier and data protection way harder. Security teams have lost control of where their sensitive data is, who has access to it, and where the data is going. Without visibility, traditional data security policies could not be transferred to the public cloud infrastructure, and they had no way of detecting data leakage from this new environment.

We knew it didn’t have to be this way. We envisioned a different kind of world, where data protection could move at the speed of cloud and become an enabler for the organization. We felt that data security should ambiently fit into the day-to-day fabric of cloud development and seamlessly speak the same language developers do. We felt that data protection teams too deserve a solution that is cloud-native by nature: agentless, asynchronous, frictionless.

We envisioned a different kind of world, where data protection could move at the speed of cloud

It was time for a change. We dropped everything, left our jobs, and founded Laminar with a clear mission in mind: enabling organizations to unleash the full potential of data securely and transparently. Where data can have laminar flow and not be interrupted or held back by security measures, but yet still be protected. We have developed the industry’s first cloud-native data protection platform for organizations that process sensitive data in a public cloud environment. Laminar delivers complete data observability across the entire cloud stack, allowing customers to continuously Discover and Classify data for complete visibility, Secure and Control data to improve risk posture as well as Detect Leaks and Remediate without interrupting data flow. Both agentless and asynchronous monitoring of data stores, compute and data egress channels allowing sanctioned data movement and alerting when something’s wrong. Data protection teams can reduce the attack surface, detect real-time data leaks and be back in control of their data.

“we founded Laminar with a clear mission in mind: enabling organization to unleash the full potential of data securely and transparently”

I am lucky enough to have Oran – one of the world’s leading security experts and a four-time Google Capture The Flag (CTF) winner – as my lifelong friend. Together, we quickly built a fantastic team of incredibly talented individuals. We have all been innovating together for a year in a truly magical environment. We all came to the office every day over the last year, just because it was so much fun working together.

At Laminar, we are also proud to be backed by Insight Partners. Our ability to use their vast network and meet with hundreds of Global 2000 executives to get a better understanding of the problem and the market before we designed the perfect solution continues to be invaluable.

As we embark on this journey together, I look forward to sharing more about Laminar’s vision, product, market momentum, culture, and values. For now, I invite you to check out our Company Manifesto and some of the assets on our website. If we can help you protect your data in the new public cloud environment, please reach out. We’d love to talk with you.

Hello World, meet Laminar.

We are enabling data protection at the speed of cloud.

Amit Shaked