The Gartner Hype Cycle for Data Security 2022 Featured DSPM – What Does It Mean for Your Business?

Gartner’s Hype Cycle for Data Security, 2022 highlighted an area of innovation that data and security professionals should keep their eye on: data security posture management (DSPM). According to the report, DSPM is a response to the increasingly complex multi-cloud infrastructures security and IT teams are managing today. These ecosystems often consist of disparate platforms, hundreds of cloud services, varying data formats, tons of access points, and, of course, the vast volume of data generated by countless systems and users every day.

With increasing data innovations and agile cloud development, data proliferates quickly and data security teams no longer have a holistic view of an organization’s data in the cloud. This includes: 

  • where it is stored
  • who has access
  • whether it’s secure
  • whether it’s in compliance with relevant regulations

Without a holistic view, teams are likely to face unprotected, overexposed, misplaced, redundant  and unknown shadow data, which can—and often does—include sensitive data. With that in mind, it’s no surprise that nearly half (45%) of all data breaches happen in the cloud, the average cost of which is over $9 million USD in the United States.

Gartner emphasizes the importance of DSPM because it “provides visibility as to where sensitive data is, who has access to that data, how it has been used, and what the security posture of the data store or application is.” This data visibility empowers data security professionals to assess risk, establish and enforce security policies, meet compliance requirements, remediate vulnerabilities, and more.

In this Hype Cycle, DSPM is located in the Technology Trigger phase, meaning a problem has been identified, but few understand the full scope of the problem, and fewer still know how to address it. So what does it actually mean for you and your business?

DSPM for Leadership

Achieving organization-wide buy-in on data security initiatives can be challenging. Infrastructure security is clearly in the domain of IT, while data security lies with the CISO who is not always a part of IT. Additionally, buy-in must be obtained from the Chief Data Officer (CDO), data governance, and data privacy teams. CISOs constantly walk a very fine line between business agility and risk mitigation. One only has to look at the conviction of Uber’s former CISO to comprehend the seriousness of missteps for this role. 

“Business risks resulting from privacy requirements, combined with ever-growing security
threats and accidental data disclosures, are increasing.” – Gartner

According to the Hype Cycle, DSPM’s market penetration is less than one percent currently. Security professionals can differentiate themselves and their organization by using DSPM to decouple data growth from data risk. With this strong foundation, CISOs can make risk-based decisions about data security, governance policies, compliance concerns and appropriate security controls — creating peace of mind for themselves and other stakeholders.

Regardless of where a company is in their cloud evolution, it’s never too early to look into new ways to secure data in the cloud. For example, data security governance (DSG), data risk assessment (DRA), financial data risk assessment (FinDRA), privacy impact assessments (PIA), data breach response processes, and DSPM are categories that many organizations should consider. These can help enforce more consistent policies, especially as new data and privacy laws continue to be rolled out.

On the topic of data security, it’s key to note that security professionals must carefully vet any DSPM solution, as this is a newly evolving product category. Here’s what you need to know.

Data Security Posture Management (DSPM) In Practice

The first step in any data management strategy is to obtain a concrete understanding of what data exists within your architecture. This is made even more difficult when you consider that the mean number of data sources per organization is 400 sources.

Data Discovery, Classification, and Cataloging

As you’re building out your data catalog, remember to include not just your known, but also your shadow data—the data that exists in your cloud environment that security and IT may not even be aware of such as deleted data from previous “versions” of your cloud data files.

Data cataloging is often done before a large data migration, as part of an audit, or during a cleanse, but in the new agile cloud environment it needs to be continuous and it is not a one time event. Continuous data discovery and classification is core to DSPM. 

Another piece of groundwork that must be tackled is mapping user access. With dozens of data storage services from each Cloud Service Provider (CSP) and each having multiple access control technologies, mapping access to a specific data element is extremely challenging. It’s important to note that access to the data element is entirely different then access to the infrastructure. If you don’t have visibility into who has access to what data and how data is accessed, you can’t hope to track that or prevent unauthorized access.

Data Risk Management and Data Hygiene

Creating and enforcing data security policies can be a daunting, ambiguous, and unique to each organization task. The good news? DSPM tools often have out-of-the-box policies to get an organization started or the ability to customize policies for organizations who already have a framework in place. 

Laminar recommends beginning with the data governance framework, such as cloud data management capability (CDMC) or Gartner’s data security governance (DSG) framework. CDMC provides auditable processes and controls, in addition to establishing a system of protection levels for data with different risk profiles, ensuring that the most sensitive data receives the strongest protection. These best practices can be used to guide the buildout of an overarching data security policy.

Because DSPM provides such a comprehensive view of an organization’s data, Gartner posits that it “forms the basis of a data risk assessment (DRA) to evaluate the implementation of data security governance (DSG) policies.” Essentially, a framework gives guidelines for specific processes and controls you should have in place and DSPM gives complete visibility to where these apply as well as help you enable governance and enforce compliance. 

Once policies are in place, then you can:

(1) maintain data hygiene by remediating misplaced, redundant or obsolete data and 

(2) manage data risk by detecting and remediating overexposed and unprotected data, and prioritizing security issues based on your business’ data risk profile.   

Data Privacy and Compliance

DSPMs detect and remediate regulatory and industry compliance violations and generate audit-ready compliance reports. Data residency requirements are often one of the top concerns of data privacy and compliance users. That’s because given the nature of cloud—it’s not in owned, discrete geographic locations and is dispersed across the CSP’s infrastructure—data often travels to, is stored in, or is accessed from geographies you may not be aware of. This can put an organization in violation of privacy regulations. A DSPM provides visibility into data store geolocations and data movement across cloud environments and regions.

Data Access and Governance 

We’ve talked about access above in several places because it’s critical to knowing who has access to what data in order to protect that data. During discovery and classification DSPM identifies who has access to data and how it’s accessed (as noted above), then, for data governance teams in particular, DSPM can also identify all internal/external users, roles, and resources with access and, especially for sensitive data stores, track their privileges.

Data Context and Remediation

You’re probably noticing a trend—to fully grasp your data security posture, you need comprehensive visibility into every corner of your cloud infrastructure: who has access to which dataset, and if it’s secure. But knowing there’s a problem is only useful if you can then fix the problem.

Data context gives the user an understanding of the data owner. This is critical for issue remediation as in order to action a remediation recommendation the person who owns the data store must be identified and contacted with the details of the recommended fix.

Why DSPM Now?

DSPM made it onto Gartner’s Hype Cycle 2022 because over a decade’s worth of cloud potential has been unleashed in a few years. That’s exposed a fast-growing gap between the agility needed to safely create value with cloud data and legacy data security tools and techniques.

Security teams can fight against the dying light of command and control based data security, or embrace a new role as critical enablers of business agility. The winners will be the ones that marry the agility they need to the security they have to have, with zero compromises for anyone. 

There is a new risk environment that must be overcome, The Innovation Attack Surface. An ephemeral, non-contiguous patchwork of accidental (or negligent) risk creation by the smartest people in your business. In essence, it refers to the continuous unintentional risk cloud data users, such as developers and data scientists, create when using data to drive innovation. 

What’s needed is a new agile security paradigm for cloud data. Agile means autonomous, continuous and context-driven protection for cloud data. 

A cloud-native DSPM platform bridges the gaps between disparate cloud platforms, creating a comprehensive view of your entire public cloud environment. It automatically analyzes, discovers and classifies your data; facilitates user access tracking; assesses for violations of your data security posture and guides the remediation process for continuous protection. 

From there, a CISO can feel more confident about their security posture including governance policies and appropriate controls, as well as reducing their data attack surface without slowing down business-critical processes and innovation. 

DSPM tooling empowers security professionals to make their security processes efficient, effective, and scalable.

Laminar Delivers Agile Data Security for the Cloud

Laminar’s cloud-native DSPM platform continuously discovers, classifies, and secures all known and unknown data — including shadow data — across your cloud environments, establishing a comprehensive data catalog. With this foundation, you’ll be ready to assess adherence to the CDMC Framework, enforce data security policies, and meet compliance requirements, all without impacting cloud performance. The platform also continuously monitors your security posture, and if a policy violation is detected, it will prioritize alerts and send actionable remediation recommendations.

If you’d like to learn more on DSPM solutions such as how it compares to other tools and how to evaluate solutions check out our Guide to DSPM.


Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.

Laying Down Fresh Tracks as the Leader in DSPM

It’s a new year and we are busy laying down fresh tracks—with a whole new lineup of events and webinars to spread the word about Laminar in our quest to evangelize the importance of DSPM, also known as data security posture management, far and wide. We have been stirring quite the “hype” helping organizations protect their cloud data and reduce risk with this new and emerging technology. Being named as a “transformational” sample vendor in the 2022 Gartner Hype Cycle for Data Security was certainly the highlight for us last year. 

2023 is here and we are doubling down on DSPM as cloud data continues, and maybe even strengthens, its position as the currency driving business and competitive advantage. In the new year, doesn’t every organization want the visibility to discover and classify all known and unknown data, identify that data that is not adhering to data policies, and quickly discern how to mitigate the risk presented by this data in hyper-complex cloud environments? Seems like the answer is a resounding “yes,” especially in a world where something like 2.5 quintillion bytes of data is created every day. That’s a lot of data that needs to be protected. Sheesh.

Part of the quest for pioneers in DSPM means the Laminar Squad needs to keep meeting folks like YOU—in-person, and at events and webinars across the globe! And boy is this squad ready. Go team, go!

Next up, the Laminar Squad will be at one of the foremost networking platforms of cybersecurity in the world, all happening in the new cybersecurity hub of Tel Aviv. This conference will bring together hundreds of key players from the cyber industry, and we can’t wait! If you are headed to CyberTech Israel and are interested in learning more about DSPM and how you can keep your cloud data secure, book an in-person meeting with us or join us for some delicious cocktails and small bites. Hope to see you there! 

Meet us there

CyberTech Israel
January 30-February 1
Book your meeting today!

Join our happy hour

Location: Alibi Bar, Frishman St 41, Tel Aviv-Yafo, Israel
Sign up here, space is limited. 

Stay awesome.

If you liked what you saw here, then be sure to share with your co-workers and friends because we want to hear from you! Follow us on Twitter @laminarsec or find us on LinkedIn. And definitely, don’t forget to @mention us when spreading the word!

Top Three Cloud Data Security Predictions for 2023

According to a recent analysis from IBM and the Ponemon institute, the average cost of a data breach has risen to a record high of $4.35 million in 2022. This year has certainly had its fair share of data breaches, and not one industry was safe. Government agencies, school systems, healthcare organizations, and even tech giants such as Microsoft all experienced a breach,  proving that there is still a lot of work to be done in the fight against attackers.

It appears there will be no slowing down for cybercrime in 2023, but it’s not all doom and gloom for data security teams. Despite the odds seemingly stacked against the “good guys,” the future shows promise. 2022 was like any other year in the past 5 years when it came to cybersecurity. The cycle of business and innovation agility led to data proliferation and unintentional consequences of lost and stolen sensitive data. However, 2023 can be a year that data security professionals break that cycle.

Here’s a look at what the data security landscape will look like in 2023.

Data security professionals will be viewed as business accelerators rather than inhibitors.

Data security has traditionally been seen as a roadblock for other areas of the organization such as IT and operations. Unfortunately, it’s the nature of the job. Data security involves having to make sure every digital asset is kept out of the hands of adversaries and is adhering to policy. With the increase in data proliferation, that has become increasingly more difficult to do.

However, it is critical for data to be available in order for businesses to conduct day-to-day operations. Data security is a key component in making that happen and, when done correctly, is not a hindrance.

Luckily, in 2022, more organizations began to understand the significance of data visibility and security, particularly in public cloud environments. As a result, they began to rely more and more on data security professionals and looked at them as business accelerators. I expect this sentiment to continue in 2023 as cloud data security technologies evolve to help make data security professionals’ lives easier and advance the business.

The increase in unknown or “shadow” data will lead to more data leaks, risks for organizations.

However, it will ultimately serve as a wake up call for CISOs to prioritize investments in data visibility and protection solutions.

There is a dark side to digital transformation fueled by the public cloud. Every day developers and data scientists create, move, modify and delete data in service of positive business outcomes. And they leave a trail of unintentional risk in their wake.

The activities that create the biggest advantages for cloud-based businesses are the same activities that introduce the most risk. As sensitive data propagates across the public cloud, risk grows.

This is the Innovation Attack Surface – a new kind of threat that most organizations unconsciously accept as the cost of doing business. Massive, decentralized, accidental risk creation by the smartest people in your business.

This unknown or “shadow” data has become a problem for 82% of security practitioners. Examples of it include database copies in test environments, analytics pipelines, unlisted embedded databases, unmanaged backups, and more. Because of its unknown content, it is at extra risk for exposure.

Security teams can expect to see more instances of shadow data breaches in 2023. However, even though breaches caused by shadow data are set to increase, security teams are becoming more and more aware of the situation and committing to solving the problem. The emerging public cloud data security market proves that this is slowly becoming a problem at the forefront of CISOs minds, and knowing you have a problem is the first step to solving it. In 2023, CISOs will prioritize finding agile solutions that provide both visibility and protection into all of their cloud data to discover and remediate data exposure risk.

A new data security center of excellence will report to the CISO

All security must protect data, however not all security is focused on data. With data increasingly growing more important as a currency between businesses, as well as as a means of innovation, organizations are storing and sharing more of it than ever (and increasingly, in the cloud). The skills gap created by this will begin to be addressed in 2023 with the rise of a new data security center of excellence, reporting to the CISO.

This center of excellence will bridge the gap between the CISO and the Chief Data Officer (CDO) to ensure an entity’s valuable data is secure. The data security center of excellence will have responsibility for the following four areas:

  1. Constantly maintaining visibility of all sensitive data
  2. Continuously protecting sensitive data
  3. Controlling who has access to sensitive data
  4. Ensuring that sensitive data adheres to the enterprise data security policy

This center of excellence, along with more data-centric, defense-in-depth security strategies will augment the important data governance and data privacy work that the Chief Data Officer typically oversees.

Data becomes more valuable every year, which makes it more important for organizations to take the proper steps to safeguard it. Traditional approaches are vanishing in this cloud-first era, where digital change is occurring quickly and complexity is high. In 2023, to stay secure and agile, organizations must be able to locate, identify, and organize every piece of data in the public cloud environment. Doing so will enable security teams to defeat shadow data, and stay ahead of adversaries for good.

Top Ways to Find & Protect Sensitive Data in the Cloud

Cloud data risk is more prevalent than ever. Laminar Labs scanned publicly facing cloud storage buckets and found personally identifiable information (PII) in 21% of these buckets – or one in five. Despite advancements in cloud infrastructure security, it’s clear that something is still amiss.

Cloud security solutions such as CSPM can detect publicly-exposed storage buckets but cannot actually identify the buckets’ content and take action to protect any sensitive data in these cloud stores.

In this post, we’ll cover some of the most common ways companies accidentally expose their sensitive data in the cloud, and how to remediate these issues by focusing on the data itself.

What are examples of sensitive data?

Most businesses interact with some form of PII. Examples of PII data include contact information, social security numbers, ID numbers, or any other forms of sensitive personal information. Some businesses also work with PHI (Personal Health Information), meaning they process and store confidential medical data such as prescriptions and diagnoses. Others work with PCI (payment card industry) data — confidential data pertaining to credit or debit cards. 

Exposing PII, PHI, PCI, or any other type of confidential data can lead to enormous consequences. According to IBM’s Cost of a Data Breach study, the global average cost of a data breach is $4.35M — not to mention the priceless loss of reputation and subsequent customer attrition.

How sensitive cloud data gets exposed 

A few common missteps lead to cloud data exposure:

Accidental exposure

It’s common for a team member to accidentally leave a data asset exposed to the internet. Thanks to the fast-paced nature of the cloud, major exposure can happen quickly and unintentionally. Laminar Labs’ study of public S3 buckets uncovered several examples of sensitive data exposed to the public, including

  • An asset containing PII from people who used a third-party chatbot service on different websites
  • An asset containing loan details – name, loan amount, credit score, interest rates, etc.
  • An asset with first names, last names, Ethereum address and Bitcoin address information, and block card email addresses

This type of exposure often happens because organizations lack proper cloud data governance. If there aren’t policies or enforcement vehicles to prevent improper data storage in the cloud, sensitive data will get exposed. After all, most team members simply don’t have the bandwidth to keep the security of this data top of mind, especially given this is not the primary directive of the users of the data. Their job is to innovate and drive an organization’s business. They accidentally store sensitive data in insecure locations, such as an S3 bucket that’s public by design. Realistically speaking, improper cloud data storage is inevitable.

Data movement

Even if sensitive data is properly secured at first, there’s always a risk that it could be moved or copied to an unsecured environment. 

This problem is specific to the cloud because moving or copying information in an on-premise environment takes a lot of effort and requires extensive permission from gatekeepers. But in the cloud, virtually any staff member can copy or move data within seconds. Unregulated data movement leads to shadow data (i.e. unknown, unmanaged data) in the form of copied data, orphaned backups, unlisted embedded databases, or cached application logs. 

Improper access management

This security misstep happens when extraneous users or third parties are granted access to sensitive data. For example, a user who has an overly-permissive role or policy might copy sensitive data into an alternate location within the organization without anyone’s knowledge. If they leave the company, they’ll leave this sensitive data in an unauthorized location, and nobody will know about it. 

Or when it comes to third-party access, granting too much access to an external tool (e.g. a CI/CD tool or SaaS application) means that risk mitigation is completely out of your hands. If this third-party tool experiences risk, your data is also at risk.

3 best practices for protecting sensitive data in the cloud

  1. Discover and classify all cloud data

    You can only protect what you know. So to combat the shadow data problem, your organization needs to precisely identify and classify sensitive data. You can keep tabs on your data by instituting a data catalog. It’s also important to set up a continuous data discovery and classification method to keep up with the dynamic nature of the cloud. 

  2. Secure and control your cloud data

    You also need to secure and control your sensitive cloud data. The best way to do this is by instituting security policies and continuously validating data with your company’s pre-determined guardrails. And as this data gets validated regularly, any instances of violations need to be prioritized for remediation based on sensitivity level, security posture, volume, and exposure. 

  3. Remediate and monitor without interrupting data flow 

    Lastly, you need to remediate sensitive data exposure without interrupting data flow. 

    This process enables your team to protect data in the cloud without compromising agility. Some of these remediation steps involve implementing data security best practices, such as enabling encryption. Other best next steps might be general data hygiene practices, such as removing unused sensitive data from your environment. 

    In addition, organizations need to set up measures for continuous data monitoring. Data security isn’t a one-and-done process. Instead, you must continuously monitor your crown jewels against stated security posture and regulations, regardless of where they move in the cloud. 

How a DSPM helps

DSPM (data security posture management) is an emerging cloud data security strategy that makes best practices for data security in the cloud a reality. It does this by discovering all cloud data, classifying it by data type and sensitivity level, detecting and alerting on data security policy violations, prioritizing those alerts, and providing remediation playbooks.

Laminar’s multi-cloud DSPM solution provides a continuous, autonomous solution for protecting sensitive data in the cloud. We interface with your other technologies, such as APIs, data catalogs, and ISTMs, combining your cloud data security efforts into a single pane of glass. 

Want to find out more about protecting data in the cloud? Tune in to our webinars to learn more!


5 Steps to Effective Cloud Data Governance

Cloud data governance is getting a lot of attention these days, and for good reason. For most organizations, their cloud data resembles the American Wild West. Data sprawls naturally and is often unruled and ungoverned. It is copied, moved, and stored in various repositories and accessed by virtually anyone. There are no guardrails or policies to ensure that it is protected and stays in the right hands. As data continues to proliferate in the cloud and data breaches increase, the industry is recognizing the need for automated cloud data governance.

What is cloud data governance?

Cloud data governance collectively refers to the set of principles, processes, policies, and tools used to manage data in the cloud to ensure that risk is properly mitigated and privacy is managed in accordance with regulatory compliance requirements. Cloud data governance also ensures that data is accurate, available, and usable across the organization.

Challenges of cloud data governance

Cloud data governance is very different from traditional data governance. The cloud itself creates a new set of expectations and ways of working with data that organizations have come to embrace as a means to innovate, improve agility, fail faster, and reduce the time to market. In this way, the cloud delivers faster business value that directly impacts the bottom line. Cloud data governance cannot hinder the organization’s ability to realize any of these business benefits. In fact, cloud data governance must preserve the value organizations get from the cloud data and that requires solving the problem of data governance from a new perspective.

When it comes to cloud data governance, it is important to understand the unofficial processes by which data is accessed and used in the cloud. Previously, when developers spun up a new application, they had to request a server to store data. That meant getting a DBA involved and, since the developer had to ask permission, it was common to involve security in that process as well.

In the cloud, developers don’t need to request a server. It’s right at their fingertips. And so is the data. Developers can spin up or copy entire data stores in minutes without ever involving a DBA, security, or anyone else for that matter. This is one of the many ways the cloud facilitates data democratization and reduces time to market – benefits that businesses aren’t willing to sacrifice.

Another factor unique to cloud data governance is the sheer volume of data that resides in data stores managed by the cloud service provider. Across a multi-cloud environment, developers and data scientists have access to hundreds of data technologies, which continue to proliferate. Developers can also easily add or embed their own data storage technology on top of a compute instance. Organizations lack any formal process for keeping track of which technologies are used and the data elements stored in them. And even if they did have such a process, data governance use cases change and data moves so frequently, it would be impossible to maintain an accurate accounting.

Why do I need cloud data governance?

The lack of data governance in the cloud has several significant implications on cybersecurity and regulatory compliance.

  1. First and foremost, organizations don’t know what data they have or where it resides. This is shadow data, which by definition is cloud data that is ungoverned, has no oversight, and is not kept up to date. Data stores containing shadow data are more likely to be misconfigured, unmonitored, and violate data policies, making them particularly vulnerable to attackers who know to look for these easy targets.
  2. The lack of data governance also means that data can be copied and moved from a well-protected environment to a less secure one. For example, sensitive data may be moved from a secured production environment to a developer environment with a lower security posture. There is also the risk of regulated data, such as card holder data, being moved from a protected environment (considered in-scope for the PCI DSS) to a less secure environment where the data is now at a greater risk of exposure and is in violation of cloud compliance requirements.
  3. When cloud data is ungoverned, the risk of sensitive data exposure increases. The data attack surface itself grows as both known and shadow data proliferates. Data assets are accidentally left exposed to the internet. Users are granted excessive privileges and can access sensitive data that they don’t need. Data stores that include sensitive data are shared with third parties. The list goes on. . . and because the data is ungoverned, it’s also often underprotected.
  4. Finally, cloud data that is ungoverned is also at risk of regulatory compliance violations. By definition, the ability to implement controls to meet regulatory requirements is a form of governance. In other words, you can’t have compliance without some semblance of governance.

What are the benefits of cloud data governance?

When done properly, cloud data governance enables security teams to implement protective controls around data without impacting existing workflows. But it also offers several other benefits.

  • First, cloud data governance helps facilitate data democratization by making more data available to more people so they can do more analytics as quickly as possible. Cloud data governance also enables users to optimize the value they get out of the data while reducing the complexity and risk typically associated with data democratization.
  • Cloud data governance helps improve the security posture of the cloud environment by making it easier to manage risk. Continuous data visibility, monitoring and automation enable organizations to finally understand what data they have, where it’s stored, who is accessing it, and how it’s being protected — without having to understand the underlying cloud data technologies.
  • Similarly, cloud data governance reduces the overhead associated with meeting and demonstrating compliance with regulatory requirements. This saves time and effort associated with implementing controls and accelerates the audit process.
  • Finally, cloud data governance can help reduce cloud costs by eliminating the creation of new shadow data and enabling teams to delete old shadow data.

5 Steps to achieving cloud data governance

Cloud data governance consists of five main processes that should run continuously and simultaneously.

Cycle Graphic representing the Five Steps to Effective Cloud Data Governance

  1. Data discovery

    You can’t govern what you can’t see, so the first step to achieving cloud data governance is to obtain visibility. This requires a centralized application that automatically and continuously discovers all data across the entirety of your multi-cloud environment. This includes data in managed and unmanaged assets, data embedded in virtual instances, shadow data, data caches, data pipelines, and big data.

  2. Classify and catalog data

    Next, define the type of data discovered so that it can be properly classified and cataloged. For example, sensitive data such as PII, PHI, and PCI should be identified and classified accordingly. This information is used to build a comprehensive, consistent data catalog across clouds.

  3. Define and enforce policies

    Once you understand what data you have, define and enforce data security policies and remediate issues for:

    • Compliance and audit management
    • Encryption at rest and in motion
    • Retention, archiving, and purging
    • Who is allowed to access what data
  4. Data ownership and usage

    Strive to associate all data with its owner. Continuously monitor who uses the data and where your data is going, especially with regards to third parties. Empower data consumers with self-service access, while retaining control and governance over data. Understand how the data is processed to ensure it can be used appropriately.

  5. Continuously monitor

    Continuously monitor for policy violations and anomalous behavior so you can proactively mitigate security risks. Address policy violations, block unauthorized access, and delete unused assets in a timely manner.

Repeat. Since cloud environments are highly dynamic, all these steps must be done continuously and constantly.

How does Laminar help with cloud data governance?

In the American Wild West, governance came in the form of a sheriff from out of town who was inevitably met with discontent and open resistance. Thankfully, times have changed – and technology has evolved. Laminar makes cloud data governance seamless. Security teams get the control they need without disrupting innovation, existing workflows, agility, or time to market.

Laminar’s cloud data security platform provides centralized cloud data governance and security across Azure, AWS, Google Cloud, and Snowflake. It enables teams to continuously and autonomously discover, classify and catalog all known and unknown “shadow” data. The platform then assesses the security posture of sensitive data against data-centric security policies, including encryption, activity logging, and retention, and alerts on security policy violations and prioritizes based on data sensitivity and risk. It also determines data ownership and offers clear guidance for remediation.

All of this from a solution that is embedded in your cloud environment and scans your assets using serverless functions that make use of the CSP’s APIs. Laminar is easy to install using cloud-native tools, is agent-less, and has no performance impact. And data never leaves your environment.

Want to see Laminar’s Cloud Data Security Platform in action? Request a demo today.

Author bio:

Mor Gozani (also goes by Mora) is the VP of Product Marketing at Laminar. She’s got the credentials to build a long list and has been working in cloud security for the past 7 years. Most recently with Orca Security, where she built and managed the product marking and technical evangelism team and prior to that she ran marketing for CloudKnox Security that was just acquired by Microsoft!


Laminar Integrates with Amazon Security Lake


Laminar integrates with Amazon Security Lake, a service announced at re:Invent 2022. Amazon Security Lake allows customers to build a security data lake from integrated cloud and on-premises data sources as well as from their private applications. With support for the Open Cybersecurity Schema Framework (OCSF) standard, Amazon Security Lake reduces the complexity and costs for customers to make their security solutions’ data accessible to address a variety of security use cases such as threat detection, investigation, and incident response.

Everything’s Better with Bacon (Data-context)

Over time, the vision for Laminar and Amazon Security Lake is to be able to enhance security events, security investigations, and risk models with data context via Laminar. Laminar is the leader in discovering, classifying and cataloging ALL your sensitive data in the cloud. Now that you have visibility to where your sensitive data is, imagine adding that data context to all your other security activities.

A server is compromised, what data was on there? What other data did that server have access to? Data context at your fingertips is a huge aid in forensics investigations. Another example is vulnerability scanning; knowing which boxes have sensitive data helps you prioritize which vulnerabilities to patch first. These are just a few examples. Everything’s better with bacon, ahem, I mean, data context.

Spend Time Where it Matters with OCSF

Data scientists and security practitioners today spend too much time trying to normalize security data across vendors in order to gain insights. Laminar is proud to support the Open Cybersecurity Schema Framework (OCSF) for the Amazon Security Lake integration that makes it easier for customers to combine logs from disparate security solutions for more efficient threat detection, investigation, or incident response.

Our initial integration is supplying Laminar’s found security events to Amazon Security Lake, making the data available for additional analytics use cases such as security investigations.

Data, Data Everywhere

Amazon Security Lake is just another example of the power of data and why data democratization is enhancing the way business operates. Unfortunately, as the volume of data increases so do the costs as well as the risks. Amazon Security Lake is architected to allow you to store data more efficiently allowing you to keep more data around for longer for valuable insights or forensics. By applying data context to other security solutions, Laminar’s cloud data security platform helps decouple data risk from data growth allowing successful data democratization, but safely. With the volume of data collected and processed in the cloud increasing exponentially today, data-centric security is a must.


Laminar is proud to integrate with Amazon Security Lake, adding data context to security events for better risk models, effective investigations and efficient remediation. Complete cloud security involves application, infrastructure, identity and data security solutions and Amazon Security Lake will allow customers to derive combined insights across these disciplines.

Click here for more on how Laminar partners with AWS.

Using Laminar to implement the Cloud Data Management Capability (CDMC) Framework

The accelerated move to the cloud and the adoption of multi-cloud deployments is causing the rapid proliferation of cloud data and, in turn, expanding the data attack surface. Faced with the increasing risk of a data breach and a growing list of privacy compliance requirements, many organizations are considering their cloud data management practices – and finding they don’t know where to begin. Laminar and the Cloud Data Management Capability (CDMC) Framework can help organizations across all industries better manage sensitive data in the cloud.

Data is the new uranium. It fuels innovation. Proper cloud data management is critical to ensuring that organizations can protect valuable data from exposure and ease the burden of regulatory compliance efforts — both critical requirements for a successful data-driven business. Cloud data management also enables business users to get more value out of their data.

What is the CDMC Framework?

However, as many organizations have discovered, managing cloud data is very different from managing data on-prem. Rapid data growth and new workflows in the cloud demand a new approach to data management. Recognizing these challenges, the EDM Council, in collaboration with industry professionals, cloud service providers, financial institutions, technology companies, and major consultant and advisory firms, created the CDMC Framework. The Framework was designed to help the industry better manage and protect data in the cloud, and better enable organizations to realize the cloud’s benefits.

According to the EDM Council, “CDMC is a best practice assessment and certification framework for managing and controlling data in single, multiple, and hybrid cloud environments.” The CDMC provides organizations a structured framework of auditable processes and controls that is broken down into 14 capabilities and 37 sub-capabilities across six components.

CDMC Framework Implementation Challenges

Of course, with any framework there is the issue of implementation. A significant skills and/or capabilities gap often exists between an organization’s current state and its future, framework-compliant state. A framework provides a destination, but many organizations also need a roadmap to get there. The CDMC is no different.

The CDMC Framework assumes that an organization has a strong foundation in data management. Before even starting on the journey to CDMC compliance, organizations must classify their data. The Framework suggests what types of data may be considered sensitive (such as Personally Identifiable Information and client identifiable information, for example), but it is up to the organization to know what data they have and to classify it accordingly – a feat in and of itself.

CDMC Framework Implementation with Laminar

This is where Laminar comes in. As a leading data security posture management (DSPM) platform, Laminar autonomously and continuously discovers, classifies, and secures all known and unknown data across all cloud platforms. Upon connecting to the organization’s cloud environment, Laminar automatically discovers all data, including shadow data that is ungoverned, and data security and management teams are blind to. Once it finds the data, Laminar identifies and classifies sensitive data such as PII, PHI, and PCI. The platform then builds a single, comprehensive catalog of all the data residing across a multi-cloud environment. Laminar does all this automatically – without impacting cloud performance or removing any data from the cloud environment.

Now the organization is ready to implement the CDMC Framework. Laminar can help here, too. Laminar prioritizes all data according to its risk profile based on sensitivity level, security posture, volume, and exposure. The platform continuously assesses the security posture status of sensitive data against an extensive set of pre-built security policies and compliance requirements. When Laminar detects a policy violation, the platform issues an alert and provides streamlined, actionable remediation recommendations via integrations with existing ticketing workflows.

Laminar facilitates or supports the majority of capabilities that comprise the CDMC Framework, significantly reducing the legwork and cloud expertise required. Let’s look at a few examples:

CDMC 1.4, Data Sovereignty and Cross-Border Movement, states, “The data sovereignty and cross-border movement of sensitive data must be recorded, auditable, and controlled according to defined policy.” Laminar supports this requirement by:

  • Automatically finding the location of each asset with sensitive or regulated data
  • Automatically triggering violation if data appears in disallowed locations
  • Automatically triggering violation if data is being accessed from disallowed locations
  • Notifying data security teams and data owners on the violations



CDMC 2.2, Classification, states, “Classification must be automated for all data at the point of creation or ingestion, and must be always on.” Laminar supports this requirement by:

  • Autonomously discovering new assets and data in the cloud environments
  • Automatically and continuously classifying data assets in the cloud

CDMC 3.1, Entitlements and Access for Sensitive Data, requires, “Entitlements and access for sensitive data must default to creator and owner until explicitly and authoritatively granted,” and, “access must be tracked for all sensitive data.” Laminar supports these requirements with the following policies:

  • Overexposure policies ensure no over-privileged access to sensitive data, such as public access or unauthorized access to third parties and users.
  • Activity logging policy ensures access is tracked for sensitive data.


Activity logging policy


CDMC 4.1, Security Controls, states, “Appropriate security controls must be enabled for sensitive data,” and, “Security control evidence must be recorded in the data catalog for all sensitive data.” Laminar supports these requirements via:

  • Extensive built-in data security policies that ensure security controls are in place such as encryption, masking, data retention, and more.
  • One-click remediation actions to fix policy violations
  • Comprehensive admin console to review the current and history state of sensitive data security posture



The capabilities that comprise the CDMC are also best practices that are incorporated into Laminar. Using Laminar to implement data security controls and guardrails that detect violations and risks puts organizations on the right path to fully implementing the Framework. If you’d like to learn more about how Laminar facilitates the implementation of the CDMC Framework, contact us today.

New Research Finds 21% of Publicly Facing Cloud Storage Buckets Contain Sensitive PII Data

Public cloud storage services run under a shared responsibility model for data protection and cybersecurity. The cloud service provider (like Amazon or Microsoft) is responsible for the underlying “security of the cloud” and its infrastructure, while users are responsible for securing their own AWS S3 buckets and the data inside, known as the “security in the cloud.”

Personally Identifiable Information (PII) in the Cloud: What We Found

In an effort to understand the prevalence of publicly exposed sensitive data, Laminar Labs scanned publicly facing cloud storage buckets and was able to detect personally identifiable information (PII) in 21% of these buckets – or one in five. Information uncovered included addresses, email addresses, phone numbers, drivers license numbers, names, loan details, credit scores, and more. 

Our original hypothesis was that this publicly available data were public datasets or public files, things that were meant to be online. But what we learned was that the majority of this data was actually misplaced data. Data that was mistakenly placed into a publicly exposed bucket where it became unintentionally exposed. Additionally, in some cases, the S3 bucket may have been misconfigured to be public when it should not have been. Both are prime examples of “shadow data.” Shadow data is any sensitive data that is not subject to an organization’s centralized data management framework and is not visible to data protection teams. For example, snapshots that are no longer relevant, forgotten backups, misplaced data, sensitive data log files which are then not properly encrypted or stored, and many more examples.

Here is a summary of some of the sensitive data that we found

  • A file containing PII of people who used a third-party chatbot service on different websites – including names, phone numbers, emails – and the messages sent to the bot (for example – people seeking unemployment benefits and more)
  • A file containing loan details – name, loan amount, credit score, interest rates and more
  • A participant report for an athletic competition, including PII (name, address, zip code, email and more) and medical info
  • A VIP invite list including names, email, and address information
  • A file with first names, last names, ethereum address and bitcoin address information, and block card email addresses. 

The Risks of PII data in the Public Cloud: Why You Should Care

Because this data contains such highly sensitive information as loan details, bitcoin addresses and conversations about unemployment benefits, we believe that this data has the potential to put the organizations to whom the information belongs at risk. Organizations cannot properly protect data they do not know is exposed. And in the shared responsibility model, keeping this data secure is the responsibility of the organization that owns the buckets in which the data resides. Fortunately, there are ways to uncover and address this risk. 

How to Mitigate & Protect PII in the Cloud:

PII Data Discovery & Monitoring

The first thing that needs to be done in order to start taking care of the problem is understanding what publicly exposed sensitive data is in your environment. However, doing this in the cloud is not as simple as it may seem. Many times S3 buckets that are not public can contain specific files and objects that are public, leaving security teams unaware of the risks. On the other hand, many buckets are supposed to be publicly exposed, for example hosted websites, and unseen shadow data can be misplaced in these intentionally exposed buckets. These misplaced files are often hard to locate amongst the many legitimate files that are housed inside those buckets. 

In other words, what is needed is a data-centric view, not an infrastructure-centric one. A way to catalog all data in a cloud environment, figure out which files and objects contain sensitive information and make sure these objects aren’t publicly available without hindering the availability of other files that are safe.

Third Party Data Access Control 

Another needed step is making sure that third parties that need access to your data have access only to what they must, as handing your data over to a third party introduces a whole new layer of security threats. We will dive into this topic on a separate post.


Many organizations focus on protecting their cloud infrastructure first. This is an important component of a comprehensive cloud security solution set, but given that an organization’s most valuable asset is its data, infrastructure security alone is not enough. True protection of sensitive data requires a dedicated data security posture management solution that can autonomously discover all data, known and shadow, whether it’s a bucket that is unintentionally set to public or the much harder to find sensitive data misplaced in buckets that are intended to be public. 

To keep up with Laminar research, see my colleague’s post on Versioning and watch our blog channel

Three Key Takeaways in the Era of the Zettabyte (How Much Data Is That Exactly?)

The Era of the Zettabyte

Sounds so ominous, the era of the zettabyte… I mean how big is a zettabyte anyways? A zettabyte is approximately equal to a thousand exabytes, or a billion terabytes and if you consider a large physical library holds the equivalent of 2 terabytes of data, you can start to see how enormous a zettabyte of data is. It’s actually pretty incredible to think that by 2025 there will be over 100 zettabytes of data stored in the cloud. (mind blown)

Modern society continues to create and store more and more data in the cloud, leading to an uncontrolled expansion of the data attack surface. With the way the cloud has removed the traditional perimeter and created a new data attack surface, the job of securing this data becomes increasingly challenging, if not next to impossible, with legacy data security or manual methods. Security teams don’t have a big picture view of the data that exists in various cloud services, which continues to grow exponentially every second of every day.

The Era of a Cloud-Native, Data-Centric Security Model

You want to leap forward into the future? It means using modern, cloud-native approaches for data security. Legacy data security suites aren’t going to cut it because they just don’t give visibility into what sensitive and regulated data you have in the public cloud. Yikes!

As the year is quickly coming to an end and another year is almost upon us, and as data continues to proliferate exponentially in the cloud, isn’t it high time we learn how to protect our data in the cloud? 

Seeing as we just had a webinar about learning how to protect your data attack surface, I thought it would be fun to share some takeaways. Check out our 3 key takeaways from our latest webinar, “Cloud Data Security 101: Learn how to Protect Your Data in the Cloud”. 

1. Data Democratization and Risk

Data is the new fuel or, as we like to call it, the new uranium  – because it is a critical element of growth. We all understand how important it is for innovation, and data democratization enables everyone in an organization to work with data to make it more accessible to do more analytics as quickly as possible. 

It increases agility, promotes innovation, and speeds up delivery of products to market..but it also contributes to the rapid growth of the “data attack surface.”  Developers can spin up or copy entire datastores in minutes, without ever involving security, adding immense amounts of complexity and risk.  

2. Shadow Data Remediation and Reduction

Shadow data. Everyone has it. That data security is unaware of that is hanging out creating risk, entirely unbeknownst to them. At Laminar we’ve also discovered shadow data in 100% of our customers’ object storage services.

So we all have Shadow Data, now what? This is where a Data Security Posture Management (DSPM) solution comes into play. (DSPM is an emerging security trend named by Gartner in its 2022 Hype Cycle for Data Security.) This category has emerged out of a direct need to discover, classify, prioritize, monitor, and protect sensitive data in the cloud without the limitations of existing approaches. Attacking the problem first by identifying all data, known and shadow, an organization has in the cloud. 

3. Solutions to Address the Sprawl of Data We All Face: The Ominous “Data Attack Surface”

DSMP solutions, enter stage left. DSPM solutions reduce the data attack surface by discovering all of your cloud data, classifying it by sensitivity, and then providing plans to remediate or mitigate data exposure. It is an approach to cloud data security that focuses on finding and securing sensitive data rather than cloud infrastructure security like CSPMs or CNAPPs. 

The data attack surface is a huge attack vector when it comes to the cloud. Reducing the data attack surface by focusing on the data first will allow for your company’s valuable data to be kept safe. 

Ready to learn more?

Interested in learning more about how you can protect your data in the cloud by learning in depth about new and evolving security risks associated with cloud data? Then check out our latest webinar or read our ebook for the Top 3 Reasons Cloud Data Security Belongs in 2023 Security Budgets!

An Introduction to DSPM: Data Security Posture Management

Why Are DSPM Solutions Important?

As the world becomes more technologically advanced, digital transformation affects how we work, communicate, entertain ourselves, socialize, buy, trade information, and so much more. Organizations have had to transform and migrate both systems and data to the cloud, so that their developers and data scientists can innovate at the speed with which users expect new services. But as data proliferates across the cloud, so too does the risk of a breach of this data. Decoupling data growth from data risk requires data security posture management (DSPM).

Legacy on-premises data security solutions can’t keep up with the speed of change and the scope of data proliferation in the cloud. If innovative organizations want to keep their data secure, a new breed of cloud-native data security solutions such as DSPM must be considered as a fundamental part of a larger comprehensive cloud data security program.

With DSPM, organizations can now see and classify all of their cloud data, known and unknown; understand what data is not adhering to their data policies and therefore presents a risk to the organization; and quickly discern how to mitigate or remove the risk presented by that data. Understanding, analyzing, and managing your risk at that level, particularly risk to your organization’s most sensitive information, will result in a considerably stronger security posture and better compliance.

What Is Data Security Posture Management (DSPM)?

In July 2022, Gartner’s Market Guide for Data Loss Prevention mentioned “DSPM” for the first time. The analyst firm subsequently included it in their 2022 Hype Cycle for Data Security. In this article, we aim to unpack the fairly new idea of DSPM, how it protects data in hyper-complex cloud environments and differs from cloud security posture management (CSPM), and what characteristics organizations should look for in a DSPM solution.

Data security posture management, sometimes known as cloud data security posture management (CDSPM), is a framework for protecting data in the cloud that gives an organization’s security and data team members enhanced insight into and the ability to manage the security posture of their cloud data. At the core of DSPM is a data-centric policy enforcement engine that:

  1. Automatically and autonomously discovers, classifies, and catalogs all data in the cloud.
  2. Analyzes data against a set of security policies and best practices (e.g., encryption, retention, etc.) and continuously monitors and alerts on policy violations.
  3. Prioritizes security and compliance issues (e.g., overexposed data, underprotected data, misplaced data, etc.) and proposes remediation recommendations for implementation.
  4. Continuously monitors cloud environments for changes, immediately discovering new data assets or posture changes to existing assets.

Thus, a DSPM solution provides continuous visibility into data, assesses data risks, determines what data violates security policies and how, prioritizes data risks, and remediates them accordingly. As a result, a DSPM can help reduce your data attack surface and facilitate compliance with regulations such as GDPR and PCI.

DSPM Solutions: Solving for the Complexity of Cloud

DSPM solutions address one of the core challenges of cloud: complexity.

The proliferation of cloud data technologies: cloud data security leader Laminar solves the complexity of the cloud data with its DSPM Solution

There are a multitude of cloud storage technologies available to developers and data scientists today. Many are configured differently, creating multiple architectures that constantly change. Developers can spin up or copy entire datastores in seconds, at the push of a button. Data is copied here, there, and everywhere.

The security team must ensure that data security controls are tight but don’t infringe on the free, unfettered use of data by developers and data scientists. Those security controls must also travel with the data, so that data has the proper level of protections regardless of where it goes. The answer: a DSPM solution.

DSPM enacts a set of data-centric security policies such as encryption, activity logging, environment restrictions or retention period that travel with the data. So when data moves from a production environment, which is typically the most protected, to a test or dev environment, it has the proper protections.

The DSPM solution protects the data, regardless of what infrastructure it is on. It supplies data-centric, infrastructure agnostic policies that then get automatically verified wherever that data resides. For instance, say you have Social Security numbers publicly exposed in a database hosted on an Azure VM. The data security person doesn’t even need to be aware that the VM exists. The DSPM discovers the asset, finds the sensitive data in it, and determines there is a data security policy violation. It then prioritizes the violation based on several factors, including sensitivity and risk, and engages the relevant team members to help in remediation.

How Does DSPM Differ From CSPM?

It’s important to understand the differences between the data-centric and the infrastructure-centric solutions for security posture management. CSPM technologies are designed to protect cloud infrastructures. A CSPM detects misconfigurations, vulnerabilities, and compliance violations across an organization’s cloud infrastructure and issues alerts for security teams to manage and fix.

CSPM solutions are great at protecting cloud infrastructure, but they lack data context. CSPMs don’t understand the value of the data or risk to the data that resides inside data stores. So while a CSPM can detect a critical vulnerability in an S3 storage bucket, it will lack insight as to whether that bucket contains any sensitive data, the loss of which could affect the business. Meanwhile, a less severe vulnerability that the CSPM has deprioritized may affect data stores with sensitive financial data such as credit cards, for example.

A DSPM with its data-centric view is needed to properly assess data risk and provide recommendations on the best way to protect that data. A DSPM manages the security of the data, across clouds, regardless of the infrastructure it resides on. A DSPM protects the data plane which has become overwhelmingly complex and dynamic in ever-expanding cloud environments. Data is the new currency for business and requires its own security context.

Organizations need both CSPM and DSPM solutions. They are separate but complementary technologies. When a CSPM leverages the rich data context from the DSPM, the security teams can focus on those alerts that impact highly sensitive data, thereby gaining a higher return on remediation efforts.

The two technologies cover different perspectives that are needed to effectively secure multi-cloud environments. One provides an infrastructure-centric perspective. The other provides a data-centric perspective. Both are important parts of a defense-in-depth strategy: CSPM to strengthen your infrastructure and DSPM to protect the data and reduce the blast radius of an attack.

Evaluating Data Security Posture Management (DSPM) Solutions

When evaluating tools that provide DSPM capabilities, there are several things to consider:

  • Plug and Play: there should be no need to provide a list of every data asset, the location, access credentials, or data owner. The system should discover all data autonomously. Its design is to find the unknown.
  • Continuous Automation: finding policy violations and evaluating their risk relative to other violations is not a one-time process. Nor is it something that can be done manually in the cloud. Developers and data scientists are constantly shifting the landscape of what data is where, so a DSPM needs to work continuously.
  • Comprehensive, adjustable policies: a DSPM should provide robust coverage in both the risks it identifies and the asset types it scans. A DSPM should find and notify on the full breadth of risk: overexposed data, unprotected or underprotected data, misplaced data and unmanaged data, and also cover a robust spectrum of CSPs, asset types, and object types. A DSPM’s policies should also be customizable to meet an organization’s particular needs.
  • Guided remediation: security and IT teams have more than enough alerts, so a DSPM should not just find exposures, but also provide full analysis of why the violation exists, evidence for its existence, and technical recommendations on how to fix it. The DSPM should also connect to your existing workflows to make this process as seamless as possible.
  • Agentless: to avoid performance impacts, and to get a complete view of ALL your data, look for a solution with an architecture that is agentless and connector-less that operates asynchronously.
  • Risk-free: a DSPM should utilize serverless functions that leverage APIs to scan your environment, so data always stays in your cloud environment for maximum security.

Laminar’s Data Security Posture Management Capabilities

Laminar’s data security posture management platform provides consistent data security across Azure, AWS, Google Cloud, and Snowflake, offering uniform security and governance for multi-cloud environments.

When you choose DSPM from Laminar you can customize your own data security policies or rely on our robust, pre-written data security policies that cover: overexposed data, underprotected or unprotected data, misplaced data, and unmanaged data. With this guidance, our solution then continuously prioritizes policy violations with no effort on your part and clearly displays them with easy-to-read dashboards. Laminar also provides technical remediation guidance and can connect with ticketing and workflow systems for a seamless remediation experience as well as send alerts to SIEM and CSPM tools for advanced correlation activities.

Laminar’s industry-leading technology integrates into your cloud infrastructure. Because the discovery engine for Laminar is embedded within your cloud environment, data never leaves your environment. Laminar scans your organization’s cloud using serverless functions that make use of the cloud service provider’s APIs. Furthermore, Laminar is easy to install using cloud-native tools, does not need an agent, and has no effect on performance.

Want to see Laminar’s Cloud Data Security Platform in Action? Request a demo today.

Author Bio: 

Andy Smith is the Chief Marketing Officer at Laminar with over 30 years of experience in Silicon Valley and over 20 years in identity management and solutions for cybersecurity.