Cloud Data Security Requires 20/20 Vision

No reasonable business leader would ever dream about leaving their logistics software unmanaged or their sales departments to their own devices. Visibility into every aspect of a business—every crevice, no matter how large or small—is critical to the success of any operation. Lack of visibility leaves businesses open to risks in the form of theft, inefficiencies, customer dissatisfaction and so much more.

As the business world continues to thrive on big data—and with more of that data stored in the cloud—visibility into a company’s data is undeniably important.

Data visibility into a self-contained, on-prem system is one thing, but that structure is hard to come by these days. Most modern businesses rely on the cloud to improve flexibility, to increase scalability, and to execute tasks quickly and effortlessly.

As the cloud allows businesses to work efficiently from anywhere at any time, greater access drives higher levels of  risk. Due to increased pace of change as well as sprawl of new cloud tech, an organization’s data will be spread around various places, leaving some data to be more-or-less invisible in a “dark corner.”

Many large brands have already come to face this reality. Earlier this year, SEGA Europe sustained a massive data breach after someone inadvertently stored secure, sensitive files in a publicly accessible AWS S3 bucket. Similarly, a “glitch” caused some Twitter users’ personal information and passwords to be stored in a readable text format on the company’s internal system rather than disguised by their hashing process. The breaches of these two shadow environments show how a little mistake can lead to public scrutiny and damage a brand.

Ignorance is (Not) Bliss

Some may argue that data visibility before the cloud was mediocre at best, often downplayed by poor employee security awareness and inconclusive data protection policies. The introduction of cloud technology highlighted that issue and led to the widespread issue of ever-increasing data breach experienced today.

One of the biggest factors contributing to data breach culture is the sheer absence of comprehensive data visibility. It’s almost become an inevitable outcome—the price of admission, so to speak—that an organization can’t know what’s going on with every piece of data. A lot of professionals have accepted that conclusion as fact.

Often referred to as “shadow data,” hidden sensitive files and programs occur when data is copied, backed up or housed in a data store that is neither governed under the same security structure nor kept up to date. What some have simply accepted as the cost of doing business is turning out to be one of the largest threats to data security.

Shadow data has primarily been a result of four main changes to data culture: The proliferation of technology and its associated high complexity, the limited bandwidth of data protection teams who are falling behind, the democratization of data and the removal of on-prem perimeters.

What Lurks in the Shadows?

While hidden data can be a result of several different situations, it typically occurs when sensitive data— customer information, employee information, financial data, applications, intellectual property, etc.—is copied in an unsanctioned way. When data is copied and stored in a way that makes the files or programs invisible to a data protection team, those assets are unsecured and unmanageable using most modern security tools. Below are a few examples of how shadow data comes about:

  • S3 Backups: Almost every modern business has at least one backup data store that they use as a contingency plan in the case of a breach or damage to its production environment. The backup data store is meant to keep exact copies of production data in case of an emergency. However, these are often left unmonitored and can mistakenly expose large amounts of data to the public, as in the SEGA Europe example.
  • Leftover Data from Cloud Migration: As many organizations move to the cloud, they will deploy “lift and shift” data migration projects, but too often, the original data will never get deleted. This lingering data will remain unmanaged, unmaintained and often forgotten, which can most definitely lead to vulnerabilities down the line.
  • Test Environment: Most organizations have a partial copy of their production or RDS database in a development or test environment where developers are building applications and testing programs. Often, developers need to move quickly and may take a snapshot of some data but fail to properly remove or secure the copied data—or they simply forget about it.
  • Toxic Data Logs: When developers and log frameworks mistakenly copy actual sensitive data into log files, the result is a “toxic” data log. For example, naming the logs with a user’s email address exposes PII that is against policy.
  • Analytics Pipeline: Many companies will store data in some type of analytics pipeline using the likes of Snowflake or others because it improves data recall speed and allows them to manipulate and analyze the data more easily. However, analytics pipelines are typically unmonitored by most security solutions today.

Turning the Lights On

Shining a light into these “dark corners” of a business’ data stores can help thwart data breaches and other inadvertent vulnerabilities. Yes, it’s necessary for modern organizations to enable their employees to move at the speed of the cloud, but that doesn’t mean security has to play second fiddle. Shadow data will occur, but the beauty of modern technology is that new solutions and approaches to decades-old challenges emerge every day.

These solutions are continuously working to discover and classify data and automatically detect all data stores and assets by scanning the entire cloud environment, revealing content in the shadows. Once all data is scanned, these solutions can categorize and classify files and programs and apply sanctioned data security policies that will allow security teams complete visibility and automated monitoring to manage all of a company’s assets effectively.

The number of breaches occurring “in the shadows” today should be enough for a business leader to reevaluate his or her approach to cloud security. Do they know where their sensitive data lives, and do they have the tools and resources to manage it? Having full data observability lets businesses understand where their shadow data stores are, their security posture and who owns them. Doing so leads to data flowing smoothly and safely and the ability to thrive in a fast-moving, cloud-first world.

 

Public Cloud Data Protection Needs a New Approach. Here’s Why

When Amit and I began our Laminar journey together, we asked ourselves: what is the biggest problem in the data security space today? What will hold back valuable data innovations? The answer was clear: data breaches. We immediately knew that if we could solve it, it would make a major impact on CISOs and data protection teams — and companies’ success overall. Data is no longer a commodity; it is a currency, and it is as valuable for attackers as it is for business. The result: the data breach culture.

Data is no longer a commodity; it is a currency, and it is as valuable for attackers as it is for business.

To solve this, we went back to first principles. How do data breaches occur today? We compiled a list of dozens of recent, major breaches, and a pattern immediately emerged – they nearly all originated from public cloud infrastructure.

We knew there was a problem, and so we knew there must be a solution. But what are the basic requirements for data protection in this new cloud infrastructure environment?

First, we have to be cloud-native. And by “we,” we mean the company and its culture, not only the solution. If we are to solve problems that are cloud-native, we must be cloud-native ourselves.

Second, data protection teams are almost blind when it comes to data residing in public cloud infrastructure. Therefore, our solution must start by integrating with the public cloud itself, using a modern, agentless approach – and identify where and which types of data reside there. This way you can focus on what matters most.

Third, we knew that when it comes to cloud infrastructure that drives and powers the business, you simply cannot slow it down or disrupt it. The solution must not impact the performance in any way, and it must not be in-line. Proxies are a big no-go, as well as solutions that use the same data plane as the workloads themselves.

Finally, we must take a zero-bullshit approach. We despise FUD. Everything we do, must tie back to real-life scenarios and to real, important actions that data security teams can take in order to better protect the data.

So with that in mind, we assembled not just a team, but the perfect team. We’re a bunch of Capture the Flag (CTF) players, kernel hackers, vulnerability researchers and experienced engineers. What unifies us is that we truly believe anything is possible.

The solution: Discover and Classify, Secure and Control, Detect Leaks and Remediate

Our solution is easily deployed by integrating with the cloud provider account. Once this is done, our platform immediately starts looking for data assets, weak spots and potential risks. Actionable, crisp and detailed information is delivered to you within minutes.

Discover and Classify: understand your data

Our technology first identifies resources which potentially store sensitive data. Then, it inspects an ephemeral snapshot of the data in order to understand and classify the data within. Here, the devil is in the details – we developed a novel technology that allows us to do that whether the data asset is managed by the cloud provider or not, whether it is a “formal” data store or if it’s only a shadow copy, and whether it’s public or isolated. Customers told us they need a complete and accurate view. They didn’t want us to stop at just managed data stores like S3 buckets. We needed to enable observability for an entire cloud account.

Secure and Control: resolve issues around controls, blast radius and data segmentation

Once our platform identifies the data that is at risk, it gathers all the metadata about it: How is it configured? What are the access controls? How segmented is the data? Is there a shadow copy of it, anywhere? From where does the data originate, and to where is it flowing? etc.

By analyzing this data we just gathered and by following rules and policies built by our Laminar Labs team, our technology understands what are the most critical issues to tackle, and delivers them to the user in a precise and actionable way. Every issue is resolvable, period.

Detect Leaks and Remediate: monitor and block unwanted data access

The last line of defense is to monitor for data access and block those that are unsanctioned. Our platform consumes audit logs, network flow logs and various data sources in order to build a profile for each and every data access. If and when an unsanctioned one is identified, it then blocks it through our cloud provider integration.

The journey only begins

At Laminar, we’re on a mission to make data safe in the cloud.

We have built some novel technology for a difficult problem that allows continuous monitoring and control for your sensitive data in the public cloud. But that’s only the start: from here, we will be constantly listening and learning from our customers in order to make our product even better.

If you’re an awesome engineer, and you’re inspired by solving the biggest problem in cybersecurity today, you should join us! Visit our careers page!