The past couple of years have been tragic and challenging as the world responded to COVID-19. One positive side effect of the pandemic however, has been the positive momentum of digital transformation, and the shift it has enabled for data security teams to move from being the gatekeepers of data to enablers.
Digital transformation was already in motion for many organizations, but the overnight shift to a completely remote business model in response to the pandemic forced organizations to adapt quickly. Those that were already engaged in digital transformation projects drastically accelerated their efforts, and companies that had not yet embarked on the journey quickly found it to be imperative.
When everyone is remote and accessing company applications and data over the Internet from their living rooms, it makes things like automation and SaaS (software-as-a-service) applications essential for staying connected and being productive. Organizations that succeed in embracing digital transformation can operate at the speed of cloud.
The real advantage, however, goes to the businesses that succeed in maximizing the value of their data. That is sometimes challenging, though. Data security is also important and protecting data often conflicts with fully leveraging it. The trick is to find a way to have effective data security without impeding access to the data.
Legacy Data Protection
Legacy approaches haven’t always balanced data access with security. Historically, most data security teams had to enforce governance. As gatekeepers, their primary job was to restrict access and prevent unauthorized use of data.
Data governance has largely been implemented through the creation of data-specific policies. The data security teams crafted the policies that the rest of the organization was expected to adhere to—essentially specifying what type of security should be in place and how it is implemented. For example, policies would ensure that sensitive or restricted data is encrypted and access is limited to only those with a verified need.
The data security team would then act as the data “police” to ensure that everyone adhered to the policies. The entire process most often caused conflict and annoyance with the users of the system. Users would ignore or circumvent data policies if those policies got in the way of getting things done. Further, the data security teams became frustrated that nobody cared about the data policies.
This legacy approach to managing data leaned too far in the direction of security for the sake of security, while getting in the way of putting the data to actual use.
Cloud Data Security is a Tipping Point
The traditional approach to data security has not kept pace with the scale of the attack surface or the speed of the threat landscape of the cloud. Data security has more control in legacy environments. Developers who wanted to create a new data store for an application had to ask permission to do so. There would be a design review meeting to approve the request and a DBA would have to set up and configure the database. This allowed for the security team to establish policies, and provided a natural opportunity to act as gatekeepers of the data.
That won’t work in a cloud-native world. With cloud platforms and DevOps principles, developers have all the power now. They can spin up a database in a heartbeat and don’t have to ask permission to do it, so the security team no longer has the chance to ask important questions and implement policies that can protect the data.
Digital transformation both enables and requires a cloud-first approach to managing data because data security vs. data utility is not a binary decision in the cloud. Without the right automated, intelligent controls, data security will be circumvented more often than not. So, the cloud has brought the issue of security versus utility to a tipping point.
Gatekeepers to Gateopeners
Digital transformation and cloud-native applications are huge advantages for the business, but the approach to data security has to adapt as well. The question now is, “How can you enable developers to move at the speed of the cloud, but do it with guardrails that protect the data?”
Cloud data security is focused on protecting data from breaches and compromises while also empowering users to get things done. The goal is to transform your data security team from gatekeepers to enablers who are capable of performing at the speed of the cloud.
To make this approach work, it’s imperative for cloud data security teams to understand where the sensitive data is and who has access to it. The cloud data security team also needs to know the overall security posture of that data, where the data “flows,” and/or how it is being used. These capabilities are the baseline for a new security concept – cloud-native data security.
Cloud Data Security Unleashes Data
Cloud-native data security has to accommodate the dynamic nature of the cloud. For example, data security has to shift to a “trust but verify” model that allows for increased agility and enables access to data while still protecting it. A “trust but verify” approach allows the security team to maintain visibility and control of data, while simultaneously unleashing the data and allowing business to move as fast as it needs to.
It must include automated continuous monitoring to autonomously discover and track where your managed and unmanaged data is stored. And, it must automate the classification of sensitive data to enable granular security policies. Finally, cloud monitoring is needed to verify that policies are being followed.
Data security managers never wanted to “police” data policy and access. They have always strived to make data available—to enable users to access and get value from the data—but to do so while safeguarding the safety and integrity of the data at the same time. Digital transformation and the advent of cloud-native security solutions that can continuously and autonomously verify policy and access finally gives them the tools to be data enablers rather than gatekeepers.