Data Classification has been a lynchpin for Data Governance. After all, the US DoD learned that "protect everything at maximum level" is cost prohibitive, and can also slow "Time to Mission". Industry can (and has) drawn many lessons from the forementioned scenario. That said, a good Data Governance model should start with Corporate Mission/Goals/Objectives. The model then should flow down to Policy. From this survey report, it appears that organizations need to review their policies. That is, add Data Classification to their Data Governance Policy. You have to find ALL of the data, this takes time (and iteration). After all, how can you protect it if you don't find it? Find the data! Then Classify it (basing the Classification on the Model that best aligns with your corporate mission). While you are at it, figure out the likelihood and impact if that piece of XYZ data was to be leaked, locked (by ransomware), or lost (worst case scenario). "Asset tagging" is another term used loosely in these Circles (pardon the pun).
A proper Data Classification flows right into Data Loss Prevention (DLP), IMHO.
Data Loss Prevention (DLP) has been an issue since the advent of Distributed (Client-server/Multi-tier) Computing. The foray away from centralized (read: System 36 to OS/390) computing was initially driven by perceived cost savings. Well, those savings might have been realized, but at the cost of "security of data".
We are coming full circle, as we return to the "mainframe in the sky". Heretofore referred to as "the Cloud". Yes, Cloud is the 3rd Wave of Computing. (1st Wave - mainframe. 2nd - Client/server). The promise of efficiency (read: Cost savings) pushed Industry to newer waves; Data security was an afterthought.
Unlike the mainframes of lore, our initial migrations to Cloud were not driven with data protection in mind.
The above findings of this survey confirm that DLP is a major concern for enterprises.
Modern day DLP mechanisms aren't without their foibles either, they can easily cripple data flow. EVERY DLP implementation (whether on-prem or Cloud-based) needs to be deployed methodically. Usually these DLPs are installed with default "Allow All", as they should not disrupt business operations. The key lies in the "tightening down" at the policy then enforcement level. To whit, if an enterprise's InfoSec crew tightens the company's financials "too much too soon", the accounting department won't be able to complete its tasks in a timely manner (READ: pay their employees, pay their vendors, or provide charting for the C-suite). These are just a few financial examples of DLP over-tightening.
That most organizations currently in the Cloud worry of data loss only amplifies the methodical approach to DLP tightening mentioned above. Why isn't Data Governance a key pillar for an organization's Risk Management Model (and Policy) remains a mystery.
Thus, any org's journey towards an effective DLP is never-ending (I prefer the term "ever evolving").
As many of our colleagues know, DLP is Key.
Shamun Mahmud CCSKv3
Regional Sales Manager - US West
Tego Cyber Inc
Sent: Jun 22, 2022 04:52:34 AM
From: Orbert Reavis
Subject: Risk Management with Google
New survey report released today! In collaboration with Google Cloud, the survey report, Measuring Risk and Risk Governance, provides a deeper understanding of public cloud adoption and risk management practices within the enterprise.
Among the survey's key findings:
- There is no consistency of data classification across the use of cloud platforms and services - only 21 percent of users are utilizing cloud service data classification.
- More than half (52%) of organizations reported that they did not evaluate the risk of their cloud services being used after procurement as product features or business environments changed.
Download the survey report here → https://cloudsecurityalliance.org/artifacts/measuring-risk-and-risk-governance/