Data Security – Scale and Bees
AUG, 2019
by Niall Crawford.
Author Niall Crawford
Niall is the Co-Founder and CIO of Enov8. He has 25 years of experience working across the IT industry from Software Engineering, Architecture, IT & Test Environment Management and Executive Leadership. Niall has worked with, and advised, many global organisations covering verticals like Banking, Defence, Telecom and Information Technology Services.
Data security, The problem is scale & a lack of bees
One of the biggest challenges of securing one’s enterprise data is the sheer volume.
Think about it. Hundred (perhaps Thousands) of Applications, Thousands (perhaps Tens of Thousands) of Instances across Development and Test and within each millions of data point, many of which contain PII (Personally Identifiable Information).
Enov8 Test Data Manager
*aka ‘Data Compliance Suite’
The Data Securitization and Test Data Management platform. DevSecOps your Test Data & Privacy Risks.
Sounds scary huh.
And then, even if you know what to secure (which is a rather “big if”) and independent of what expensive masking tools you have (IBM Optim, Informatica, Compuware, CA etc), there is the task of building the remediation scripts. Which typically take months (8-12 weeks) per platform and is often prone to error and omissions a finally executing them.
A set of tasks that are usually done by a centralized team of data “experts”, with a single TDM tool and delivered in a sequential fashion.
Do the Maths!
The “Small Bank of Narnia” with 100 key platforms would take 16* years to be compliant.
*100 Platforms x 2 months / 12 (months in a year)
Or more likely, simply due to “do-ability” (or lack of “do-ability) the organization will just do half a dozen important ones and hope audit, compliance and/or the regulators don’t notice.
Centralization is Bad
However, the problem is here is not just scale.
The biggest issue is the inability to parallelize (federate) the effort.
Imagine each of the 100 platform teams/tribes could do the masking themselves.
- The skills & method to Understand Data
- The skills & method to accurately remediate the Data
- The technology to execute these exercises in Parallel
Well then one might say, optimistically, that the tasks would go from 16 years to say 6 months.
Our Eureka Moment
These somewhat “obvious” observation lead to our Eureka moment and design of Enov8 TDM (aka Data Compliance Suite).
DCS was designed& built to go “against the grain” of traditional TDM tools and methods and deliver four key things:
- Simplicity of Use
No need for experts. Promoting the opportunity for tribes/ teams to “do it themselves”.
- Hands-Off
Encouraging automation of historically manual (or semi manual) data security tasks
- Parallel Data Ops
Promoting the ability to do Profiling, Masking, Validation in parallel manner.
- Enterprise Visibility
Providing enterprise view of coverage & compliance (opposed to traditional blind-spots).
Our Architecture
Enov8 DCS is a new generation Test Data Management / Data Compliance Solution that was built from the ground up to address the needs of both Technical (engineering) & Non-Technical (audit & compliance) staff alike.
Designed with a pleasant front-end and with “guard-rail like” navigation,
DCS takes the users through a best-practice Data Securitization journey.
Which includes:
- Use of “automated intelligence” to understand your Data & Identify Risks.
- Automatic (on the fly) build of masking or encryption scripts.
Yep no more centralized team taking 8 weeks to engineer “often error-prone” solutions.
- Ease of execution, both Just in time & scheduled.
- Automatically Testing (Validation) Data is Compliant and void of Production PII.
- Delivery of Compliance Dashboards & Reporting showing coverage and status.
Giving Compliance, Security & Audit comfort that IT is moving in the right direction.
And for the more technically minded.
- Use of “Worker Bees” to spread DataOps load across the network
No need to wait for one application to finish masking before you start the next. The Enov8 Worker Bees (Battle Bees) can execute hundreds of Data Operations in parallel. Worker Bees can be placed anywhere on your Network (e.g. across platforms, subnets and clouds) to leverage parallel processing and reduce latency and storage transfer costs.
- Provision of Rest-API & Webhooks so compliance can be added to your delivery-tool chain.
To Summarize
In the “good old days” we all had a single team of “subject matter experts” to mask data. And in a company with a handful of platforms, that would probably still work. However, organizations IT & Test Environments are complicated nowadays. Today even medium sized organizations can have hundreds of data platforms with Gigabytes or Terabytes of data. If your organization want to be “truly” compliant, there is a need to move away from traditionally centralist and serial methods. It is time to automate, federate and parallelize your Data Ops.
Learn more about Enov8 TDM – Datasheets.
Other TDM Reading
Enjoy what you read? Here are a few more TDM articles that you might find interesting.
Enov8 Blog: What s Data Friction from the Perspective of TDM
Enov8 Blog: A DevOps Approach to Test Data Management
Enov8 Blog: What is Data Masking? And how do we do it?
Relevant Articles
What makes a Good Deployment Manager?
Deployment management is a critical aspect of the software development process. It involves the planning, coordination, and execution of the deployment of software applications to various environments, such as production, testing, and development. The deployment...
DevOps vs SRE: How Do They Differ?
Nowadays, there’s a lack of clarity about the difference between site reliability engineering (SRE) and development and operations (DevOps). There’s definitely an overlap between the roles, even though there are clear distinctions. Where DevOps focuses on automation...
Self-Healing Data: The Power of Enov8 VME
Introduction In the interconnected world of applications and data, maintaining system resilience and operational efficiency is no small feat. As businesses increasingly rely on complex IT environments, disruptions caused by data issues or application failures can lead...
What is Data Lineage? An Explanation and Example
In today’s data-driven world, understanding the origins and transformations of data is critical for effective management, analysis, and decision-making. Data lineage plays a vital role in this process, providing insights into data’s lifecycle and ensuring data...
What is Data Fabrication? A Testing-Focused Explanation
In today’s post, we’ll answer what looks like a simple question: what is data fabrication? That’s such an unimposing question, but it contains a lot for us to unpack. Isn’t data fabrication a bad thing? The answer is actually no, not in this context. And...
Technology Roadmapping
In today's rapidly evolving digital landscape, businesses must plan carefully to stay ahead of technological shifts. A Technology Roadmap is a critical tool for organizations looking to make informed decisions about their technological investments and align their IT...