Data Security (Scale and Bees)

25
AUGUST, 2019by Jane Temov

Data security, The problem is scale & a lack of bees

One of the biggest challenges of securing one’s enterprise data is the sheer volume.Think about it. Hundred (perhaps Thousands) of Applications, Thousands (perhaps Tens of Thousands) of Instances across Development and Test and within each millions of data point, many of which contain PII (Personally Identifiable Information). 

Sounds scary huh.

And then, even if you know what to secure (which is a rather “big if”) and independent of what expensive masking tools you have (IBM Optim, Informatica, Compuware, CA etc), there is the task of building the remediation scripts. Which typically take months (8-12 weeks) per platform and is often prone to error and omissions a finally executing them.A set of tasks that are usually done by a centralized team of data “experts”, with a single TDM tool and delivered in a sequential fashion.

Do the Maths!

The “Small Bank of Narnia” with 100 key platforms would take 16* years to be compliant.*100 Platforms x 2 months / 12 (months in a year)Or more likely, simply due to “do-ability” (or lack of “do-ability) the organization will just do half a dozen important ones and hope audit, compliance and/or the regulators don’t notice.

Centralization is Bad

However, the problem is here is not just scale.The biggest issue is the inability to parallelize (federate) the effort.Imagine each of the 100 platform teams/tribes could do the masking themselves.
  • The skills & method to Understand Data
  • The skills & method to accurately remediate the Data
  • The technology to execute these exercises in Parallel
Well then one might say, optimistically, that the tasks would go from 16 years to say 6 months.

Our Eureka Moment

These somewhat “obvious” observation lead to our Eureka moment and design or  Data Compliance Suite. DCS was designed& built to go “against the grain” of traditional TDM tools and methods and deliver four key things:
  1. Simplicity of Use
No need for experts. Promoting the opportunity for tribes/ teams to “do it themselves”.
  1. Hands-Off
Encouraging automation of historically manual (or semi manual) data security tasks
  1. Parallel Data Ops
Promoting the ability to do Profiling, Masking, Validation in parallel manner.
  1. Enterprise Visibility
Providing enterprise view of coverage & compliance (opposed to traditional blind-spots).

Our Architecture

Enov8 DCS is a new generation Test Data Management / Data Compliance Solution that was built from the ground up to address the needs of both Technical (engineering) & Non-Technical (audit & compliance) staff alike.Designed with a pleasant front-end and with “guard-rail like” navigation,DCS takes the users through a best-practice Data Securitization journey.Which includes:
  • Use of “automated intelligence” to understand your Data & Identify Risks.
  • Automatic (on the fly) build of masking or encryption scripts.
Yep no more centralized team taking 8 weeks to engineer “often error-prone” solutions.
  • Ease of execution, both Just in time & scheduled.
  • Automatically Validating (Testing) Data is Compliant and void of PII.
  • Delivery of Compliance Dashboards & Reporting showing coverage and status.
Giving Compliance, Security & Audit comfort that IT is moving in the right direction.And for the more technically minded.
  • Use of “Worker Bees” to spread DataOps load across the network
No need to wait for one application to finish masking before you start the next. The Enov8 Worker Bees (Battle Bees) can execute hundreds of Data Operations in parallel. Worker Bees can be placed anywhere on your Network (e.g. across platforms, subnets and clouds) to leverage parallel processing and reduce latency and storage transfer costs.
  • Provision of Rest-API & Webhooks so compliance can be added to your delivery-tool chain.

To Summarize

In the “good old days” we all had a single team of “subject matter experts” to mask data. And in a company with a handful of platforms, that would probably still work. However, organizations IT & Test Environments are complicated nowadays. Today even medium sized organizations can have hundreds of data platforms with Gigabytes or Terabytes of data. If your organization want to be “truly” compliant, there is a need to move away from traditionally centralist and serial methods. It is time to automate, federate and parallelize your Data Ops.Learn more about DCS.
Jane TemovJane is an experienced IT Environments Management & Data Evangelist. Areas of specialism include IT & Test Environment Management, Data Securitization, Release Management, Service Resilience, Configuration Management, DevOps & Infra/Cloud Migration. 

Relevant Articles

Enterprise Architecture Tools: 11 to Be Aware Of in 2025

Enterprise Architecture Tools: 11 to Be Aware Of in 2025

Enterprise architecture (EA) is an essential discipline for organizations aiming to align their IT strategy with business goals. As companies become more complex and technology-driven, having the right set of EA tools is crucial to streamline operations, improve...

What is a Staging Server? An Essential Guide

What is a Staging Server? An Essential Guide

Release issues happen.  Maybe it’s a new regression you didn’t catch in QA. Sometimes it’s a failed deploy. Or, it might even be an unexpected hardware conflict.  How do you catch them in advance?  One popular strategy is a staging server....

What is Deployment Planning? A Detailed Guide

What is Deployment Planning? A Detailed Guide

Deployment planning, sometimes referred to as "implementation planning," is the process of creating a plan for the successful deployment of a new software or system. It involves identifying the resources, tasks, and timeline needed to ensure that the deployment is...

The Definitive Guide to Test Data Generation

The Definitive Guide to Test Data Generation

Test data generation is a critical part of the software testing lifecycle, ensuring that applications are tested against realistic scenarios before going live. If you’re not testing against production-like data, you’re arguably not truly testing your application. In...

What is a Test Data Manager? A Detailed Introduction

What is a Test Data Manager? A Detailed Introduction

Testing is a critical aspect of software development, and it requires the use of appropriate test data to ensure that the software performs optimally. Test data management (TDM) is the process of creating, storing, and managing test data to ensure its...

How to Manage Test Data in Software Testing

How to Manage Test Data in Software Testing

To compete in today's market, software companies need to create programs that are free of bugs and vulnerabilities. In order to accomplish this, they first need to create test data models specifically for staging environments. Test data sets must be compact,...