Obstruction On Road Ahead: A GPS To Modern IT Infrastructure

By Glenn Dekhayser

Do you remember the day you went live with your brand-new data protection software, how it was perfectly architected to your production infrastructure, how you were able to recover your servers and data reliably and easily, and how you were able to keep your backup jobs within your window?

Yeah, those were the days.

A few years have passed now and it just seems that you need to think about your backup solution way too much. Things seems to take longer than they used to, maybe you had to add another piece of software to protect the workload that the formerly-new backup solution didn’t cover yet. Also that dedupe target appliance you bought is coming up for support renewal in a year and WOW, that renewal is a lot more money than you thought it was going to be. The vendor gave you a great deal on the newer model, but moving everything is going to be a major project.

Oh, and the cloud. You’ve got stuff up there now, and you just read the contracts and you can’t BELIEVE that they don’t protect the data for as much as they’re charging.

If you’re like 56% of enterprise respondents of a recent ESG survey, you’re either likely or highly likely to be swapping out your backup solution. AGAIN.

So, what went wrong?

First, let’s discuss expectations.

Back in the early days of the commercial internet, it wasn’t a foregone conclusion that your company would have a firewall (imagine that). Some just had NAT software and let it go from there. Once security became a real concern, I personally remember hearing CIOs tell me “We’ve got a firewall. Now we’re secure.” Like security was DONE.

This is still the way many feel about data protection. They think they invest in a backup solution, then they don’t have to think about things – ever.

This way of thinking is VERY dangerous. It’s also incorrect and counterproductive.

Think of your production environments 10 years ago. Where were they? How many production machines did you have, both virtual and physical? How many different structured data sources did you have? How large were your structured and unstructured data sets? How many unstructured objects (files/directories) were you managing? Most importantly, what restore service levels did your business expect for the different workloads?

I’m willing to bet if you compare the answers to the above to the answers for today’s production environments, they’re completely different in scale, location, and variety.

Why would you expect the same backup architecture that was so good in the former environment to be valid in the current one? Note that I didn’t say solution- I’m saying ARCHITECTURE. I’ll get to that in a minute.

So, here’s the challenge- our production environments constantly change in scale, speed, and variety (at a rate exponentially higher than they did 10 years ago per Gartner), but swapping out data protection solutions is highly disruptive and expensive. What’s an architect to do?

Just as IT architects strive to create production platforms that provide reliable, homogenous experiences for application and data workloads, they must also strive to create data management platforms (which include data protection) that are BUILT FOR CHANGE. These platforms must accommodate legacy and new workloads equally and have a foundational architecture that allows for new data protection sources and techniques to be added without changing the underlying management paradigms.

Metadata across all of the disparate sources, despite having varied formats and use cases, must be equally visible, searchable, and actionable so that compliance, security, and business requirements can be met.

Be careful of solutions that are actually combinations of multiple acquired products that weren’t developed to work together. True integration is VERY difficult, and the nature of acquisitions is such that there’s usually a rush to get revenue from them, so you’ll be dealing with a split brain for the different parts of your environment the multiple products deal with. I see this today with companies that originally had a great vSphere backup product, but acquired someone else to deal with cloud backup, for instance. For me, this is a red flag.

I prefer the platform approach- where the data sources may have their own best practices for acquiring the data that is to be protected and managed, but the data movement, storage, and metadata management is homogenous and infinitely scalable. With this approach, I’m confident that whatever change is thrust upon the data management infrastructure, I won’t need an additional solution or dedicated hardware/software to deal with any corner case that comes up.

Once you have a reliable platform, you’re not done. Data management is NOT just an IT problem! The business needs a committee that includes the IT architects, legal, compliance, and application owners so that enterprise needs are taken into account, and perhaps a window into oncoming projects can be opened so that the data management environment can evolve ALONG WITH the production environment. Changes to the data management environment need to be done at least quarterly, to avoid the problem I refer to at the beginning of this article. This isn’t a technology problem, it’s a people problem, and it needs solving.

People just want backup to work. Executives just want their data secure and searchable. Compliance has their own set of rapidly-evolving data requirements.

This isn’t “just” going to happen. Without a reliable, flexible, and scalable data management platform, it’s NEVER going to happen.