Resource Library

Find videos, customer case studies, datasheets, whitepapers and more to learn how Commvault can help you make your data work for you.

5 Strategies for Modern Data Protection

eBook Whitepaper



It’s no secret that today’s unprecedented data growth, datacenter consolidation and server virtualization are wreaking havoc with conventional approaches to backup and recovery. Here are five strategies for modern data protection that will not only help solve your current data management challenges but also ensure that you’re poised to meet future demands.


Most enterprise IT organizations today struggle to even finish backups at all. That is, often the backup jobs are not able to completely process full or even incremental file sets in the time window available. It can be shocking to learn the truth of how much critical corporate data is assumed to be backed up but is not actually. 

It’s clear to those on the ground what’s going on. The amount of data to be backed up is growing exponentially, while resources (storage, network, servers) devoted to backup are constrained. Besides the amount of data, the locations and types of data have become more complex with applications, cloud data and virtualized servers. Even if it were possible to throw more hardware at the problem, that would not solve it. IT staff valiantly keep legacy backup systems going with manual work or scripting, which lowers their productivity - and worse, the backups are extremely hard to use for data recovery. Legacy backup tools just aren’t up to the job.

New professional-grade data backup and recovery solutions, like Commvault® software, do a much better job of getting all of your data backed up in far less time and requiring far fewer resources. Modern tools should have knowledge of current enterprise applications, physical and virtual environments, and file systems to enable rapid, consistent copying of that data. Also of key importance is modern storage array snapshot technology, which creates nearly instantaneous applicationaware copies, drastically improving backup and, and even more importantly, speeding recovery time. By implementing snapshots, you can execute against ever tighter SLAs, and when the snapshot process is coordinated with backup, you can get a protection copy created apart from production servers, reducing the impact to those resources. Commvault software solved the challenge of the management of those snapshot operations, with each storage array using its own tools and processes, manual scripting and a lack of application-awareness. All of this means that backups actually get done and can actually be used for recovery which, after all, is the reason to backup in the first place.


The truth is that legacy backup systems do not minimize data redundancy well at all. Data is allowed to proliferate in multiple copies because it has traditionally been seen as safer and simpler in terms of backup logistics and storage just gets cheaper.

However, this leads to excessive demands on network, storage and management resources, especially in the hyper-data growth environment you face today. Target-based deduplication appliances were seen as a way to get rid of some of the extra copies, but they don’t do anything to solve the network problem. The solution in a modern implementation is to eliminate redundant data eliminate at the source and never transmit it over the network. Another area where deduplication adds value is for moving data to other locations for disaster recovery. Yes, you may have replication capabilities in legacy systems, as a costly extra. But it is resource-intensive or tied to specific hardware. But worse, there is no granularity into the data you are moving, which again leads to a lot of waste and inefficiency. By only moving the changed data to other locations you can reduce the cost, time and resources required for replicating data to meet your recovery needs. Finally, legacy deduplication was limited in scale and silo’ed, so you aren’t able to address it at a global level.