It is significant in many ways. It addresses fundamental infrastructure and operational issues both current and future, for its customer base and the market in general. It also demonstrates Commvault’s commitment to evolving its platform stack beyond traditional backup and recovery, tackling the technical and operational complexities introduced by hybrid and multi cloud environments.
There is a significant strain placed on enterprise IT to store, manage and protect all the data organizations need to deal with. Let’s face it: enterprise IT is only getting more complex and costly to manage as organizations become more and more data-centric. Remember, there is no business without data, and no lasting business without solid data protection. Our research is very clear about this.
Zooming in on the management of data protection workloads, the complex interactions between the compute and storage layers – combined with the multiplicity of “destinations” (on-premises, in the cloud, in multiple clouds) – makes is it extremely difficult to deliver data protection/management and operations efficiently at scale. This complexity potentially negates the benefits of an elastic and flexible cloud infrastructure – which is what organizations were trying to achieve in the first place. You can’t control or optimize what you can’t manage.
In addition, let’s not forget that all data is not equal. To optimize cost and performance of data protection or data management operations, a “perfect” fit needs to exist between the compute, networking and storage layers – a task made more complex by the “cloudification” of infrastructure. Taking on the complexity of storage across the infrastructure is a must have to achieve scale, optimize cost and deliver portability and access to data across the hybrid ecosystem. It’s not just a data placement issue; it’s an operational efficiency question, and a business imperative for data-centric organizations – and what isn’t today?
Backup and archive data tend to be “passive” and hard to leverage for other uses. It’s also highly tied to the storage it lives on. Removing the storage format and dependencies obstacles makes the data and its management “active.” In time, with automation and AI, it can become “pro-active.”
To deliver maximum operational and business efficiency, management of data must be active, dynamic, automated (to a point) and complete. This means a comprehensive protocol consolidation (block, file and object storage) on a single platform “under the covers.”
In a perfect world, the data management application would figure all of this out in a hardware-agnostic fashion with enough context that resiliency and performance would be taken into account by design (or through easily-defined policies).
That’s why we should pay close attention to the combination of Hedvig and Commvault as it mitigates challenges that many organizations face today. The combined stack is a software-defined solution that we expect will be easily deployed where it is needed by IT generalists in a way that leverages compute and storage resources to align with the need. Then scale as needed. Forget complex controller replacements and firmware upgrades. Operational efficiency is achieved via complete protocol consolidation (block, file and object storage) on a single platform.
By combining the technologies, in time Commvault will be able to more easily and actively manage data sets wherever they live, and more important, wherever they should live based on their business purpose. We expect this will give rise to improved and enhanced disaster recovery capabilities but also add a “multiplier” effect to the existing Commvault stack and its data management capabilities. It also places Commvault in a good position to keep adding value-added data-centric services to further leverage data assets for new business outcomes. The topic of compliance comes immediately to mind.
With strong and deep integration, wethink this software-defined intelligent data management platform will favorably compete with loosely-coupled “solutions” or “blueprints.”
Senior Analyst Christophe Bertrand covers data protection at ESG. Christophe has 25 years of experience in services, software and high-end storage systems, as well as a passion for the product marketing discipline and product launches.