News/Trends, Tech/Engineering

The Cloud As A Path To Infrastructure Resiliency

Cyber security and IT risks are attracting more attention from business leaders – PwC’s CEO survey released in January 2016 found that 74 per cent of CEOs rated security as one of the top three challenges that could affect the success of their businesses over the next year. However, this risk is often not reflected in how businesses assign and manage their data protection plans over time.

Using public cloud services for disaster recovery (DR) and data protection can provide an agile way to solve some of these challenges. As many global companies have new European legislation compliance issues to consider alongside increased risk to IT, making data protection easier is a management requirement. At the same time, use of cloud solutions can help to save significant costs too through smarter management of data over time.

Getting DR ready for a converged approach

In the world of business continuity and DR, it can be difficult to know where to start the process of bringing different areas together. Business continuity refers to all those approaches and technologies used to prevent IT issues from taking down a service, while DR involves similar processes and IT solutions used to return systems to normal as quickly as possible. Alongside both continuity and DR projects, companies will also have to implement archiving for their data and records in order to meet compliance and DR needs as well.

All these different strands have to be applied to the existing IT resources that support business applications. What results from this is a Spaghetti Junction of tools, processes and platforms used to protect data; at best, there is a lot of overlap where data sets are protected using multiple tools, additional infrastructure and adding further expense to the storage bill. At worst, this drives up cost while also leaving islands of data underprotected against the risk of failure.

Consolidating DR strategies is therefore a good approach, not just to help IT leadership in their endeavours in managing data resiliency, but also to reduce costs associated with data protection too. By removing unnecessary tools, companies can save on their ongoing licensing and management overheads while also looking to reduce potential risk.

Public cloud now offers a worthy option for DR planning. From the initial suspicion that greeted cloud deployments when they launched around five years ago, today public cloud services are perceived as more secure and more trusted than internal IT, according to research by Gartner. The fact that companies like Amazon Web Services and Microsoft Azure depend on their security and availability to stay in business is not lost on IT professionals, while the ability to improve recovery and reduce costs can be attractive too when budgets are tight.

However, the biggest attraction here is the potential to automate DR and data protection processes so that information can be copied and made secure more efficiently. Rather than running multiple tools to cover data backup, disaster recovery and archival, these services can be consolidated by using a single secondary copy of the data. This not only reduces costs and improves efficiencies, but potentially improves the ability to recover data faster.

Managing hot, warm and cold data in the cloud

Central to this approach is how data is managed across its lifecycle, as well as how it is managed from initial creation through to long-term storage and archival. Data can be divided into three overall categories:

  • Hot data – this describes data that has been newly created across the business. Hot data includes everything from the most recent files that individuals have been working on through to central application data. This hot data covers the last thirty days of activities, and can be cached locally. In the event of a failure, this information will be what the business needs recovered fastest.
  • Warm data – this covers data that is still fairly recent, but is less essential than hot data. Typically, warm data covers files and application data created more than thirty days ago but is less than ninety days old. In traditional DR planning terms, data in this category will be necessary for recovery after a disaster but would have an acceptable recovery time objective that is longer than the hot data.
  • Cold data – this includes all information suitable for long-term storage and archiving. While this information is still necessary for the business to store, it is not referenced regularly. In the event of a file or record being required, restore time can be slow.

In the course of normal operations, data should flow smoothly from hot to warm, and from warm to cold states. Automating this process can help reduce the costs involved in data management for recovery purposes.

Alongside this, it is worth thinking about how recovery operations can be run in the public cloud. For example, AWS offers a service called Virtual Private Cloud (VPC) that is specifically associated with a customer’s account. Using their own VPC, companies can use AWS resources as an extension of their own internal IT network. Using this platform, companies can shift their DR workloads over to AWS and run operations in the cloud as required allowing them to maintain complete security of their data.

This approach can help when companies are considering the value of traditional DR implementations, which typically rely on secondary sites and warm failovers to operate successfully. Using public cloud, companies can cut their facilities spend and use already established services as they are required.

For companies that have compliance requirements around their data to consider, public cloud services have developed to provide management control over locations where data may be stored. This expansion of services – coupled with dedicated locations in areas with stricter local data privacy rules such as Germany – means that companies can centrally administer global operations while adhering to local regulations through the public cloud’s geographically dispersed data regions.

As companies seek to streamline their data protection requirements, taking a converged approach can help to improve service back to the business. Simplifying DR and data protection does not mean ignoring the specific needs of data archiving – or worse still, thinking that implementing data backup is good enough to cover all use cases – instead, it is about bringing together multiple uses of that data for recovery into one overall process.

By using the public cloud to automate and manage operations for data protection, company IT teams can improve their DR processes as well as cutting costs.

This article was originally published on IT Portal.