The future of work is a big concern for companies of all sizes. The 9-to-5 office workday has given way to employees working remotely from multiple locations, and while there’s been plenty of discussion of work-from-home employees, the remote office/branch office (ROBO) often presents an even greater IT challenge.
According to Nemertes Research, approximately 90% of new hires today work in branch offices [link], yet only 20% of those locations have IT staff present to manage the significant IT assets required to run the location. Because IT staff is handling data protection remotely as part of a portfolio that can be described as supporting “everything with a plug in it,” data protection often takes a back seat, putting the organization’s data at serious risk.
Issues with ROBO data aren’t uncommon: According to research by Enterprise Strategy Group (ESG), 37% of companies reported local data backup as a significant challenge. The problem has many facets, and any one of them can put an organization at serious risk.
While organizations may have well-defined data protection policies that are enacted centrally, implementation at ROBO locations is often lacking. The unique challenges of managing data security at these locations too often result in inadequate backup procedures that put the organization at risk of data loss and regulatory non-compliance.
To solve these problems, organizations must reconsider the entire environment, not just individual parts. For example, individual server backup may work well for a small organization in a single office, but it becomes inefficient when spread across multiple sites. Similarly, using tape-based backup is challenging when you don’t have the highly skilled staff resources to run this process correctly from end to end.
For many organizations, it makes sense to consider a cloud-based service instead. As wide area networks (WAN) have become more reliable and bandwidth has become less expensive, cloud solutions have grown as a scalable, cost-effective backup method. However, effectively protecting the growing amount of data that exists at remote and branch offices requires a shift in strategy.
For many organizations, it makes sense to consider a cloud-based service instead.
Rather than replicating older disaster recovery models on the cloud, IT should consider systems that are purpose-built built to take advantage of how cloud services are designed. For example, a true cloud-first approach will avoid proprietary cloud infrastructures where data can only be saved to a few specific locations. Similarly, IT shouldn’t have to install multiple systems or appliances in each new location but should instead leverage the simplified remote administration that can be achieved through the cloud.
There are many important considerations when approaching a transition from local data backup to a cloud-first solution. IT should be looking at how to make effective use of multi-tenant architecture alongside micro-services when examining a solution, whether it’s to be built or bought. Scalability is another important factor, as backup tends to create ever-multiplying volumes of data. Finally, a new approach that is designed to eliminate the risk of human error or local storage failure should be carefully investigated to ensure it doesn’t create any new single point of failure.
Cost is an important consideration in any data backup strategy, and cloud-based backup makes it possible to tier data by priority to reduce overall storage costs. A good example is Amazon Glacier, which provides cheaper data storage in exchange for less immediate availability. New data is stored in a local cache or high-performance storage while it is likely to be retrieved frequently. After 90 days, the data is moved to long-term storage for archiving, reducing overall storage costs while still preserving the massive amounts of data necessary for legal compliance.
This tiered approach requires a thoughtful plan for handling versioning, so that organizations can roll back to specific versions of a file even after a long period of time has passed. Storage of metadata should be taken into account to simplify the management of information and files over time. Backup solutions with the ability to deduplicate data before it crosses into the cloud can greatly reduce the burden on both the network and the storage location, making deduplication an important feature for large-scale cloud backup implementation.
In the past year, around 40% of data was saved outside of companies’ data centers, more than ever before. As organizations continue to increase their use of remote offices, mobile devices, and cloud applications, this trend will continue for the foreseeable future. Data protection strategies need to adapt in order to keep up.
Cloud-first data protection methods provide a natural solution to this challenge. As more data shifts out to the edge of the business, cloud-based data protection services can ensure that important data is spotted, analyzed, and stored in a compliant, cost-effective manner.
Help your organization take the next step and access this white paper: Shifting Data Protection Strategies to the Cloud for Remote and Branch Offices