Much has been written about the exponential growth of unstructured data. Much less has been said about the ways in which legacy approaches to file storage and management contribute to that increase.
Legacy file storage was created for single sites. Allowing users to collaborate on files between sites or meeting business Recovery Point Objectives is only possible through scheduled replication of files between sites.
Traditional network-attached storage deployments can create additional cost and complexity for even a simple two-site deployment. Take a look at the diagram below, for example.
Figure (1a) shows a primary datacentre housing user files and replicating them to a secondary site, so users from both sites have access to the same files (figure 1b). In this scenario the company investment already is 2x the primary storage investment to satisfy the RPO and remote collaboration requirements.
In addition to the secondary copy of data, corporates are required to backup their data (figure 2a) in case of data corruption or a malicious attack. The cost of corporate backup and offsite storage (figure 2b) equals that of the primary investment, so the total investment is 3x the cost to protect the primary data source.
In addition to collaboration, business units are often required to integrate with cloud partners. This is a burden that legacy storage vendors did not anticipate in the original design, so we end up with integration that does not meet the cloud-native requirements for modern applications. Applications designed to work with files do not understand object storage, so this limits any ability to work with data in cloud storage, unless applications are rewritten.
Instead, S3-compatible storage buckets are frequently used as an archive tier, as shown in figure (3a). There is no way of consuming cloud native service using this type of deployment. This limits the business’ ability to migrate critical file storage and associated workflows to the cloud. In short, legacy storage requires corporates to make multiple copies of every file to achieve critical business outcomes, paying multiple times to make and keep those copies.
Panzura’s intelligent hybrid cloud approach allows corporates to make files immediately consistent across sites, and provides enterprise-grade durability without replicating files for backup and disaster recovery.
Figure (4) shows a global file system that, instead of replicating files across locations, uses public, private or dark cloud storage as a single authoritative data source. Virtual machines at the edge (on-prem or in cloud regions) overcome latency by holding the file system’s metadata as well as intelligently caching the most frequently used files to achieve local-feeling performance.
User changes made at the edge are then synced with the cloud store and with every other location (after being de-duplicated, compressed and encrypted). All locations sync simultaneously.
This ensures up-to-the-minute file consistency for all locations in the file system, with immediate consistency achieved through peer-to-peer connections whenever users open files.
Panzura uses a Write Once, Read Many approach to storing unstructured data in object storage as immutable data blocks. Once written, that data cannot be modified, encrypted or overwritten.*
As users and processes make changes to files at the edge, those changes are stored as new immutable data blocks. The file system pointers are updated with every change, to reflect which data blocks are required to form files at any given point in time.
Panzura provides a granular ability to restore data by taking lightweight snapshots of the file system at configurable intervals (figure 5). These snapshots provide point-in-time captures of the data blocks used by every file.
The data blocks themselves cannot be overwritten, so snapshots allow single files, folders or the complete file system to be restored, with a near zero RTO and a last-change RPO.
Not only is this process faster and far more precise than restoring from traditional backups, cloud providers themselves replicate data across cloud regions or buckets to provide up to 13 9s of durability.
This exceeds the durability that many corporates can achieve using even multiple copies of data, and means that IT teams no longer need to maintain separate backup processes to replicate data. Instead, they can rely on the inherent durability provided by Panzura and the cloud object store itself.
Snapshot frequency and retention can be configured as required. For example, a company may choose to capture hourly snapshots and keep them for a period of 30 days, along with weekly snapshots kept for 90 days, and monthly snapshots kept indefinitely.
The quantity and calibre of organisations willing to admit to being hit by ransomware and other cyber threats suggests that there is no complete defence, yet.
By its nature, Panzura’s intelligent approach to data management makes companies resilient to ransomware and other malware by preventing stored data from being encrypted, and being able to restore files, folders or the entire file system to a point in time whenever required.
Additionally, the system slows ransomware attacks. Only frequently used files are cached at the edge, so as files and directories that are not cached are accessed, they need to be retrieved from cloud storage. This takes time.
In the event of a malware attack, data is written to object storage as new objects. These spikes in cloud ingress and egress can trigger alerts to allow early detection of attacks, reducing contamination and allowing for faster recovery.
*File deletion is subject to a secure erasure process, which cannot be run accidentally.
The cloud offers tremendous potential for enterprises to reduce storage costs, improve productivity, and reduce data availability risk. Tapping that potential fully and effectively can provide significant competitive advantage while reducing both business and technological risk.
To date, companies attempting to fully integrate the cloud as a storage tier have been faced with building their own limited solutions by kludging together different technologies from various vendors, many of which were never designed to be used with cloud storage. This approach to implementation fails to realize the full benefits of cloud storage while consuming precious IT resources in implementation and management.
Panzura allows companies to unify corporate data, without incurring the cost of replicating it 3 times for durability. And, both data and locations can scale as required by deploying cloud storage with confidence and ease, without sacrificing productivity or compromising existing workflows.
With Panzura, you can break the unending cycle of on-site storage expansion, eliminate islands of storage that make it difficult for people in different sites to work together, increase your productivity, and enjoy real-time data protection.