When you’re looking to replace your existing enterprise NAS or legacy storage system by consolidating your unstructured data in the cloud, you don’t expect to lose key features. However, the features you’ve taken for granted, because they’re baked into legacy storage, aren’t standard with cloud storage.
What Key Features Should You Look For?
A recent DCIG report highlights that a few key areas put the top cloud-based NAS consolidation solutions above the rest. When weighing your options, keep your eyes out for the following core aspects:
Limitless adaptive capacity is essential for meeting your organization’s dynamic storage needs. From terabytes to petabytes, the data explosion shows no signs of slowing down. Your solution should be prepared to protect, back up, and recover in a future of continuous, explosive unstructured data growth.
Public cloud integration should be usable as your central global storage, with your choice of provider. The flexibility to offload some maintenance and upkeep from your own IT is invaluable — whether you choose AWS, Azure, Google Cloud, or otherwise.
A global namespace is what keeps all your file system data in one consistent view for all users. Regardless of which site is browsing your file directories, all sites should have a united view instead of struggling with isolated, siloed directories.
Support should be broad and highly available. Top solutions tend to have near immediate response times from their tech support. From live agents to community forums and self-service knowledge bases, help is available around the clock, in multiple channels.
Here are some things to watch for:
Global File Locking
Does your solution only allow one user at a time to access and edit files? If it has file locking ability, does it work in real time, without intervention?
For remote collaboration, file (or even byte-range) locking is critical to keeping your files consistent across all locations in real time – that’s a polite way of saying uncorrupted – and under control. Without it, file versions splinter off from the master copy and render your file system an unruly mess. Or worse, one user overwrites another’s work.
Does your solution offer cloud-based mirroring of files and directory metadata?
Without this key feature, you might be vulnerable to storage failure at any given site. The goal of cloud use is uninterrupted access — yet, some solutions may leave you exposed to cloud outages, or even accidental cloud bucket deletions.
Dedicated High Availability (HA)
Does your solution carry a high availability rating, and how much are you paying for it?
Uptime has the ability to make or break your organization’s productivity. If you’re not supported by a fully reliable solution, you’re looking at a costly risk that you might not be able to shake in today’s economic climate. With solutions like cloud mirroring (at no cost, from Panzura), you can achieve resilient ratings like 13 9s — or 99.99999999999% of durability.
Is your solution rated compliant with your organization’s required regulations?
While some solutions may be compliant with some regulatory requirements, not all are made equal. For instance, military-grade FIPS-140-2 certification takes a careful look at a solution’s secure deletion and data protections. Limited solutions meet the requirements, even if they meet the demands of HIPAA and other boards.
It’s wise to check these, and your many other compliance needs, before you make your final decision.
Does your solution keep strong performance, even when expanding for more data, locations or users?
Speed and costs are a tricky thing to balance alongside your productivity. Naturally, some solutions don’t do well with this elasticity, especially when the cloud is involved. Public cloud integration helps you stay agile but may introduce latency. Your solution should focus on keeping local performance at global scale — regardless of your number of locations or users.
Is there a dashboard available to dissect and comb through your entire file network?
Having clear vision into the wilds of your data scape is less useful without the tools to navigate it. Even if you know the value of integrated search and auditing, some solutions seem to have omitted these features. Be sure to seek the solution that fully enables your IT team to keep up with your data growth.
What Other Features Should You Watch For?
As you finish checking these highlights, keep in mind there are likely other features that are worth investigating.
Additional features to compare may include:
File & user backup snapshots: Point-in-time restore points for file recovery.
Dark site security support: For maintaining “air-gapped” domains — i.e. no inbound or outbound traffic.
Encryption: Expect no less than military-level AES 256 cryptography.
Forward-thinking features you might want to consider may include:
Immediate global data consistency so that every user, at every location, can see the most up-to-date version of every file, whenever they need to do so, just as they can when working off network-attached storage. Achieving this requires real time distributed file locking as well as a peer-to-peer sync to keep all locations immediately up-to-date with no risk of file overwrites.
Ransomware and malware mitigation to prevent malicious lockdowns or disruptions to your data — via immutable data architecture and global file snapshots to prevent data from being damaged, and create a reversible breadcrumb trail of restore points.
Global deduplication & compression to lighten the storage load by removing redundant data blocks before they’re stored. Using lightweight metadata file pointers to record which data blocks comprise a file at any given time allows blocks that would otherwise have been duplicated, to be stored once and used by multiple files.
Global file collaboration features to take global locking further — such as distributed file locking and byte-range locking to keep master copies of collaborative files splinter-free, and using local filer caching to keep them rapidly accessible.