Reading Time: 6 minutes
As multi-media file sizes continue to grow at a phenomenal rate, editing, collaborating, and transferring them between remote sites can be frustratingly slow, painfully expensive, and worryingly insecure.
Media and entertainment content creation is a storage- and time-consuming process, as media workloads leveraged from content repositories can create millions of files. Many of these are replicated at remote sites, often taking up hundreds of petabytes of storage.
These siloed repositories are constraining production environments, as users in remote sites cannot access the files they need fast enough. Consequently, explosive data growth has forced technology teams to re-architect their on-premises infrastructure to include the scalable storage capabilities of the public cloud.
Even though the public cloud offers substantial storage availability, it places further strain on file accessibility. Without high-performance access to mega-media files, remote sites continue to find file access and file operations to be slow, to the extent they can result in product release delays.
Bring products to market faster while reducing storage costs
- Consolidate data into a single, deduplicated view across the organization
- Empower real time, high performance collaboration on Adobe and AutoCAD files
- Avoid overwrites with immediate file locking that automatically works everywhere
- Boost performance with file changes that are immediately visible to every user
- Protect data against damage and deletion with ransomware resilience
When productivity for your remote teams depends upon a high degree of collaboration on media assets, overcoming the performance challenges posed by cloud storage becomes imperative.
Inability for people to work together across locations, or to move large media files fast enough, makes it difficult for you to fully tap the potential of your talent pool, hampering your growth and even reducing your ability to meet deadlines.
If the demands of massive data growth aren’t enough for technology teams, threats of media assets being stolen or held for ransom have become problematic. Assuring resilience against these attacks have become the highest priority for IT operations, which adds yet another layer of complexity for those struggling to find a complete data protection solution.
Although migrating your content repository to the cloud initially appears to be the cost-effective answer to silo constraints and media data growth challenges, this strategy simply replaces one problem with more issues – performance and unexpected cloud egress charges. With every interaction impacting your potential revenue, the speed and capacity of your file system to keep up with content creation demands becomes crucial.
This solution brief describes how Panzura’s next-generation cloud data management solution CloudFS future-proofs you against data growth and data protection challenges with global scalability, performance, and resiliency.
Every single country has Panzura running, and they are globally working together. It’s a fundamental because, for the majority of time over the last 20 years, those regions have been in silos and we had to shuffle content back and forth. - Disney
Consolidating Distributed Mega-Sized Media Files
The ever-increasing size of the files you work with means there’s a constant need for more capacity. Traditionally, on-premises media storage was deployed and dedicated as a repository for specific applications such editing, compositing, graphic creation, playout and so on. As applications and users multiplied, storage silos proliferated as well.
Although each site would function individually, data efficiencies were not addressed at the global level, meaning duplicated files could reside at each site and – depending on the type of deduplication process in use – within each storage volume.
All solutions do not deduplicate data in the same way. This really matters, as the outcomes of the different approaches are completely different.
Using volume deduplication for example, duplications will be identified and removed within each volume. However, that still means a storage repository with 100 volumes can have the same single “deduped” file in each one of those volumes.
This can consume an enormous amount of unnecessary storage space. For example, if a 10GB file exists in each of the 100 volumes, that single file occupies 1TB of total capacity.
By contrast, global deduplication looks across an entire file system, removing redundant copies and leaving just one authoritative copy of that file. With global deduplication, that same 10GB file will occupy just 10GB of storage.
Panzura’s global file system CloudFS consolidates distributed multi-media data into a single, authoritative data set that is visible, and accessible, across the organization.
CloudFS deduplicates redundant data before moving it to your chosen cloud or object store, using global deduplication. Panzura doesn’t simply deduplicate files though. Instead, CloudFS looks at the data blocks that comprise files, and deduplicates at the block level. That means that files that contain identical elements also benefit from deduplication, even though the files themselves are not identical.
This can allow you to realize a very significant reduction in your overall data footprint. CloudFS maintains this globally deduplicated data set at all times, checking for redundancies every time it moves data into your cloud storage.
Accessing Data in Real Time
With CloudFS, all content creators in your organization work from the authoritative data set stored in your cloud or object store. No changes in workflows, or user behaviors are required – users interact with media files in the same way they always have, and CloudFS provides them the same performance as the on-premises file experience. That means that files open and save as quickly as they did when stored locally.
Maintaining Immediate File Consistency
CloudFS is the fastest global file system on the planet, making even mega-sized media file edits visible to users as soon as they need to see them, regardless of the number of locations you have, or how far apart they are.
No more scheduled file replication across sites – Panzura’s real time file consistency ensures that users can rely on working the authoritative file, complete with any changes, at all times.
CloudFS enables cross-site collaboration on files in a way nobody else can. For applications that allow it, Panzura automatically locks a file for editing the moment it’s opened, so only one user can ever be writing to a file at the same time. Where applications support byterange locking, Panzura enforces that too. That allows multiple users to work in the same file, while locking the elements they are working on. It’s the file experience users have when they’re sitting right next to each other, even if they’re a world apart.
Transferring Large Files, Fast
Even the largest video files can be rapidly made available across all locations using CloudFS. Panzura’s compression techniques minimize the amount of data to transfer at any time, without disrupting video quality.
Keeping Data Protected
CloudFS provides built in protection against accidental data deletion, or damage caused by malware or ransomware attacks, with a resilient data architecture. Data managed by CloudFS is stored in an immutable – Write Once, Read Many – format so that once it’s in your object storage, it cannot be changed.
New data created by file edits or new file creation are stored as new data blocks. No data is ever overwritten, so any malware or ransomware does not damage files in CloudFS.
Restoring Damaged or Lost Data
Read-only system snapshots are taken on a scheduled basis, and these record the file system at that point in time. Additionally, snapshots are taken at every location in the CloudFS every 60 seconds. These provide the ability to restore any file to any point in time as required.
In the event of any file damage – whether caused accidentally or as part of a wider encryption attack such as a ransomware event – individual files, folders or the entire file system can be restored to a pristine state with no data loss, and minimal disruption.
Ensuring Data Complies With Your Security Requirements
Streamlining data management through intelligent deduplication and the introduction of a single authoritative data source allows IT teams to more easily control where data is stored, who has access, and to monitor that access. Data is securely encrypted in flight and at rest, making it unable to be read, in the event that it is intercepted.
Empowering High Availablity
CloudFS gives you the level of high availability you need to maintain a productive workforce. Every location in a global file system always has read access to data from every other location. Data is stored securely in the cloud and each location can read that data. In the event of a disaster in one location, every other location already has access to the data for immediate recovery.
Panzura Lets You Work With Your Data, The Way That Works For You
Every part of Panzura’s data management solution has been specifically and intentionally designed to let you manage, protect and work with enormous volumes of the files that matter to you most, so you can reach your goals faster. You choose the cloud storage provider that works for you, and we’ll help you get there
Proven in the Most Demanding Environments
CloudFS is designed for high performance at scale, and is used by some of the largest and most respected media and entertainment firms on the planet. Should we talk?