Think about how much data your company created in 2016. According to IDC, the world collectively created about 16.3 zettabytes (yes, a zettabyte is a billion terabytes). They also predict that the world will create 10x that amount by the year 2025. Approximately 90% of that data will be stored in file and object storage. While consumers have traditionally created the bulk of the data up to now, enterprises will create 60% of the world’s data in 2025. At that time, ten percent of all data will be created by IoT. This brings with it a number of challenges as a result of this exponential growth.
- The complexity of managing the ever expanding scale of data storage infrastructure
- The challenge of accessing the immense scale of data
- The challenge of dealing with different types of data
Companies have kept pace with the mounting storage explosion of file-based data by purchasing more storage. This of course only adds to management complexity. Recently, companies have gravitated to flash storage in order to increase the efficiency and speed of writing and accessing data. This however, creates a conundrum in that not all data is created or used equally. While the cost of permanent flash storage is substantiated by high value data, unstructured data may not justify it. According to Qumulo, “Unstructured, file-based data is the crown jewel of the modern day enterprise and petabyte scale data storage is the new normal.”
Fortunately, there are pioneers who are paving the way to the next generation of data storage solutions. One of these trailblazers is Qumulo, a leader in data-aware scale-out NAS solutions. Many data intensive organizations are turning to scale-out storage systems to obtain control of their data. Gartner in fact estimates that more than 80% of enterprise data will be stored in scale-out storage systems in enterprise and cloud data centers, up from 30% today.
Qumulo’s foundational solution is Qumulo Core; a new generation of scale-out storage engineered to store and manage unstructured and file-based data at web-scale. This scale-out file and object storage is a software-defined solution that runs on commodity hardware, providing increased scalability and cost effectiveness. Qumulo core software gives you the flexibility to run just about anywhere because it uses no custom device drivers or proprietary hardware. This allows it to run on a variety of standard platforms... Two popular options today include their pre-engineered appliance with HPE, who now offers it on their Apollo servers, in lieu of their partnership with Qumulo. Qumulo Core can also be run on virtual machines both on-premises and in the cloud.
Qumulo Core utilizes a flash-first hybrid design that uses both solid-state drives (SSD) and high-density hard disk drives (HDD) in separate tiers to achieve the optimal balance between capacity, performance, and cost. All data is initially written to SSD. When accessing recently written data, these SSDs serve as cache. As your storage capacity grows, data that is accessed less frequently is moved to HDDs. This ensures that the right data is matched with the right media.
Another dilemma for companies is how they obtain visibility of their data in order to best determine what it is and how it is stored and used. Qumulo Core’s data-aware design is built to handle tens of billions of files and provides visibility control for file systems at petabyte scale. It offers real-time analytics built into the file system itself that fosters greater visibility into which data is most valuable and where it is stored. With Qumulo Core, storage administrators know which users or applications are accessing what files and what data is backed up. Just as important, it provides insight into why data is growing. Because administrators can see usage, activity and throughput at any level of the unified directory structure, they can better pinpoint problems and effectively manage how storage is used. Its signature Qumulo Scalable File System (QSFS) captures information about the data being stored, creating a footprint of that data. Qumulo leverages this information through a patented database built directly into the nodes of the file system, and designed to answer queries against that data.
Flexible and Dedicated Support
Because Qumulo is software based, it is constantly delivering innovation through continual software updates. Qumulo also monitors the health of your data cluster, analyzing performance and reporting failures in real time. For those customers running Qumulo on HPE Apollo servers, hardware issues are reported to HPE’s support organization. Customers that enroll in the Qumulo Customer Care Program are even assigned their own customer success manager that will guide you throughout the product lifecycle.
Curious to learn more about Qumulo and its scale-out storage systems and solutions? Talk to WEI about our experience with Qumulo and how our customers are utilizing Qumulo to manage enterprise data storage. Contact us today.