CIOs should buy less storage. Everybody talks about the data explosion and having to buy more storage but the truth of the matter is adding more capacity is not the solution.
In the region, like the rest of the world, CIOs often go to their storage vendors and say “we need another 100 terabytes”. And most vendors oblige them by selling them just what they asked for. They are trying to solve their storage problems by throwing capacity at it. However, given that storage utilization rates are typically at only 30 to 40 percent, it is not a lack of capacity, but low utilization – that is the problem.
According to a report by The Bloor Group, data storage requirements are rising at an average rate of 55 percent per annum but the costs are falling at a far slower rate. This means data storage costs are increasing as a percentage of the IT budget – The Bloor Group estimates this could hit 40 percent of data center costs – which unfortunately isn’t growing as fast as the expectations from CEOs and CMOs who are increasingly demanding the access and ability to mine the data for customer insights in real-time.
The cost of acquisition usually makes up only around 20 percent of the TCO of storage infrastructure. What will blow their budgets are the other attendant costs – data center floor space costs, power consumption and cooling costs, and labor costs. For this reason alone, CIOs need to think long and hard before adding more capacity.
Choosing the smarter storage solution
The only way to bring storage growth under control – while delivering at the service levels required – is to build in capabilities and automation into the storage architecture at every level that help reduce the complexity of managing the storage, which in turn reduces costs.
Take a closer look at how the smarter or intelligent features – often from storage software embedded within the storage architecture – can help get the most out of every single byte. Smarter storage architectures will have three key characteristics. First, they will be inherently efficient by design. Second, they have to be self-optimizing, which means they can analyze data patterns, and automatically make adjustments to optimize performance and cost. Finally, they are cloud-ready, designed to adapt to rapidly changing conditions within a highly virtualized environment, managing storage from any and every storage vendor. Things to look out for when evaluating storage solutions include:
For most data centers the escalating labor costs of managing storage are a primary concern. Look for management interfaces that can handle multiple storage systems, including legacy storage devices from any vendor. With a unified interface, this significantly cuts down on complexity, simplifying the entire storage management process.
With data consolidation and storage virtualization, data can be stored anywhere without impacting the user experience. Using right-tiering technologies – like Easy-Tier from IBM – automatically manages the migration of highly active data from slower to faster responding storage devices, from hard disk drives to flash-based solid state drives.In a SPC-1 benchmark with Easy-Tier, IBM showed that it could achieve a 300 percent performance increase using less than 3 percent of SSD in the storage architecture. Right-tiering technologies should be self-learning and automatic without having the storage administrator to manually manage the policies of what data to place where.
Real-time compression from IBM is a technology that compresses data as it is written to a storage device which means much less capacity is required from the outset. The most attractive part of this technology is that real-time compression does not compromise performance.
As an example, IBM ran a test for an Oracle OLTP workload for 700 users and there was no difference in the numbers of transactions and their response time with or without real-time compression. In another test with an Exchange workload performance actually improved by about 80 percent when real-time compression was turned on.
Data deduplication – which can work well with RTC – is often confused with real-time compression. While real-time compression compresses data as it is written, data deduplication occurs after the data is saved onto the storage device. However, both have a role to play in helping to reduce data storage requirements but are used in different ways.
Storage virtualization is an important first step for enterprises looking to increase the efficiency of their existing storage infrastructure. With storage virtualization you will get greater flexibility and speed in adapting the storage infrastructure to changing business requirements as well as to the introduction of new applications.
Does it solve my storage problem?
Whether it is virtualization, real time compression or right-tiering, the question that CIOs still want answered is this – how do I bring storage costs under control, while delivering on my business objectives and giving me the flexibility to deal with any new requirements?
CIOs are increasingly coming to the conclusion that buying more and more capacity is not the way to go. A smarter storage architecture – one that increases the efficiency and optimizes existing storage investments with storage capabilities – is the smarter answer to your data explosion woes.
Ong Chee Beng, Storage Business Unit Executive, ASEAN, IBM