Playing the Flash Wars

Not all flash are created equal! This is the common message from three vendors we approached recently: Dell, Pure Storage and Tegile Systems. Dell is what you’d call traditional storage vendors that has until recently been exclusively using spindle drives or HDDs in its arrays. The latter two – Pure and Tegile – belong to the solid state disk or SSD camps. If you read a previous posting – The paths that lead to flash – you will surmise for yourself that even within the pure flash camp, there is infighting as to what path if the best.
According to Dell’s storage solutions director for South Asia, Harmeet Malhotra, flash is hot and a lot of companies believe that once they get flash, their problems will go away. Many are probably wrong on this!
Pure Storage hinted in an interview that some NetApp customers have experimented on replacing HDDs with SSDs in the hopes of achieving the performance they desire without the necessary expense of buying new purpose-built flash arrays.
Rob Commins, VP Marketing Tegile Systems explained that such an approach won’t work as legacy arrays built around disk drive performance dynamics do not fully leverage the characteristics of flash, nor do they deliver the efficiencies and performance of multipliers such as in-line data de-duplication and compression offer. “There are significant differences in the capacity, performance and data management characteristics of flash-based solutions from startups as well,” he explains.
The reality is that there are a few other considerations that they need to look at before deciding the best flash solution for their immediate business challenges as well as long term plans,” Malhotra adds further.
Not all flash solutions perform similarly against varying business needs. Here are some considerations from Dell:

  • Does their workload need primarily read caching? That is the easiest solution and is available in many forms. Write cache performance is harder to deliver.
  • Do they need data security and redundancy? Server-based cache is the fastest way of getting the highest performance. But a server could become a single point of failure. So smarter options need to be evaluated in that scenario.
  • How important is latency? The latency that application sees and what the storage delivers is not the same as there are other steps between data reaching the application. So end-to-end optimization and expertise is key.
  • Which type of flash is best suited? There is SLC (single-level cell) – very fast and very expensive. And there is MLC (multi-level cell) - fast at reads and okay for writes; though less expensive than SLC, it's still more expensive than spinning disks.
  • Are there other helpful technologies? Dell offers "Data Progression" that allows simultaneous use of SLC and MLC with disks. This technology keeps moving data from SLC to MLC and if needed, to disks. This can reduce the need for SSD drives dramatically, thus delivering almost flash-like performance for the almost the price of disk.

If you are a business like Facebook processing more than 300PB of data from your 1.19 billion monthly active users, you probably want to have the fastest storage processors in the world, right? And is flash the only answer?
Malhotra said that if the performance bottleneck is coming from a slower disk sub system, then yes, flash must be evaluated. But it's always important to know that flash will be one part of the big picture; and not the only one.
Commins is more cautious saying that the decision of where and how to implement flash is tricky. He recommends going over use cases, highlighting requirements such as application mix, latency sensitivity, cost constraints and several other factors as part of the decision making process. While such a move may sound like a deterrent towards moving to flash, it is also aimed at ensuring there are no surprises later on.
An organization needs to holistically view the business problem and find the exact bottleneck before deciding on whether flash will really solve the problem. Dell offers its customers a specific tool called DPACK that can be used to look at the big picture – evaluate all parts from CPU, to memory, to network and disk I/O to help better determine where the exact bottleneck is. On top of that, there are OS and application tools that help isolate if specific parts of an application need optimization.
If it is determined that the disk I/O is causing slow performance, evaluating the right flash solution is the way to go.
It should be remembered that flash is not meant to replace all of an organization’s existing storage infrastructure. Both Dell and Tegile agree that flash is an add-on to existing storage infrastructure and meant to support applications that demand higher performance.
Flash can be used to augment performance of existing arrays that still have a long service life, can be added as a separate storage system adjacent to an existing array, or replace an array entirely when a tech refresh is in order.  All are viable strategies, depending on the organization's circumstances.
Dell's point of view is that every scenario requires a thorough evaluation. There could be scenarios where the old equipment is really slow and is best replaced. But most business needs are likely to be such that mixing the two together and taking a time bound roadmap will be much better suited to many businesses.
Disks still provide the lowest cost/TB and for high capacity requirements, they will still make business sense in many cases. Likewise, for many organizations tape media will continue to be the lowest cost storage media for long-term or deep archive. It’s just a matter of applications and priorities.
What is clear though is that when it comes to flash we have not seen the end of it. The landscape for flash-based solutions is just beginning to heat up and we will continue to see competitive pressures as new customer dynamics begin to pop up putting pressure on vendors – startups and legacy – to come up to play solutions that meet the business and technical requirements.
Who will win the war? For now, startups probably offer the best when it comes to new techniques and innovation. Of course, legacy vendors have deep pockets and the capability to put feet on the ground to solve implementation and support challenges that will come up as a result of mixing new technologies with old infrastructure. It’s no secret that many startups are just looking to be acquired. It’s just a matter of price and timing.

Gartner recently unveiled its Magic Quadrant of flash storage vendors. It’s interesting to see who made the cut and who didn’t. It’s also interesting to note the criteria in the selection process (why is Cisco and Huawei in the list for instance?). While there will be arguments as to why certain vendors were not in the list, it is worth remembering that this is just one analyst’s view. At the end of the day, the customer will have to decide based on multiple factors what solution is right for their organization. And it may not even have to be a flash-based one, 

You might also like
Most comment
share us your thought

1 Comment Log in or register to post comments

jallantan@hotmail.com's picture

It is interesting that visionaries are usually led by startups. In the case of IBM and EMC, their claim to visionary fame is likely from acquisitions. Can we say this is because most traditional vendors lack vision and therefore creativity in solving business problems with new technology?