IN 1991 Geoffrey Moore published ‘Crossing the Chasm: Marketing and Selling High-Tech Products to Mainstream Customers’.
This seminal marketing textbook explored the ‘diffusion curve’ – the rate at which new technology products and services are adopted across any given market. The book is still in print today, is still a best seller, and as Tom Byers, Faculty Director of the Stanford Technology Ventures Program, Professor of Engineering and leading light in the US tech venture capital scene put it a few years back ‘is still the bible for entrepreneurial marketing’. The reason for this is simple: Moore’s description of the way new technologies are adopted in enterprises illustrates a clear truth or, as ordinary mortals like me might put it, ‘he nailed it’.
Even if you haven’t read the book (essential reading for marketing professionals is not necessarily essential reading for subscribers to Storage Magazine) you will be familiar with his terms: the ‘innovators’, ‘early adopters’, ‘early majority’, ‘late majority’ and ‘laggards’.
(insert illustration of adoption bell curve).
Innovators are the IT people that take up products with exciting potential whilst they are still in their infancy. In many cases this is unproven, ‘beta’ product, best described as a working test with a live customer base; it’s a huge risk compared to ‘off the shelf’ proven approaches but generally offers something new and unique. Next come the early adopters, the visionary first movers who see the potential in new technology as it becomes workable, and rapidly adopt it for their own advantage.
When a technology succeeds with the early adopters, it can then ‘cross the chasm’ into early and late ‘majority’ territory: mainstream acceptance and sales nirvana for the vendor.
Finally, we have the laggards, the uber-cautious who take on ‘new’ technology only after everyone else has. But its not called a chasm without good reason: for every vendor who successfully bridges the gap there are legions who fall into the depths and become footnotes in tech history. This happens even when the technology is demonstrably superior (Betamax vs VHS is the classic example: Betamax offered superior picture quality and a smaller storage form factor to VHS, but a small price difference was enough to kill it – early adoption did not give rise to mainstream acceptance).
Moore’s description of technology adoption works perfectly in nearly every circumstance, especially in ‘me too’ markets where there is little concrete differentiation between products. Nowhere is this more true than amongst the old giants of the enterprise storage market; take a look at proprietary solutions from any of the 20th century’s household names – be that EMC, Dell, HP, IBM – whatever – it’s a struggle to find something that really makes a concrete, provable difference. They all have robust management software which is broadly proprietary, broadly comparable hardware, broadly comparable prices, and broadly comparable performance – its the choice between a BMW, a Jag, a Merc or a Lexus: everyone has their preference, but much of that comes down to familiarity and personal taste - no model costs half as much as the other at retail, travels twice as fast, or costs half as much to run.
Yet, every so often, Moore’s take on the diffusion curve is wrong. Circumstances combine to create a situation where a technology doesn’t just cross the chasm, it builds the equivalent of an eight lane motor way on a suspension bridge and charges across at speeds which exceed normal limits. For this to occur, two things need to happen:
The new technology needs to deliver a
genuine step change. Real benefits
and obvious advantages as opposed to
a marginal brand differences – its not for
‘me too’ launches.
The problem the technology solves or
advantage it delivers has to be big enough
to offset the reservations customers on the
early adoption side of the chasm feel.
The ‘need’ of the buyer has to be genuine
and pressing.
Both of these circumstances are now present in the enterprise storage market, and accordingly the market is going to change quickly and permanently: traditional enterprise storage vendors are facing inevitable decline in their current business models and a ‘change or die’ moment because the advantages of software defined storage are so great as to represent an end-of-an-era step change: when the Iron Age arrives your highly optimised manufacturing processes, slick marketing and motivated sales channel count for nothing if you are working in bronze.
So what’s the step change advantage in software defined? Firstly, the cost advantage of software defined storage is huge. Software defined storage separates the physical storage plane from the data storage logic (or control plane). This approach eliminates the need for proprietary hardware and software, and freed from this IT teams can work on commodity x86 hardware and discs in cheap racks, generating as much as a 50% cost saving. That’s the first of our necessary market conditions for rapid adoption dealt with: real advantage in huge cost savings coupled with the additional advantage of avoiding vendor lockdown.
So what about the second condition, the pressing problem or advantage? The key problems with storage are so often stated that everyone knows them – we’re storing more and more data, much of it is unstructured and we are keeping it for indefinite periods. Here’s the top seven pain points as measured by 451 research.
The biggest problems is data growth, which is a major problem for more than half of all enterprises, both large and small. The next biggest problem is managing the cost – which harks back to my previous point – and the third is capacity forecasting. Put these three together and you have a serious budget and management headache with data volumes growing a fierce rate, difficulties in predicting how much you will need, and major issues working out how much exactly you are going to need above and beyond ‘a lot more!’.
In these circumstances, given the clear advantages of the software defined and the serious need for it it driven by the nature of the problems faced by IT, is it any wonder why analysts from Gartner to IDC are united in forecasting the unstoppable rise of software defined storage? In a word, ‘no’: open source based software defined storage marks the beginning of a new era of more agile, scalable and cost-effective storage, and will emerge as the dominant storage architecture. And the traditional storage vendors? They either get with the open source mammals on storage’s evolutionary path or they go like the dinosaurs.