The story behind Software Defined Storage

By David Ellis, CTO and Senior Director Services – EMEA at Arrow ECS EMEA.

  • 9 years ago Posted in

It’s no secret that both structured and unstructured data is growing at an immense rate. Data, in some way or another, is part of our every day life. Although we’re aware that it’s growing exponentially, many aren’t aware of just how vast the volumes are. Add the rise of social media, mobile - and now Machine to Machine (M2M) and the Internet of Things (IoT) - and the amount of data available is only set to continue to increase exponentially.

But where’s it all going to go? After all, data needs to be stored somewhere.

This data tsunami is continuing to put pressure on the data centre and with no signs of abating any time soon, organisations need to look at the best options available for keeping their valuable information housed, managed, maintained and protected. It’s vital, therefore, that data centres are built so that they can scale and grow in-line with this anticipated usage, particularly if they’re to provide the highest quality of service through an ‘always on’ environment.

As the data evolves, so must the solutions. One of these solutions is the Software Defined Data Centre (SDDC), which can provide service providers and enterprises with improved agility, automation and flexibility for its users. This, in turn, will drive cost reductions – so a win-win situation for all involved.

The SDDC has, in fact, been discussed for several years, albeit in theory. Primarily because implementing it is considered a complex and potentially onerous task. Making the transition from a physical data centre to a SDDC means thinking about the underlying physical infrastructure. However, it needn’t be a drastic and radical transition, rather something that can be implemented in a step-by-step approach to address the changing needs of the business or organisation as they occur.

The first step for data centre managers in making the transition to a SDDC will be to consider whether their networks are able to handle the large increase in bandwidth requirements caused by this data explosion. They’ll also need to think about how their server technologies will deal with the analytics of such vast amounts of data, which will surely demand high compute power. As more data is created, storage capacity will need to be increased alongside backup and archiving provisions. To help with this, technologies like flash and hybrid storage are becoming far more widely deployed to ensure performance requirements are met.

As the demand for compute power fluctuates, in response to changes in the business environment, it can become impossible or uneconomical to manage it all in a central location. Co-location is an option to consider in order to aggregate and process the data, but operating a distributed model (central and remote aggregation data centres) can create management challenges for staff working in the data centre. This only strengthens the case for an automated, software-defined infrastructure. The beauty of the SDDC is that it allows many functions to be automated, therefore reducing the data centre management overheads.

SDDCs will also help companies drive more value from the data through business analytics and intelligence. This is particularly important with the rise in unstructured data from social applications and networks such as Facebook, LinkedIn and Twitter. The use of predictive analytics, in particular, will help companies derive real ROI from these new technologies.

Another important aspect to bear in mind with SDDC is security. Many organisations are moving to all-wireless workplaces, delivering tailored content dependent upon location and user. With factors such as the increased ‘east-west’ traffic in the virtualised data centre and the sharp increase in the number of connected devices, security risks will only continue to increase.

The changes may also lead more organisations to adopt a hybrid cloud strategy and place less critical applications into the public cloud.

With an explosive amount of data expected to be created in the 3rd Platform era and even greater demands being put on IT managers and data centres, now is the time to review current infrastructures and take the first steps in tackling the inevitable future of unstructured data.

CTERA has launched the next generation of CTERA Insight, a powerful visualization and analytics...
Infinidat has introduced its Retrieval-Augmented Generation (RAG) workflow deployment architecture...
With a flagship campus in Coventry, UK, the Coventry University Group operates and delivers in both...
On average, only 48% of digital initiatives meet or exceed business outcome targets, according to...
Toshiba Electronics Europe GmbH has released a new video that showcases its successful partnership...
GPUaaS provides customers on-demand access to powerful accelerated resources for AI, machine...
New data reveals that outdated backup technology, limited backup data encryption, and failed data...
Nokia and Lenovo partner to develop high-performance AI/ML data center solutions to meet growing...