AS REPORTED a couple of issues ago, the DCA is one of eight beneficiaries and the technical lead of the EURECA project. This is the second European Commission funded project the DCA has played a leading role in. This time, the call came from the Horizon 2020 innovation and research programme dealing with energy efficiency and market uptake and is specifically related to data centres and the public sector.
At a recent DCA Board of Governors meeting, almost everyone around the table had stories to tell about tenders received that, at the very best, were based on misunderstood information, so clearly this is a call that sits right at a centre of many DCA members daily issues.
The project, therefore, is focussed on providing coordination and support for the uptake of high energy performance data center related products and services within Europe’s Public Sector organisations. In a very competitive field involving 27 rival bids, our winning proposal is based on providing Public Sector procurement teams with access to a tool which incorporates all the industry’s best practices, performance indicators and metrics.
The DCA is joined by experts from CBRE/Norland, Telecity, Carbon3IT, Green IT Amsterdam, Certios and Maki Consulting within the consortium which is led by University of East London who act as the Project Coordinator. The total budget for the project is Ä1.5M. It’s a simple concept, we will build a tool that can support a non-expert(say a procurement officer), who can input data about the format and setup of the data centre along with which best practices have been deployed. Then we can help them identify the opportunities for energy savings, and gain a broad overview of the procurement actions that are needed to improve the environmental performance of the facility in question.
We also intend to include key resources for these procurement actions such as market navigation and benchmarking information about the whole range of data center products and services offered by the market, whether that be data centre equipment, hardware or software through to services such as hosting and training. Clearly this project has the potential to open up many new market opportunities for the industry sector who can fulfil these requirements. It is also very possible to achieve backing by EU or National Policies in supporting its use. One thing that has struck me about the length and breadth of DCA members is the things they all have in common – obviously they all have business interests in building and operating data centres, but no matter what the product or service on offer, to a lesser or greater degree, ALL are selling energy efficiency.
So by joining the DCA and being listed as an industry supplier is more important now than ever. If we are successful, the scenario is a win-win situation for both the industry and the public sector – data centres will realise the benefits of the innovations the data center industry has developed in recent times by saving money and energy and the supply and service sector gets access to a largely untapped market across Europe.
The website is now live and provides more details, it can be found at www.EURECA-project.eu
Business rates advice for Data Centre operators
What you need to know ahead of the 2017 Revaluation
By Phil Ray, Director, Altus Edwin Hill.
THE SUBJECT OF BUSINESS RATES is rarely out of the press these days. The Government recently announced a wide scale review of the business rates system in England in response to concerns from occupiers over whether it is fit for purpose in the 21st Century.
Despite this review the Valuation Office, who set the level of rateable values in England and Wales has already begun the next revaluation, which will result in new rateable values from 1 April 2017. But what does all this mean for Data Centres, and what can owners and occupiers do to ensure they are not paying too much in the way of business rates now and in the future? Data Centres are a unique class of property and are essentially valued differently from other property types and the valuation process requires great skill and expertise.
These buildings may look like an office or more commonly a warehouse from the outside, but their valuation takes account of structural differences, internal fit-out and the considerable elements of plant and machinery. Typically this results in a considerably higher valuation figure than a similarly sized office or warehouse building would attract.
Protect your position
So what can be done now to protect a company’s position and to ensure it is
not paying more in business rates than it needs to?
Existing rateable values remain in force for another two years, up to 31 March 2017, following which new rateable values will come into effect. The Government recently imposed a deadline limiting the financial impact of business rates appeals raised after 31 March 2015 – prior to this date the financial benefit of a reduction would go back to the start of the rating list in April 2010, after this date any savings achieved can generally only be backdated up to 1 April 2015. It is however still possible to backdate these reductions to before 1 April 2015 as the Valuation Office has powers to backdate the reduction to April 2010, but must do so before 1 April 2016. Ratepayers must act very quickly to take advantage of this.
Even if appeal savings are restricted to the final years, they can be substantial and also have an impact on future liability as factual changes made now should also be reflected in the calculation of a 2017 rating assessment. A lower 2010 figure
can produce savings at the start of the next rating list due to the Government’s transitional relief scheme, which limits the percentage a business rates bill can be increased or decreased each year following a revaluation.
Occupiers should however be cautious of automatically appealing against the rating assessment of a Data Centre. Historically, plant and machinery ancillary to Data Centres could have been undervalued by the Valuation Office. Unless checked fully by a specialist surveyor, a speculative appeal without the right professional knowledge and insight could result in a higher assessment and an increased liability for your business rates.
Relief for partly-occupied properties
When new or newly converted, Data Centres are typically only 50% occupied, leaving room for expansion. In these circumstances there are often grounds to apply for empty rates relief. For new properties, the rateable value is often first entered into the rating list incorrectly. There are strict rules governing this initial assessment and these are regularly ignored resulting in incorrectly billed charges. We estimate that in over 80% of cases the adopted date of completion is premature and can be challenged.
Additionally, if a property is undergoing significant alteration or improvement there could be grounds for the business rates liability to be removed or reduced over the duration of the works.
The 2017 business rates revaluation – why act now?
With the span of the last revaluation having been increased from five to seven years, 2017 will be the most significant revaluation since 1990. It will impact considerably on many Data Centres across the country as the Valuation Office seeks to increase assessments in line with rental evidence.
The valuation date (or Antecedent Valuation Date or AVD as it is technically known) for the 2017 Rating Revaluation is 1 April 2015. This is the date around which the prevailing market values and evidence will be considered when setting the next levels of assessment. In order to collect this evidence the Valuation Office is already issuing large numbers of their ‘Request for Information’ forms. These forms are designed to establish rental values and market trends by analysing transactions approximately 12 months either side of this valuation date.
It is imperative that these forms are completed correctly as a response to these questions could have a significant impact on a 2017 rateable value and its resultant liability. A specialist chartered rating surveyor would be able to assist in completing these as part of their service.
To further complicate matters, the Government has committed to a comprehensive review of business rates and there is already the likelihood of a “two stage” appeal process. This will require much greater emphasis than ever before on providing detailed information and comparable evidence at the outset. This is particularly important for Data Centres because comparable evidence for this specific type of property is very limited and relatively scarce compared with other types of property. Despite the recession, this sector has shown considerable growth in recent years, whilst rental levels for offices and warehouses may have fallen in some areas. This has the potential to translate
into increased rateable values for Data Centres from 2017. There will however be instances where certain properties have become obsolete due to their age and
the speed at which Data Centres have
evolved in recent years. The complexity
of Data Centres and the lack of market exposure mean that a Valuation Officer
can struggle to establish or accept that
degree of obsolescence at these specialist properties.
Rating is a complex business at the best of times, and the further complexities of Data Centres require expert advice with experience and knowledge of this very specific property type. With the right approach, owners and occupiers will be
able to take a proactive stance to ensure
that they only pay the correct amount of business rates, and with correct management and mitigation this remains the case in the future. Advice should be taken now to
protect a company’s position, minimise current liability and prepare for the forthcoming 2017 revaluation.
Phil Ray is a Director at business rates specialists Altus Edwin Hill Chartered Surveyors. Phil has over 40 years experience in specialist valuations, having previously been employed at the Valuation Office with responsibility for Data Centres and has been involved with a number of key data centre valuation and appeals across the country. Altus Edwin Hill has specialised in
managing clients’ business rates for
60 years.
Buyer Beware: Can you count on PUE?
By Jason Liggins, CTO, Ark Data Centres.
SINCE THE GREEN GRID first introduced the power usage effectiveness (PUE) method of calculating energy efficiency back in 2007, it’s become the most frequently used metric for determining how efficient data centre buildings are.
Developed to focus industry attention on inherent inefficiencies of the mechanical and electrical (M&E) infrastructures employed within legacy data centres, the PUE metric was designed to give data centre operators a way of isolating where energy wastage occurs – and work towards minimising or eliminating these losses.
Today PUE has become enshrined across the industry as the de-facto standard for tracking data centre energy efficiency – everyone uses PUE as the benchmark of choice, including customers who typically look to compare providers based on their PUE ratings. But in recent years the ‘hijacking’ of PUE for PR purposes has made it increasingly difficult for customers to identify, with confidence, which data centre providers top the efficiency performance league.
In part this is due to the variety of reporting approaches employed to calculate PUE. It’s an issue the Green Grid attempted to address in 2011 with the release of PUE version 2 and its subsequent incorporation into the draft ISO 30134-2 standard. However, some vendors continue to use version 1 methodologies or selective counting in a bid to promote good PUE figures for their data centres. As a result, customers need to be incredibly savvy – asking some highly pertinent questions – if they’re to establish with certainty that the numbers they’re presented with tell the whole story.
The marketing misuse of PUE – it’s a matter of timing and what you count
The PUE metric has some industry-acknowledged shortcomings, especially when it comes to accuracy of reporting. The quest for data centre energy efficiency, driven in response to the cost of energy, climate regulations and carbon tax regimes, has motivated some companies to manipulate the system in a bid to minimise their PUE rating – and acquire a marketing edge.
For instance, the design potential PUE of a data centre when it’s full of customer IT equipment and running at its maximum potential can produce a much lower PUE than 12 months of measurements when it’s running at 25% of its design potential.
Another opportunity for introducing creativity into what data gets captured, and when, is highlighted by the fact that the data centre industry continues to debate the need for taking measurements all year round and averaging these to take account of extreme ‘highs’ and ‘lows’ in demand. It’s a question that’s complicated by a number of other ‘accounting’ challenges that creep in when calculating PUE.
For example, if a PUE measurement is taken at a single moment in time rather than an overall calculation of 12 months usage, and that single point of time is when little or nothing is turned on – say when the minimal number of UPS systems are up and running or cooling demand is at its lowest point – then an advantageous PUE can be obtained.
The problem of partial PUE
But it doesn’t end there. Another typical ploy is the use of partial PUE (pPUE) figures, a scenario in which the selective sampling of one area – say, the best performing data room on site – is promoted as being relevant for the entire facility.
A variation on this approach is the failure to measure elements of the total facility energy usage in order to achieve a lower PUE result. Examples include not counting the energy consumption of the lights, or failing to include distribution losses to external IT equipment because it’s difficult to measure or there’s a lack of instrumentation in place to capture this data. This enables a data centre to record a PUE of 1.5 when its true value should be 1.6. Other operators fall back on the creative interpretation of PUE calculation guidelines to improve their PUE rating by exploiting a lack of clarity as to whether the energy a device uses should be counted in the overall energy calculation. This is especially true if sub systems are shared with other non-data centre functions.
Mixed use facilities add to the PUE challenge
These accuracy issues are particularly relevant when you consider that most data centres are mixed use facilities, housing a number of functions: data centres, labs and offices. This was an area specifically targeted under PUE v2, but is still where the opportunities for some truly imaginative accounting practices can creep in.
Buyer beware – check what’s in the PUE
The relentless pursuit of ever-lower PUE ratings has resulted in some unscrupulous companies ‘cooking the books’ in an attempt to make their facilities appear more efficient than they really are. And that’s a disappointment, not least because it makes it difficult for customers to identify with accuracy how data centres truly measure up to one another.
Being up front about what’s counted in – and what’s not – is the key to tackling efficiency and managing a data centre well to reduce costs while boosting performance. If used consistently on a single facility, or group of facilities, it can provide a valuable insight into how these perform on a year-on-year basis in terms of energy efficiency – and it’s this metric that is being used as part of the Climate Change Agreements (CCA) for Data Centres.
When it comes to measuring efficiency and reporting PUE claims, it’s time for the industry to demonstrate complete integrity rather than massaging figures to boost their sales pipeline. Transparency on PUE and adhering to the spirit and intent of the Green Grid’s PUE metric, as measured for the CCA, is the only way to generate real efficiency savings that translate to lower costs and carbon emissions for customers.
Internet of Things, virtual workspaces and the new IP set to dominate business trends
By Joy Gardham, Regional Director, EMEA West at Brocade.
WEARABLE TECHNOLOGIES, drones and driverless cars are just some latest technological innovations that have blurred the lines between science fiction and reality in the past few years, setting the stage for yet more technological innovation over the years to come.
In an attempt to cut through the hype, we’ve predicted where we’re likely to see fundamental changes in the IT industry.
The Internet of Things
The past year has seen the Internet of Things begin to truly take shape and, this trend is rapidly gaining momentum. With Gartner predicting that 26 billion connected units – from watches to refrigerators and treadmills to cars – will be online by 2020, businesses will need to cope with an ever-growing network of connected devices generating and accessing data. Many of these connected technologies will be brought on to existing corporate or public networks, making investment in the underlying infrastructure that supports these advancements absolutely essential.
The potential for businesses to generate new revenue opportunities through IoT products and services is vast. Interoperability and open standards will be vital to making this possible, with organisations such as the IoT Open Platforms’ initiative critical to the long term success of these solutions.
The initiative has created a portal to open technologies that can be used to create Internet of Things applications but also to document their interoperability, relationships, and reference to existing use cases, infrastructures and deployments.
The virtual workspace
will become
a necessity
Employees are demanding great flexibility and, with budgets set to remain tight for the foreseeable future, companies are looking for ways to meet user demands and boost productivity. Virtual working is therefore set to become increasingly mainstream as companies look to meet this conflicting pressure.
IDC has predicted that 1.3 billion people will work remotely using mobile technology by the end of 2015; equivalent to 37 percent of the entire global workforce. Virtualised workspaces offer an opportunity for employees to have the same working experience regardless of their location or device, giving them greater freedom that, in turn, improves productivity for the business.
However, this added flexibility will inevitably put greater strain on core IT infrastructure, and companies need to ensure they have the right network in place in order to deliver this level of user flexibility and freedom. The Brocade Global CIO Survey, an examination of over 200 CIOs worldwide, uncovered that over half of the respondents spend more than 50 percent of their time reactively citing network downtime or availability. With CIOs already spending such a significant amount of time just keeping the lights on, it’s vital that companies invest and prepare their network infrastructure if virtual workspaces are on the horizon.
A new approach to privacy
Data privacy continues to be a hot topic and, with “privacy-first” services such as Snapchat, Whisper, and Ello growing in popularity, it seems that this trend is here to stay. With public awareness of data security and privacy increasing, organisations are now dedicating resources to hone cyber security initiatives and retain customer confidence that their private information is safe.
Data breaches or leaks are no longer simply security issues; they are potentially hugely damaging to a company’s brand and reputation, a tough lesson learned by Xbox and Sony amongst others. As a result, organisations need to continue to pay much closer attention to how and where they store sensitive commercial or customer data.
As cyber threats continue to evolve, investment in education for employees and customers will be particularly critical as access to, and use of, data becomes a fundamental issue for every business.
Openness helps SDN and NFV take big leaps forward
As a technology, SDN and NFV are still in their infancy but it is already clear that they have the potential to radically change the way that enterprise networks and data centres are built, managed and provisioned. The primary benefits of these technologies – increased automation, greater agility and faster roll-out of new services – are hugely compelling and it is surely only be a matter of time before they become mainstream tools for services providers and enterprises. The adoption of SDN and NFV will be aided by growing momentum around open standards, as the industry recognises that in order to truly align an enterprise’s infrastructure strategy with its business requirements, customers must be free to choose the solutions that best meet their specific needs, regardless of vendor. Truly open, interoperable standards are likely to rise in popularity throughout the remainder of 2015, as customers opt for greater choice and flexibility.
The rise of the New IP
All of these new technologies are a clear sign that our appetite for innovation remains unsated – but they also highlight the way in which our world is becoming increasingly dependent on the networks that underpin these technologies.
Having access to networks that are equipped to cope with fast-accelerating trends such as mobile and big data, using technologies based on open architectures and open ecosystems, will be critical to unlocking the potential benefits of these new solutions. Legacy networks, built on closed proprietary protocols and designed for non-mission critical applications, will have to adapt if they are to support these new technology trends.
We are therefore on the cusp of a new area for networking. The rise of the New IP, a new networking paradigm that is better aligned with the evolution of the rest of IT – and increasingly open, software-driven, and user-centric, is likely to be a major trend for organisations and businesses of all sizes.
We have already seen this begin in earnest with large brands such as the ABC Television Network already come on board with the broadcaster confirming that in the Autumn, it aims to distribute its programming via a hybrid IP-baseband master control and playout system. This sees traditional equipment being replaced by the virtualised equivalent, all running in the cloud.
The cloud reigns supreme over the
data centre industry
By Leo Craig, General Manager of Riello UPS
CONSUMERS are becoming more ‘information hungry’ than ever - devouring cloud services, social media applications, shopping, banking and working online. As we move towards a digital Europe, opening up new opportunities for people and businesses, the role of the data centre in storing, managing and processing this data will become progressively significant. This ultimately means that today’s data centre must provide a sturdy infrastructure to handle this information and must also be flexible, agile and nimble to cope with demand.
The big data (centre) squeeze
As the reliance on the ‘cloud’ shows no signs of abating, the technologies and methods to store this data continue to evolve. For example, Facebook has set up ‘cold storage’ facilities to store users’ old photographs. With more than two billion photographs shared every single day, efficient storage of older, less accessed data is vital to keep the website running effectively. The new ‘cold storage’ facilities only take up a sixth of the power of a normal data centre and use less expensive equipment for storage.
As the modern data centre looks at new routes to reduce power, another key challenge is how to reduce floor space. In the past, future proofing meant that data centre owners were forced to install much bigger UPS to meet capacity for the years to come. But now owners can choose the ‘modular’ approach which allows for flexibility and expansion in data centres ranging in size from micro to large scale.
Rightsizing a UPS now makes it possible to invest only in the functionality required for
the current load requirement, minimising up-front costs for capital equipment and maximising efficiency. Modular allows the system to be scaled up or down to meet future demands.
Modular design offers the maximum in availability, scalability, reliability and serviceability whilst also offering high efficiency, low cost of ownership and a high power density. And this trend is only going to increase as according to global research body Frost & Sullivan, the market for modular UPS is expected to grow to £260m (US $400m) by 2017. The 2012 Best Practice Guide for the EU Code on Data Centres also recommends a modular approach to data centre design.
UPS and energy storage
Where UPS technology has established benefits in terms of power quality, business continuity and guarantee of power supply, in the future its role as a form of energy storage will become ever more critical. The emerging technology of Lithium Ion (Li-ion) batteries has opened up new doors when used in conjunction with a UPS. Li-ion batteries give data centres the ability to store sufficient power capacity to keep the data centre running for 30 to 60 minutes or more without having to run the generator. As Li-ion batteries have greater cyclic properties (10,000 cycles) compared to valve-regulated lead acid (VRLA) batteries (500 cycles), they have a faster rate of recharge which makes them suitable for energy storage applications.
From an environmental point of view, Li-ion can work in temperatures from 0-40 degrees without affecting the battery life, hence removing the need to keep the battery room at 20 degrees as demanded by valve-regulated lead acid (VRLA) - thereby offering a saving on cooling.
It is not only data centres which are realising the benefits of Li-ion batteries - in the US, electric car giant Tesla has announced its commitment to the energy sector by unveiling a suite of new solar batteries for homes, businesses and utilities. When revealing the new Li-ion battery, Tesla stated that about two billion of the Powerpacks could ‘store enough electricity to meet the entire world’s needs.’ These are bold words but they clearly demonstrate the growing interest in using energy storage in the transition to a more sustainable world.
Our energy future
Underlying all the new trends coming to market is the ultimate need to be more efficient as tough EU targets aim for a reduction in greenhouse gases. Data centre power consumption is on the increase and likely to double by 2020. Various voluntary schemes have been set up responding directly to this increased energy consumption such as the European Commission’s ‘Code of Conduct for Energy Efficiency in a Data Centre’ and of course Data Centre Alliance’s own certification scheme. But before long, businesses will not have a choice when it comes to being environmentally responsible as it will become legally binding. For example, one mandatory scheme which will take effect fromDecember this year is the Energy Savings Opportunity Scheme (ESOS) which obligates large companies to have four yearly energy audits.
From these, detailed reports on energy use and the efficiency of their organisation are produced. ESOS will affect companies with more than 250 employees or companies with fewer than 250 companies but with an annual turnover exceeding Ä50m/£40million. As more and more businesses sit up and take notice of energy efficiency measures, data centres will become more effective overall and provide that essential infrastructure to back-up the increasing amount of data in the world.
Understanding data compliance
By Campbell Williams, Group Strategy and Marketing Director, Six Degrees Group.
FROM PCI DSS to CDE standards the data market today is rife with myths, jargon and acronyms when it comes to compliance. This is complicated further by data protection and compliance policies involving codes of conduct for IT decision makers throughout the UK. From payments to data sovereignty, there is a rule or best practice guide for everything meaning finding a place to start is challenging. Every UK organisation must comply with the regulations or they could face hefty penalties and suspension of service. Non-compliance is no longer an option.
A recent survey by 6DG has unearthed the fact that almost half (43%) of IT professionals don’t currently understand the compliance legislation when it comes to managing data. It’s no wonder why. From the UK’s Data Protection Act to individual (and varied) company privacy policies, IT professionals could get lost in a sea of paperwork. In fact, over half (52%) of the IT industry specialists surveyed said that they would rather use a third party to manage their data compliance than make sense of it themselves. The cost of non-compliance can be substantial. Demonstrating how eager they are to enforce to Cabinet Office’s ‘zero-tolerance’ approach to non-compliance, the Information Commissioner’s Office (ICO) issued a fine of £325,000 to an NHS University Hospital Trust after a serious data breach in 2012.
Data sovereignty (where the data is stored) is a key component when it comes to compliance. For some organisations it’s essential that data is stored within the UK or EU, or as prescribed either by law or by internal governance policies. We were pleased to see a large majority (86%) of those questioned believing that data sovereignty is a concern.
However, we were surprised to learn that in cases where an organisation outsources to Managed Services Providers (MSPs), there was often a lower level of in-house knowledge when it comes to compliance.
Rather than managing and monitoring the MSP closely, businesses are blindly assuming that their MSP is complying with the relevant regulations. A shockingly high proportion (35%) of those outsourcing to an MSP admitted to not even knowing where their data is housed. When a third of IT professionals using an MSP aren’t checking where their data is stored, how can they be sure that the solution is compliant and correct? With businesses relying on cloud providers that might be operating anywhere in the world, it’s time to start taking responsibility and make compliance and sovereignty a business priority.
Organisations need to manage vital financial information, customer details and intellectual property correctly in order to comply with the latest regulations. It is troubling that the majority of IT professionals surveyed have an insufficient understanding of how to make sure they are compliant. There’s clearly been a breakdown in communications between the ICO and the UK’s IT departments, but considering the number of rules out there perhaps it’s not surprising. Something needs to be done to help UK industries make sense of this maze of legislation.
Whilst we’re waiting for this to happen, here are my top tips for becoming data compliant:
Ask your Managed Services Provider how
they deal with your data.
Keep up to date with the latest legislations
and changes, trying to understand how
they impact the way you do business.
Manage your MSP, keep asking what
improvements they are making and how
this will impact you and your data.
Always know where your data is being
stored. Always.
These tips simply scrape the surface of a complicated environment. Whether you’re a customer or a provider, everyone has the responsibility to ensure they are complying with the latest regulations. After all, compliance regulations exist for a reason.
Liquids flow downhill!
By John Booth, Managing Director, Carbon3IT
THE USE OF LIQUID COOLING technologies in the ICT industry is very small, but is this about to change? First, we need to understand why liquid cooling is gaining ground in a traditionally air cooled technology area. Rack power densities are projected to rise, a recent Emerson Network Power survey indicated that 70% of 800 respondents expected rack power will be at or above 40Kw by 2025 (Data Centre 2025), high rack power drives increased power distribution losses, increased air flow and power consumption, lower server inlet air temperatures and increased chiller power consumption.
Energy costs will rise globally sooner rather than later if fossil fuels continue to be the primary source due to scarcity of supply, but even the use of other alternative energy solutions become more widely adopted, the integration of the new plant will require new infrastructure, all of which require massive investment globally.
The costs of which will eventually be passed on to the consumer. Thermal management options will dictate rack power density and will have an impact on energy efficiency. Finally, users are having an impact, high performance computing has always been drawn to liquid cooling options but non-academic uses such as bitcoin mining, pharma and big data, and social media such as AI and face recognition are all driving a need for energy efficient compute.
The Emerson Network Power survey suggested that 41% of respondents felt that in 2025 air and liquid would be prevalent, 20 % ambient air (free cooling), 19% cool air, 11% liquid, and 9% immersive. It is the 20% comprising liquid and immersive technologies that are of real interest. Liquid cooling can be split into 4 main types, these are, “on chip”, basically liquid cooled chipsets that use heat sinks to dissipate the heat generated and the hot air is expelled in a similar fashion to conventional.
Single Phase immersion, where all thermal interface materials are replaced with oil compatible versions, fans are removed, optical connections are positioned outside of the bath and the servers are immersed in an oil bath. Two phase immersion, where thermal interface materials are replaced with dielectric compatible versions, fans are removed, rotating media (storage) are replaced with solid state drives, optical connections are located outside the bath and the servers are immersed in a dielectric solution. The final method is dry disconnect, where heat pipes transfer heat to the edge of the server, the server connects to a water cooled wall via a dry interface and the rack itself is connected to a coolant distribution system.
PUE’s of 1.1 or lower have been cited for pure liquid cooled solutions, but these have been largely carried out on single rack installations and may not have considered additional water pumping when installed on a large scale. That said, considerable savings can be made by switching to cooling – a recent Green Grid Liquid cooling webinar cited figures of 88% reduction in blower power and 97% reduction in chiller power. Liquid cooling also has another benefit in that inlet temperatures can be higher (up to 30-40°C) and useful heat in liquid form (up to 65°C) is the output. This means that the data centre can now be connected to CHP systems or the heat sold for other purposes.
This of course means that the data centre will need to become part of an ecosystem, one where waste water from an industrial process can be used for cooling and where the waste heat from the data centre can be used for another purpose.
So what is the outlook for liquid cooling in the data centre arena? Clearly, adoption is on the up, from the Emerson survey many of the respondents expected to see a lot more liquid cooled solutions in the space come 2025, which after all is only 10 years away. But, and it is a big but, a lot of investment has been made globally in the construction of air cooled data centres and these are not well suited to the wide scale adoption of liquid cooling technologies. Add the “ecosystem” element, which will no doubt cause some problems with managers and designers and you have another set of reasons not to adopt.
It is difficult to see past the clout of the vendors who have air cooled solutions who ultimately determine what technology is installed, unless they either develop their own tech or buy one of the current players. Many of the liquid cooled solutions are the domain of smaller, dare I say, niche players and there may be difficulties in a rapid adoption of Liquid through a lack of equipment, skills and infrastructure. This of course will be negated if a big giant comes a calling. That said, liquid cooling in my mind is a disruptive technology and we await developments with interest.