液冷VS风冷 | 风冷永不退场

 

风冷永不退场

Air cooling will never go away

 

Peter Judge

 

译 者 说

目前,仍然有大量的低密度机房(4kW/R)存在,且其主要采用空气冷却,如果全部改造为液体冷却,一方面经济成本巨大,另一方面也造成大量的资源浪费;另外,随着高密度机柜需求越来越多,空气冷却确实已达到其供冷极限,急需高效的供冷解决方案,液体冷却确实是一个合适的选择。

 

随着高密度机柜的出现,液体冷却也将跟随其出现和发展。在综合考虑经济、时间、运维成本及投资收益的前提下,冷板式液冷将会在液体冷却中快速部署,浸没式液冷仍需要一段时间才能大规模应用。

 

总的来说,每种技术都有其优劣势,任何一种单独的技术都无法完全满足高效、经济、实用的多重需求,在特定情境下各种技术合理配置运用更加现实,空气冷却和液体冷却仍然会在未来一定时间内并存。

 

 

我们需要液冷,但是它不能代替风冷

We need liquid cooling, but it won’t replace air

  

液体冷却被认为正在推动数据中心热管理领域的一场革命。普遍认为传统的空气冷却正在逐步退出。它也必须退出历史舞台,从而为水冷却、电介质冷却和其他液体冷却的服务器的出现让路。而在以往,鲜少有变革能如此干脆利落。

Liquid cooling is supposed to be driving a revolution in heat management within data centers. The old way, air cooling, is on the way out, we are told. It must go, to make way for a world where servers are cooled with water, dielectrics, and other fluids. In real life, revolutions are rarely so neat and tidy.

 

毫无疑问,机柜内服务器的密度已经到了无法再用空气高效冷却的临界点。液体冷却有很多优势,包括效率的提升,消除灰尘和污垢方面的改善,运行更安静,同时还可以实现余热回收。

There is no doubt that the densities of servers in racks are reaching the point where some of them can no longer be cooled efficiently with air. And liquid cooling has a vast set of benefits, including increased efficiency, improved exclusion of dust and dirt, and quieter operation - and it delivers waste heat in a form where it can be used elsewhere.

 

但是,空气冷却供应商储备的订单并没有减少的迹象,同时,新的数据中心仍在围绕冷水机组、暖通空调设备和其他空气冷却设备进行设计。

But still, air cooling vendors have a backlog of orders that show no sign of diminishing, and new data centers are still being designed around chillers, HVACs, and other air-cooled equipment.

 

我们该如何解释这种现象?现在的空气冷却与未来的液体冷却系统又如何共存?

How do we explain this? And how will today’s air cooled environments coexist with tomorrow’s liquid cooled systems?

 

冷却的多样性
Palette of cooling

 

“开放计算项目”的液体冷却的负责人,冷却专家顾问Rolf  Brink说:“空气冷却将会被液体冷却所取代的说法有两点是错误的。空气冷却永远不会消失。服务器都是被空气冷却的说法也不正确。这并不是一场哪种技术最终会胜出的较量。”

The story that air will give way to liquid cooling is wrong on two counts, says specialist cooling consultant Rolf Brink, the Open Compute Project lead for liquid cooling: “Air cooling will never disappear. And it is also incorrect to say they've always been air cooled. It's not a battle about which technology will be left at the end of the road.

 

“必须关注IT设备并了解其需求”Brink说,“IT设备对冷却的需求是多种多样的。而这正是现在你所考虑的冷却技术丰富多样的地方。”

“You have to look at the IT equipment and see what it needs,” says Brink. “IT equipment has various requirements for cooling, and this is where the palette of cooling technologies that you should be considering is greatly enriched these days.

 

“今年或者明年,冷板会成为主流”Brink说,“浸没式液冷还需要几年才能成为主流。但并不是所有的IT设备仅使用浸没式、冷板式或风冷即可满足需求。”

“Cold-plate is becoming mainstream this year or next,” says Brink. “Immersion is going to take a few more years before it becomes mainstream. But not all IT equipment is suitable for immersion or cold plate or air cooling alone.

 

“这是一种大的思维方式的转变”他说,“我们将看到越来越多的复合型基础设施和设备可以兼容空气冷却和液体冷却。而这正是整个行业需要做好准备的。”

“That is the big paradigm shift,” he says. “We're going to see more hybrid environments where the underlying infrastructure and facilities can cater to both air and liquid cooling. And that is what the industry needs to get prepared for.”

 

“我们正处在这个过渡阶段,既看到了空气冷却需求的增长,也看到了许多新兴的液体冷却需求的出现”Stuart Lawrence,Stream Data Centers的产品研发和更新副总说,“因此我们发现可配置性是目前最重要的事情。”

“We're in this transition phase, where we see both extended demand for air cooling, and a lot of newer liquid cooling requirements coming in,” says Stuart Lawrence, VP of product innovation and sustainability at Stream Data Centers. “So we find configurability is the most important thing right now.”

 

作为数据中心运营商,Lawrence必须应对客户(主要是一些一次租用整栋建筑的大型企业)提出的要求:“我们发现一些客户正在尝试用液体直接冷却芯片,有的是单相流体,有的是相变流体,有的是冷板。浸没式液体冷却并不是太多。”

As a data center operator, Lawrence has to deal with what his customers - mostly large players taking a whole building at a time - are ready for: “We're seeing some customers playing around with some liquid cooling direct to chip, either single phase fluids, or phase changing fluids or cold plates. We aren't seeing a lot of immersion “

 

 

数据中心国际专家培训

 

 ATD设计课程
没有做过运维的设计师就无法设计出好用的数据中心吗?ATD课程将彻底解决这个问题
 
 AOS运维课程

运维管理的直接目标是优秀的运维人员而非设备本身;全球权威的AOS运维管理专家课程将带您深刻理解运维管理的本质

 
 ATS管理课程

正确和系统地了解Tier分级体系会提升数据中心的项目投资回报、减少业务中断风险,ATS课程将全面带您学习Tier知识,帮助您有效提升企业的运营指标和内外部客户的满意度。

 

点击图片查看课程排期

 

 

 

扫码回复【uptime培训】了解课程详情

 

风冷视角
The air perspective

 

空调供应商承认确实需要进行改变。江森自控暖通空调产品业务发展全球总监Mukul Anand说“某种程度上,空气冷却确实有其局限性。”

Air-conditioning vendors admit that things must change. “At some point, air cooling has its limitations,” says Mukul Anand, global director of business development for applied HVAC products at Johnson Controls.

 

利用空气可以带走的热量是有限的。正如他解释的那样,冷却高电量的芯片需要大量的空气:“风速会变得很高,空间的噪声也很大,服务器风扇能耗也增加,而风扇能耗并不会体现在PUE的计算中。”他发现直接芯片冷却、浸没式和相变冷却的占比在增长,同时注意到在空气冷却系统中也有水环路,正如蒸发系统一样。数据中心正在努力降低用水量,同时也在可能的情况下关掉压缩机,在空间中使用水冷却使得他们的工作更容易。

“There's only so much amount of heat you can remove using air.” As he explains, it takes a lot of air to cool a high-energy chip: “The velocity of air becomes very high, noise in the white space becomes a challenge and the server fan power consumption increases - which does not show itself in the PUE calculation.” He sees direct-to-chip, immersion, and twophase cooling growing, and notes that air-cooled systems often have a water circuit, as well as using water in evaporative systems. Data centers are trying to minimize water consumption while switching off compressors when possible, and water cooling inside the white space can make their job easier.

 

“我们已经看到了市政当局和民间团体在用水冷却数据中心方面明显的转变”Anand 说,“一种从直接蒸发冷却技术转向空气冷却(风冷)冷水机组、水冷冷水机组和干冷却器的转变。”随着液体进入空间,他说“我们必须确保我们已经完全了解了所使用的的液体(水、乙二醇等),同时确保我们在液体冷却服务器技术上达成一致,并尽可能经济化使用。”

“We've seen a distinct shift of municipalities and communities away from using water for data center cooling,” says Anand. “A shift from direct evaporative cooling technologies towards either air cooled chillers or water cooled chillers and dry coolers.” As liquid cooling comes inside the white space, he says: “We have to make sure we completely understand the fluids that will be used (water, glycol, etc.) and make sure that we converge on an agreed liquid cooled server technology, and use economization as much as possible,” he says.

 

他说“其中一个直接的结果是冷却液体的温度可以提高至IT设备允许的最大值。30℃是一个中位数。但是已经比目前数据中心所用的空气冷却系统中的冷冻水温度高。”空气冷却系统必须适应这种变化,“我们必须推出和使用可以在30℃冷却液下稳定和高效运行的设备。”

“One of the direct consequences is to use the chilled fluid temperature as high as the IT equipment will allow. 30°C (86°F) is being looked at as a median number. That is certainly higher than the chilled water fluid used in data center air cooling systems today.” Air cooling systems will have to adapt, he says: “We must launch and use products that are as comfortable and efficient providing chilled fluid at 30°C.”

 

冷却系统在此温度下,数据中心可以有更长的时间利用室外空气进行自然冷却。“在自然冷却模式下,压缩机停止运行,可以节省大量电力资源。在像劳登县、弗吉尼亚、硅谷,我们正在尽可能使用自然冷却以节约能源。”

With that temperature in their cooling systems, data centers can spend more time on free cooling using outside air. “That allows for a whole lot of hours in the free cooling method where the compressors do not consume any significant amount of power. In places like Loudoun County, Virginia, and in Silicon Valley, we're using as much economization as possible.”

 

世界上现存的数据中心中,10%的机柜都可以转换为液体冷却。“液体冷却可以冷却90%的空气冷却服务器,因此数据中心可以逐步地转变为越来越多地利用液体冷却。”

In this world, 10 percent of the racks in a data center can move to liquid. “You have a cooling architecture that can cool 90 percent air cooled servers, and gradually convert this data center to more and more liquid cooled.”

 

许多液体冷却方案(比如ASHRAE定义的)在最理想的情况下很少启动冷水机组和机械制冷,冷水机组和机械制冷仅作为应急情况下得备用系统。Anand 说“冷水机组和机械制冷主要为那些炎热的下午准备的。在没有电力供应的时候,你必须要有发电机。一年中不能使用自然冷却的少数时间内,你就需要冷水机组来完成这部分冷却工作。”

In a best-case scenario, many of the liquid cooling scenarios defined by ASHRAE rarely need chillers and mechanical cooling, and those chillers will become a backup system for emergencies, says Anand: “It is for those warm afternoons. You must have a generator for the times when you don't have power. You must have chillers for the few hours in a year that that you cannot get economization to do the cooling job for you.”

 

这些冷水机组也会面临挑战,除了高密空间会带来挑战,他说“也包括所有方和运营方都倾向的多层数据中心。”这些冷水机组在生产中,更需要关注其本身和供应链中的碳排放。“如果使用更少的金属,更轻的设备,在制造和运输过程中产生的碳排放则会更少。”当冷水机组集中设置在屋顶密集布置时会造成“热岛”效应;“当冷凝器或带冷凝器的冷水机组在屋顶分散布置时,则不会相互影响。

Those chillers could still be challenged, he says, because as well as running denser white space, “owners and operators are leaning towards multi-story data centers.” These chillers will need to be built with a greater concern for the carbon embodied, both physically and in their supply chain: “If you're using less metal and lighter pieces of equipment, the carbon generated through the fabrication and shipping processes is lower.” Chillers are placed on the roof of the building, and this means they are packed together tighter, causing a “heat island” problem: “When condensing units or chillers with condensers are spread far apart on a roof, one is not influenced by the other.

 

当32或64台冷水机组密集布置于屋顶时,一台的排风会进入另一台的冷凝器,对其效率和制冷量都会有不利的影响。”

When you have 32 or 64 chillers close together on a rooftop space, the discharge air from one goes into the condenser of the next one, adversely impacting its efficiency and capacity.”

 

 

 

空气冷却的扩展
Extending air cooling

 

回到自然空间内,Lawrence看到建筑中很多液体冷却设备只是简单地扩展了空气冷却:“这些设备是将液体输配至芯片,但是只是到了背板换热器或侧板换热器。”

Back inside the white space, Lawrence sees a lot of liquid cooling implementations as simply extending the air cooling provided in the building: “It's direct liquid to chip, but the liquid goes to a rear door heat exchanger or a sidecar heat exchanger.”

 

像Iceotope公司生产的精密空调,其液体可以输配至需要冷却的特定部件,该类型精密空调所服务的服务器仍然安装在常规的机柜内。此类精密空调是直接芯片冷却或板冷与GRC公司和 Asperitas 公司在销售的浸没液冷设备的一种中间技术。

注:Green Revolution Cooling (GRC): GRC致力于提供高效的数据中心液冷解决方案。他们的技术通常涉及将服务器浸泡在非导电液体中,以有效地冷却硬件并提高数据中心的能效。

Asperitas B.V.: Asperitas专注于开发创新的液冷技术,以提高数据中心的能效。他们的液体冷却技术,可以在数据中心中实现更高效的散热。

Precision cooling from companies like Iceotope, where servers remain in regular racks, and liquid gets to the specific parts which need cooling are a mid-point between direct-to-chip or cold plate, and the more extreme idea of total immersion in tanks sold by the likes of GRC and Asperitas.

 

直接芯片冷却和精密液体冷却设备均可以安装在空气冷却的环境内, Lawrence 说“它们通过在空气冷却的数据中心内采用空气与液体的换热系统来散热。”

Direct-to-chip and precision liquid cooling products can be installed in an air cooled environment, says Lawrence: “They reject heat by means of an air-to-liquid heat exchange system within an air cooled data center.”

 

这可能会让液体冷却变革达不到预期,但这是有原因的,Lawrence 说“直接液体冷却的很多配套设施并不成熟。”

That may be disappointing to liquid cooling revolutionaries, but there’s a reason, says Lawrence: “Most colocation facilities aren't really ready to go direct liquid.”

 

他表示液体冷却可以作为一种必要补充,他说“我认为当我们利用4个10kW机架位置升电到40KW机架时,液体冷却可作为空气冷却的补充。”这些高密机柜需要设置额外的换热器或附属设备。

He sees liquid cooling as additive, where it is required: “I think we will get this extension of air cooling where they will take 10kW racks and make four rack positions into 40kW racks.” Those high-density racks have an extra heat exchanger or “sidecar.”

 

“过去十年中,我部署的大部分产品采用空气冷却,其内有液体冷却盘管,”IBM可持续发展和数据中心创新领域负责人Dustin Demetriou 说“2016年的时候,我们在一家金融服务公司做这样的部署工作,因为他们基本上都采用直膨式冷却系统,没有冷冻水,但是他们又需要高功率的机柜。”

“In the last 10 years, the majority of the products that I've deployed are air cooled with an internal liquid cooling loop,” says Dustin Demetriou, IBM Systems leader for sustainability and data center innovation.“As far back as 2016 we were doing this in a financial services company because they had basically DX chiller systems with no chilled water, but they needed a high power rack.”

 

“直接芯片冷却最大的优势就是使用与空气冷却服务器相同的IT架构和同样的机柜尺寸,”Anand 说,“冷却分配单元可以布置在白空间内,有时也可以布置在机柜内。使用这项技术,可以相对快速地将至少一部分数据中心改造为高负载机房。”

“The great part about direct-to chip liquid cooling is that it uses the same IT architecture and the same rack form factor as air-cooled servers,” says Anand. “The cooling distribution units can be in the white space, or sometimes in the rack themselves. Using this technology, transitioning at least a portion of the data center for the intense compute load can be done relatively quickly.

 

Anand 说“当采用浸没式冷却箱时,可能会出现一些分歧。将浸没式冷却箱的热量排到空气冷却系统时需要开启压缩机,或者对浸没式冷却系统进行改造。”

When things move to immersion cooling tanks, there may be a division. Expelling the heat from an immersion tank into an air-cooled system might require the compressors to be turned on, or changes to the immersion system, says Anand.

 

他解释说“浸没式冷却箱中服务器消耗的电量转化为热量,这些热量必须排掉。在浸没式冷却箱内可以使用温度更高的液体来排掉服务器的热量。那么需要开启压缩机制取的温度较低的液体可能就不需要了。”

He explains: “The power that's consumed by the servers in the immersion tub gets converted to heat and that heat has to be removed. In a bath, we can probably remove that heat using warmer temperature fluid.And the lower temperatures that mandate the operation of a compressor are probably not needed.”

 

 

利益损失
Losing the benefit

 

这种混合方案有一个明显的缺点。液体冷却最值得推崇的好处之一就是可以利用这些被集中的余热制取温度更高的热水。

There’s one obvious downside to this hybrid approach. One of the most vaunted benefits of liquid cooling is the provision of waste heat in the concentrated form of highertemperature water.

 

如果热量排放到空气冷却系统,那么就像以前一样损失了。将浸没冷却箱运行在较低温度,余热回收的好处也不存在了。这就像是数据中心过度冷却的重现。

If the heat gets rejected to the air-cooling system, then it is lost, just as before. Running the liquid bath at this lower temperature removes the benefit of useful waste heat. It’s like a re-run of the bad practice of overcooled air-conditioned data centers.

 

“从可持续的角度来看,令人遗憾的是温度没有提升。”Lawrence 说“利用这种空气扩展技术,我们并没有真正得到液体冷却的可持续性的好处。”Demetriou  指出仍然存在持续的好处“如果从每瓦的性能来看,一个搭载5GHz芯片的服务器,如果完全采用空气冷却,可能只能发挥一半的性能。因此,采用液体冷却只需要更少的服务器就可以完成工作。虽然没有得到液体冷却的全部好处,但是我认为还是得到了很多好处。”

“The sad part about it from a sustainability perspective is you are not raising any temperatures,” says Lawrence. “So we're not we're not getting the real sustainability benefits out of liquid cooling by utilizing this air extension technology.” Demetriou points out that there are still sustainability benefits: “If you look at it in terms of performance per watt, a product with 5GHz chips, if it was strictly air cooled, would have probably given half the performance. So you would need fewer servers to do the work. You're not getting all of the benefits of liquid but I think you're getting a lot.”

 

Demetriou 也是ASHRAE 9.9技术委员会的成员,该委员会是冷却指南和标准的主要制定者。“这个领域我们研究了很久,因为这既不是完全液体冷却也不是完全空气冷却。还有很多中间过程。”

Demetriou also sits on the ASHRAE 9.9 technical committee, a key developer of cooling guidelines and standards: “This is an area we spend a lot of time on, because it's not all liquid or all air. There are intermediate steps.”

 

输配
Funneling

 

Lawrence 说,全液冷却数据中心让人感到复杂的另一个原因就是“输配”,即把足够的电力输送到机柜。

Another reason that all-liquid data centers are complex to imagine is the issue of “funneling,” getting enough power into the racks, says Lawrence.

 

“如果一个40MW、400000平方英尺(约37160平方米)的数据中心,由25000平方英尺(约2322.5平方米)的模块组成,我可以用电缆向每个模块输配2.6MW的电力。如果将功率密度加倍,由400000平方英尺(约37160平方米)变为200000平方英尺(约18580平方米),或者100000平方英尺(约9290平方米),那我将面临巨大挑战。”

“If I take a 40MW, 400,000 sq ft data center, made up of 25,000 sq ft data halls, I can get all my electrical lineups to deliver 2.6MW to each data hall. If I start doubling the density to make that 400,000 sq ft data center 200,000 sq ft or 100,000 sq ft, then I have a really big challenge.

 

“我必须将建筑建得又长又窄,才能使所有电缆正常输配电力。如果将建筑建成又小又方的形式,将电力输配至此空间将会是一个很大的问题。电力输配将成为一个很大的挑战。”

“I have to make that building really long and thin to actually get all the electrical lineups to funnel correctly. If I make it small and square I end up having really big problems getting the actual electrical power into the space. The funneling becomes too much of a challenge.

 

“现在谈论这个问题的人并不是很多,但是我认为这将会成为一个相当大的问题。液体冷却的挑战在于如何设计设备,使其不会遇到电力输配的问题。”在空气冷却设施内布置少量的高密度机柜可以避免这个问题,他说:“如果在空气冷却空间内,就有很多空间可以布置电力输配线缆。”

“Not a lot of people are talking about that right now, but I think it's going to be a pretty big problem. The challenge with liquid cooling is to design the facility in such a way that you don't run into funneling issues to get the power into the space.” Placing small quantities of high density racks within an air-cooled facility actually avoids this problem, he says: “If you're working with an air cooled space, you've got a lot of space to route your power around.

 

“当建筑尺寸适用于液体冷却时,你将会遇到各种各样的电力输配问题,有的以前甚至都没有考虑过。”

When you make the building appropriately sized for liquid cooling, you run into all sorts of electrical funneling issues that you hadn't had to even think about before.”

 

设备生命周期
Equipment lifecycles

 

空气冷却系统将继续存在一个主要原因是因为系统中的这些设备都非常坚固耐用。放置在数据中心屋顶的冷机系统可以使用20到25年,在此期间,可能会出现四代不同的芯片硬件,每一代对冷却系统都会有不同的需求。

One major reason why air-cooled systems will remain is because are very rugged and enduring pieces of equipment. A chiller system placed on the roof of a data center is expected to last for 20 to 25 years, a period that could see four different generations of chip hardware, all with different cooling needs.

 

江森的Anand 说:”如果HVAC架构设计为向液体冷却服务器供冷,则在数据中心生命周期内我们不必改变冷却架构。“

Johnson’s Anand says this is possible: “If your HVAC architecture is designed to provide cooling required by liquid cooled servers, we will not have to change the cooling architecture through the life of the data center.

 

他说:“数据中心从世界的某个地方开始设计、在另一个地方建成到业务上线可能需要几年的时间。”

“The time period from when a data center is designed in one part of the world to when it is built and brought online in another part of the world might be several years,” he says.

 

“我们不希望等到液体冷却技术在全球范围内得到普及,才在下一个建筑架构设计中将其考虑进去。”

“We do not want to wait for liquid cooling technology to be adopted all across the world for the next architectural design of the building to materialize it in construction.”

 

不只是设备,还有建筑,Lawrence 说:“超大规模用户签订15到20年的租约,而我们看到IT设备每4到5年就会更新。而这让我感到困惑。如果你今天签订一份租约,租约期限内将会经历3次IT的更新。你上架的IT设备在这15年期间要么将一直是空气冷却,要么你的机柜或自然空间有其他形式的液体-空气冷却系统。”

It’s not just the equipment, it’s the building, says Lawrence: “Hyperscalers are signing leases for 15 and 20 years, and we are seeing IT refreshes in the four to five year range. That boggles my mind. If you're signing a lease today, it’s going to last three IT refreshes. That IT equipment that you're putting in is either going to be air cooled for that 15 year period, or you're going to have some form of liquid-to-air system in the rack or in the the white space.”

 

像戴尔和惠普这样的服务器生产商正在生产液体冷却版本的硬件,他们预测在未来10年将会有50%的数据中心采用液体冷却。并不是每一个应用都需要如此高的冷却需求,这意味着仍有50%的服务器可以用空气冷却。

Server makers like Dell and HP are producing liquid cooled versions of their hardware, and are predicting that in 10 years' time data centers will be 50 percent liquid cooled. Not every application has such high demands for cooling, and this means that half the servers can still be air cooled.

 

由于这种分歧的存在,情况也会变得复杂。如果业主提供的全部是空气冷却,而客户需要增加液体冷却,Lawrence 解释道:“如果你将液体直接输送到机柜的CDU(冷量分配单元)或列间液体冷却设备,情况将会变得复杂。”

It can also get complicated because of demarcation. If the building owner provides overall cooling with air, and tenants want to add liquid cooling, Lawrence explains: “It gets complicated if you bring liquid straight to a CDU (cooling distribution unit) on rack or an in-row liquid cooler.”

 

强制措施
Forcing the issue

 

Rolf Brink认为可能需要教育或者监管来推动数据中心更快地实现液体冷却:“新设施还没有针对未来的生态进行设计,这样的情况仍然比较普遍。这是行业的核心问题之一。这正是监管发挥作用的地方,要求至少要在空间中为液体冷却基础设施做好准备。”

Rolf Brink thinks that it may take education, and even regulation, to push data center designs more quickly to liquid: “It still happens too often that new facilities are not yet designed for the future ecosystem. This is one of the core problems in the industry.And this is where regulation can really be beneficial - to require facilities to at least be prepared for liquid infrastructures in the white space.”

 

Brink 说:“一旦数据中心建成并运行,将不会对自然空间进行重建。在运行状态下,并不会将水管安装进空间。这是不可能的。”

Brink says: “As soon as the data center is built and becomes operational, you're never going to rebuild the whitespace. You are not going to put water pipes into the whitespace in an operational environment. It is just impossible.”

 

由于在设计阶段没有考虑液体冷却,导致后期对液体冷却的增加产生抵制,他说:“人们往往忽视那些未来经得起考验的必要性投资。”

Because liquid is not included in the design phase, this creates “resistance” from the industry to adding it later, he says: “People have neglected to make the make the necessary investments to make sure that they are future-proofed.”

 

这可能是因为建设不同阶段设备的融资和再融资方式,他说:“或者是因为缺少对液体冷却变革的雄心或信心。”

This may be due to the way that facilities are financed and refinanced at various times during the build phase, he says, “or it may be lack of ambition or not believing in the evolution of liquid cooling.”

 

Brink 说:“问题是液体冷却的生态环境使其变得更加难以可持续。数据中心不会只为了以防万一冒险多花一点钱,。”

The problem is that it's creating an environment in which it's still going to be very difficult to become more sustainable.Data centers won’t take a risk and spend a bit more “just in case,” says Brink.

 

一些问题可以通过教育来改变。ASHRAE已经发表了一些论文,描述了使用液体冷却的不同阶段,同时OCP也做了一些教育工作,他说:“通过立法要求为液体冷却做好准备工作,才能真正对行业产生重大影响。”

注:OCP是"Open Compute Project"(开放计算项目)的缩写。它是一个开源硬件设计项目,旨在共享数据中心和服务器领域的创新和最佳实践。OCP的目标是通过开发开放标准和设计,提高数据中心硬件的效能和灵活性。该项目由一些大型科技公司共同发起,包括Facebook、Microsoft、Intel等。

Some of this can be changed by education. ASHRAE has brought out papers describing different stages of using liquid cooling (see Box), and OCP has also done educational work, but in the end he says “legislation can really make a significant difference in the industry by requiring the preparation for liquid.”

 

现阶段,虽然德国能效法案鼓励更多的余热回收,但是仍然没有法律要求新建的数据中心空间中安装管道。

At this stage, there’s no prospect of a law to require new data centers to include pipes in the white space, although the German Energy Efficiency Act does attempt to encourage more waste heat reuse.

 

早期的法案试图强制要求新建数据中心中30%的废热应进行回收利用。但是由于德国没有足够的处于合适位置的集中供热系统来利用这部分热量,该要求就这样被迟滞了。

Early in its development, the Act tried to mandate that 30 percent of the heat from new data centers should be reused elsewhere. This was pushed back, because Germany doesn’t have sufficient district heating systems in the right place to make use of that heat.

 

然而,至少在建设阶段考虑废热回收利用的要求意味着德国新建的数据中心将会预留热量排放接口,那么将这些排放口与空间中更高效的热量回收系统连接起来将是顺理成章的一步。

But the requirement to at least consider waste heat reuse could mean that more data centers in Germany are built with heat outlets, and it is a logical step to connect those up with more efficient heat collection systems inside the white space.

 

在整个欧洲,能效指令要求数据中心在2024年报告他们的能源消耗量和能效数据,欧盟将于2025年考虑实施适用的能效措施。

Across Europe, the Energy Efficiency Directive will require data centers to report data on their energy consumption and efficiency in 2024, and the European Union will consider what efficiency measures are reasonable to impose in 2025.

 

无论实施何种干预措施,都可能对空气冷却和液体冷却之间的转换产生重大影响。

Whatever intervention is imposed could have a big impact on the hand-over between air and liquid cooling. 

 

 
 
深 知 社
 
 

 

翻译:

王利峰

DKV(DeepKnowledge Volunteer)计划成员

 

校对:

贾梦檩

阿里云 暖通工程师

DKV(DeepKnowledge Volunteer)精英成员

 

公众号声明:

本文并非官方认可的中文版本,仅供读者学习参考,不得用于任何商业用途,文章内容请以英文原版为准,本文不代表深知社观点。文中内容来自互联网,如有侵权,将在24小时内删除。中文版未经公众号DeepKnowledge书面授权,请勿转载。

 

推荐阅读:

 

 

 

首页    暖通    液冷VS风冷 | 风冷永不退场
液体冷却有很多好处,但空气冷却供应商的积压订单并没有减少的迹象,新的数据中心仍在围绕冷风机、暖通空调和其他风冷设备进行设计。我们该如何解释这一点?今天的风冷环境将如何与明天的液体冷却系统共存?
设计
管理
运维
设备
电气
暖通
控制
碳中和
储能

深知社