从ChatGPT开始 | 人工智能正在重塑数据中心设施版图
从ChatGPT开始:人工智能正在重塑数据中心设施版图
ChatGPT Is Just The Start: AI Is Rewriting The Data Center Real Estate Map
人工智能终于迎来了它的时代,它正在戏剧性地改变数据中心地产布局
Artificial intelligence is finally having its moment, and it is dramatically transforming the data center real estate landscape.
January 26, 2023 Dan Rabb, Bisnow Data Centers Reporter
当ChatGPT的话题热火朝天时,人工智能背后的高算力也是创新的重要着眼点。高性能计算的时代也意味着数据中心基础设施也即将迎来大变局。
北京时间2月8日,微软宣布推出由ChatGPT支持的最新版本人工智能搜索引擎必应和Edge浏览器。微软首席执行官表示,“搜索引擎迎来了新时代”。微软股价大涨4.2%,市值飙升800亿美元(约合人民币5400亿元)。
ChatGPT是一种基于网络的人工智能界面,它可以用非常人性化的书面回答来解答用户提出的问题,自去年11月OpenAI发布以来,它就抓住了公众的想象力。事实上,拥有一种可以立即撰写房地产营销电子邮件和解释如何从录像机中取出花生酱三明治的圣经经文的技术,似乎是朝着科幻小说中设想的未来产生的巨大飞跃。
ChatGPT — the web-based AI interface that responds to user questions with often strikingly human written responses — has captured the public imagination since its release by OpenAI in November.Indeed, access to a technology that can instantly create both real estate marketing emails and a biblical verse explaining how to remove a peanut butter sandwich from a VCR seems like a massive leap toward a future envisioned in science fiction novels.
虽然像ChatGPT和OpenAI的图像生成DALL-E这样的“生成式”人工智能在目前的形式下可能只是高科技玩具,但它们是以人工智能为中心的数字转型浪潮的前沿,数据中心行业领导者表示,这种转型已经在进行中。人工智能的广泛采用可能是重塑未来几个月和几年数字基础设施版图的最重要力量,它将改变数据中心的建设地点和方式。
While “generative” AI like ChatGPT and OpenAI’s image-creating DALL-E may be little more than high-tech toys in their current forms, they are the leading edge of a wave of AI-centered digital transformation that data center industry leaders say is already well underway. The widespread adoption of AI may be the most significant force reshaping the digital infrastructure map in the months and years ahead, transforming both where and how data centers are built.
专业数据中心提供商Applied Digital的首席执行官Wes Cummins说:“回顾过去二三十年,数据中心是围绕着视频流应用驱动需求而建立的,这就是所有基础设施的建设目的。推动未来需求的应用程序将围绕人工智能展开。这种计算将成为未来十年或二十年的需求所在。”
“You look back, the last 20 or 30 years, data centers were built out around the needs of the video streaming applications driving demand, that’s what all the infrastructure was built for,” said Wes Cummins, CEO of specialty data center provider Applied Digital. “The apps driving demand going forward are going to be around artificial intelligence. This kind of compute is going to be where the demand will be for the next 10 or 20 years.”
全球最大的几家科技公司正将他们的数字基础设施投资押注于人工智能驱动的未来。本月早些时候,微软宣布向OpenAI投资100亿美元,用于ChatGPT的持续发展和扩张,此前微软已向OpenAI投资了10亿美元。谷歌也在人工智能开发方面进行了大量投资(见本公众号2月6日国际周报),而Meta上个月暂停了全球数据中心的扩建(见本公众号1月4日独家解读),以便重新设计数据中心,以优化人工智能应用程序。
The world’s largest tech companies are betting their digital infrastructure dollars on an AI-driven future.Earlier this month, Microsoft announced a $10B investment in OpenAI for the continued evolution and expansion of ChatGPT, on top of a previous $1B investment in the company. Google has also made big investments in AI development, while Meta last month paused data center build-outs around the world in order to redesign them to be optimized for AI applications.
业内人士表示,过去几个月是人工智能创新的关键转折点,这些技术的采用曲线在不远的将来会变得更加陡峭。OpenAI的工具和服务添加到微软的Azure平台等开发正在降低创建新人工智能产品的成本障碍。专家表示,这正在推动创新浪潮,并创造了构建和运行这些技术所需的大量算力需求。
Industry insiders say the past few months have been a key inflection point for AI innovation, with the adoption curve for these technologies poised to get much steeper in the near future.Developments like the addition of OpenAI’s tools and services to Microsoft’s Azure platform are lowering cost barriers to creating new AI products. Experts say this is driving a wave on innovation and creating demand for the massive amounts of computing power needed to build and operate these technologies.
商业人工智能应用开发商Beyond Limits的高级人工智能技术策略师和系统架构师Ari Kamlani表示:“你会看到很多应用,从风险投资的角度来看,你已经看到这些更具生成型技术的新格局在演变。生成性的领域出现了爆炸式增长,人们与这些技术的互动方式也发生了变化。”
“You’re going to see a lot of applications, and from a venture standpoint you’re already seeing this new landscape evolve in terms of these more generative-type technologies,” said Ari Kamlani, senior AI technology strategist and systems architect at Beyond Limits, a developer of commercial AI applications. “There’s an explosion of the generative landscape, as well as in how people interact with these technologies.”
更多的人工智能意味着对算力需求的不断增长,这意味着更多的数据中心。但大多数人工智能应用程序所需的设备和基础设施与大多数数据中心提供商几十年来一直托管的服务器有很大不同。
More AI means a growing need for computing power, and that means more data centers. But the equipment and infrastructure required for most AI applications differs significantly from the servers most data center providers have been hosting for decades.
数据中心国际专家培训
运维管理的直接目标是优秀的运维人员而非设备本身;全球权威的AOS运维管理专家课程将带您深刻理解运维管理的本质
正确和系统地了解Tier分级体系会提升数据中心的项目投资回报、减少业务中断风险,ATS课程将全面带您学习Tier知识,帮助您有效提升企业的运营指标和内外部客户的满意度。
点击图片查看课程排期
扫码回复【uptime培训】了解课程详情
一般来说,人工智能需要的处理器比大多数传统数据中心服务器中的处理器更快、更强大。事实上,正是这些高性能芯片的开发在很大程度上促成了这一波 AI 创新浪潮。但是,尽管这些所谓的 GPU 处理器能够更好地快速执行所需的计算,但与大多数数据中心设计的设备相比,它们使用更多的功率并产生更多的热量。
In general, AI requires faster, more powerful processors than those found in most traditional data center servers. Indeed, it has been the development of these high-performance chips that has largely enabled this wave of AI innovation. But while these so-called GPU processors are better at performing the required calculations quickly, they use much more power and create significantly more heat than the equipment for which most data centers are designed.
因此,许多供应商已经在改变他们的设施设计,以应对日益增长的 AI 计算能力需求浪潮。根据 Flexential 首席创新官 Jason Carolan 的说法,这意味着重新设计从备用电源系统和发电机到液体冷却系统的一切,并改变建筑位置以靠近附近的变电站。“它开始真正改变下一代数据中心的设计要求,”Carolan 说。
As a result, many providers are already changing their facility designs to get out in front of the growing wave of demand for AI computing capacity.According to Flexential Chief Innovation Officer Jason Carolan, this means redesigning everything from backup power systems and generators to liquid-based cooling systems and changing building locations to be closer to nearby substations.“It’s starting to really change design requirements for the next-generation data center,” Carolan said.
AI 不仅改变了数据中心的构建方式,还改变了它们的构建位置。与 Bisnow 交谈的数据中心专家达成共识:人工智能的广泛采用将加速数据中心格局的分散化,推动新市场的发展,并远离硅谷和北弗吉尼亚等行业传统中心。
AI is not only changing how data centers are built, it’s also changing where they’re built.Among the data center experts who spoke with Bisnow, there was a consensus: widespread adoption of AI will accelerate the decentralization of the data center landscape, driving development in new markets and away from the industry’s traditional hubs like Silicon Valley and Northern Virginia.
一个重要因素是人工智能计算的巨大电力需求。该行业传统枢纽不断上涨的能源成本和电力短缺已经将数据中心推向了新的领域,开发商越来越多地优先考虑能够获得廉价且理想的可再生电力的地点。人工智能很可能会加速这一趋势,因为它的采用会增加每个设施的电力需求。
One significant factor is the massive power requirements for AI computing. Rising energy costs and shortages of power in the industry’s traditional hubs are already driving data centers into new territories, with developers increasingly prioritizing locations where they have access to cheap and ideally renewable electricity. AI may well accelerate this trend as its adoption increases how much power each facility needs.
与此同时,专家表示,如果 AI 应用程序要在经济上可行,就必须通过降低电力成本来降低 AI 计算价格。目前,通过云或直接从数据中心访问所需算力的巨大成本普遍令人望而却步。ChatGPT 的运营亏损严重,每天仅在计算上就花费超过 10 万美元。
At the same time, experts say driving down the price of compute for AI through lower power costs is imperative if AI applications are going to be financially viable. At present, the enormous cost to access needed computing power through the cloud or from a data center directly is broadly prohibitive. ChatGPT operates at a significant loss, spending over $100K daily on compute alone.
据多个消息来源称,微软的Azure 云正在托管 ChatGPT,这样 OpenAI 就不必投资物理服务器机房。考虑到微软目前的费率,单个 A100 GPU 每小时运行费用为3 美元,在ChatGPT 上生成的每个单词的费用为 0.0003 美元。至少有八个 GPU 用于在单个ChatGPT 上运行。因此,当 ChatGPT 生成平均 30 个单词的响应时,公司将花费近 1 美分。通过这样的估算,OpenAI 可能每天至少花费 10 万美元或每月 300 万美元用于运行成本。
According to multiple sources, Microsoft’s Azure cloud is hosting ChatGPT so that OpenAI does not have to invest in a physical server room. Considering Microsoft’s current rates, it is $3 an hour for a single A100 GPU, and each word generated on ChatGPT costs $0.0003. At least eight GPUs are in use to operate on a single ChatGPT. So, when ChatGPT generates an average response of 30 words, it will cost nearly 1 cent for the company. Through such an estimation, OpenAI could be spending at least $100K per day or $3 million monthly on running costs.
但人工智能对数据中心格局的转变比单独寻求廉价电力要复杂得多。专家表示,人工智能系统的独特架构将为数据中心空间创造两种截然不同的需求载体。
But AI’s transformation of the data center landscape is more complicated than just the quest for cheap power. Experts say the distinct architecture of AI systems will create two distinct demand vehicles for data center space.
Bisnow 使用 OpenAI 的 DALL-E 创建的图像,提示为“北达科他州平原上的数据中心”。
An image created by Bisnow using OpenAI's DALL-E with the prompt "Data Center On Plains Of North Dakota."
构成人工智能应用程序的互连计算系统通常可以分为具有不同基础设施要求的两个部分——实际上是一个大脑的两个半球。一方面,大量的计算能力被用于所谓的“训练”:让 AI 模型访问大量信息,就 ChatGPT 而言,信息来源于整个互联网。它可以从中制定可以应用于其他地方的决策框架。
The interconnected computing systems making up an AI application can generally be divided into two parts with different infrastructure requirements — effectively two hemispheres of a single brain.In one part, a massive amount of computing power is used for what is called “training”: giving an AI model access to a massive amount of information — in the case of ChatGPT, the whole internet — from which it can develop a decision-making framework that can be applied elsewhere.
一旦创建了这个决策框架,它就可以在一组不同的基础设施上运行,供用户实时交互。实际应用 AI 的后一阶段称为“推理”。Applied Digital 的 Cummins 说:“你获取了 3000 亿个数据点,对它们进行处理,然后训练一个模型,然后根据所有数据点做出决策,然后不断完善该模型。你在一个环境中训练模型,然后你可以将该模型上传到实时操作环境中进行决策。”
Once this decision-making framework has been created, it can be run on a different set of infrastructure for users to interact with in real time. This latter stage, where the AI is actually applied, is known as “inference.” “You’re taking 300 billion data points, you crunch them, and you train a model that then makes decisions based on all the data points, and then you constantly refine that model,” Applied Digital’s Cummins said. “You train the model in one environment, and then you can upload that model into a real-time operating environment for the decision-making."
专家表示,对数据中心空间托管这两种计算环境的需求正在重塑数据中心版图,但方式截然不同。人工智能训练——需要大量的计算能力,但几乎不需要与最终用户或其他设施进行闪电般快速连接,这将推动低电力成本和可再生能源市场的需求,即使他们没有那种传统数据中心所需的光纤网络连接。
The need for data center space to host both of these computing environments is reshaping the data center map, experts say, but in very different ways. 。AI training — requiring larger amounts of computational power but little need for lightning-fast connectivity with end users or other facilities — will drive demand in markets with low power costs and renewable energy, even if they don’t have the kind of connections to fiber networks that would be required in a traditional data center, experts say.
Applied Digital 专门从事此类设施,其正在北达科他州开发主要以人工智能为重点的设施。Applied 的 Cummins 说,主要考虑因素是提供尽可能低的计算成本。虽然这主要意味着廉价的电力,但也可以通过建造无冗余的传统任务关键型数据中心,并将它们放置在需要使用较少电力进行冷却的寒冷地区,从而降低成本。
Applied Digital specializes in exactly this kind of facility, with its main AI-focused facility under development in North Dakota. The primary consideration, Applied's Cummins said, is providing the lowest possible cost of compute. While this primarily means cheap power, costs can also be kept down by building facilities without the redundancies needed in traditional mission critical data centers, and by locating them in cold locations where less power has to be used on cooling.
“大多数数据中心都是瑞士军刀——它们为一切应用而建,无论你是在运行视频、为你的企业开发的微软应用程序还是其他任何东西,” Cummins说。“我们专门为这些用例构建它,因此它的成本要低得多。”
“Most data centers are kind of Swiss Army knives — they were built for everything, whether you're running video, Microsoft apps for your business or whatever it might be,” Cummins said. “We're building it specifically for these use cases, so it makes it much cheaper.”
虽然人工智能训练正在推动边远市场对数据中心的需求,但为最终用户部署这些技术所需的计算能力正在帮助推动靠近主要人口和商业中心的数据中心需求,数据中心行业称之为“边缘”。
While AI training is driving demand for data centers in far-flung markets, the need for computing power to deploy these technologies for end users is helping drive data center demand in close proximity to major population and business centers at what the data center industry calls “the edge.”
Flexential 的 Carolan 指出了人工智能用例,例如自动驾驶汽车或实时使用语言处理人工智能的客户服务呼叫中心。这种应用程序需要大量的计算来进行推理,而且几乎需要立即产生。实现这种延迟需要将算力及其所在的数据中心放置在最终用户附近。这也意味着需要建设确保数据可以快速传输所需的密集光纤网络。
Flexential’s Carolan points to AI use cases like self-driving cars or a customer service call center that uses language processing AI in real time. That kind of application requires a ton of computing for inference, and it needs to happen almost instantly. Achieving that kind of latency requires placing that computing capacity — and the data centers it lives in — near the end user. It also means building out the dense fiber networks needed to make sure data can be moved quickly.
Carolan 表示,超大规模企业已经在边缘 AI 计算基础设施和支持它们所需的网络方面进行了大量投资。他说,这些光纤接入点的创建意味着其他数据中心运营商紧随其后,因为人工智能会吸引在北弗吉尼亚等传统地点之外创建新的微型数字基础设施中心。
Carolan said hyperscalers are already making significant investments in both edge AI computing infrastructure and the networks needed to support them. He said the creation of these fiber access points mean other data center operators follow close behind, as AI leads to the creation of new mini digital infrastructure hubs away from traditional locations like Northern Virginia.
“它正在发展数据中心业务,不仅是因为超大规模数据中心希望进入新的区域,而且这也导致其他数据中心靠近超大规模数据中心以进行连接,”卡罗兰说。“边缘-中型-大型处理中心的架构不断增长,它们倾向于吸引周围的其他数据中心,因为人们希望靠近这些设施,主要是因为他们想要在他们的云基础设施之间建立超快速管道以实现移动处理。”
“It’s growing the data center business not only because the hyperscalers want to be in new regions, but that’s also causing other data centers to be close to the hyperscalers for the connectivity,” Carolan said. “There’s this growing architecture of edge to medium to large processing centers where they have a tendency to attract other data centers around them, because people want to be close to those facilities, largely because they want a super-fast pipe between their infrastructure in the cloud to be able to move processing around.”
近年来,大型科技公司一直在宣布他们的人工智能投资计划。特别值得注意的是 Meta 的 2023 年资本支出指南超过 350 亿美元,这主要是由于对人工智能/机器学习和高端 GPU 的进一步投资推动的,以为其算法提供更多的分析和算力。
Mega cap tech companies have been declaring their plans for AI investment in recent years. Particularly notable is Meta’s more than $35bn guide for capital expenditure in 2023, largely driven by further investment in AI/machine learning and higher-end GPUs to allow for more analytics and computing power for its algorithms.
在未来三个月疲软的经济背景和收入前景的推动下,我们见证了大型科技公司成本意识的诞生。尽管其中一些公司裁员,但我们认为支出将继续用于被视为具有更高获利机会的 AI/ML 应用程序,ChatGPT 为超大规模企业之间的 AI 军备竞赛火上浇油。
Driven by a weaker economic backdrop and outlook for revenues over the next three months we have witnessed a birth of cost consciousness within large cap tech companies. Despite headcount reductions for some of these companies, we believe spending will continue to be directed to AI/ML applications that are viewed as having higher monetisation opportunities, with ChatGPT adding fuel to an AI arms race amongst hyperscalers.
翻译:
Seaman
DKV(DeepKnowledge Volunteer)计划精英成员
公众号声明:
本文并非官方认可的中文版本,仅供读者学习参考,不得用于任何商业用途,文章内容请以英文原版为准,本文不代表深知社观点。中文版未经公众号DeepKnowledge书面授权,请勿转载。
推荐阅读: