请问HN:查询大型语言模型(LLMs)到底有多贵?

2作者: teach6 天前原帖
我开始注意到一些出于善意的人们对大型语言模型的环境成本表示担忧。就在昨天,我在社交媒体上看到一个表情包,声称“ChatGPT在处理每个查询时需要1到3瓶水进行冷却。” 这让我觉得不太可信,但真相是什么呢? 我明白,训练一个大型语言模型的成本是非常高昂的。(虽然建立一个新的CPU制造厂的成本同样高。)但我认为,查询一个模型的增量成本应该相对较低。 我很想看到你们对“回答一个查询”所需的水和电量的粗略计算,比如ChatGPT、Claude-3.7-Sonnet或Gemini Flash。如果你能把这些数据与观看五分钟YouTube视频或进行一次Google搜索的成本进行比较,那就更好了。 如果能提供相关来源的链接,我将不胜感激。
查看原文
I&#x27;m starting to see things pop-up from well-meaning people worried about the environmental cost of large language models. Just yesterday I saw a meme on social media that suggested that &quot;ChatGPT uses 1-3 bottles of water for cooling for every query you put into it.&quot;<p>This seems unlikely to me, but what is the truth?<p>I understand that _training_ an LLM is very very expensive. (Although so is spinning up a fab for a new CPU.) But it seems to me the incremental costs to query a model should be relatively low.<p>I&#x27;d love to see your back-of-the-envelope calculations for how much water and especially how much electricity it takes to &quot;answer a single query&quot; from, say, ChatGPT, Claude-3.7-Sonnet or Gemini Flash. Bonus points if you compare it to watching five minutes of a YouTube video or doing a Google search.<p>Links to sources would also be appreciated.