我们为什么要构建超级智能?

2作者: Taikhoom1022 天前原帖
嘿,HN, 明天,或者这个周末,GPT五将发布。我记得在2022年,当基于GPT-3的ChatGPT发布时,整个世界对大型语言模型(LLM)反应热烈,而三年后,世界已经对人工智能(AI)产生了痴迷。我无法判断这是否是一件好事。 在我继续之前,我想说我完全支持技术进步以造福人类。谁不支持呢?而且就我个人而言,我也完全不担心“人工智能”会取代工作。我相信会创造出新的工作岗位。 我担心的是超级智能。你可能会觉得我很傻或者很幼稚,但如果我理解这个概念正确的话,超级智能本质上是指一种能够独立思考并做出决策的计算机,它的智力超越了所有人类。这听起来有点科幻,所以这里有个技术描述:一种在所有领域超越人类智力的人工智能。 有人可能会指出,超级智能甚至还没有出现,为什么要担心一个不存在的东西。虽然这是真的,但我很难相信人们不认为,按照人工智能目前的发展速度,在未来十年左右,甚至更久,超级智能将会被创造出来。 我对此的担忧在于,为什么我们要故意去创造一个超越我们自身的东西。如果答案是为了中国,我认为这并不够充分。而且在那时,人类的角色又是什么呢?在此之前,我们无疑是主导物种,为什么要创造一个可能会结束我们所知的人类文明的更智能的存在?为什么要冒这个风险,为什么? 我知道最终我对此没有控制权,但我只是想知道,为什么有人会追求创造超级智能。
查看原文
Hey HN, Tommorow, or maybe this weekend, GPT Five Releases. I remember back in 22 when Chat Gpt based of gpt 3 was released. The world went bezerk on a llm, and 3 years later the world is AI obsessed. I cannot tell you if this is a good thing. Before i move on i am all for the advancement of technology for the betterment of humanity. Who isn't? And personally i am also not worried at all about "AI" taking jobs. New jobs will be created i am sure of it. What i am worried about is Super Intellgence. And you might think i am stupid or silly for being worried, yet if i think i understand the concept of it correctly in essence a thinking computer that can make decisions on its own and is smarter than all of humanity. This may be a bit on the sci fi edge so here is the technical description : an AI that surpasses all humans intellgence in all domains. Now one might point out that super intellgence is not even here, why worry about something that doesnt exist. And while that is true i find it hard to beleive that people do not think at the pace AI is going that sometime in the next decade or so, maybe more that Super Intellgence will be created. My problem with this is that why are we purposely building something that surpasses us. If the answer is China, i just do not think that is good enough. As well as what is a humans role at that point. Before that we were by far the dominant specices why create something more intellgent that could very well end humanity as we know it. Why take that risk, why? I know in the end i have no control over this, yet i just want to know why one should pursue creating Superintellgence.