展示HN:在远程SSH shell会话中执行本地LLM提示
嗨,HN,
这是我在过去几个月里开发的一个工具。
与其给大型语言模型(LLM)工具提供SSH访问权限或将其安装在服务器上,不如使用以下命令:
```
$ promptctl ssh user@server
```
这会使一组本地定义的提示“神奇地”出现在远程shell中,作为可执行的命令行程序。
例如,我在本地定义了`llm-analyze-config`和`askai`的提示。那么在(任何)远程主机上,我可以:
```
$ promptctl ssh user@host
# 现在在远程主机上
$ llm-analyze-config /etc/nginx.conf
$ cat docker-compose.yml | askai "添加负载均衡器"
```
`llm-analyze-config`和`askai`背后的提示在我的本地计算机上执行(尽管它们是远程调用的),通过我选择的LLM。
通过这种方式,LLM工具从未被授予对服务器的SSH访问权限,也不需要在服务器上安装任何东西。实际上,服务器甚至不需要启用出站互联网连接。
期待您的反馈!
GitHub: [https://github.com/tgalal/promptcmd/](https://github.com/tgalal/promptcmd/)
查看原文
Hi HN,<p>This is a tool I've worked on the past few months.<p>Instead of giving LLM tools SSH access or installing them on a server, the following command:<p><pre><code> $ promptctl ssh user@server
</code></pre>
makes a set of locally defined prompts "magically" appear within the remote shell as executable command line programs.<p>For example, I have locally defined prompts for `llm-analyze-config` and `askai`. Then on (any) remote host I can:<p><pre><code> $ promptctl ssh user@host
# Now on remote host
$ llm-analyze-config /etc/nginx.conf
$ cat docker-compose.yml | askai "add a load balancer"
</code></pre>
the prompts behind `llm-analyze-config` and `askai` execute on my local computer (even though they're invoked remotely) via the llm of my choosing.<p>This way LLM tools are never granted SSH access to the server, and nothing needs to be installed to the server. In fact, the server does not even need outbound internet connections to be enabled.<p>Eager to get feedback!<p>Github: <a href="https://github.com/tgalal/promptcmd/" rel="nofollow">https://github.com/tgalal/promptcmd/</a>