实验性配置驱动的 Rust LLM 客户端库
我找不到好的库来创建一个可在运行时配置的 LLM 客户端以供代理使用,因此我自己构建了这个客户端,它支持运行时配置,并能够发现可用的模型和服务。
查看原文
There were no good libraries I could find which let me create a runtime configurable llm client for use in agents. so I built this one which is  runtime configurable and enables the discovery of available models and services.