展示HN:Omni-NLI – 一个多接口的自然语言推理服务器

1作者: habedi0大约 18 小时前原帖
大家好, 我开发了一个名为 Omni-NLI 的开源工具,用于自然语言推理。它可以使用不同的模型来检查一段文本(称为前提)是否支持另一段文本(称为假设)。这种工具的主要应用是进行软事实核查和文本之间的一致性检查,例如句子之间的关系。 目前,Omni-NLI 具备以下功能: - 可以通过 `pip install omni-nli[huggingface]` 安装为 Python 包。 - 可以在您的个人计算机上使用,因此您的数据保持本地和私密。 - 具有 MCP 接口(用于代理)和 REST API,适合作为微服务的常规使用。 - 支持使用来自不同来源的模型(Ollama、OpenRouter 和 HuggingFace)。 - 可以用来检查模型是否存在自我矛盾的情况。 - 支持展示推理过程,让您了解它为何认为某个主张是错误的。 如果您有兴趣了解更多信息,以下是相关链接: 项目的 GitHub 仓库:[https://github.com/CogitatorTech/omni-nli](https://github.com/CogitatorTech/omni-nli) 项目的文档:[https://cogitatortech.github.io/omni-nli/](https://cogitatortech.github.io/omni-nli/)
查看原文
Hi everyone,<p>I&#x27;ve made an open-source tool (called Omni-NLI) for natural language inference. It can use different models to check if a piece of text (called a premise) supports another piece of text (a hypothesis). The main application of a tool like this is for soft fact-checking and consistency checking between pieces of texts like sentences.<p>Currently, Omni-NLI has the following features:<p><pre><code> - Can be installed as a Python package with `pip install omni-nli[huggingface]`. - Can be used on your own computer, so your data stays local and private. - Has an MCP interface (for agents) and a REST API for conventional use as a microservice. - Supports using models from different sources (Ollama, OpenRouter, and HuggingFace). - Can be used to check if it seems that a model is contradicting itself. - Supports showing the reasoning so you can see why it thinks a claim is wrong. </code></pre> In any case, if you are interested in knowing more, there is more information in the links below:<p>Project&#x27;s GitHub repo: <a href="https:&#x2F;&#x2F;github.com&#x2F;CogitatorTech&#x2F;omni-nli" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;CogitatorTech&#x2F;omni-nli</a><p>Project&#x27;s documentation: <a href="https:&#x2F;&#x2F;cogitatortech.github.io&#x2F;omni-nli&#x2F;" rel="nofollow">https:&#x2F;&#x2F;cogitatortech.github.io&#x2F;omni-nli&#x2F;</a>