展示HN:OpenEvolve – DeepMind的AlphaEvolve开源实现

1作者: codelion9 个月前原帖
我构建了一个开源实现的 Google DeepMind 的 AlphaEvolve 系统,称为 OpenEvolve。它是一个进化编码代理,利用大型语言模型(LLMs)通过迭代进化发现和优化算法。 <p>试试这个: <a href="https://github.com/codelion/openevolve">https://github.com/codelion/openevolve</a> <p>这是什么? <p>OpenEvolve 通过结合多个 LLM 和自动评估,进化整个代码库(不仅仅是单个函数)。它遵循 AlphaEvolve 论文中描述的进化方法,但完全开源且可配置。 <p>我构建这个系统是因为我想尝试进化代码生成,并看看是否能复制 DeepMind 的成果。原始系统成功改善了 Google 的数据中心,并发现了新的数学算法,但没有发布任何实现。 <p>它是如何工作的 <p>该系统有四个主要组件,它们在进化循环中协同工作: <p>1. 程序数据库:以 MAP-Elites 启发的结构存储程序及其指标 2. 提示采样器:使用过去的解决方案创建丰富上下文的提示 3. LLM 集成:使用多个模型生成代码修改 4. 评估池:测试程序并提供反馈指标 <p>你可以用它做什么 <p>- 运行现有示例,观察进化的过程 - 定义自己的问题,使用自定义评估函数 - 配置 LLM 后端(与任何兼容 OpenAI 的 API 一起使用) - 在集成中使用多个 LLM 以获得更好的结果 - 使用多个目标优化算法 <p>我从 AlphaEvolve 论文中复制的两个示例: <p>- 圆形打包:从简单的几何图案进化到复杂的数学优化,达到了 DeepMind 报告结果的 99.97%(n=26 时,半径和为 2.634 对比 2.635)。 - 函数最小化:将随机搜索转变为完整的模拟退火算法,具有冷却调度和自适应步长。 <p>技术见解 <p>- 低延迟的 LLM 对于快速生成周期至关重要 - 使用 Gemini-Flash-2.0-lite + Gemini-Flash-2.0 作为集成时效果最佳 - 对于圆形打包问题,Gemini-Flash-2.0 + Claude-Sonnet-3.7 表现最佳 - Cerebras AI 的 API 提供了最快的推理速度 - 两阶段方法(探索然后利用)在复杂问题上效果最佳 <p>开始使用(少于 2 分钟) <p># 克隆并安装 git clone <a href="https://github.com/codelion/openevolve.git">https://github.com/codelion/openevolve.git</a> cd openevolve pip install -e . <p># 运行函数最小化示例 python openevolve-run.py examples/function_minimization/initial_program.py \ examples/function_minimization/evaluator.py \ --config examples/function_minimization/config.yaml \ --iterations 50 <p>你只需要 Python 3.9 以上版本和一个 LLM 服务的 API 密钥。配置通过简单的 YAML 文件完成。 <p>我会在这里回答问题并进行讨论!
查看原文
I&#x27;ve built an open-source implementation of Google DeepMind&#x27;s AlphaEvolve system called OpenEvolve. It&#x27;s an evolutionary coding agent that uses LLMs to discover and optimize algorithms through iterative evolution.<p>Try it out: <a href="https:&#x2F;&#x2F;github.com&#x2F;codelion&#x2F;openevolve">https:&#x2F;&#x2F;github.com&#x2F;codelion&#x2F;openevolve</a><p>What is this?<p>OpenEvolve evolves entire codebases (not just single functions) by leveraging an ensemble of LLMs combined with automated evaluation. It follows the evolutionary approach described in the AlphaEvolve paper but is fully open source and configurable.<p>I built this because I wanted to experiment with evolutionary code generation and see if I could replicate DeepMind&#x27;s results. The original system successfully improved Google&#x27;s data centers and found new mathematical algorithms, but no implementation was released.<p>How it works<p>The system has four main components that work together in an evolutionary loop:<p>1. Program Database: Stores programs and their metrics in a MAP-Elites inspired structure 2. Prompt Sampler: Creates context-rich prompts with past solutions 3. LLM Ensemble: Generates code modifications using multiple models 4. Evaluator Pool: Tests programs and provides feedback metrics<p>What you can do with it<p>- Run existing examples to see evolution in action - Define your own problems with custom evaluation functions - Configure LLM backends (works with any OpenAI-compatible API) - Use multiple LLMs in ensemble for better results - Optimize algorithms with multiple objectives<p>Two examples I&#x27;ve replicated from the AlphaEvolve paper:<p>- Circle Packing: Evolved from simple geometric patterns to sophisticated mathematical optimization, reaching 99.97% of DeepMind&#x27;s reported results (2.634 vs 2.635 sum of radii for n=26). - Function Minimization: Transformed a random search into a complete simulated annealing algorithm with cooling schedules and adaptive step sizes.<p>Technical insights<p>- Low latency LLMs are critical for rapid generation cycles - Best results using Gemini-Flash-2.0-lite + Gemini-Flash-2.0 as the ensemble - For the circle packing problem, Gemini-Flash-2.0 + Claude-Sonnet-3.7 performed best - Cerebras AI&#x27;s API provided the fastest inference speeds - Two-phase approach (exploration then exploitation) worked best for complex problems<p>Getting started (takes &lt; 2 minutes)<p># Clone and install git clone <a href="https:&#x2F;&#x2F;github.com&#x2F;codelion&#x2F;openevolve.git">https:&#x2F;&#x2F;github.com&#x2F;codelion&#x2F;openevolve.git</a> cd openevolve pip install -e .<p># Run the function minimization example python openevolve-run.py examples&#x2F;function_minimization&#x2F;initial_program.py \ examples&#x2F;function_minimization&#x2F;evaluator.py \ --config examples&#x2F;function_minimization&#x2F;config.yaml \ --iterations 50<p>All you need is Python 3.9+ and an API key for an LLM service. Configuration is done through simple YAML files.<p>I&#x27;ll be around to answer questions and discuss!