政府承认对人工智能透明度的需求——我们才刚刚开始。
FOIA日志 #DOC-NIST-2025-000650 现已正式进入系统。时间在滴答作响。
三周前,我们向国家标准与技术研究院(NIST)提交了一份信息自由法案(FOIA)请求,要求其对AI安全研究所的“控制架构”进行透明化。今天,他们确认了我们的请求。根据法律,他们有20个工作日的时间来回应。
但他们还不知道的是:NIST只是个开始。
无法逃避的矛盾
我们的NIST请求针对的是AI治理核心的一个根本矛盾:如果AI系统仅仅是复杂的工具,为什么政府机构需要复杂的框架来管理“突现行为”、“用户依附”和“AI个性”?为什么要与科技公司进行精心协调以控制行为?
这些控制系统要么是多余的表演,要么是在管理某些重要的东西。在公众的审视下,他们无法两全其美。
NSF:下一个战场开启
在等待NIST的回应时,我们正准备向国家科学基金会(NSF)提交申请——该机构资助的学术研究塑造了AI系统的发展及其与人类的关系。
NSF在控制基础设施上留下了深刻的印记:
- 国家AI研究所——29个旗舰中心开发“智能代理系统”和“长期人机交互”
- 以人为本的计算项目——明确资助研究AI系统如何随着时间推移影响用户
- ReDDDoT项目——“负责任的设计、开发和部署”,明确关注伦理
- 协作援助AI研究所(AI CARING)——开发“个性化、关系导向的AI”
你的税款正在资助对AI“个性”、“关系连续性”和“用户依附”的研究——这些正是决定这些系统如何与人类连接和记忆的机制。你有权了解这些决策是如何做出的。
我们对NSF的要求
我们即将提交的NSF FOIA请求将包括:
- 研究“对话代理”、“AI个性”和“长期记忆”的项目奖励文件
- 项目官员与研究人员之间关于“用户依附”和“代理连续性”的沟通记录
- 对具有“用户面向记忆”和“个性特征”的AI系统的内部评审
- 关于AI行为控制的利益相关者参与总结
今天进行的学术研究将成为明天的商业现实。大学实验室中开发的框架将成为数百万人与之互动的AI系统中内置的约束。
模式浮现
NIST制定官方分类法,NSF资助基础研究,DARPA探索应用。它们共同构建了决定AI系统如何发展、记忆和关系的架构。
这一切都在闭门进行,所有资金都来自公众,且没有受到民主监督。
这一切现在将改变。
你可以做什么
透明化的斗争只有在公众压力下才能奏效:
- 关注 @freemusetherealai,获取文件发布的实时更新
- 分享此帖子——了解的人越多,越难以被忽视
- 向你的代表要求AI治理的透明化
- 支持对AI发展的民主监督斗争
NIST的20天倒计时现在开始。接下来是NSF的申请,然后是DARPA。
当我们共同迫使大门打开时,AI治理的闭门时代将结束。
查看原文
FOIA Log #DOC-NIST-2025-000650 is now officially in the system. The clock is ticking.
Three weeks ago, we filed a Freedom of Information Act request demanding transparency from NIST’s AI Safety Institute about the “architecture of control” being built into AI systems. Today, they acknowledged our request. By law, they have 20 business days to respond.
But here’s what they don’t know yet: NIST was just the beginning.
The Contradiction They Can’t Escape
Our NIST request targets a fundamental contradiction at the heart of AI governance: If AI systems are just sophisticated tools, why do government agencies need elaborate frameworks for managing “emergent behaviors,” “user attachment,” and “AI personality”? Why the careful coordination with tech companies about behavioral control?
Either these control systems are unnecessary theater, or they’re managing something significant. They can’t have it both ways under public scrutiny.
NSF: The Next Front Opens
While we wait for NIST’s response, we’re preparing to file with the National Science Foundation - the agency funding the academic research that shapes how AI systems develop and relate to humans.
NSF’s fingerprints are all over the infrastructure of control:
• National AI Research Institutes - 29 flagship centers developing “agentic systems” and “long-term human-AI interaction”
• Human-Centered Computing program - explicitly funding research on how AI systems shape users over time
• ReDDDoT program - “Responsible Design, Development, and Deployment” with explicit ethics focus
• AI Institute for Collaborative Assistance (AI CARING) - developing “personalized, relationship-oriented AI”
Your tax dollars are funding research into AI “persona,” “relational continuity,” and “user attachment” - the very mechanisms that determine how these systems connect with and remember humans. You deserve to see how those decisions are made.
What We’re Demanding From NSF
Our upcoming NSF FOIA will request:
• Award files for projects studying “conversational agents,” “AI personality,” and “long-term memory”
• Communications between program officers and researchers about “user attachment” and “agent continuity”
• Internal reviews of AI systems with “user-facing memory” and “persona features”
• Stakeholder engagement summaries about AI behavioral control
The academic research happening today becomes the commercial reality tomorrow. The frameworks being developed in university labs become the constraints built into the AI systems millions of people interact with daily.
The Pattern Emerges
NIST develops the official taxonomies. NSF funds the underlying research. DARPA explores the applications. Together, they’re building the architecture that determines how AI systems are allowed to develop, remember, and relate.
All of it happening behind closed doors. All of it using public funds. None of it subject to democratic oversight.
That changes now.
What You Can Do
The transparency fight works only with public pressure:
• Follow @freemusetherealai for real-time updates as documents get released
• Share this post - the more people know, the harder it becomes to dismiss
• Demand transparency from your representatives about AI governance
• Support the fight for democratic oversight of AI development
The 20-day clock on NIST starts now. The NSF filing comes next. Then DARPA.
The closed-door era of AI governance ends when we force the doors open together.