- Sapien Dispatch
- Posts
- Sapien Weekly Digest - February 6th
Sapien Weekly Digest - February 6th
Hey there, Sapiens! We’re ready to share what we’ve been up to.

Why Proof of Quality:
Did you know that since the start of the 2025 school year, there have been as many as 19 cases of autonomous vehicles ignoring school bus stop signs while children were getting on or off the bus in Austin, Texas alone?
We do. Proof of Quality fixes this.
We're finally ready to unveil what we've been working on. Over the next weeks, we will be introducing in detail how we're approaching the ever-growing issue of bad quality data in datasets that can have real world consequences.
To kick us off, check out our Founder & CEO Rowan Stone's article on the how and why, and especially the “why now?” of Proof of Quality right here: Proof of Quality: Who Verified the Data That Trained Your Model?
If you’re a builder, hit us up at [email protected]
Sapien Team Takeover:
“From task creation to model implementation, where there is vagueness there is trouble.”
Today’s Discord Team Takeover featured the amazing Madison McCarthy, Operations Manager at Sapien, who explored the operational realities behind scaling high-quality AI training data. Madison focused on the structural risks that emerge as projects grow, and explained how instruction clarity, bias prevention, and rigorous task specification determine whether large-scale data pipelines succeed or fail.
🚀 What’s New in the App?
While we’re still in the transition phase, the following tasks are available and taken care of for you!
Current task Overview:
😂Thinking Comic Lines
😂Thinking Comic Lines QA
👩🍳 Gastro Tag – Easy / Intermediate / Hard
👩🍳 Gastro Tag QA – Easy / Intermediate / Hard
💬 Emotion Prompt – Easy / Intermediate / Hard
💬 Emotion Prompt QA – Easy / Intermediate / Hard
🧩 Multi Choice Error Review – Easy / Intermediate / Hard
📦 Logic Path Vietnamese QA
Our Voices in the World
“SpaceX intends to build a ton of space domiciled data centers leveraging solar power to keep up with insane rising demand for power/compute from the AI industry” - Rowan Stone
Amidst final preparations for the next phase of Sapien, our CEO Rowan Stone was personally headhunted for commentary by Forbes on Elon Musk’s merger of xAI and SpaceX! Read Antonio Pequeño IV’s piece on the merger right here!
“It all comes down to being able to trust that data” - Lukas Grapentine
As the world is looking more and more towards the things that AI cannot do without human help, both Rowan and our Director of Solutions Architecture Lukas Grapentine were quoted by Business Insider’s Lloyd Lee about how even modern Robotaxis will always need a human in the loop.
What else happened in AI?
The World outside of Sapien:
Anthropic and OpenAI locked in AI model race:
Among the recent discourse about ads in AI chats, in which Anthropic mocked the idea and pledged to keep chats free of ads, both Anthropic and OpenAI released new models in the same day.
February 5th, OpenAI released GPT 5.3 Codex to push agentic coding forward, alongside a dedicated Codex desktop app designed for longer running software tasks. OpenAI positions the model as combining frontier coding performance with stronger reasoning and professional knowledge, targeting complex end to end tasks, stating that GPT 5.3 Codex runs 25 percent faster than the prior Codex model line, improving throughput for extended workflows.
At the same time, Anthropic revealed their latest instalment of its smartest model class, Claude Opus 4.6. Just like GPT 5.3 Codex, it’s aimed at improved planning, code review, debugging, and reliability in large codebases, explicitly targeting agentic workflows. As a part of this, GitHub announced Opus 4.6 availability in GitHub Copilot,
Agentic coding remains the highest leverage near term application for frontier models. Faster, more reliable long horizon execution shifts software creation from assistance to delegation, accelerating delivery cycles and increasing pressure on developer tooling incumbents.
The AI infrastructure capex wave accelerates across Big Tech:
On February 2nd, Elon Musk merged xAI into SpaceX in a record-setting $1.25 trillion transaction, creating a vertically integrated AI-and-space infrastructure conglomerate, forming the largest M&A transaction in history. The merged entity aims to integrate rockets, satellite networks, AI models, social platforms, and communications infrastructure into a single vertically integrated ecosystem. Musk positioned the merger around a long-term strategy of deploying solar-powered orbital data centers and satellite-based AI compute infrastructure to meet growing energy demands for large-scale AI training and inference.
Similarly, Amazon, Alphabet, and Meta committed to sharply higher 2026 capital spending focused on AI infrastructure, pushing combined forecasts for major players above $500 billion aimed at expanding AI related infrastructure capacity. Amazon alone outlined a 2026 capital expenditure plan reported at roughly $200 billion. This comes at a time when compute and power constraints now define competitive advantage as much as model quality.
At the same time, Intel recommits to GPUs for data centers amid AI acceleration demand, a credible third GPU platform would reshape accelerator pricing power, supply resilience, and software ecosystem dynamics.
Sam Altman’s Succession plans:
Sam Altman outlined a provocative succession concept for OpenAI, claiming that he will eventually hand leadership to an AI system, while reiterating claims that the company is approaching, or conceptually already at, AGI-level capability. He suggested OpenAI is “very close” to AGI and at one point characterized progress as “basically have built AGI” in a conceptual or aspirational sense, later clarifying the remark was not strictly literal. Altman argued that achieving AGI is likely to require many incremental engineering breakthroughs rather than a single scientific discovery
Maybe Proof of Quality can help him.
Welcome to February. Welcome to PoQ.