GREG BROCKMAN JUST TOLD US EXACTLY WHERE OPENAI IS HEADING

OpenAI president Greg Brockman gave an interview this week and said a lot more than the usual corporate talking points. On AGI: OpenAI believes they have definitive line of sight to it. The debate about how far text intelligence can go, in his words, has been answered. They see it. The applications that people have always dreamed about are starting to actually come within reach.
On Sora getting dropped: the honest reason is that compute allocation at this scale has become genuinely painful. Better models are coming this year, and the amount of internal pressure around deciding where resources go has gone up, not down. Something had to give.
The part worth paying attention to is what he said about Spud.
This isn't an incremental update. It's a full new base model, a complete new pre-train, with two years of research behind it.
Brockman spent the last 18 months focused almost entirely on GPU infrastructure to support the training runs required to build it. When he describes what Spud represents, he calls it two years of research coming to fruition.
On top of Spud, OpenAI is building an autonomous AI system designed to handle the day-to-day work of a research scientist. Brockman's framing is pretty clear about the hierarchy: the AI acts as the junior researcher running tasks on its own, and you're the senior researcher reviewing outputs, giving feedback, and setting the direction. You don't even need to know the mechanical skills. The goal is to massively accelerate how fast they can produce new models and make research breakthroughs happen. The junior researcher is still on track for September, with Spud as the foundation.
Anthropic Is Building An Always-On Agent And It Just Got Leaked!

Anthropic is quietly testing something called Conway inside Claude, and Testing Catalog got an exclusive look at it before any official announcement. It surfaces as a separate sidebar option alongside the existing interface, and selecting it opens a dedicated environment that runs completely independently from a standard chat session.
What's inside is a lot. Conway can run Claude Code, support external webhooks, work with Chrome, and send notifications. The System section has an Extensions area where you'll be able to install custom tools, UI tabs, and context handlers using a new .cnw.zip file format, which suggests Anthropic is building out a full third-party ecosystem around this thing. There's also a Connectors section showing which tools are linked to the instance, and Claude in Chrome can connect directly to Conway so everything stays live and in sync.
The webhooks piece is probably the most significant detail. Public URLs can wake the Conway instance when an external service calls it, meaning this agent doesn't need you to open a tab or start a conversation to do something. It can sit in the background, stay connected to your tools and services, and respond to triggers on its own. That's a fundamentally different product than a chatbot.
This lines up directly with what leaked from the Claude Code source code a few days ago. The KAIROS feature buried inside that leak described an always-on Claude running persistently in the background, watching your workflow, and taking action without being asked. Conway looks like the product version of that idea. If Anthropic ships this anywhere close to what's been spotted, it would be one of their biggest product moves yet.
Gemma 4 Is Dropping Thursday And Google’s Own People Are Hyping It
Logan Kilpatrick, who leads Google AI Studio, posted a single word on X today: "Gemma." That's it. No context, no announcement, no press release. Just the name. Testing Catalog picked it up immediately and reported that Gemma 4 is likely dropping this Thursday.
When someone that close to the product posts something that cryptic, it's not an accident. Logan is Google AI Studio's product lead and he knows exactly what a one-word post does to the timeline. The reply from Testing Catalog was just a string of eyes emojis. Pretty much sums it up.
Gemma is Google's open model family, meaning this would be available for local use, fine-tuning, and self-hosting. If the release lands Thursday, it'll be one of the more significant open model drops of the year. We'll be covering it as soon as it's out.
