We're Interviewing OpenAI Leadership All Day
Greg Brockman, Mark Chen, Brad Lightcap, and other company powerplayers join the stream today
Good morning, we’re live on TBPN. Let’s get to Coogan’s run of show for August 7, 2025.
Secret Superintelligence
Today we’re interviewing Mark Chen, Greg Brockman, and several other members of OpenAI’s leadership team about GPT-5. There’s no question the model is a huge milestone for the company and for technology broadly, but it has me thinking about the nature of superintelligence.
The other day, I was joking that my first question for a superintelligent AI would be: “Teach me how to build an open source, superintelligent AI.” The riff being that then I could go to Meta and earn $100 million a year as an AI researcher because I’d have all the information necessary to build frontier AI models for Meta.
Obviously, no one is expecting GPT-5 to answer questions rigorously about the nature of GPT-5. That’s intellectual property, tightly controlled by OpenAI’s inner circle. In fact, just knowing which particular training strategies are most effective is so important that when Zuckerberg started poaching from OpenAI, the company’s Chief Research Officer Mark Chen told his team he felt something had been stolen from them. He was obviously, in a way, referring to trade secrets.
But aside from key intellectual property — like how to design frontier AI models — I wonder how much of the economy really hinges on secrets. Peter Thiel writes extensively about secrets in Zero to One, and sort of identified that secrets were one of the major keys to building a successful AI model.
I wonder how important secrets are in the broader economy. Even at lower-level jobs, there are little tricks of the trade, small process optimizations that drive outperformance (which the outperformer might not even be aware he’s doing). Perhaps those are hidden — not just hidden from the open web (and training data sets, and LLMs), but even from the outperformer’s own manager, for example.
Dwarkesh often discusses continual learning as a roadblock to superintelligence, but it feels like a more tractable challenge — at least as a research frontier. Though I do wonder about the nature of secrets, because if you can keep something out of the training data forever, and it’s impossible to derive it from the laws of physics, it could be very, very difficult to ever create a system truly able to answer every single question.
Can God create a stone so heavy that he can’t lift it?
TBPN’s top stories
RUMOR: Altman announces $1.5M bonus for all OpenAI employees distributed over next 2 years…
Anduril inks weaponry contract with Taiwan…
Meta snags product designer Joshua Pekera from Airbnb…
duPont Registry: 2008 Koenigsegg CCXR estimated at $2,750,000 – $3,500,000 heads to auction…
Supercars publishes list of celebs blacklisted by Ferrari…
Derek Thompson: How AI Conquered the US Economy…
Microsoft drops limited edition WindowsXP Crocs…
OpenAI to give entire U.S. Fed access to Enterprise GPT for $1 per agency…
Bloomberg: How Doodles Became a Cutthroat Billion-Dollar Dog Industry…
Newsletter CEO accused of inflating subscriber numbers to advertisers…
East coast convenience chain Sheetz offers 50% discount for payments made in BTC…
Timeline bangers
Gongworthy
Palantir CTO Shyam Sankar’s net worth swells to over a billion as stock rips
Today’s lineup
OpenAI Chief Research Officer Mark Chen at 11:30
OpenAI President Greg Brockman at 12:00
OpenAI CFO Sarah Friar at 12:50
Warp founder Zach Lloyd at 1:00
Charlie founder Riley Tomasek at 1:10
Vercel CEO Guillermo Rauch at 1:20
Factory CTO and Co-Founder Eno Reyes at 1:30
Augment Co-Founder Guy Gur-Ari at 1:40
CodeRabbit CEO and Co-Founder Harjot Gill at 1:50
OpenAI VP of Research, Post-Traning Max Schwarzer at 2:00
Raindrop CTO Ben Hylak at 2:15
OpenAI COO Brad Lightcap at 2:35
What you missed yesterday on TBPN
Watch yesterday’s episode here.
[01:33:46] Andreessen was on the show yesterday. A few standout moments: (1) he uses GPT to generate 30-page “novels” to deep dive on topics, and (2) he’s updated his thinking on open source since the release of DeepSeek, GPT-OSS, and open weights.
Other topics included: the pace of technology this summer; Apple sitting on the sidelines of the AI Olympics; how “tech products become obsolete the moment they become perfect”; the value of the internet advertising model; why model training isn’t copyright violation; and Lina Khan’s FIG victory lap.
[01:02:06] Avi Schiffmann’s AI pendant Friend has been forcing some interesting questions on the timeline. Reggie James writes:
Sitting down at breakfast, Yatu reveals he’s wearing an AI pendant product (Omi). Sean and Jackson freeze.
A couple days later, I’m wearing a Friend necklace and talking to my wife, and I stop to look at it. Wondering if it should be included in the goings on of my home life.
I think it’s clear that we will start to create a set of social norms around AI that’s extremely explicit. We will state things like “can all AIs leave the room.” Spaces will have machinery running that renders connected electronics useless…”
Jackson Dahl’s review of Friend is also worth reading.
[01:12:44] Roblox, Anduril, CrowdStrike, and Apple fit into a16z GP David George’s concept for “modelbusters,” companies that “either reveal a market that’s much larger than anticipated, or expand into new product lines so effectively that they break out of their original category entirely.”
But is a modelbuster un-modelable? Can there truly be a model for finding a modelbuster?
[01:06:26] General Matter is sprinting toward restarting uranium enrichment in the U.S. It was obvious this company needed to exist, but no one was pitching it. Then the timeline accelerated — the government contract came through much faster than expected.
[00:07:00] “I can’t say much, but I left extremely bullish on SaaS, even in the age of AI.” Jordi reported back after hanging out with “the most powerful people” in the Valley on Tuesday night. What did he see there?
[00:11:20] Google’s prompt-based world simulator Genie 3 is obviously amazing, but where will it ultimately land in terms of use cases? A meditative hangout space? Something you put on in the background for ambiance? Live lucid dreaming? Creating synthetic memories with old family photos?
[00:08:25] Legendary poster and new-ish X engineer Nikita Bier has been signaling that he’s working on improving X’s algo. But is it actually getting better, or is the slop increasing unabated?
[01:10:32] We know Social Network 2 is in development. If anyone has leads, we’ve got to get John and Jordi in the background as extras — it has to be doable.
Other guests: Shopify president Harley Finkelstein’s on company’s blockbuster earnings, tariffs [02:09:20]; Lovable CEO Anton Osicka on how he got to $10M ARR in 2 months with a 15 person team (GONG) [02:40:47]; Github CEO Thomas Dohmke on the widespread adoption of Copilot [02:49:42]; 137 Ventures cofounder Alex Jacobson [02:59:06]; and Rillet CEO Nicolas Kopp [03:17:47].