Welcome, humans.
Scott Guthrie has been at Microsoft for 28.5 yearsāsince before Netscape was a thing, when the internet was just starting to get big.
Today, he oversees Microsoft's entire Cloud + AI infrastructure. And at Microsoft Ignite 2025, he gave us an exclusive interview about the new Fairwater AI data centers that are rewriting the rules of what's possible with AI infrastructure.
These aren't just big data centers. They're the densest concentration of GPU power on the planetāfeaturing 5 million individual cables, Grace Blackwell GPUs that are 12x more powerful than AWS's Trainium2 chips, and a cooling system so efficient it won't need new water for six years.
In the episode, Corey sits down with Scott to explore:
- (1:11) Microsoft's multi-model strategy: Why they're partnering with both OpenAI and Anthropic (Claude now runs natively in Azure, GitHub, and M365).
- (8:33) The observability breakthrough: How Microsoft discovered they had 10x more autonomous agents running internally than they thought.
- (15:29) Building the world's fastest data center: Why the double-decker Fairwater design keeps every cable under 230 meters.
- (17:49) The sustainability play: How liquid cooling uses just 20 houses worth of waterāand doesn't replace it for 6+ years.
- (19:34) 100% renewable energy commitment: No carbon credits, no accounting tricksāreal solar, wind, hydro, and nuclear.
- (21:09) Creating 6,000+ skilled jobs: Why these facilities need master electricians and pipe fitters, not just engineers.
- (23:26) The future nobody's talking about: Scott's prediction on what AI will enable that we can't even conceptualize today (like how e-commerce was science fiction in the early internet days).
Bottom line: Microsoft is building infrastructure at a scale that makes science fiction look modest. Grace Blackwell 300 GPUs deliver sp12x the throughput in flat networks with zero oversubscription. And they're doing it while prioritizing sustainability, safety, and community impact.
Listen now: YouTube | Spotify | Apple Podcasts
Dive deeper with these resources:
- Microsoft connects datacenters to build its first AI superfactory
- Microsoft's new Agent 365 platform
- Azure AI services
- More news from Microsoft Ignite 2025
P.S. Scott made a fascinating point about AI following the same pattern as the internet and smartphones: we're still in the ādial-up modemā phase. The apps we can't even conceptualize yet? Those are coming. And they'll need infrastructure like this to run on.

THIS EPISODE WAS MADE POSSIBLE BY OUR PARTNERā¦
Dell Technologies Is First to Ship NVIDIA's Next-Gen AI ChipāAnd You Can Buy It Now

Remember waiting months for GPUs during the AI boom? While most companies are still waiting to get their hands on NVIDIA's Blackwell architecture, Dell Technologies just started shipping it.
The Dell Pro Max with GB10 is the first desktop to pack NVIDIA's GB10 Grace Blackwell Superchipāthe same next-gen architecture powering the AI labs building tomorrow's models. It's not a server rack. It's a desktop that sits on your desk.
Here's what you get:
- 128GB of LPDDR5X memory for running large models locally.
- 4TB SSD storage for massive datasets.
- NVIDIA DGX OS pre-installedāthe same software stack used by AI research teams.
- 20 CPU cores (10 Cortex-X925 + 10 Cortex-A725) optimized for AI workloads.
This isn't for casual ChatGPT users. It's for teams training custom models, running inference at scale, or doing serious data science work who are tired of waiting for cloud compute.
If you burn through thousands of GPU cloud credits in a few months, you'll definitely want to check this out. Itās a surprisingly affordable way to bring Blackwell performance in-house. Check out the Dell Pro Max with GB10 here.

Tuesday, Nov 25: LIVE with Micah Hill Smith of Artificial Analysis ( at 10AM PST | 12pm CST )

Click the image above, go to YT, and click ānotify meā to get an alert when we go live!
Donāt miss this upcoming LIVE episode with Artificial Analysisā Micah Hill-Smith where we dive deep into how to pick the right AI model for your needs.

ICYMI: Three More Episodes You Should Watch
Here are three more episode we released recently that we think youāll love.
1. Live Gemini 3.0 Demo with Google DeepMindās Logan Kilpatrick
On Friday, we went hands-on with Gemini 3.0 joined by Logan Kilpatrick from Google DeepMind to show you what the worldās new best AI model can do:

Hereās our favorite moments from the live-stream:
- (10:02) The launch week reality: Why shipping relieves organizational pressure but creates new challenges around scaling for millions of users.
- (11:20) The benchmark problem nobody talks about: Why SWE-Bench Verified results can't be compared across labs appropriately, since each uses custom agent harnesses they don't disclose.
- (13:26) Logan's personal benchmark philosophy: He likes the breadth of custom benchmarks over the ālegacy benchmarksā as a signal for real progress, but both are important.
- (18:03) Why Nano Banana became a phenomenon: The quirky naming strategy that helped differentiate from the broader "Gemini" brand.
- (19:46) The iOS clone that blew Logan's mind: Someone live-coded a fully functional iOS lookalike using the new anti-gravity developer environment.
- (20:10) Inside AI Studio's gallery: Logan walks through the breadth of what's possibleāfrom interactive landing pages to studio-quality image editing.
- (22:43) Why games changed everything for Logan: The massive gap between games you enjoy and games you can build just got way smaller
- (27:00) The kinetic shapes breakthrough: 20 simultaneous physics. simulations with different propertiesāin one prompt. Six months ago, models struggled with a single bouncing ball.
- (29:09) Prompting advice that matters: "You don't have to be a prompt magician"āLogan shows single-sentence prompts that create complex apps.
- (31:22) The "I'm feeling lucky" button: AI Studio will literally come up with app ideas for you if you can't think of what to build.
- (45:31) Why betting against AI scaling is confusing: Logan's take on the bearish crowdā"if you just follow the chart of progress, it continues to be true that models keep getting better."
- (46:37) TPUs as Google's secret weapon: The full-stack approach mattersāco-designing model architecture with custom hardware gives Google a differentiated edge.
Practical takeaway: If you're choosing between frontier models or learning to vibe code, watch Logan demonstrate AI Studio's workflow. The design aesthetics alone are āincredibleā, and the fact that you can create multiplayer games or interactive experiences in under 100 seconds shows where vibe coding is headed.
Plus, our custom Cat Doom benchmark (yes, we're making this a thing) proves that consistent 3D world generation still needs work, but we're getting close!
2. David Hsu, CEO of Retool on how 48% of non-engineers now build and ship internal production code.

What you'll learn: Why nearly half of all software builders at major companies now have zero engineering backgroundāand what that means for your career. Five years ago, that number was 5%. David's bold prediction = in 18-24 months, engineers will completely stop hand-coding internal applications.
You'll also learn:
- (1:54 ) Who the new 48% are: Revenue ops managers, finance teams, and people like Charlotte at Coursera who got promoted four times in five years by building tools
- (4:10) The AI breakthrough that enabled "vibe coding": How LLMs removed the need for CS degrees to build functional software
- (11:16) Why ChatGPT isn't delivering ROI: "You're basically the API, shuffling data back and forth" instead of automating the actual work
- (14:48) Agents vs chat interfaces: The difference between software that waits for instructions and software that actually does the job
- (24:14) The new role of engineers: Building guardrails so non-engineers don't accidentally drop production databases
- (31:27) Excel's billion users are next: Why every complicated spreadsheet should become an application
- (43:54) David's prediction: "100% of internal apps will be built by non-engineers, no doubt"
Bottom line: If you're still copying between ChatGPT and Google Docs, you're already behind. Watch David explain how high-agency people are using AI tools to build actual solutionsānot just generate text. The future belongs to builders who can wield AI, regardless of their technical background.
Watch and/or listen now on: YouTube | Spotify | Apple Podcasts
3. Alex Wiltschko, CEO of Osmo on how his company is teaching computers to smell

What you'll learn: How AI cracked the code on digitizing smell, the sense 100x more complex than vision, and why this changes everything from perfume to disease detection.
The biggest reveal? Osmo achieved āscent teleportationā by reading molecules in one room, uploading the data to the cloud, and recreating that exact smell in another room. Alex describes teleporting a plum as "one of the wildest days of my professional life." This isn't science fictionāit's production technology.
You'll also learn:
- (3:32) The complexity gap: Vision uses 3 receptors (RGB), but smell uses 300+ olfactory receptorsāOsmo built the world's first "Primary Odor Map" using AI to crack this
- (19:46) Osmo Studio's speed advantage: Design custom fragrances in one week instead of 18-24 months using traditional perfumery methods
- (21:54) Real applications today: How Osmo created a signature scent for the Museum of Pop Culture from just a photo
- (24:57) Why smell = emotion: Your brain literally touches the world when you smell somethingāit's wired directly to memory in ways vision and sound aren't
- (29:37) AI-designed molecules in production: Three molecules that have never existed in nature are already being used by major fragrance brands
- (35:16) The disease detection vision: Dogs can smell cancer and Parkinson's before symptoms appearāOsmo is building sensors to do the same (eventually iPhone-sized)
Practical takeaway: This interview is genuinely educationalāwe learned more about the science of smell than we expected. Alex also shares a hot take on why AI progress follows S-curves, not exponential growth. If you're curious about AI applications beyond text and images, this is a must-watch.
Stay curious,
The Neuron Team

Thatās all for today, for more AI treats, check out our website.
ICYMI: check out our most recent episodes below!
- Invisible Technologiesā Caspar Eliot on how humans train AI
- (YouTube, Spotify, Apple)
- OpenAIās Ahmed El-Kishky on where AI coding goes next
- (YouTube | Spotify | Apple)

What'd you think of this podcast episode?
Pick an answer below, then tell us why with the "additional feedback" option.
P.P.S: Love the newsletter, but donāt want to receive these podcast announcement emails? Donāt unsubscribe ā adjust your preferences to opt out of them here instead.

.jpg)

.jpg)





.jpg)



