GM. This is Milk Road AI, the newsletter that tracks the AI wars like it’s fantasy football for finance nerds.

This is the first ever AI edition!

My name is Melvin, and I’m here to kick butt and capitalize on the future - and trust me, the robots aren’t waiting.

Twice a week (Monday and Wednesday), I’ll hit your inbox with the biggest moves in the space and why they matter for your bags.

​​Here’s what we got for you today:

  • ✍️ The Google vs. Nvidia war

  • 🍪 ChatGPT is officially selling out

RAD Intel is helping companies double their sales using AI. Check out other ways how RAD Intel’s AI can help your business.

Prices as of 10:00 AM ET.

THE DUST SETTLES ON THE PRE-THANKSGIVING BOMBSHELL

Did you enjoy that turkey? Good. 

Because the holidays are over, and there are no warm meals in wartime.

And make no mistake: there is a war in the AI industry.

You can see it in the bears vs. bulls battle for the S&P 500. 

You can see it in the crypto vs. AI fight for capital.

But right now, a new front has opened, and it might be the most important battle of the decade: 

The Google vs. Nvidia war.

And for the first time in this entire AI race, Nvidia might be the one getting cooked.

Right before the holiday break, reports surfaced that Meta (Facebook) had entered advanced negotiations to buy Google’s custom chips (TPUs).

Or as Trump would call it “a YUGE DEAL”.

For nearly a decade, the AI hardware market has been a one-party state. 

It was Nvidia’s world. We were just living in it.

Here is how Nvidia held the crown:

  • They had the fastest chips (GPUs) on earth.

  • They had software (CUDA) that was impossible to quit.

This combo let them basically hold Big Tech hostage. 

If you wanted to survive the AI arms race, you didn't have a choice, you paid the Nvidia Tax and you said thank you. 

But the news from last week has fundamentally fractured this status quo.

The headlines focused on the who but the real story is the why.

Why would Mark Zuckerberg, who has arguably gathered one of the best AI teams in the world, risk his infrastructure on Google's chips?

The answer: it’s an efficiency war.

You see, the industry is witnessing a clash between two philosophies. 

To understand why this deal is happening and how to pick a winner in this war, you have to understand the hardware.

Nerd corner: the "army" vs. the "assembly line" 🤓 

You’re going to hear a lot of alphabet soup this week, CPU, GPU, TPU but it really just comes down to how the “workers” are organized.

Let’s start with GPUs.

GPUs are an Army of General Workers: thousands of flexible soldiers who can handle all kinds of tasks. 

Great for broad AI, but pricey to manage and power.

TPUs are an Automated Assembly Line: a factory built for one job.

They can’t do everything, but for this task, they’re faster and much cheaper.

The scorecard is brutal for the generalists because while the Assembly Line is just as fast as the Army for raw math, it uses way less power by cutting out all that management overhead. 

That’s why reports show TPUs running up to 30x more energy-efficient and 4–10x more cost-effective than GPUs. 

One estimate puts 1 GW of compute at ~$700M per year on TPUs, roughly 30% less than GPUs.

This efficiency is critical when you consider that JPMorgan research estimates the 5 biggest AI hyperscalers will collectively spend over $1.2 trillion by 2027. 

So, with the world’s biggest AI players preparing to pour roughly a trillion dollars into raw compute from silicon to electricity, you’d think Nvidia would feel the pressure. 

But the truth is, they’re still the main beneficiary: their chips power most of this buildout, and Google is trying to change that through their TPUs.

But Jensen Huang isn’t worried. 

He’s playing offense and spoiler alert: he’s got plenty of tricks up his sleeve.

If you’re running a business without AI, you’re leaving money on the table.

RAD Intel has built an AI engine that helps companies:

  • Double their sales growth

  • Land seven-figure partnerships

Do you like that? Here’s what else it can do for your business:

  • Pinpoint the key traits of your target market

  • Create content your audience actually cares about

  • Identify the right influencers for your brand

Long story short: let RAD Intel’s AI do the heavy lifting for you.

Nvidia’s counter-move: The "design war"

If you thought Jensen Huang was going to sit there and let Mark Zuckerberg ruin his margins, you don’t know Jensen.

While the market was obsessing over the Meta deal last week, Nvidia made a lethal move.

A $2 billion investment and partnership with Synopsys.

Every chip in the world, whether it’s from Apple, AMD, or Google is designed using software from Synopsys.

It is the Adobe Photoshop of chip design. You literally cannot build a chip without it.

By investing heavily here, Nvidia is integrating its own AI directly into the design tools.

This creates a wild circular trap.

If Google wants to design the next generation of TPUs to kill Nvidia, they might have to use Nvidia powered software to do it.

Nvidia is trying to lock in the design layer before they may lose the hardware layer.

And he didn’t just let his wallet do the talking. 

He went on national TV to make sure everyone heard the message loud and clear.

Jensen claps back: Google is... cute.

I wasn’t kidding: this guy literally appeared on CNBC to announce his deal with Synopsys, and he dropped a masterclass in corporate shade.

When asked about Google and Meta, he didn't panic, he smiled and brought up the ASIC problem.

What’s an ASIC? ASIC stands for Application Specific Integrated Circuit

It’s a fancy way of saying a chip designed to do exactly one thing. 

It’s like a toaster. A toaster is amazing at toasting bread (efficient, cheap, fast), but if you try to cook a steak in it, you’re going to have a bad time. 

Google’s TPU is an ASIC.

Jensen’s argument? He sells the oven.

"You know, listen, we’ve been -- we’ve been competing with ASICs now for quite a long time, and Google has had ASICs for a long time... and they did a great job... But what NVIDIA does is much more versatile. Our technology is much more fungible."

Ouch. Translation: Good job on your little toaster. We are building the kitchen.

He then pivoted to the "Generational Gap" argument, reminding viewers that while Google's ASICs are great for specific tasks (like chatbots), they are too rigid for the next phase of AI.

"Accelerating Synopsys design tools... requires a computer architecture like CUDA. It’s not available for ASICs. And so NVIDIA can address markets that are much, much broader, not just chatbots."

He’s betting that if this deal happens, by the time Meta finally migrates to Google’s rigid chips, the AI models will have changed so much that those chips will be obsolete. 

As Jensen put it, the future isn't just text on a screen, it's "Physical AI".

Wrapping up

So, is the Nvidia Era officially over? Not exactly, but the era of easy mode is over.

For the last three years, owning Nvidia stock was like being the only umbrella salesman in a hurricane, you didn't have to sell, people were begging to buy. 

But now, the whales are finally learning how to swim on their own.

In other words: the Nvidia Tax just got audited. 

Meta buying Google chips is a massive signal that the unlimited pricing power thesis is cracking. 

It’s like your most loyal subscriber cancelling their plan to build their own streaming service. 

However, Jensen’s move with Synopsys proves he is playing 4D chess while everyone else is playing checkers. 

By focusing on the design software, he just pulled a landlord move.

Google can build their own house, sure, but they’re going to have to pay Nvidia rent for the land they build it on.

That’s it for the first ever Milk Road AI edition - but we’d love to know what you think of this first edition. Reply to this email with:

  1. Brilliant

  2. It was decent

  3. I expected more

Incogni helps to remove people's private information online. Sign up with code MILKROAD for 55% off.

Bridge helps businesses send, store, accept, and launch stablecoins instantly. Learn how Bridge is powering stablecoin-backed money movement.

Wanna get in on the next iteration of AI domains, before everyone else? The Singulant just launched .AI4 domain names – check them out here.

BITE-SIZED COOKIES FOR THE ROAD 🍪

Your WhatsApp is getting hacked by AI. A new AI-powered malware wave called “Water Saci” is sweeping Brazil, rewriting its own code to dodge scanners and impersonate trusted contacts: stay safe out there, Roadies.

ChatGPT is officially selling out. Leaked code in the latest Android beta shows OpenAI is building its own ad network, meaning sponsored results could start showing up inside your chats.

Amazon just dropped a nuclear weapon on Nvidia. AWS announced the new Trainium3 chip today, and the specs are terrifying for Jensen Huang. It offers 4x the performance and 40% better energy efficiency than the previous version. 

MILKY MEMES 🤣

Keep Reading

No posts found