Return to site

Why AI’s Next Phase Favors CPUs and What Investors Should Know

March 4, 2026

Artificial intelligence is going through a major transition - one that could reshape the entire semiconductor industry. For years, the narrative has been simple: Nvidia makes the GPUs that train AI models, and therefore Nvidia is the way to invest in AI. But as Gary Brode from Deep Knowledge Investing explains, that story is now changing.

Training large language models is incredibly expensive. It requires massive GPU clusters, huge amounts of electricity, and specialised hardware. Nvidia dominates this part of the market, and for good reason - their CUDA ecosystem and hardware lead are enormous.

But the next phase of AI isn’t just about training. It’s about inference - the moment when an AI model actually answers a question, completes a task, or acts as an “agent” on your behalf. Inference is less about brute force GPU power and more about efficient, scalable compute. And that’s where CPUs from Intel and AMD come back into the picture.

Gary explains how companies like DeepSeek have cut costs by under training models and relying more heavily on inference. He also breaks down why “agentic AI” - systems that can plan, act, and execute tasks - will be far more inference heavy than today’s chatbots.

This shift could rebalance the entire AI hardware landscape. While Nvidia remains dominant in training, inference workloads may increasingly favour CPUs, opening the door for Intel and AMD to regain share.

📈 Gary Brode has spent three decades managing money in the hedge fund world. Now, through Deep Knowledge Investing, he shares the same level of deep, well-researched ideas he uses for his own portfolio without the Wall Street conflicts.

Use coupon code SFB30 at checkout for 30% off an annual or monthly subscription (applied for the full first year). https://deepknowledgeinvesting.com/subscribe-now/

What you get as a subscriber:

• High-alpha stock ideas – Complete research reports on overlooked opportunities, backed by Gary's personal positions (he only recommends what he owns).

• First to know updates – Immediate alerts on new buys, sells, target price changes, or opinion shifts – crucial for knowing when to enter or exit.

• Direct access – Email Gary directly with questions; VIP perks include future Discord channel access (launching soon) and even advertising opportunities to reach 5,000+ engaged investors.

• Proven track record – Published in Barron's, featured on RealVision, with standout wins like 117% on HCA Healthcare and 324% on Houghton Mifflin Harcourt.

This isn't hype or generic tips – it's professional-grade research designed to help everyday investors (and pros) earn more consistent returns in equities, no matter the market environment.

Use coupon code SFB30 at checkout for 30% off an annual or monthly subscription (applied for the full first year).

Disclosure: The links provided are affiliate links. I will be paid a commission if you use this link to make a purchase. You will receive a discount by using these links/coupon codes. I only recommend products and services that I use and trust myself or where I have interviewed and/or met the founders and have assured myself that they’re offering something of value.

EPISODE TRANSCRIPT

Phil Muscatello: G' day and welcome back to Shares for Beginners. I'm Phil Muscatello. Today we're diving into a number of interwoven stories that sit right at the heart of artificial intelligence and why the companies behind the chips powering AI matter to every investor. To understand what's going on, let's look at the chips inside the data centers. CPUs made by intel and AMD are, uh, the general purpose chips that have powered home and office computers for decades. GPUs where Nvidia dominates are built to do huge numbers of calculations at once, which makes them perfect for training AI models. And then there's ARM chips. These are ultra efficient processors originally designed for phones and tablets. Nvidia and Meta. And just keep in mind that Meta is the company behind Facebook and WhatsApp and Instagram and some of those other social media kind of companies have announced a major deal to develop new ARM M based chips for Meta's data centers. That raised concerns that intel and AMD could lose even more share. But at the same time, Nvidia has invested billions into intel and is working with them on new combined CPU GPU designs. And let's not forget there's TPUs, which is what Google use in their AI data centers for their artificial intelligence. So what's really happening in this tangled web? To help us unpack all of this, I'm joined by Gary Brode from Deep Knowledge Investing. He's been shrinking himself down to the nanoscale to talk about semiconductors with insiders right inside those chips. Hello, Gary.

Gary Brode: Hey Phil, thanks for having me. Great to be back and great to

Phil Muscatello: have you back on as well. It's been too long. So let's have a look at the headline view of what's happening in the, uh, chip sector and artificial intelligence and all the companies in this tangled web that I've described.

Gary Brode: Okay, and I get that this is all very confusing to a lot of people, but don't worry, in the time we have, I think we can unravel it for people. So the big news from a week, a week and a half ago was Nvidia doing a deal with Meta or maybe we should just call them Facebook because that's how everybody thinks of them. And they're going to be developing chips for Meta's or Facebook's data centers and they're going to be based on ARM technology. And the reason for that is because those chips use a lower power draw. So when everybody hears about these data centers using tons and tons of electricity, that really is to run the chips and for the cooling. So anything that has a lower power draw is going to be of interest. So let's just start by unpacking the different kinds of chips. You know, everybody has been focused on Nvidia lately and they focus on GPUs. That stands for graphic processing units. And they started actually as a company that made those chips and those cards for gamers. And they still do, right? If you are a big video game player, chances are you have an Nvidia GPU card in your computer. Now, the cpu, the central processing unit that most computers use are the x86 architecture chips. Those are the ones that were developed by intel decades and decades ago. And then AMD, advanced micro, uh, devices, they started making x86 chips as well for the CPUs. And one of the things that happened is over the last few years, as these data centers have grown, people have said, oh no, oh no, this is terrible because Cher is going from the CPU, these x86 chips, to the GPU, the Nvidia chips. And the answer is yes, that's true, but misleading. And here's why. When you hear about these data centers that are packed with servers, and those servers are loaded with the Nvidia gpu, the AI accelerators, what do you think is directing Traffic? Those are CPUs, right? Those are.

Phil Muscatello: I didn't realize, I didn't realize that they're still in the, in the guts of these machines. Huh? Huh?

Gary Brode: Right. So when people say, oh my God, share is shifting, okay, that's true, but the whole market is growing and, and the number of CPUs are growing as well. So now let's talk about the ARM processors. Now ARM is a company, they actually make this infrastructure and they license it out. And if you have like say, you know, I use a Samsung Galaxy phone and the high end ones in the United States

00:05:00

Gary Brode: use Qualcomm chips, those are phenomenal processors. But those Qualcomm chips are based on the ARM designs. And the reason why Samsung uses them, the reason why people like the ARM designs in cell phones and tablets is because they use less power, they're more power efficient. And with your cell phone you really do want that 20 hour life. Your laptop, you're probably fine with five, six hours, but your cell phone, you really do want that longer life. Now the downside to the ARM processors is compatibility. They don't always run all the legacy software, particularly some of the Microsoft stuff as well. There are issues with that. But this is the reason. And so, you know, a couple weeks ago when Meta and Nvidia announced this deal, they announced that they were going to be developing chips based on the ARM infrastructure. And people said, oh no, this is terrible. You know, it's bad for intel and bad for AMD. Again, that's the x86 CPUs. They said they're going to lose share. And my response to that is maybe right, it's complicated. There's a lot, there are a lot of moving parts here. I think the concerns are valid, but it's not as simple. And we can start going through the complexities and the reasons why. But let me just stop here because I've just given you a ton of information. Anything we need to clarify on the way here?

Phil Muscatello: No, uh, no, that's pretty clear. But it's interesting because, you know, so much about investing is based around narratives. And the narrative has been that Nvidia make these GPUs, they're being used in data centers, and therefore that's your way of investing in artificial intelligence. And then, you know, suddenly people notice that intel and amd, their processes are being left behind. And so the narrative is, oh, uh, we'll sell off that, but not taking a look at the implications of it. So what do you think is the implications that mean that intel and AMD are still going to be around for a little while longer and possibly investable?

Gary Brode: They definitely will. So again, really important to emphasize that you still are packing these data centers with servers. Those servers need CPUs to run them next. There's no question AMD is really good on the PowerU side. And that's a big deal for Meta, right? For anybody in a data center, you want to have a lower power draw. All right, we get that. But that's also an area where intel has made great strides in recent years. So just last month intel released their Core Ultra line. They're third generation, so they've been making those chips for three years. And the thing that's so interesting about those chips, and that's what's in my laptop right now, is they combine a cpu, a GPU and an NPU that stands for neural processing unit. It's basically an AI optimized thing. I told you it's going to be complicated. But the thing that's so great about these particular intel chips is they have E cores and P cores don't worry, I'm going to explain. I know we've got initials for everything. We have acronyms for all of it. The P cores, those are your power cores. When you need high power, when you need your computer working at fast full capacity, that's what you're going to use. And that has a high power draw. But these chips also have a lot of E cores. Those are the efficiency cores. And a lot of what you do on your computer, your laptop in particular, doesn't require the full use of your chip. And so what intel does is they say, okay, fine, if what you're doing is not processing intensive, we're going to shift that workload to the E cores which use, they're efficient, they use very little power. So there's all of this happening. Now. In addition to that, let's start to take a look at, ah, the relationships between all of these companies. First of all, Nvidia just sold all of their ARM stock right now that was about $140 million worth of stock. For a company like Nvidia, which is approaching $5 trillion in market cap, this is like that's the change in their couch cushion, right? It doesn't matter. But uh, last year Nvidia announced that they'd be developing combined CPU GPU chips with intel and just a few months ago invested $5 billion in Intel. That's a real investment. Now again, for a company like Nvidia, is that a lot of money? No, it's not a lot. And you know, if people want to argue against what I'm saying, let me help you. I'm going to help you argue against me, which is Nvidia made that investment at a time that they were trying to persuade President Trump to let them sell their high end GPUs to China.

00:10:00

Gary Brode: It also is the same time that President Trump secured a 10% stake in intel for the United States and he staked a certain amount of his reputation on keeping intel as a national champion. And the President does not want to see intel fail. With good reason. It, intel is a nationally important company. So, you know, it's really easy to say, hey, wait a minute, Jensen Hong just sold amd, he bought intel stock, they're developing chips with Intel. But the argument against it is, yeah, but he did that in a way that would please President Trump at a time that he wanted something from President Trump. And Jensen Hong, you know, he's, he's a smart guy. Ah, for people who don't know he's the CEO of Nvidia, but he is also a very politically astute person. He has a very good sense of what things look like in public. And, you know, he needs to keep the United States government happy, but he also needs to deal with the Chinese government. These chips are nationally important all over the world. And so he's a very astute political operator. And if somebody wants to argue, hey, he was really just doing that to keep President Trump happy. You know, at a time when they wanted to sell tens or hundreds of billions of dollars of chips to China, not a $5 billion investment. Doesn't matter. So, like I said, we've got a lot of moving parts here.

Phil Muscatello: Ditch the spreadsheets. Sharesite is Investopedia's top tracker for DIY investors. Invest smarter, not harder. Grab four months free on an annual premium plan at sharesite.com sharesforbeginners and then we've got Google with their TPUs. How does that work? Are TPUs commercially available for other organizations or other companies wanting to use them, or are they specifically within the Google walled garden?

Gary Brode: Yeah, so TPU stands for Tensor Processing Unit. And you know what's really interesting is everybody has a desire to not be dependent on a single supplier. So right now, if you're one of the hyperscalers, and that's just a fancy word for companies like Meta, you know, or Facebook, Google, Amazon, Apple that are building these gigantic data centers. OpenAI is one as well, and filling them with Nvidia GPUs and training AI models. You know, those, your hyperscalers, they're all saying, hey, wait a minute, you know, we are at the mercy of. Of Nvidia. And if Nvidia has a problem or just decides they want to take our supply of chips and sell them to somebody else, we kind of have a problem. And so some of these companies are developing very specific chips that do exactly what they want them to do. And then, you know, they send the design to Taiwan Semiconductor and have them made there. And they want to have a greater degree of control. And so the TPUs, the tensor processing units, are just Google's attempt to do that. And, yeah, they'll sell them Amazon on, you know, they're making their own chips as well. Apple's been designing their own chips for years. Nobody wants to be a single source supplier. And that, by the way, takes me. So I, you know, I own intel stock. I do think it matters that, you know, they're a national champion, that the US Government desperately does not want to see intel fail. But one of the things that I think will help intel succeed is intel has had trouble for years in their new fab plants, I.e. taiwan Semiconductor, every year or 18 months smoothly moves from one fab plant to another. More advanced technology. And they basically grade these things in nanometers. And so, you know, intel, they had a lot of difficulty going from 10 nanometers to seven down to five, down to, you know, three. Uh, basically what they've done is they've had all these problems, but their new 18, a plant that stands for 18 Angstroms, which is basically, basically 1.5 nanometers, that looks like it's doing pretty well. And they are trying to get external customers. Now, think about it. If you were AMD or Nvidia and all of your production is taking place in Taiwan, which is maybe the most politically dangerous place to be at the moment, wouldn't you want a second supplier? Right.

Phil Muscatello: I mean, who knows, who knows when China is going to make that move as well that we've been hearing about for decades? Yeah.

Gary Brode: Uh, yeah. And the thing about China is they have been explicitly clear. They haven't given us a time frame, but they have said, we will take this by force, by negotiation. They basically, at some point they're going to say to Taiwan, you know, hand over the keys. You are now basically our next Hong Kong. Right. Or our next Macau, or they will invade. And they've been crystal clear on that, and they haven't given a time frame, but it's

00:15:00

Gary Brode: going to happen. And I do know people, and I have a huge amount of respect for the capabilities of the US Military and the character of the men and women who serve in uniform for the United States. And I have spoken to people in the US Military who believe that they could stop China from taking Taiwan. I'm not a military expert. Uh, all I can say is I am skeptical. It's very far from here. Supply lines are overextended, and China does have the ability to sink US Aircraft carriers, and that is very expensive, both in terms of money and even more valuable lives, the lives of our service people. You know, I don't want to argue with military experts in an area where I'm not an expert. Uh, all I'll just say is I'm skeptical that we could stop China. At any rate, at some point, China will either have control of Taiwan semiconductors plants or, or they'll be destroyed. Or I have been told that US Companies have the ability to remotely disable those plants. Okay? But if that happens, the world's largest best semiconductor manufacturer is either offline or controlled by The Chinese Communist Party. If you were, you know, any of the companies we're talking about, any of these hyperscalers or Nvidia or amd, wouldn't you like to be producing something in the United States or at least in a friendlier place? And intel is the only company in the United States capable of that kind of production. Now the good news is, uh, what I'm hearing are really good things out of the 18A plant and you know, whether they'll have a customer there soon or not, I don't know. Worst case scenario is some of these companies, they watch and they see, okay, hey intel, go make this 18A AH plant work. And if it does, the next plant will be 14A technology which is, you know, basically just even smaller. And you know, they might say we want to see you succeed in this operation before we commit to being a customer for the next one. But the point of all this is everyone all over the world is saying we want some control. Right? Nvidia and AMD are saying we don't necessarily want to be 100% reliant on Taiwan Semiconductor. We Google and Facebook Meta and Amazon are saying we don't want to be 100% reliant on Nvidia. And so the TPUs is a very long explanation. I'm sorry about that.

Phil Muscatello: But the TPUs are fascinating. Yep.

Gary Brode: All right, that's basically just Google saying we want chips that A do exactly what we want them to do and B, where we have some control, where we're not a single supplier. And so that's what's going on. And again, you have lots of companies doing this kind of stuff.

Phil Muscatello: So then intel, you're saying, is the only one that can actually manufacture these chips within the United States. Is that the case?

Gary Brode: That is the case now. Now Taiwan Semiconductor is building fab plants in the United States right now.

Phil Muscatello: I was going to ask as well. Fab plants, that's fabrication plants, I'm assuming.

Gary Brode: Exactly. Yeah, sorry, I apologize for the lingo. But yeah, fabrication plant. A fab plant is Simply a like $10 billion plus, you know, multi tens of billions of dollars plant where uh, high end semiconductors are produced. They're fabrication plants. They're called fab plants. One thing the previous White House did, and I was supportive of this, was they said, hey, you know, we're going to take $40 billion and use it to incent to build semiconductor manufacturing capacity in the United States. Because everybody's saying, wait a minute, we're getting all of our stuff from Taiwan. So it Wasn't just all these companies. It was the United States saying, we don't want to be completely reliant on Taiwan either. And part of what they did, and I think this was correct, was to invite Taiwan Semiconductor. They said, hey, if you start to produce in the US we will give you money. We'll support you. And that actually was, it was a move that I supported. Now, President Trump has taken a slightly different approach. He's less carrot and more stick. And he's gone to Taiwan Semiconductor and said, hey, guys, how do you feel about having huge amounts of tariffs against your chips coming to the United States? And Taiwan Semiconductor said, well, we're, we're not a fan. We don't like that. And President Trump said, great. How do you feel about building more fab plants in the United States? If you do that, we won't tariff you. And Taiwan Semiconductor said, well, that sounds really good. And by the way, this is good for everybody because guess who else doesn't want to be 100% reliant on Taiwan geography? That's Taiwan Semiconductor. Right now they have.

00:20:00

Gary Brode: And we're going to unfortunately get to geopolitics a little bit here, but I'm going to do my best to make it not hurt. Taiwan Semiconductor has what they call the Silicon shield. First of all, it's really important to note that Taiwan Semiconductor is not only the largest semiconductor manufacturer in the world, they are the best. Intel is a great manufacturer. Samsung makes great chips. Qualcomm is phenomenal. Micron makes terrific memory. They're not the only high end, but at the very ultra, the very best, high end, Taiwan Semiconductor stands alone. And so they view the fact that they are crucial to the entire world's technology to be political insurance against a Chinese invasion. Right. Basically, if the ccp, the Chinese Communist Party, controls Taiwan Semiconductors, best in the world, fab plants, the rest of the world has a problem with technology. And so Taiwan Semiconductor has said, yeah, we'll produce in the US but not our very best. They're reserving their very highest end production for Taiwan, and that's for political protection. Again, they call it the silicon shield. And, uh, I don't like that decision, but I certainly understand it. And just like I like things that are in the best interest of the United States. Well, you know, the people at Taiwan Semiconductor are 100% entitled to do what is in the best interests of Taiwan. I don't have to like it, but I do understand and respect it. So they are building plants in the United States and that will get Them some money from the previous White House, the Biden administration, and it will get them relief from the Trump tariffs. And so you know, again that works all the way around. But right now in the United States, high end production of semiconductors of CPUs is being done by Intel.

Phil Muscatello: What's the scale of investment in artificial intelligence infrastructure? You know, what is the size, what's it look like?

Gary Brode: It is overwhelming. Let's just take a look at the unbelievable looming disaster that uh, are the capex plans for just one company? Actually, you know what, let's take a step back.

Phil Muscatello: Capital expenditure plans, how much they're actually exactly be uh, coughing up for this particular project. Yeah, yeah.

Gary Brode: So we were taking a look at this, you know, uh, at Deep Knowledge Investing. We do a weekly piece called five Things to Know in Investing. This week, two weeks ago we were talking about the fact that just two US hyperscalers, I think it was Google and Meta alone plan to spend this year in 2026 more than $400 billion. We're closing out on half a trillion dollars of, of capital expenditures from two companies alone. But now let's get to the looming disaster. That's OpenAI. Now that's the company that makes Chat GPT. So you know, if you're not familiar with OpenAI, you probably are still familiar with ChatGPT. That's that company. The obligations that this company has taken on, the promises that they've made to buy and build from other companies. You want to guess the amount? It is truly staggering. $1.4 trillion. Those are the commitments they made. Now here's where it gets really fun. They don't have $1.4 trillion. And you might say, okay, well fine, very few companies have $1.4 trillion. But they make money, right? Nope. Okay, but they have positive cash flow. Uh, nope. This company intends to burn somewhere in the neighborhood of $125 billion over the next few years. So what they need is $1.4 trillion to fulfill their obligations, plus another $125 billion or so. They basically, they need one and a half trillion dollars to fund expected operating losses and capital expenditures over the next few years. Now where they're going to get that money I have no idea. But realize that if they have to start canceling orders with Oracle, that's a gigantic problem because Oracle is spending money to build out these data centers. Right now Oracle is putting capital up and somebody said, okay, well that's bad for Oracle, but what about everybody else? Well they've ordered hundreds of billions of Dollars of chips from Nvidia. Well, what happens, you know, somebody might say, well, you know, Nvidia isn't reliant on them. No, but Nvidia's stock price is reliant on a certain growth rate. And if OpenAI starts to default on their commitments, that leads to a problem where Nvidia's growth rate can fall. And if that happens, their stock gets rerated and it can take down the entire equity sector. Right. It won't just be Nvidia, it will be Nvidia,

00:25:00

Gary Brode: Oracle, Google, Meta, Amazon, you know, Apple isn't going to do well, intel will be fine, but the stock is going to be down that day. AMD stock will be hammered as well. I mean, the equity indexes will get crushed on OpenAI, a non public company simply not being able to meet their obligations. So the amount of capital expenditures involved here, the planned capital expenditures for the industry are now in the range of trillions of dollars.

Phil Muscatello: Let's talk about computing, shifting from training to inference. Because at the moment they're large language models. They're just training large language models basically to say, if I say car, you know, next will be red Ferrari, whatever it's going to be, what is inference and what's the implications of moving to inference?

Gary Brode: All right, so that's a crucial question, Phil. And I want to be clear about something. The answer I'm going to give is going to be a general answer where there's going to be somebody listening to this who is a, uh, computer science engineer and is going to say, wait, that's not technically true. Okay, everybody give me just a little bit of leeway here. I'm giving a general explanation for how to think about this. I am not going to try to explain, you know, the theoretical physics of these black boxes. When these LLMs, the large language models, these are the things that people know as ChatGPT or Grok or Claude or Gemini when they are built. They spent hundreds of billions of dollars on training. And that is where Nvidia is the market leader. And that itself is a little misleading. Right? Uh, to say Nvidia is the market leader implies that there's somebody, you know, behind them. And the distance between Nvidia and AMD and the space is gigantic. Right? Nvidia is, they're basically a monopoly, uh, for all intents and purposes. And so what's really expensive about these Nvidia training models is when you train, you are basically teaching the model everything. You're feeding it all the data and preparing it for anything. So think about it. Imagine you're in medical school and you have to train and learn, and you have to know all of this information to pass your licensing exam, right? Okay. Inference. Think about it like researching a specific answer, right? So when you go to your doctor and you say, doctor, I have this problem or this medical issue, the doctor may say, uh, ah, yes, we learned about that in medical school. And I've read 20 articles about it, and I've seen 500 patients who have this exact problem. This is what's causing it. Here's how we're going to treat it. I know what this is. That's training, right? Inference is when you go to your doctor and you say, hey, doctor, I'm having this, you know, kind of strange problem. And he says, okay, that's really interesting. You know, let me go check out some case studies on that and come back to you with an answer. And by the way, this is the thing that I train Deep Knowledge Investing's interns to do, right? I ask them questions. They are instructed. If they don't know the answer, don't. Don't BS your way through it. Just say, I don't know, give me a day, I'll go do research. And this is what I do all day. I'm doing inference. I already know the finance. I'm going and finding the answers. And this, by the way, is what happened almost exactly a year ago with Deep Seek. So I think, you know, we got the deep seq announcement. Phil, was that January of 25? It was either late January or early January.

Phil Muscatello: Sure, it was last year sometime, but yeah, it feels like about a ago year year ago, yeah.

Gary Brode: Okay.

Phil Muscatello: The Chinese. The Chinese firm that was suddenly able to produce artificial intelligence with much less cost, I believe.

Gary Brode: Yeah, right. Okay, so let's talk about this for a second. First of all, they claimed that they didn't have access to Nvidia's best GPUs. That is absolutely not true. They also claimed that they did it on their own. Also not true. They were stealing data. Okay, fine, but they were saying, you know, instead of hundreds of billions of dollars, we did this on, you know, 50, 60 million dollars. Well, how do you do that? And the answer is there's no free lunch, right? There is no free lunch anywhere in AI. Uh, basically what they do is they under train the models, right? So less training and more inference. So if you were to ask ChatGPT a question, the training expense is very, very high. But when you ask it a question, there's a very high probability that it knows the answer and

00:30:00

Gary Brode: can very quickly and with relatively low electricity and CPU usage come back and say, here's your answer. With an inference model like Deep seq, what is more likely to happen is it says, yeah, I don't know the answer to that. I'm now going to use more and more power and computing, both electricity and computing power. And I'll come back to you with an answer. So it really is just how much pre preparation do you do? And uh, the reason all of this matters is because the next big move in AI and artificial intelligence is going to be something called agentic AI. Just a fancy way of saying we're going to be moving from asking ChatGPT to make you a funny picture that you can post social media to saying to an LLM, a large language model, go do this thing, go make me a reservation, go make this plan, go purchase this item, right? Coders are using it to say, go build me this model. So you saw, you know, uh, a week or two ago we had a huge decrease in the stock price of all of these companies, the SaaS companies, software as a service, right? The companies that say we're going to charge you $50 per month, uh, you know, like Adobe, right, we're going to charge you $50 a month, $100 a month for our product. And Claude, you know, the LLM from the company called Anthropic basically said yeah, we can start doing all of this stuff.

Phil Muscatello: And so hence the Sessica as they call it, or the Cessapocalypse.

Gary Brode: Yes, exactly. And so, you know, it's a threat to all of these companies, but those are agentic AI agents that are basically saying, sure, I will go do this thing, I will make this reservation, make this purchase, build this model, create a database, you know, handling piece of software, you can vibe code, an app, which is just a fancy way of saying you tell the LLM this is what I want and it creates the app for you, okay? All of that stuff, that agentic activity is training, light, inference, heavy. And here's why that matters. Because in training that is very expensive to create, less expensive to run. And that's the place where Nvidia has this gigantic lead. And another reason they have that lead is because of their CUDA instruction set, which is basically just a fancy way of saying Nvidia has a language, a communication language that the models are optimized to run on. And if you want to use AMD GPUs, they need to basically translate Nvidia's CUDA language. It's why they have a huge lead. It's one of many reasons they have a huge lead. But in inference, that activity, all those agentic activity and the inference activity is less GPU heavy and more CPU heavy. And guess who makes those? That's intel and amd. And so we started this conversation by saying, oh no, Nvidia has a deal with Meta and they're going to be using these ARM based processors and isn't this horrible for AMD and Nvidia? And I said, okay, this is a valid concern, but it's complicated. Well, guess what? The entire AI market is shifting from an area where Nvidia has basically monopoly power and they've earned it. Nvidia is a phenomenal company and there's going to be more computing power shifting to CPU usage. And boy, that's really good news for intel and amd.

Phil Muscatello: Super is one of the most important investments you'll ever make. But how do you know if you're in the best fund for your situation? Head to lifesherpa.com you to find out more. Lifesherpa, uh, Australia's most affordable online financial advice.

Phil Muscatello: Okay, so to drive all of this, we're talking about energy and huge energy demands. And in your article in Five Easy Pieces. Sorry, your article in, uh, Five Things,

Gary Brode: you know that's a Jack Nicholson movie, right?

Phil Muscatello: Yeah, yeah, yeah, that's a. Yeah, it's a film reference. I know, know that. So Japan has kicked off a $550 billion commitment to the US with a $36 billion investment in energy infrastructure. What's this telling us about the. I know you like geopolitics. Uh, what's the Brodeer geopolitical story happening here?

Gary Brode: So I like this deal all the way around and we'll see what happens.

00:35:00

Gary Brode: So the reason Japan agreed to invest a little over half a trillion dollars in the United States and is because President Trump strong arm them. Um, and he basically said, look, you know, we are an import country, you guys are an export country. Your country benefits disproportionately, in his opinion, from being able to sell into the US market. We want to monetize that. You want to sell into the US Market, you have to invest here. And Japan said, okay, you know, we will do that. Now there were questions at the time whether Japan had to just invest half a trillion dollars and the United States government got to say, okay, thank you, we now own these assets or whether Japan would own these assets and earn a return. It turns out with these energy assets, this actually looks like a really good deal. All the way around. First of all, the United States desperately needs energy infrastructure. One of the things that people focus on is the big companies in the AI space are really US Controlled. Right? Nvidia, intel, amd, Meta, all the companies we've been talking about, with the exception of Taiwan Semiconductor and some of the Chinese LLMs like Deep Seq, they're all US companies. And so people are saying, hey, great, we have this gigantic lead. Hold on a minute. What China has is the ability to create enormous amounts of energy infrastructure because they don't have the same permitting issues that we have in the United States. And so China has been building more and more energy infrastructure and that ultimately is going to give them an advantage in a lot of this AI related stuff because you need epic amounts of power to run this. Now, the Trump administration has tried to respond to that. They have fast tracked 10 companies to get Department of Energy approval, including a company called Terrestrial Energy. The ticker there is I am M Sr Integral Molten Salt Reactor, which they make small modular reactors with. You chain two of them together and you have almost 900 megawatts of power and the meltdown risk is zero. This is safe new nuclear. We have, you know, more than a dozen companies in the United States that are building small reactor technology. And the Trump administration also just set aside $100 billion which should produce something in the neighborhood of 10, you know, normal third generation nuclear plants. That's, you know, another 10 gigawatts of power. You know, they're trying to respond to that and I think, you know, reasonably so that's the right move. So Japan is saying, okay, we're going to invest over $30 billion in the United States and build energy infrastructure which the US Desperately needs. I've, uh, looked the terms of the deal indicate that Japan will earn a really good return on this until they're paid back. And then after that I think they get a, uh, 10% return going forward. So Japan is going to make money on this and will also help Japan shift their reliance on energy. Japan has to import almost all of

Phil Muscatello: their energy from us here in Australia.

Gary Brode: Yeah, well, from the Middle east to the United States, which is a more reliable partner for Japan. So this could be a really good deal all the way around. But in the end, if we just want to run air conditioners and our computers and people want to charge their iPhones, we, we still need more energy infrastructure. But if we want to have this AI powered future and not have China win that race, we're going to need a lot more than $36 billion. But Japan investing in building that capacity, I believe it's a lot of natural gas capacity. The largest natural gas plant in the world. You know, that's a great place to start. And it'll, I think, uh, put Japan in a better place in terms of energy reliability. The US Needs the infrastructure, and Japan should be able to earn a good on that investment. So I'm liking this deal all the way around.

Phil Muscatello: Okay, Gary, then what's the takeaway? What do investors need to be thinking about to make sense of all this? Because, you know, we're getting all these people coming up to us. I'm sure you are as well, saying, how do I invest in artificial intelligence? How do they.

Gary Brode: Yeah, that's. Well, that's a great question. And the thing that people have done that's been really smart and they've done incredibly well as they've invested in Nvidia. And the people who have owned Nvidia through all of us have done incredibly well. Congratulations. That's phenomenal. Other ways to invest in that space are the hyperscalers that we talked about, right? Companies like Google and Meta and Amazon. I am not personally invested in those. I am, um, concerned about valuation risk.

00:40:00

Gary Brode: I'm also concerned about return risk. These companies, as we said, are spending hundreds of billions of dollars. Like just, you know, Google, we said Google and Meta alone, almost half a trillion dollars in capital expenditures this year. You know, a guy who I respect on X, you know, formerly Twitter, Harrison Kupperman, actually called the people running these AI data centers, and none of them can figure out how they're going to make money. And this I've spent the last two years saying no one has a business model to earn a return on these hundreds of billions of dollars that they're spending. I don't know how they're going to get paid. And the worst part is they can't stop. And the reason is AI development happens so quickly, quickly that, uh, if you stop, the value of your model goes to zero within months. And so nobody knows how they're going to make money from it, but they can't stop spending on it. They might blow another few trillion dollars before anybody even bothers to figure out a business model. And I don't know that they ever will. Right now. Google and Amazon are not going to go bankrupt over this. But that doesn't mean I want the valuation risk. I'm, um, also, as we talked about before, very concerned about a situation where OpenAI says, hey, guess what? We don't have $1.5 trillion we can't get $1.5 trillion. And look, uh, again at the risk of being repetitive, OpenAI walking away from commitments is not going to bankrupt Nvidia, but what it will do is lower their growth rate and then the whole tech sector rerates. So I'm not investing in that way. The companies that I like in this space one, I like intel, they have been designated as the national champion. I like their new Core Ultra chips. I like what I'm hearing out of this 18A plant. I think it's going to succeed. I think neither Nvidia nor AMD nor the White House wants to see intel fail. So I think there's a floor under it. I think you know, you basically have, it's not a free call option but there's a lot of upside and I think you're somewhat protected on the downside. So I like that. I think AMD is really interesting and that's a company that we're starting to look at. I also am playing this on the energy side. I own a huge amount of energy pipelines, energy infrastructure, a lot of oil related assets. I own a massive. We've talked about this before Phil. I own a massive amount of uranium. I'm very bullish on new nuclear. That is you know, for the environmentalists in the crowd, if you are among the people who believe that carbon dioxide is a uh, pollutant and I do not believe carbon dioxide is a pollutant but that's a whole other political conversation. But if you believe that then nuclear is the way you get this AI future without burning fossil fuels. And somebody might say oh no, it's dangerous. Not the new nuclear, not the Gen4 plants. The meltdown risk is gone. So I own a huge amount of uranium. I think there's another company that's really interesting, I mentioned it earlier. The ticker is imsr. The company is called Terrestrial Energy. They are among the 10 companies that have been slotted for expedited Department of Energy approval and they are the only new nuclear company on that list. Just a month or two ago they got approval from the Department of Energy to start construction of a new nuclear plant. Right? This is gen 4 and that's the

Phil Muscatello: hardest, that's, that's the hardest part isn't it? Getting approval. I mean that's where the bottleneck really is in uh, developing nuclear at this point.

Gary Brode: Phil, you want to know how bad it is? The United States since the uh, formation of the nuclear regulatory agency 50 years ago has approved a total of two new reactors. We have had more plants decommissioned than built in that time and ready. Here's my really unfair quiz for you. Want to guess what is the largest expense to building a nuclear reactor in the United States? You know what is the most expensive part of it?

Phil Muscatello: Oh, uh, come on. Obviously, it's regulatory approval.

Gary Brode: Super close. It is interest expense, because you have to start and It'll take you 20, 30 years to get approval and you have interest expense ticking against you. It is actually. It's not regulatory approval. It is waiting the financial expense of waiting for regulatory approval. The Trump administration is cutting through that, uh, cutting through that red tape. I believe the first company that will benefit from that is Terrestrial Energy. Imsr. Now, in fairness, I don't have a big position there because the company is years away from real revenue. There's a huge amount of risk there. This isn't the kind of thing where, oh, good, within two years we'll know it's working. It's going to take years to get to the point where they have real revenue. You know, these, these things are risky. I think the technology works,

00:45:00

Gary Brode: but none of this is proven at commercial scale. So, you know. But would I put a couple of percent into something that could go up, you know, 10x20x that could be one of the survivors, one of the big companies in this next wave of energy infrastructure. Yeah, because I can stomach the loss of 2% of capital trying to make 20% of capital. And that's the kind of risk reward that we're looking at here, especially again, with expedited regulatory approval. So those are multiple ways to play it. Congratulations to the people who have owned Nvidia this whole time. They've done incredibly well. I have nothing but admiration for, for the company, their shareholders, for Jensen Hong, I just see some risk on the traditional hyperscalers that's caused me to watch that and sit back. I'm cheering for them to succeed. I don't want to see anything bad happen, but I just feel like I see other ways to play this where I like the risk reward better and more importantly, where I just don't see necessarily the same level of rapid downside risk.

Phil Muscatello: I'm loving this new pivot I'm taking with the podcast, where we're looking at real companies and real stories and trying to understand how the world works. Because it's not just about investing, really. We are looking at the way the world works. And if listeners are interested in finding out more about the way the world works, because I know you do like to take the large picture and the overview of what's actually happening. How can they find out more about deep Knowledge investing?

Gary Brode: Sure. Uh, thank you. Feel free to go to deepknowledgeinvesting.com in the upper right. There's a subscribe now button. There's a free option, but you know the I paywall the ways that I'm investing in the portfolio and the way we're using deep knowledge investing research to make money. We charge a hundred dollars a month, so that's just click subscribe now and feel free to join. If anyone has questions, feel free to reach out to me directly@ireepnowledgeinvesting.com that goes straight to me. I read everything that comes in to that email address. I respond to everything. And then, you know, for people who just want to engage in a less formal way, I'm on X, formerly Twitterary Brode G A R Y underscore B R O D E. And you know, that's the Gary Brode with the avatar picture that looks like me.

Phil Muscatello: Gary Brode, thank you very much for joining me today.

Gary Brode: Thanks Phil. Always great speaking with you.

Phil Muscatello: Thanks for listening to Shares for Beginners. You can find more@sharesforbeginners.com if you enjoy listening, please take a moment to rate or review in your podcast player or tell a friend who might want to learn more about investing for their future.

00:47:44

Any advice in this blog post is general financial advice only and does not take into account your objectives, financial situation or needs. Because of that, you should consider if the advice is appropriate to you and your needs before acting on the information. If you do choose to buy a financial product read the PDS and TMD and obtain appropriate financial advice tailored to your needs. Finpods Pty Ltd & Philip Muscatello are authorised representatives of MoneySherpa Pty Ltd which holds financial services licence 451289. Here's a link to our Financial Services Guide.