Skip to main content

The Silicon, Software, and Systems

Motley Fool - Wed Aug 28, 8:03AM CDT

In this podcast, Motley Fool analyst Asit Sharma and host Dylan Lewis discuss:

  • Why AMD is spending $4.9 billion on ZT Systems, and what the company's rack-scale ambitions look like.
  • General Motors' plans to lay off over 1,000 employees, and why it might be AI-driven.
  • The questions that company leadership and boards should be asking as they think about AI, and two companies that have established good AI practices so far.

To catch full episodes of all The Motley Fool's free podcasts, check out our podcast center. To get started investing, check out our beginner's guide to investing in stocks. A full transcript follows the video.

Should you invest $1,000 in Advanced Micro Devices right now?

Before you buy stock in Advanced Micro Devices, consider this:

The Motley Fool Stock Advisor analyst team just identified what they believe are the 10 best stocks for investors to buy now… and Advanced Micro Devices wasn’t one of them. The 10 stocks that made the cut could produce monster returns in the coming years.

Consider when Nvidia made this list on April 15, 2005... if you invested $1,000 at the time of our recommendation, you’d have $786,169!*

Stock Advisor provides investors with an easy-to-follow blueprint for success, including guidance on building a portfolio, regular updates from analysts, and two new stock picks each month. TheStock Advisorservice has more than quadrupled the return of S&P 500 since 2002*.

See the 10 stocks »

*Stock Advisor returns as of August 26, 2024

This video was recorded on August 19, 2024.

Dylan Lewis: AMD ups the ante in the AI race, Motley Fool Money starts now.

I'm Dylan Lewis, and I'm joined over the airwaves by Motley Fool analyst Asit Sharma. Asit, thanks for joining me.

Asit Sharma: Thanks for having me back. This marks three Mondays in a row. This has been my shot in the arm every Monday, to make it through the rest of the day.

Dylan Lewis: I appreciate you coming with me on the Monday show. You know what? We're catching up on everything that happens over the weekend. Sometimes we wind up with the news fairy delivering something interesting for us to talk about. Definitely the case today, news out that AMD is acquiring ZT Systems, and this is just the continued evolution of chipmakers upping the ante in the AI race. Reportedly, the deal was for about $4.9 billion. I'm going to read you, Asit the poll quote from AMD's press release. This is a strategic acquisition to provide AMD with industry leading systems expertise to accelerate the deployment of optimized rack-scale solutions addressing a $400 billion data center AI accelerator opportunity in 2027. Help me unpack that. What are we talking about here with this deal?

Asit Sharma: Let's rewind the clock to several quarters ago when Jensen Huang, CEO of NVIDIA, said that his company foresaw a need to replace these mordern AI data centers, which was a very new term at the time every five years. What he was really saying is that the technology is going to change very fast and NVIDIA just didn't want to provide chips anymore. They wanted to really show companies how to build this modern data center. They wanted to sell some of the networking equipment, they wanted to sell the solutions, and they've largely done that. They're looked on as very consultative partner with the Big Cloud hyperscalers with enterprise businesses, and they bring a lot of tech to the table. We tend to get focused on the GPU fight between NVIDIA and everyone else's investors. But in terms of being able to help a company decide what to put in its data center, how to do it, NVIDIA is still top notch. In fact, they have their own R&D data center.

They can configure anything on the fly, doesn't matter what kind of existing systems you have. What you just read is the answer from AMD to all that. AMD is saying, look, we want to play at every level. We just saw them spend, 600 odd million a few weeks ago to acquire a company called Silo, which is the largest AI lab in Europe. Here's where they're saying, we're going to bring a ton of expertise to the data center as well. It's not just going to be about our accelerators versus yours. What ZT Systems does is gives and you named it in this "RAC level solutions". That means they have a RAC that has servers on it. It's got a liquid cooling system. It's got networking equipment. It has power distribution, it's got software, and it's customized. You roll this RAC in and you've got a plug and play solution for a data center customer. Guess who really loves ZT Solutions. It's NVIDIA. They've been great at helping NVIDIA customize products and sell them faster. This is a way to level the playing field between these two giants. We're now in round three of this heavyweight fight early days still, but the blows are starting to land.

Dylan Lewis: I feel like this is probably 15 maybe 25 round fight. I think they're going to be duking this one out for quite some time. It's interesting that NVIDIA is reportedly a customer of ZT Systems, because I wonder how that dynamic will play out with it now being a part of AMD's portfolio. The deal is not supposed to close until some point in 2025, and so there's going to be some lead time for figuring this out. The other thing that I see emphasized quite a bit in the coverage of this so far is this notion of an ecosystem with AMD and what they're able to offer. Is that getting at that exact thing you're talking about, the avoiding the reliance on just the GPUs and starting to build out these more holistic solutions to maybe make it a little bit harder for some of their customers to leave?

Asit Sharma: I think so. From the customers perspective, they always want an integrated solution. That solution takes place over three big areas. You think about the terminology that NVIDIA and AMD love to sling around. Here's the way that AMD looks at it. It's the silicon, the software, and the systems. That's what they put in the same press release that you're reading from. Silicon is like, OK, I know I've got to run my large language models on some chips and I want them to be fast, and then the software, you also want to have software that helps you achieve your business objectives faster and with maybe an edge that you can have over competitors, you want very flexible software and fast software, and systems, you don't want to have to reinvest after three years. Going back to less about what Jensen Huang is talking about today where they want to have a new generation of systems and hardware every year. But back to the original vision that a data center ought to be able to keep its competitiveness for its customers every five years and replace it that way.

I think AMD is speaking more to this whole integration where if you are this big sophisticated business, you don't have to have various parts of your IT department coming back and saying, oh, yeah, the silicon is great, but the software, AMD, come on the software. They've been working on their open source software and the AI part of that ecosystem as well with this acquisition that I mentioned from a few weeks ago. This is the systems piece. I do think that it makes sense both to compete with NVIDIA, but also just to be able to come to a hyperscaler like an Amazon or Microsoft or a fortune 100 business and say, this is going to cost you a ton of money, but it's going to save you money over the long term. We've heard Jensen Huang making this very argument. Before this year, AMD wasn't able to make it. With these last couple of acquisitions, it's starting to speak that same language about cost and opportunity and return on investment.

Dylan Lewis: I mentioned that ticker price of 4.9 billion. That is going to be a combination of cash, but also, I think some stock in the deal. If you're simply going as the Crow flies on market cap does not look like a particularly large acquisition for AMD. It's currently roughly a $250 billion company. But if you take the look at the balance sheet and you're looking at it from that perspective, AMD is sitting on about five billion dollars in cash and equivalents. As you mentioned before Asit, they had made another acquisition fairly recently, another much smaller one. But I look and I say, this is maybe a little bit of a bigger bet than the market cap would imply.

Asit Sharma: I think so, Dylan. The point for AMD here is to allocate their funds very wisely. You don't want to make an acquisition decision that becomes a financial decision. In other words, you get some return on it that's a financial return, but it doesn't add value. You want it to be, as they said strategic. If you're going to use up your balance sheet, you want to have the ability with whatever asset you acquire, it could be a hard asset. It could be a series of contracts that you acquire, it could be a company. You want to be able to plug that into your system and get a lot more out of that. At this critical time for AMD, where it already has a lead over other multifaceted chip making companies and is playing second fiddle to NVIDIA.

If you're going to use up that balance sheet and start to get to a position where future investment might be levered, this is probably one that makes sense for you because the systems work very well, AMD's chips versus what they're acquiring with ZT plus the AI piece that they've acquired. All this flows through from one end to the other, and I don't think it's wrong to call it an ecosystem play. Whenever I hear words like that, like platform, ecosystem, with this acquisition. We are the ecosystem player in our industry. I always get skeptical, but here it's logical. I love that you're pointing out that they're starting to fill up and they may have to re up in the future with some debt or more capital from the equity markets or use some more of their free cash flow as they go along.

Dylan Lewis: To be clear, the market is rewarding them for that investment today. Shares up about 2% on the news. They weren't too disappointed. I think the market in general is rewarding that AI investment right now, which, if I'm being honest, Asit it made it a little surprising that when we look over at some other news from today, General Motors laying off more than 1,000 employees in its software and services division, and this seems like a move driven by management's desire to slim down operations a little bit. I'm not surprised by that. I am surprised that they are targeting the software side of this business with slimming things down.

Asit Sharma: True. You and I were chatting, Dylan, neither one of us is reading too much into this. But it is interesting. We had a wave of cost optimization last year where we saw lots of tech companies laying off a ton of employees and we saw in the business world as well, there was so much of staff reduction as interest rates stayed high, inflation was high, and companies were trying to make sure they could still improve profits. That's tapered off a little bit. I just wonder here. We know GM has been up against a lot of flux and in the industry, EVs were hot. They're cooling off as a business prospect for these companies, they're still investing. Is it really related to that? The shifting winds of EVs versus their traditional engines, or is it something that I think we might see from other companies in the future, which is to say, these LLMs have become so good at coding and so good at giving architectural advice in software if you've got an objective and you describe it to a good large language model. Do we really need hundreds and hundreds of people coding and writing software and trying to architect this stuff, or could we take the best of the bunch, and then some lower level employees to check behind some software experts to check behind this interplay between humans and the AI models? I wonder if that's not going on here. But it's a data point. You and I will follow this, I'm sure as the months wear on.

Dylan Lewis: Absolutely. We've been waiting for a while to see what the shakeout would be as artificial intelligence winds up working its way in more, and I think honestly, a lot of my thinking was that these highly technical tech jobs would probably be something that would be assisted, and maybe we would see a little bit less new hiring going in, but maybe maintaining certain employment levels. I was a little surprised to see them ratcheting this down. But it does remind me a little bit of a story I saw last week that I be kind of sitting on Wayne just wanted to get someone's take on this, and it give us this conversation a nice opportunity to do that. Wall Street Journal had an opinion piece out last week, why AI risks are keeping board members up at night, detailing the cost reduction elements that are popping up for companies as we're starting to get more use of LLMs, but also things like data privacy, things like the information employees are sharing with the LLMs. What I liked about this piece is we've seen various forms of AI mongering. Some of it being very good, some of it being very bad. But this is one of the first times I've seen a board level perspective on this, where we're starting to think a little bit more about corporate liability. We're starting to think a little bit more about the way that we're structuring policies for our workers. That's what a lot of the piece got into. On that note. what do you want to be seeing from boards or from management teams when it comes to this stuff?

Asit Sharma: Dylan, I want to see boards really hone in on a few things. One is to understand what the ethical implications of AI are. Number 2 is to understand where things are proprietary and could be exposed. As you mentioned, that's a growing concern. Three, I want them to get more involved on a very minute level. What I mean by that is, it's been easy in the past to have board members who were experts in their field so if you had to do your audit compliance, your audit committee was filled with people who had this experience. Same goes for companies who want to maybe expand into markets. If you're a tech company, you bring in a board member who has great experience in go-to-market function in the industry where you want to tap into. That'll make sense. But we need people who are very hands on with AI. I think the Wall Street Journal article that you referenced speaks to this a little bit. You can't really understand the effects of this technology until you've played around with it.

If I were a CEO, I would be reluctant to take advice from any board member who can't really show me that they are playing around with AI and have more than a beginning level knowledge. That is, in this day and age, anyone can pop a question to Chat GPT or Claude Anthropic? Show me that you understand the systems. The reason this is important is because then you can get some guidance on how so many other decisions can be made. Do we spend the money to just keep an LLM in house and draw a circle around it so stuff doesn't get out? Are we comfortable with a third party provider who's saying they've got a secure port and no one can ever get to our data? Do we understand if employees who are playing with the stuff themselves and developing great software for our company are following our rules. Maybe you have non competing agreements in place. Maybe you need them if you don't have those for some smaller companies that just have to be public to have a board of directors to advise it. There's so many issues related to both how this can hurt companies in the long run, but also just the potential that needs some expertise. This is a challenge, and it could be that we'll see boards creating positions that bring in just an AI expert to advise on these matters.

Dylan Lewis: One of the things I thought that was interesting that came up the piece, and we'll link to it in the show notes because I think it's a good read, was that you have some companies they name check Salesforce publicly posting their guidelines for using and developing things like generative AI. I think in part to create a rules of the road for their employees. But I think also there are a lot of customers that use Salesforce CRM and some of their software suites, and there are going to be questions about what the processes for the software that you are consuming and then feeding your own data into on your side, because any privacy or data elements transfer. There's a little bit of a transitive property issue here. I thought it was a unique approach and one that I'm looking for more companies to be doing because I think we need that transparency.

Asit Sharma: I think it's such a nice point. I think about companies like Salesforce and ServiceNow, which has been forthcoming and how they use data and how companies maybe can perceive them as a partner. There is a lot of publication of guidelines and thought and how they're trying to protect assets. But it's still like early Wild West here. If you are a publicly traded company, the government's going to require you to have audited financial statements so investors can understand that you're not fudging the numbers. You just referred to AMD's balance sheet Dylan. That's because there are rules and regulations that force them to put that stuff on paper so you and I can look at it. But the government really hasn't stepped in much to do anything about regulating the way companies interact with generative AI. Now, that's for the good and the bad here in the US, we traditionally are an entrepreneurial society where we have good regulation, it comes in stages as stuff evolves. In Europe, it's a little bit of the opposite situation. They're very quick to put in guard rails, but sometimes that kills the investment and the creativity. Not to say that Europe is any less creative than the US, it just can be harder sometimes to bring new technologies to market. I'm somewhere in the middle. I want to make sure that we as a society, keep our entrepreneurial edge and move with the stuff, go forward. But I do want to see that along the way, we're thoughtfully approaching the technology, and sometimes as you point out companies take the lead in that, then the government follows. It looks at what those thought leaders have put in place, and I will guarantee you that the people who are employed to work on regulation study those as best practices and interact with them so there is something circular here and a few companies are taking a lead.

Dylan Lewis: Asit Sharma. Thanks for joining me today. Maybe, just maybe. We'll do it again and make it four Mondays in a row next week.

Asit Sharma: I'm up for it. Take these in and great to chat with you.

Dylan Lewis: Just a quick programming note. No second segment today, or for the next few days. The Motley Fool Money team is on-site at Podcast Movement here in Washington, DC, and we'll be bringing you some special conversations on podcasting and the ad industry from that. Hey, if you're at Podcast Movement, give us a shout, let us know. You can reach us at podcasts@fool.com. As always people in the program may own stocks mentioned, and The Motley Fool may have formal recommendations for or against, so don't buy anything based solely on what you hear, I'm Dylan Lewis. Thanks for listening, we'll be back tomorrow.

Asit Sharma has positions in Advanced Micro Devices. Dylan Lewis has no position in any of the stocks mentioned. The Motley Fool has positions in and recommends Advanced Micro Devices. The Motley Fool recommends General Motors and recommends the following options: long January 2025 $25 calls on General Motors. The Motley Fool has a disclosure policy.

Paid Post: Content produced by Motley Fool. The Globe and Mail was not involved, and material was not reviewed prior to publication.