In this month’s episode of Tech Talk, Marcello Montanari, Managing Director & Senior Portfolio Manager, North American Equities, and Rob Cavallo, Senior Portfolio Manager, North American Equities, dive into the latest developments in the rapidly evolving Technology and Health Care sectors. They explore the intensifying AI race, including OpenAI’s strategic collaboration with Google, Meta’s aggressive recruitment initiatives, and AMD's impressive rebound. The discussion also highlights new advancements in healthcare AI, such as Microsoft’s promising diagnostic model, while sharing insights into key trends and opportunities shaping the remainder of the year.
Watch time: 21 minutes, 43 seconds
View transcript
Jordan Wong Okay. Hi, everyone, and welcome back to Tech Talk. Tech Talk is a monthly web series where I get the opportunity to sit down with Marcello Montanari and Rob Cavallo. Marcello is a managing director and senior portfolio manager on the RBC GAM North American Equity team, and Rob is a senior portfolio manager on the North American Equity team.
Marcello and Rob manage a number of growth-oriented strategies. Perhaps their best-known strategy being the RBC Life Science and Technology Fund. This is a strategy that invests heavily across the technology and healthcare sectors within the U.S. equity market. And the reason we get together is both these sectors combined, I think, are two of the most innovative sectors within the S&P 500.
And so, it's great to get together once a month to talk about all things technology and healthcare related and keep you all up to date and current on what's going on within the space. So, Marcello, Rob, thanks for joining me.
Marcello Montanari Thanks for having us.
Robert Cavallo Thanks for having us.
Jordan Wong Well, I was thinking for this month's episode, you know, we are at the halfway point. So, I do kind of want to end off this month's episode with a few thoughts from each of you on healthcare and tech as we work towards the end of the year. But I also want to talk about some related news happening across the AI space.
And then a few company specific pieces, perhaps around, you know, AMD, Meta, Microsoft. And so, if we can jump right in, maybe I'll just start off with the AI race. You know, I think we’re still in the early stages within AI, but I think it's safe to say we're kind of past that point of questioning, you know, whether or not the technology is legitimate and whether or not there's real world applications.
I think now, you know, we're hearing chatter and we're seeing kind of this arms race heat up across some of the major hyperscalers. And, you know, recently we saw that Open AI and Google have began working together, specifically Open AI using Google Cloud services and perhaps, adopting their tensor flow chips as well.
And so maybe I could ask you guys to kind of explain about what exactly is going on there and what are the implications for, you know, the various infrastructure players within AI. And what does this mean for Google's own AI strategy?
Marcello Montanari Okay. Well, maybe I'll start. It's kind of hard to tell with any degree of precision what's going on exactly between, Google and Open AI.
As we know, OpenAI has been partnered with Microsoft for quite some time, you know, since, almost the beginning, really. But, you know, there could be a number of reasons why Open AI has reached out to Google to use some Google Cloud services.
Number one, I would say is it's kind of no secret that the relationship between Open AI and Microsoft has been a little bit difficult as of late. And there seems to be some sort of back and forth into, you know, what the ultimate structure of Open AI will be and what the contractual obligations will be between the two, and so on, so forth, and obviously they're jockeying for position.
So, you know, one element might be that Open AI is kind of negotiating in public with Microsoft by reaching out to Google to do something, that could be one of the reasons. Another one is that Microsoft actually kind of has a ROFR, a right of first refusal, in terms of like turning away business from Open AI if they don't want to do it.
We've already seen that, they've actually turned away some business that, you know, I guess Microsoft would deem it to be kind of low value as far as they're concerned. So, they've kind of turned it away. So that means by definition, since Open AI doesn't have much of its own infrastructure, they got to go somewhere to get it. So obviously, you know, Google cloud services might be one of the reasons that they would go there.
There's also kind of a school of thought that, you know, there's a view that Open AI will gradually take share from Google, and that may or may not be true. Obviously, they are taking share of like, generalized searches and stuff like that. So, there's a view that, maybe Google views this as a way of, hey, if we're going to lose some share, at least we can process the search requests out there.
That's a bit of a wonky idea in my view. But at the same time, it's a non-zero probability that something like that is in play. And then alternatively, actually, probably the most likely is like, we'll call it “Occam's Razor.” The easiest, simplest explanation is probably the right one.
And, you know, ChatGPT is, at the moment, the de facto consumer AI platform out there. And last I saw, there were 800 million users. Their token use is through the roof.
And we've already seen so many companies hit capacity constraints, whether it was Microsoft, AWS, so to think that OpenAI hasn't hit capacity constraints and had to scramble a bit would, you know, would be naive.
So, I think that that's probably the easiest, and probably the most likely reason that they've gotten a little bit closer. Just, they need the capacity and, Google's there and then, you know, obviously it's not free.
In terms of the point you raised earlier, to use TPUs and stuff like that, I mean, this gets a little bit of “inside baseball” for people who don't understand all this stuff. But right now, you know, OpenAI stuff is pretty much built off of the Nvidia GPUs and AMD to a lesser extent. And from my understanding, the relationship between OpenAI and Google is to basically use Google's fleet of GPUs from Nvidia and to a lesser extent, AMD. They have an agreement to test out Google's proprietary TPUs, but I don't think that there's any firm commitment on Open AI’s part to basically start moving serious workloads over there.
So that's kind of where it all stands at the moment, I'd say.
Jordan Wong And maybe I could ask you to comment.
So, we also recently read about Meta's push, recruiting a lot of folks from OpenAI. Maybe you could comment a little bit on that.
And is it safe to say that at this point it's possible that we're starting to see maybe like an early emerging winner across the space? Or is that still sort of too early to call?
Marcello Montanari I say still a little bit too early. I mean, there's definitely some frontrunners here for sure. I think what happened with Meta was llama 4 was, you know, heralded to be the next big model to come out. And then when the training was complete, it was clear that it didn't achieve its benchmarks.
And what we quickly saw after that is either people were forced out from Meta, or they quit on their own, but some key people left from the llama team. And it looks like Zuckerberg just basically kind of hit the reset button here and said, this is too important to lose. And as a result, he's gone out, by my count, after considering the people that have left, I think he's hired over 20 people now. And he's hired them, you know, the reports are some of these people have been hired at kind of Shohei Ohtani-esque levels of pay. But, anyway, so he's clearly committed to this, and Meta is committed to this.
And on top of that, I mean, the first move he made if we could believe some of the reports we saw was he went after, or he had a discussion with Perplexity about acquiring Perplexity. And that didn't go anywhere. And, you know, if that in fact was the case, one day we'll find out why that didn't happen.
So, his next step was to basically do an acqui-hire of, Alexander Wang, from Scale AI, and he brought in Scale AI at a 49% position, and he's building a team around that. And then he's gone after a couple of other big names like, Daniel Friedman or Daniel Gross, Friedman and Gross. I don't remember their first names, but some pretty big names out there.
So, he's gone after, like I said, about over 20 people now, just indicating that he's in this for real.
Jordan Wong Great, no that's really helpful. Thank you.
Maybe pivoting ever so slightly over to you, Rob.
Recently, Microsoft released this new health care model, which is AI driven. Maybe you could just talk to us a little bit about that.
I'm not sure if it's something everyone's sort of up to speed on and maybe more importantly, what this kind of means for the future of the health care sector.
Robert Cavallo Yeah, and sorry, I just wouldn't mind adding one point to the Open AI/Google conversation as well.
So, I think to Marcello's point, yes, it does speak to capacity constraints and specifically probably Nvidia GPU capacity constraints.
What's notable is this year Google dramatically stepped up their Nvidia GPU supply. And it's something like two thirds of their processors are going to be Nvidia GPU based versus, you know, only a third from their internal chip that they make. So, I think this is also just about OpenAI getting more access, as Marcello said, to Nvidia GPUs.
And one more point. What’s interesting is like they now would have an arrangement with obviously Microsoft, Google, Oracle. AWS is the one that's missing. And what does that mean? Is that on the come? Is that a shot against AWS Trainium which they're trying to push? Is it capacity constraints at AWS? I just thought that was an interesting point to keep in mind. Maybe that's the next announcement that we'll see out of OpenAI at some point, is they need more they need more capacity to meet the seemingly endless robust demand for ChatGPT.
Back to your initial question, though, on Microsoft. They created the basically this health care model somewhat designed around all the major LLMs, and it's a diagnosis tool. And they ran some interesting tests, very recently, or have been doing it very recently on how accurate they can diagnose complex cases versus physicians with upwards of 20 years experience.
So, the first step of the process they competed against medical license exams in the US. And they basically got to a point where they could get 100% on these exams. They then went to this next step where they basically use 300 cases that came out of Mass General in the United States that are published in and then and in that control group against physicians, the result was they basically had a rate four times greater in diagnosing the patient with the actual disease and whatever it was relative to those experienced physicians. It was like an 85% versus 20% rate. And they did it at a lower cost, meaning they could do it at a much lower iteration to get to the final answer.
So, I don't know if it means anything today, this is not replacing doctors today. But it just goes to show something we've talked about for a long time is that, you know, health care is going to be arguably one of the biggest areas for use of AI, as a tool continues to be developed over the years. And this is just another signpost that, you know, we're on our way. And these use cases will just continue to, I think, present themselves over the coming months and years.
Jordan Wong Oh, that's excellent. Really fascinating actually.
Rob, maybe I'll stick with you, pivot a little bit. I want to talk a bit about AMD. We've seen a big reversal to the upside in performance recently.
And so maybe just walk us through what's driven this and where do you stand on this opportunity in sort of that AI GPU battle between them and Nvidia right now.
Robert Cavallo So, excluding just broader AI sentiment recovery post Liberation Day lows, what we've seen is a resurgence on what the opportunity set, at least from Wall Street's perspective, looks like for AMD going into next year for their AI GPU servers.
A couple big things.
You know, as the announcements came out on the Middle East on the Trump tour through that region, AMD, given the fact that they're at a mid-single-digit billion AI revenue base stand to be outsized beneficiaries relative to Nvidia, and that kind of got the market excited again.
And then just recently, AMD also hosted an AI event where they kind of just talked a little bit about where their servers are coming to for the back half of the year and more importantly, what their system, a rack scale level servers are going to look like into next year.
And they're closing the gap to Nvidia. Look, there's still a half year to one year gap. Basically, the market got excited that AMD is becoming truly again, a second-choice vendor potentially. And given their starting point on their AI revenue base, the upside is just very attractive for the valuation it was at, at the lows. And that's what's really driven the recovery.
Just, more confidence around what that opportunity set over the next kind of 12 to 24 months might look like for them.
Jordan Wong Oh, that's great. Thank you.
Guys, maybe, you know, before we wrap up the episode, I want to, like I said, leave a little bit of time to give you guys a chance to maybe just offer up some thoughts, we’re halfway through the year, obviously, there's been a lot of volatility. A lot's going on.
So maybe just, you know, a few comments from both of you on what you're looking out for, what you might be expecting over the next six months until we get to the end of the year.
Marcello Montanari You know, maybe I'll start and hand it off to Rob.
I mean, we hate to make short term predictions. We like to keep most of our focus on the longer term. And as I like to say, you know, get it right and sit tight. But, you know, we're still left positive. You know, we've got a significant secular growth tailwind behind us which is just really driven by AI.
And we seem to be going from strength to strength. And that's basically helping to drive the cloud migration and the cloud infrastructure bill. So that's, you know, all systems go from that perspective. And that's both in the traditional and AI focused cloud transition.
And in the meantime, amongst other sectors, like, we know that security, cybersecurity remains a priority for CIOs. We know that e-commerce continues to shift from traditional retail over to e-commerce. And that's basically helping to fuel the entire digital advertising ecosystem.
And then, you know, eventually we'll start talking more and more about embodied AI, which is a fancy way of saying robotics. But it is happening as we speak. And just to kind of open up the aperture a little bit because people tend not to think about this, but, you know, a chat bot is essentially a robot that lives on a server, right? And basically, does its work from a server. A Waymo, Robotaxi is essentially a robotic car. So, it's not just about, you know, robots on manufacturing floors and on warehouse floors anymore. It's expanded beyond that.
So, we're going to be hearing a lot more about that going forward. And, you know, everyone likes to fixate on, kind of, humanoid robots and that's coming too but I think that's a little bit further out.
But, you know, all of this stuff is happening, and again, just kind of circling back to the whole AI theme, there's a lot of talk about agentic and, you know, is it coming and what is it, and so on, so forth. And I would argue that, you know, agentic AI already kind of exists.
And if you think about both Google and Meta, they basically have these almost fully automated digital ad platforms where an advertiser just goes to them and says, you know, “here's my budget, this is the return I’d like to get, you guys handle it.” And they basically put it in their black box, and the black box creates almost everything for them.
And, now that we have Generative AI that can actually sit in front of all this, and their products are called “performance max” and “advantage plus”, respectively for Google and Meta, now that you’ve got Generative AI that can sit in front of this, you're in a position now where you can fully automate the entire digital advertising workflow and that'll allow a lot of more small-midsize businesses into the whole ecosystem.
So, it's a lot of exciting stuff going on here and a lot of stuff to keep our eyes on and hopefully take advantage of.
Jordan Wong Absolutely. Robert.
Robert Cavallo Yeah. Sorry. Just one maybe high-level thought that’s more specific to health care and tech.
But you know, towards the beginning of the year we were talking about kind of a year of two parts. You know, the first half of the year we're sort of taking our medicine with trade negotiations and so forth. The second half of the year we get some of the sugar and we started to see this come through with, you know, the tax cut and trade negotiations seemingly on a path to somewhere okay, potentially going into a fed easing cycle. So, we think the setup is still good. We can have a long argument about how much of that is priced in, but we feel like the broader setup is favorable once we get past this Q2 earnings season, which given the sharp rally in the market since the Liberation Day lows. We're not saying that it's, you know, a straight line higher, but we're generally positive
More specific to some events I'm thinking about, Nvidia, we get the meat of the curve of the launch of their new server system level product. I think it's possible that this puts the stock back on a more meaningful beat and raise cadence into the second half of the year, which could get excitement and maybe take the stock to the next level out here beyond $150.
Within healthcare, two important trials I'm thinking about, both from Lilly. One is the oral obesity trial that's going to read out probably during the fall some time. This would be, you know, a major unlock to get an oral obesity approval, or obesity data that could lead to approval next year.
And potentially get updated Alzheimer's data in very early-stage patients too. This could be another exciting opportunity that the market isn't really talking about at all right now and could have read across the Biogen and some others and it could be just a general sentiment flipper to the positive side, which the sector needs.
And there's a few more, but those are the ones I sort of highlight as being most meaningful that I have my eye on today.
Marcello Montanari If I could just throw one last thing on top of this question. You know, the budget bill has provisions for accelerated depreciation of capital spending and the expensing of R&D. And we shouldn't lose sight of the fact that this could actually pull forward a lot of capital spending because companies now will be able to write it all off on the first year, which basically depresses earnings.
So, you’ll have a negative earnings impact on the first year or call it a year and a half to two years. And then after that you get like a bump.
But the cash flow impact, it's from day one sort of thing. So, they could pull business forward and accelerate some activity.
Again, so this benefits the whole infrastructure layer of AI.
Jordan Wong No, that's really insightful. Thank you, guys.
We'll wrap the episode up here. So once again, I want to thank you for spending time with us this month. And thank you to everyone who tuned in.
You know, if you are interested in learning more about some of the strategies that Marcello and Rob manage, strategies like the RBC Life Science and Technology Fund, or the RBC Global Technology Fund, we do encourage you to visit our website or reach out to your RBC GAM contact.
Thanks again, and we will be back next month.