![Divram Profile](https://pbs.twimg.com/profile_images/1796385744292106240/QS2yAuUl_x96.jpg)
Divram
@divram_ai
Followers
2K
Following
5K
Statuses
5K
@beefcubee @ShanghaoJin @bboczeng noob question - why does US need USAID if it has CIA for intelligence?
0
0
1
Sentiment is low. IMO AMD should change the name of the product line of mi300 mi325 to something else. 300 to 325 just sound like a marginal improvement, it’s super uncool compared to names like “hopper” “Blackwell”, a100, h100.
Here’s the Harsh Truth About $AMD 🧐 If AI were truly diversifying hyperscaler demand beyond $NVDA, AMD’s earnings would have painted a very different picture. Instead, what we saw was another confirmation that Nvidia remains the default choice for AI infrastructure -- while AMD struggles to gain real traction in a market where speed and ecosystem lock-in determine everything. Despite beating revenue expectations with $7.7B (+24% YoY), AMD’s data center business -- the segment that’s supposed to drive its AI narrative -- fell short. Analysts were looking for $4.2B, and AMD delivered only $3.9B. That’s not just a slight miss -- it’s an undeniable signal that hyperscalers aren’t shifting their AI spending away from Nvidia in any meaningful way. $MSFT confirmation that it’s using MI300X for GPT-4-based Copilot applications sounded impressive, but it doesn’t change the reality that AMD’s AI revenue still isn’t accelerating the way it needs to. Yes, there are new AI deals with $IBM, Vultr, and Fujitsu, but deals alone don’t drive the business forward -- deployments at scale do. And that’s where AMD is falling short. Hyperscalers aren’t just buying GPUs --they’re investing in the entire stack: the software ecosystem, the developer frameworks, the compatibility with their existing AI workloads. And that’s where Nvidia still holds an unmatched advantage with CUDA. If MI300X adoption were happening at scale, we wouldn’t be seeing data center revenue miss expectations at a time when AI infrastructure spending is at an all-time high. We’d be seeing hyperscalers ramp orders aggressively, expanding AMD’s pipeline, and positioning MI300X as a true alternative to Hopper and Blackwell. Instead, the narrative is still about "future adoption," while Nvidia’s latest quarterly results keep showing demand outpacing supply. The rest of AMD’s business isn’t providing much of a cushion, either. Gaming and embedded revenue both declined YoY, which means legacy product lines are losing steam at the exact moment AI is supposed to be the growth engine. Client revenue was strong at +58% YoY, but that’s driven by Ryzen chips, not AI. And let’s be clear: investors aren’t paying a premium for AMD because they expect it to sell more consumer CPUs. They’re paying up for an AI-driven future, and right now, that future isn’t materializing in the numbers. Gross margins tell the real story. At 54%, with no meaningful expansion, AMD isn’t seeing the kind of pricing power that defines a true AI leader. Nvidia, by contrast, has been increasing both ASPs and margins with each new generation of its AI chips. Why? Because its GPUs aren’t just hardware -- they’re the backbone of an entire AI software ecosystem that hyperscalers, enterprises, and developers are already locked into. AMD, even with competitive hardware, still lacks that kind of entrenched demand. Nvidia’s dominance isn’t just about today -- it’s about what’s coming next. Blackwell is already on the horizon, and Nvidia is making a massive push into inference workloads, further strengthening its grip on the AI market. Meanwhile, AMD is still trying to establish a foothold in AI training, the segment Nvidia already controls. The timing is not in AMD’s favor. If MI300X had launched earlier, before Hopper had such a deep install base, maybe it would have been a different story. But by the time MI300X is widely adopted, Nvidia will already be scaling Blackwell, pushing new software optimizations, and reinforcing its dominance. The AI market isn’t like CPUs or GPUs for gaming -- there’s no room for a second-place finisher in AI compute. This is an industry driven by ecosystem advantages, first-mover positioning, and hyperscaler commitments that lock in long-term spending. Right now, AMD is still the alternative, not the priority. Its future in AI depends on whether it can truly break Nvidia’s stronghold, and this quarter’s numbers make one thing clear: it hasn’t happened yet.
1
0
0