Latest news with #InstinctMI350


Arabian Post
7 days ago
- Business
- Arabian Post
AMD Stakes Future on Open AI Infrastructure
Advanced Micro Devices projected bold expectations for its artificial intelligence trajectory during its Advancing AI event in San Jose on 12 June 2025, emphasising system-level openness and ecosystem collaboration. CEO Dr Lisa Su unveiled the Instinct MI350 accelerator series, introduced plans for the Helios rack-scale AI server launching in 2026, and fortified AMD's software stack to challenge incumbent leaders in the sector. Top-tier AI customers including OpenAI, Meta, Microsoft, Oracle, xAI and Crusoe pledged significant investments. OpenAI's CEO Sam Altman joined Su onstage, confirming the firm's shift to MI400-class chips and collaboration on MI450 design. Crusoe disclosed a $400 million commitment to the platform. MI350 Series, which includes the MI350X and MI355X, are shipping to hyperscalers now, with a sharp generational performance leap — delivering about four times the compute capacity of prior-generation chips, paired with 288 GB of HBM3e memory and up to 40% better token‑per‑dollar performance than Nvidia's B200 models. Initial deployments are expected in Q3 2025 in both air‑ and liquid‑cooled configurations, with racks supporting up to 128 GPUs, producing some 2.6 exaflops FP4 compute. ADVERTISEMENT Looking further ahead, AMD previewed 'Helios'—a fully integrated rack comprising MI400 GPUs, Zen 6‑based EPYC 'Venice' CPUs and Pensando Vulcano NICs, boasting 72 GPUs per rack, up to 50% more HBM memory bandwidth and system‑scale networking improvements compared to current architectures. Helios is poised for market launch in 2026, with an even more advanced MI500‑based variant expected around 2027. Dr Su underscored openness as AMD's competitive lever. Unlike Nvidia's proprietary NVLink interface, AMD's designs will adhere to open industry standards—extending availability of networking architectures to rivals such as Intel. Su argued this approach would accelerate innovation, citing historical parallels from open Linux and Android ecosystems. On the software front, the ROCm 7 stack is being upgraded with enterprise AI and MLOps features, including integrated tools from VMware, Red Hat, Canonical and others. ROCm Enterprise AI, launching in Q3 or early Q4, aims to match or exceed Nvidia's CUDA-based offerings in usability and integration. Strategic acquisitions underpin AMD's infrastructure ambitions. The purchase of ZT Systems in March 2025 brought over 1,000 engineers to accelerate rack-scale system builds. Meanwhile, AMD has onboarded engineering talent from Untether AI and Lamini to enrich its AI software capabilities. Market reaction was muted; AMD shares fell roughly 1–2% on the event day, with analysts noting that while the announcements are ambitious, immediate market share gains are uncertain. Financially, AMD projects AI data centre revenues growing from over $5 billion in 2024 to tens of billions annually, anticipating the AI chip market reaching around $500 billion by 2028. These developments position AMD as a serious contender in the AI infrastructure arena. Its push for rack‑scale systems and open‑standard platforms aligns with the growing trend toward modular, interoperable computing. Competition with Nvidia will intensify through 2026 and 2027, centred on performance per dollar in large‑scale deployments.
Yahoo
13-06-2025
- Business
- Yahoo
This Week In AI Chips - AMD Advances AI Innovation With Open Ecosystem Approach
AMD recently showcased its vision for an open AI ecosystem, emphasizing a collaborative approach through new silicon, software, and systems at its Advancing AI 2025 event. The company introduced the Instinct MI350 series of GPUs, highlighting significant advancements in AI compute performance, efficiency, and scalability. AMD has continued to expand its ROCm open software stack and unveiled its next-generation AI rack designs, aiming for leadership in rack-scale AI performance beyond 2027. Strategic partnerships with industry leaders like Meta, OpenAI, and Microsoft underscore AMD's commitment to driving AI innovation through open standards and shared technological advancements. These developments position AMD as a central player in accelerating AI ecosystems across various industries. last closed at $118.50 down 2.2%. In other trading, was a standout up 3.8% and ending trading at $74.34. In the meantime, softened, down 5.1% to close at ¥23,730. AMD's upcoming 5th Gen EPYC processors and AI accelerators promise rapid revenue growth. Click to explore the detailed narrative on AMD's strategic advancements. Also, check out our Market Insights article titled "A.I. Enters the 'Show Me The Money' Phase," where we reviewed key AI chip investments poised for growth—get in fast. settled at $145.00 up 1.5%. NVIDIA recently announced strategic partnerships and technological integrations, including a collaboration with Samsung to invest in AI robotics and numerous European initiatives focused on AI infrastructure, all within the past two days of NVIDIA's GTC Paris conference. closed at $158.70 down 0.5%. finished trading at €676.60 down 2%. Unlock our comprehensive list of 53 AI Chip Stocks like Applied Materials, Intel and Taiwan Semiconductor Manufacturing by clicking here. Interested In Other Possibilities? AI is about to change healthcare. These 22 stocks are working on everything from early diagnostics to drug discovery. The best part - they are all under $10b in market cap - there's still time to get in early. This article by Simply Wall St is general in nature. We provide commentary based on historical data and analyst forecasts only using an unbiased methodology and our articles are not intended to be financial advice. It does not constitute a recommendation to buy or sell any stock, and does not take account of your objectives, or your financial situation. We aim to bring you long-term focused analysis driven by fundamental data. Note that our analysis may not factor in the latest price-sensitive company announcements or qualitative material. Simply Wall St has no position in any stocks mentioned. Sources: Simply Wall St "AMD Unveils Vision for an Open AI Ecosystem, Detailing New Silicon, Software and Systems at Advancing AI 2025" from Advanced Micro Devices, Inc. on GlobeNewswire (published 12 June 2025) Companies discussed in this article include NasdaqGS:CRDO NasdaqGS:NVDA NasdaqGS:QCOM ENXTAM:ASML NasdaqGS:AMD and TSE:8035. This article was originally published by Simply Wall St. Have feedback on this article? Concerned about the content? with us directly. Alternatively, email editorial-team@ Sign in to access your portfolio


Entrepreneur
13-06-2025
- Business
- Entrepreneur
AMD Unveils Integrated AI Platform with MI350 Series GPUs
The company announced broader access to the AMD Developer Cloud, offering developers and the open-source community access to next-gen compute platforms You're reading Entrepreneur India, an international franchise of Entrepreneur Media. At its Advancing AI event in San Jose, AMD revealed a wide-ranging AI strategy with a new suite of hardware, software, and end-to-end infrastructure solutions, reinforcing its position as a key enabler of next-generation AI workloads. The company unveiled its latest AI accelerators, 'the AMD Instinct MI350 Series' which promise a 4x generational performance leap and are already being rolled out in large-scale cloud deployments, including Oracle Cloud Infrastructure. AMD Chair and CEO Dr. Lisa Su delivered the keynote alongside executives from Meta, Microsoft, Oracle, and OpenAI, showcasing how AMD's integrated platform spanning GPUs, CPUs, networking, and open software is helping power some of the world's most advanced AI applications. Currently, seven of the ten largest AI customers are deploying AMD Instinct Accelerators, underscoring the company's growing leadership in the space. A central highlight was the introduction of the AMD Instinct MI350 Series, which includes the MI350X and MI355X GPUs. These are built to drive transformative AI performance and are supported by the 5th Gen AMD EPYC processors and AMD Pensando Pollara NICs, all working together as part of a new open-standards rack-scale infrastructure. Broad availability of this infrastructure is expected in the second half of 2025. AMD offered a glimpse into the future with its "Helios" AI Rack, a fully integrated AI platform set for release in 2026. Powered by next-gen AMD Instinct MI400 Series GPUs, "Zen 6"-based AMD EPYC "Venice" CPUs, and AMD Pensando "Vulcano" NICs, Helios aims to redefine AI performance standards and support ever-larger and more complex models with efficiency at scale. On the software front, AMD launched ROCm 7.0, its latest AI software stack. The update brings enhanced support for industry-standard frameworks, expanded hardware compatibility, and new APIs, drivers, and libraries to accelerate AI development. Additionally, the company announced broader access to the AMD Developer Cloud, offering developers and the open-source community access to next-gen compute platforms. AMD also emphasised sustainability with a significant energy efficiency milestone. The MI350 Series has surpassed AMD's earlier five-year goal of a 30x energy efficiency gain achieving a 38x improvement in AI training and high-performance computing nodes. AMD is now setting a new 2030 goal: a 20x increase in rack-scale energy efficiency from 2024 levels, aiming to reduce power consumption by 95 per cent.


Business Insider
13-06-2025
- Business
- Business Insider
Advanced Micro Devices Stock (AMD) Poised for Breakout as AI Demand Accelerates
Advanced Micro Devices (AMD) has faced its share of volatility, with the stock down 23% over the past year. However, recent momentum suggests a potential turnaround is underway. Multiple growth catalysts are driving revenue higher, and it appears the market has yet to fully recognize this acceleration. Given these tailwinds, I'm bullish on AMD and believe the stock is well-positioned for a breakout toward its all-time highs around $200 per share. Confident Investing Starts Here: Data Center Boom Powers the AI Revolution AMD's Data Center segment is emerging as the company's growth engine—and for good reason. In Q1, the segment generated $3.7 billion in revenue, marking a 57% year-over-year increase, fueled by surging demand for its EPYC CPUs and Instinct GPUs. CEO Lisa Su pointed to deepening partnerships with major players like Microsoft (MSFT), Meta (META), and Oracle (ORCL), as hyperscalers increasingly rely on AMD's chips to power AI workloads. This momentum isn't just a one-off. Data center revenue nearly doubled in 2024 to $12.6 billion, and Q1's results continue that trajectory. AMD is further reinforcing its position through strategic acquisitions, such as Untether AI and Brium, thereby expanding its capabilities in both AI hardware and software. The upcoming Instinct MI350 series is already generating buzz for its potential to rival Nvidia in AI infrastructure. Despite the cyclical nature of the semiconductor industry, AMD's consistent data center growth, driven by rising AI demand and a strengthened ecosystem, suggests the company is well-positioned for a robust and sustained run. Client Segment: Ryzen Roars Back AMD's Client segment shouldn't be overlooked—it's staging an impressive comeback. In Q1, client revenue surged to $2.3 billion, up 68% year-over-year, driven by strong adoption of the new 'Zen 5' Ryzen processors across both laptops and desktops. AMD's Ryzen AI Max chips are at the forefront of this growth, powering over 50 AI-enabled laptop models expected to hit the market this year. But beyond the spike in sales, this signals AMD's growing presence in the PC space, particularly in AI-driven devices like Microsoft's Copilot+ PCs—a trend that could create lasting tailwinds. What's particularly compelling is the shift in market perception. Once seen as the underdog, AMD is now steadily gaining ground on Intel by delivering processors that excel in both performance and power efficiency. The Client segment's 68% jump in Q1 reflects this evolution—AMD is no longer catching up; it's setting the pace. As AI becomes increasingly integrated into everyday computing, Ryzen chips are well-positioned to keep driving growth in this segment. Strategic Moves: Betting Big on AI AMD's strategic initiatives have played a key role in fueling its recent momentum. Notably, the acquisition of ZT Systems' data center infrastructure business underscores the company's long-term ambition to lead in AI hardware. CEO Lisa Su has projected that the AI accelerator market could reach $500 billion by 2028, and AMD is positioning itself to capture a significant share. The MI235X chip, purpose-built for AI inference, targets a segment Su believes will eventually outpace AI training in market size. Collaborations are also strengthening AMD's ecosystem. From IBM (IBM) deploying Instinct MI300X accelerators to Fujitsu (FJTSF) working with AMD on sustainable AI infrastructure, these alliances are embedding AMD's technology into the fabric of global AI development. With double-digit revenue growth forecasted for the years ahead, AMD's strategic investments are increasingly looking like long-term value drivers. A Valuation That Screams Opportunity At first glance, AMD's current valuation—trading at 31x this year's consensus EPS of roughly $4—may seem steep. However, a closer look reveals a compelling growth story. Analysts project a 44% jump in EPS by 2026, reaching an estimated $5.71. That would bring the forward P/E down to a much more attractive 22x, a reasonable price for a company set to benefit from a robust, multi-year AI tailwind. Adding to the bullish case, AMD's gross margin climbed to 54% in Q1 and continues to improve as its high-margin data center segment scales. This margin expansion sets the stage for stronger profitability ahead, reinforcing the stock's long-term value. Is AMD a Good Stock to Buy? Wall Street maintains a fairly bullish view of AMD stock. AMD features a Moderate Buy consensus rating based on 22 Buy and 10 Hold ratings issued over the past three months. Notably, not a single analyst is bearish on AMD stock. AMD's average price target of $127.93 implies a modest 5% upside potential over the next 12 months, which, in my view, suggests that Wall Street is still underestimating the stock. AMD Poised for a Breakout as AI Demand Accelerates AMD appears to be at a pivotal inflection point. Its Data Center and Client segments are gaining momentum, recent acquisitions are reinforcing its position in the AI space, and the current valuation suggests the market has yet to fully price in its growth potential. While challenges like export restrictions to China and weakness in the gaming segment remain, they're far outweighed by AMD's aggressive expansion into AI. Given the accelerating demand for advanced computing power, I believe AMD is well-positioned not just for a recovery but for a breakout to new highs. The conditions for a sustained rally are falling into place—and that breakout may be closer than many expect.


Forbes
13-06-2025
- Business
- Forbes
AMD Accelerates AI Data Centers With Instinct And Helios
Today, AMD held its Advancing AI event in San Jose, California. This year's event centered around the launch of the new Instinct MI350 series GPU accelerators for servers, advances to the company's ROCm (Radeon Open Compute) software development platform for the Instinct accelerators, and AMD's data center system roadmap. Disclosure: My company, Tirias Research, has consulted for AMD and other companies mentioned in this article. The Instinct MI350 & MI355X GPU accelerator specs AMD First up is the latest in the Instinct product line, the MI350 and MI355X. Like its main competitor in the AI segment, AMD has committed to an annual cadence for new server AI accelerators. The MI350 and MI355X are the latest and are based on the new CDNA 4 architecture. The MI350 is a passively cooled solution that utilizes heat sinks and fans, whereas the MI355X is a liquid-cooled solution that employs direct-to-chip cooling. The liquid cooling system provides two significant benefits: the first is an increase in Total Board Power (TBP) from 1000W to 1400W, and the second is an increase in rack density from 64 GPUs per rack to up to 128 GPUs per rack. According to AMD, the MI350 series of GPU accelerators provides approximately a 3x improvement in both AI training and inference over the previous MI300 generation, with competitive performance equal to or better than the competition on select AI models and workloads. (Tirias Research does not provide competitive benchmark information unless it can verify it). Structurally, the MI350 series is similar to the previous MI300 generation, utilizing 3D hybrid bonding to stack an Infinity Fabric die, two I/O dies, and eight compute dies on top of a silicon interposer. The most significant changes are the shift to the CDNA 4 compute architecture, the use of the latest HBM3E high-speed memory, and architectural enhancements to the I/O, which resulted in two dies rather than four. The various dies are manufactured on TSMC's N3 and N6 process nodes. The result is an increase in performance efficiency throughout the chip while maintaining a small footprint. New ROCm 7 features AMD The Second significant announcement, or group of announcements, is around ROCm, AMD's open-source software development platform for GPUs. The release of ROCm 7 demonstrates just how far the software platform has come. One of the most significant changes is the ability to run PyTorch natively on Windows on an AMD-enabled PC, a huge plus for developers, and making ROCm truly portable across all AMD platforms. ROCm now supports all major AI frameworks and models, including 1.8 million models on Hugging Face. ROCm 7 also provides an average of 3 times better training performance than ROCm 6 on leading industry models and 3.5 times higher inference performance. In addition the enhancements to ROCm, AMD is doing more outreach to developers, including a developer track at the Advancing AI event, and the availability of the new AMD Developer Cloud accessed through GitHub. Helios AI Rack AMD The third major announcement was the forthcoming rack architecture, scheduled for 2026, called Helios. Like the rest of the industry, AMD is shifting its system focus to the rack as the platform, rather than just the server tray. The Helios will be a new rack architecture based on the latest AMD technology for processing, AI, and networking. Helios will feature the Zen 6 Epyc processor, the Instinct MI400 GPU accelerator based on the CDNA Next architecture, and the Pensando Vulcano AI NIC for scale-out networking. For scale-up networking between GPU accelerators within a rack, Helios will leverage UALink. The UALink 1.0 specification was released in April. Marvell and Synopsys have both announced the availability of UALink IP, and switch chips are anticipated from several vendors, including UALink partners like Astera Labs and Cisco. Additionally, an A-list of partners and customers joined AMD at Advancing AI, including Astera Labs, Cohere, Humain, Meta, Marvell, Microsoft, OpenAI, Oracle, Red Hat, and xAI. Humain was the most interesting because of its joint venture with AMD and other silicon vendors to build an AI infrastructure in Saudi Arabia. Humain has already begun the construction of eleven data centers with plans to add 50MW modules every quarter. Key to Humain's strategy is leveraging the abundant power and young labor force in Saudi Arabia. There is much more detail behind these and the extensive list of partnership announcements, but these three underscore AMD's dedication to remaining competitive in data center AI solutions, demonstrate its consistent execution, and reinforce its position as a viable alternative provider of data center GPU accelerators and AI platforms. As the tech industry struggles to meet the demand for AI, AMD continues to enhance its server platforms to meet the needs of AI developers and workloads. While this does not leapfrog the competition, it does narrow the gap in many respects, making AMD the most competitive alternative to Nvidia.