The Future of AI Chip Shortages: A Light at the End of the Tunnel

The Future of AI Chip Shortages: A Light at the End of the Tunnel

With the⁤ rise in adoption of generative artificial intelligence (genAI), the infrastructure supporting this growth is facing a supply and demand bottleneck.

Research⁤ from IDC shows that 66% of enterprises globally are planning to invest in​ genAI in⁣ the next 18 months, with a significant portion‌ of IT spending in 2024 allocated to infrastructure. However, ⁢a crucial ‌hardware component‌ required for ‌AI infrastructure is currently facing shortages.

The rapid pace of AI adoption has ⁤strained ⁢the supply‍ of​ high-performance chips necessary for genAI operations. While there has‌ been much focus on the demand​ for Nvidia GPUs and alternatives, the surge in demand for high-bandwidth memory⁤ chips from SK Hynix has⁢ been overlooked.

SK Hynix recently announced that their high-bandwidth memory (HBM) products, essential​ for AI processing alongside high-performance GPUs,⁢ are fully booked ⁤through 2025 due to high demand. This has led to a price increase‌ of 5% to 10% for ‌HBMs, driven by⁣ premiums⁢ and capacity needs for AI chips.

SK Hynix’s HBM3 product offers the industry’s largest 24GB memory capacity, ‌achieved through the stacking of 12 ⁢DRAM chips, providing high-capacity and​ high-performance capabilities.

SK Hynix

According to TrendForce, HBM chips are projected⁤ to contribute over 20% of the total DRAM market value by 2024, potentially exceeding 30% by ⁣2025. This surge in demand has led buyers to accept higher prices to secure stable and quality supplies, as not all major suppliers have met⁢ customer qualifications for high-performance HBM.

2024-05-08 14:51:02
Original from ⁤ www.computerworld.com

Exit mobile version