Connect with us

Company News

Micron increases CapEx to meet AI demand for high-bandwidth memory

Micron Technology on Tuesday slightly raised its capital expenditure forecast for 2024, as the U.S. chipmaker invests heavily to make high bandwidth memory (HBM) semiconductors to meet surging demand from the AI industry.

Why it’s important
Boise, Idaho-based Micron is one of the three large providers of HBM chips, an essential part of the hardware used in artificial intelligence servers. Its advanced HBM3E will be used in AI chip leader Nvidia’s H200 chips.

Context
The company said in March its HBM chips, which refer to semiconductors used in the development of AI applications, were sold out for 2024. A majority of its 2025 supply has also been allocated, it has said.

Micron currently offers eight-layer HBM, and has started sampling 12-layer HBM.

By the numbers
Micron raised its 2024 CapEx forecast to about $8 billion, up from its earlier forecast of $7.5 billion, CFO Matt Murphy said.

Key quote
“In fiscal 2025, we expect HBM to be a multibillion-dollar business for us,” Chief Operating Officer Manish Bhatia said at the J.P. Morgan Technology, Media and Communications Conference.

Market reaction
Shares of the company were down about 3% in a broadly weak market. The stock hit a record high in March, and is up about 51% this year through Monday close. Reuters

Click to comment

You must be logged in to post a comment Login

Leave a Reply

Copyright © 2024 Communications Today

error: Content is protected !!