On January 29, 2021, a man walked past the company’s logo in the lobby of SK hynix’s Bundang office in Seongnam.
Jung Yeon-je | AFP | Getty Images
SK Hynix, one of the world’s largest memory chip makers, said on Thursday that second-quarter profit hit its highest level in six years as it maintained its leadership in advanced memory chips critical to artificial intelligence computing.
Here’s how SK hynix’s second-quarter results compare to the LSEG SmartEstimate, which is weighted based on more consistent and accurate analyst forecasts:
- income: 16.42 trillion won (approximately US$11.86 billion), compared with 16.4 trillion won
- operating profit: 5.47 trillion won vs 5.4 trillion won
June quarter operating profit hit the highest level since the second quarter of 2018, rising from the previous quarter Loss of 2.88 trillion won The same period a year ago.
Revenue from April to June increased 124.7% from 7.3 trillion won a year ago. This is the highest quarterly revenue in the company’s history, according to LSEG data dating back to 2009.
SK hynix said on Thursday Due to the strong demand for artificial intelligence memory, including high-bandwidth memory, the overall price of its memory products continues to rise, resulting in a quarterly revenue increase of 32%.
The South Korean giant supplies high-bandwidth memory chips to companies such as artificial intelligence chipsets Nvidia.
SK Hynix shares fell 7.81% on Thursday morning.
decline comes as South Korea Cospi The index fell 1.91% following disappointing overnight sell-off in U.S. technology stocks letter and Tesla income. The reports mark investors’ first look at how large companies performed in the second quarter.
“Strong demand for AI servers is expected to continue in the second half of this year with the launch of AI-enabled PCs and mobile devices, and traditional markets will gradually recover,” the company said in an earnings call on Thursday.
Taking advantage of the strong demand for artificial intelligence, SK hynix plans to “continue to maintain its leading position in the HBM market through mass production of 12-layer HBM3E products.”
The company will Started mass production of 12-layer HBM3E Customer shipments are expected in the fourth quarter after samples are provided to key customers.
Supply is tight
Memory leaders such as SK Hynix have been actively expanding HBM production capacity to meet the growing demand for artificial intelligence processors.
HBM requires more wafer capacity than conventional dynamic random access memory products, a type of computer memory used to store data, which SK Hynix said is also facing supply constraints.
SK said: “Investment demand is also rising in order to meet the demand for traditional DRAM as well as HBM. HBM requires more wafer production capacity than ordinary DRAM. Therefore, the capital expenditure level this year is expected to be higher than what we expected at the beginning of the year.” Hynix .
“While overcapacity is expected to increase next year due to increased industrial investment, a large part of it will be used to increase HBM production. Therefore, the tight supply situation of traditional DRAM is likely to continue.”
Daiwa Capital Markets’ SK Kim said in a June 12 report that they expect “HBM and memory supply tightness to continue into 2025 as HBM production encounters bottlenecks.”
“As such, we expect the favorable price environment to continue and SK hynix to benefit from its competitiveness in AI graphics processing unit HBM and AI server high-density enterprise SSD (eSSD) to record strong sales in 2024-25 earnings, thereby driving a re-rating of the stock,” King said.
As large language models such as ChatGPT drive the explosive adoption of artificial intelligence, the supply of high-bandwidth memory chips is already stretched thin.
Artificial Intelligence boom expected to continue The supply of high-end memory chips will be tight this year, analysts warned. SK Hynix and Micron said in May that they had sold out of their high-bandwidth memory chips for 2024 and that their 2025 inventories were almost sold out.
Large language model Requires large amounts of high-performance memory chips Because such a chip allows these models to remember details of past conversations and user preferences in order to produce human-like responses.
SK Hynix dominates the high-bandwidth memory chip market, becoming Nvidia’s sole supplier of HBM3 chips ahead of its competitors Samsung reportedly passed the test In recognition of the use of its HBM3 chips in Nvidia processors for the Chinese market.
The company said it expects to begin shipping the next-generation 12-layer HBM4 in the second half of 2025.