Articles


2024-07-26

[News] SK hynix Approves USD 6.8 Billion Investment for its First Fab in Yongin, Targeting Next-gen DRAMs

South Korean memory giant SK hynix announced on July 26th that it has decided to invest about 9.4 trillion won (approximately USD 6.8 billion) in building the first fab and business facilities of the Yongin Semiconductor Cluster after the board resolution today, according to its press release.

The company plans to start the construction of the fab in March next year and complete it in May 2027, while the investment period was planned to start from August 2024 to the end of 2028, SK hynix states.

The company will produce next-generation DRAMs, including HBM, at the 1st fab prepare for production of other products in line with market demand at the time of completion.

“The Yongin Cluster will be the foundation for SK hynix’s mid- to long-term growth and a place for innovation and co-prosperity that we are creating with our partners,” said Vice President Kim Young-sik, Head of Manufacturing Technology at SK hynix. “We want to contribute to revitalizing the national economy by successfully completing the large-scale industrial complex and dramatically enhancing Korea’s semiconductor technology and ecosystem competitiveness,” according to the press release.

The Yongin Cluster, which will be built on a 4.15 million square meter site in Wonsam-myeon, Yongin, Gyeonggi Province, is currently under site preparation and infrastructure construction. SK hynix has decided to build four state-of-the-art fabs that will produce next-generation semiconductors, and a semiconductor cooperation complex with more than 50 small local companies.

After the construction of the 1st fab, the company aims to complete the remaining three fabs sequentially to grow the Yongin Cluster into a “Global AI semiconductor production base,” the press release notes.

The 9.4 trillion investment approved this time included various construction costs necessary for the initial operation of the cluster, including auxiliary facilities1, business support buildings, and welfare facilities along with the 1st fab.

In addition, SK hynix plans to build a “Mini-fab2” within the first phase to help small businesses develop, demonstrate and evaluate technologies. Through the Mini-fab, the company will provide small business partners with an environment similar to the actual production site so that they can improve the technological perfection as much as possible.

Read more

(Photo credit: SK hynix)

Please note that this article cites information from SK hynix.
2024-07-26

[News] Tencent Cloud Releases Self-developed Server OS, Supporting China’s Top Three CPU Brands

Due to challenges in exporting high-performance processors based on x86 and Arm architectures to China, the country is gradually adopting domestically designed operating systems.

According to industry sources cited by Tom’s hardware, Tencent Cloud recently launched the TencentOS Server V3 operating system, which supports China’s three major processors: Huawei’s Kunpeng CPUs based on Arm, Sugon’s Hygon CPUs based on x86, and Phytium’s FeiTeng CPUs based on Arm.

The operating system optimizes CPU usage, power consumption, and memory usage. To optimize the operating system and domestic processors for data centers, Tencent has collaborated with Huawei and Sugon to develop a high-performance domestic database platform.

Reportedly, TencentOS Server V3 can run GPU clusters, aiding Tencent’s AI operations. The latest version of the operating system fully supports NVIDIA GPU virtualization, enhancing processor utilization for resource-intensive services such as Optical Character Recognition (OCR). This innovative approach reduces the cost of purchasing NVIDIA products by nearly 60%.

TencentOS Server is already running on nearly 10 million machines, making it one of the most widely deployed Linux operating systems in China. Other companies, such as Huawei, have also developed their own operating systems, like OpenEuler.

Read more

(Photo credit: Tencent Cloud)

Please note that this article cites information from Tom’s hardwareLiberty Times Net and EE Times China.

2024-07-26

[News] GB200 AI Servers Generate USD 210 Billion Annual Revenue, Boosting Supply Chain Growth

According to a report from Wccftech, it’s indicated that with soaring market demand, the shipment volume of NVIDIA’s Blackwell architecture GB200 AI servers has also significantly increased.

As NVIDIA claims, the Blackwell series is expected to be its most successful product. Industry sources cited by Wccftech indicate that NVIDIA’s latest GB200 AI servers are drawing significant orders, with strong demand projected to continue beyond 2025. This ongoing demand is enabling NVIDIA to secure additional orders as its newest AI products remain dominant.

The increasing demand for NVIDIA GB200 AI servers has led to revenue performances of Taiwanese suppliers such as Quanta, Foxconn, and Wistron exceeding expectations. Reportedly, NVIDIA is expected to ship 60,000 to 70,000 servers equipped with GB200 AI server. Each server is estimated to cost between USD 2 million and 3 million, resulting in approximately USD 210 billion in annual revenue from the Blackwell servers alone.

NVIDIA’s GB200 AI chip servers, available in NVL72 and NVL36 specifications, have seen greater preference for the less powerful models due to the growing number of AI startups choosing the more financially feasible NVL36 servers.

With Blackwell debuting in the market by Q4 2024, NVIDIA is projected to achieve significant revenue figures, potentially surpassing the performance of the previous Hopper architecture. Furthermore, NVIDIA has reportedly placed orders for around 340,000 CoWoS advanced packaging units with TSMC for 2025.

Notably, according to the industry sources previously cited in a report from Economic Daily News, TSMC is gearing up to start production of NVIDIA’s latest Blackwell platform architecture graphics processors (GPU) on the 4nm process.

The same report further cited sources, revealing that international giants such as Amazon, Dell, Google, Meta, and Microsoft will adopt the NVIDIA Blackwell architecture GPU for AI servers. As demand exceeds expectations,NVIDIA is prompted to increase its orders with TSMC by approximately 25%.

Read more

(Photo credit: NVIDIA)

Please note that this article cites information from Wccftech and Economic Daily News.

2024-07-26

[News] Battle between Memory Giants Heats up in 2H24 as Samsung and SK hynix Advance in HBM3/ HBM3e

As SK hynix and Samsung are releasing their financial results on July 25th and July 31st, respectively, their progress on HBM3 and HBM3e have also been brought into spotlight. Earlier this week, Samsung is said to eventually passed NVIDIA’s qualification tests for its HBM3 chips. While the Big Three in the memory sector are now almost on the same page, the war between HBM3/ HBM3e is expected to intensify in the second half of 2024.

Samsung Takes a Big Leap

According to reports from Reuters and the Korea Economic Daily, Samsung’s HBM3 chips have been cleared by NVIDIA, which will initially be used exclusively in the AI giant’s H20, a less advanced GPU tailored for the Chinese market. Citing sources familiar with the matter, the reports note that Samsung may begin supplying HBM3 to NVIDIA as early as August.

However, as the U.S. is reportedly considering to implement new trade sanctions on China in October, looking to further limit China’s access to advanced AI chip technology, NVIDIA’s HGX-H20 AI GPUs might face a sales ban. Whether and to what extent would Samsung’s momentum be impacted remains to be seen.

SK hynix Eyes HBM3e to Account > 50% of Total HBM Shipments

SK hynix, as the current HBM market leader, has expressed its optimism in securing the throne on HBM3. According to a report by Business Korea, citing Kim Woo-hyun, vice president and chief financial officer of SK hynix, the company significantly expanded its HBM3e shipments in the second quarter as demand surged.

Moreover, SK hynix reportedly expects its HBM3e shipments to surpass those of HBM3 in the third quarter, with HBM3e accounting for more than half of the total HBM shipments in 2024.

SK hynix started mass production of the 8-layer HBM3e for NVIDIA in March, and now it is also confident about the progress on the 12-layer HBM3e. According to Business Korea, the company expects to begin supplying 12-layer HBM3e products to its customers in the fourth quarter. In addition, it projects the supply of 12-layer products to surpass that of 8-layer products in the first half of 2025.

Micron Expands at Full Throttle

Micron, on the other hand, has reportedly started mass production of 8-layer HBM3e in February, according to a previous report from Korea Joongang Daily. The company is also reportedly planning to complete preparations for mass production of 12-layer HBM3e in the second half and supply it to major customers like NVIDIA in 2025.

Targeting to achieve a 20% to 25% market share in HBM by 2025, Micron is said to be building a pilot production line for HBM in the U.S. and is considering producing HBM in Malaysia for the first time to capture more demand from the AI boom, a report by Nikkei notes. Micron’s largest HBM production facility is located in Taichung, Taiwan, where expansion efforts are also underway.

Earlier in May, a report from a Japanese media outlet The Daily Industrial News also indicated that Micron planned to build a new DRAM plant in Hiroshima, with construction scheduled to begin in early 2026 and aiming for completion of plant buildings and first tool-in by the end of 2027.

TrendForce’s latest report on the memory industry reveals that DRAM revenue is expected to see significant increases of 75% in 2024, driven by the rise of high-value products like HBM. As the market keeps booming, would Samsung come from behind and take the lead in the HBM3e battle ground? Or would SK hynix defend its throne? The progress of 12-layer HBM3e may be a key factor to watch.

Read more

(Photo credit: Samsung)

Please note that this article cites information from Reuters and Business Korea.
2024-07-26

[News] SK hynix Financial Report Exceeds Expectations, with Predicted Memory Capacity Allocation Benefiting Taiwanese Manufacturers

Global HBM leader, South Korea’s SK hynix, announced its financial report for the last quarter on July 25, exceeding market expectations. According to a report from Economic Daily News, the company also announced a full-scale effort to boost production of high-bandwidth memory (HBM) for AI, with this year’s capital expenditure expected to surpass initial projections. Additionally, more capacity will be allocated for HBM production.

Industry sources cited by the report also indicate that for Taiwanese manufacturers, the major global memory companies are expanding their HBM production capacity by converting existing DRAM capacity to HBM. This shift will suppress the supply of DDR4 and DDR5 DRAM, positively impacting market conditions.

Previously, as per sources cited by the Economic Daily News, it’s indicated that global memory leader Samsung plans to allocate about 30% of its existing DRAM capacity to HBM production. Now, with SK hynix reportedly making similar plans, this may benefit Taiwanese DRAM-related companies like Nanya Technology and ADATA in the future.

Reportedly, Nanya Technology is said to believe that the DRAM market has significantly improved due to the production cuts by the three major memory manufacturers—Samsung, SK hynix, and Micron—in the second half of last year, combined with the strong demand for HBM driven by generative AI. This chain reaction is spreading to various types of DRAM, and the company expects to see clear operational improvements soon.

SK hynix announced yesterday that its Q2 revenue increased by 125% year-on-year to KRW 16.4 trillion (USD 11.9 billion), setting a new record. Operating profit reached KRW 5.47 trillion, the highest since Q3 2018, significantly better than the KRW 2.9 trillion loss in the same period last year. The operating margin was 33%, exceeding expectations, mainly due to a more than 250% surge in HBM sales and an overall increase in DRAM and NAND chip prices.

SK hynix plans to begin mass production of the next-generation 12-layer HBM3e chips this quarter, enhancing its competitive edge over rivals Samsung and Micron in the design and supply of advanced memory for NVIDIA’s AI accelerators. HBM3e is expected to account for about half of all HBM chip sales this year. Additionally, capital expenditure for this year is likely to exceed initial expectations.

SK hynix predicts that the overall memory market will continue to grow in the second half of the year, with DRAM and NAND chip supply becoming tighter and demand for AI servers remaining strong.

Read more

(Photo credit: SK hynix)

Please note that this article cites information from Economic Daily News.
  • Page 1
  • 328 page(s)
  • 1637 result(s)

Get in touch with us