AI


2023-10-30

[Insights] Apple’s Quiet Pursuit of AI and the Advantage in AI Subscription Models

According to Bloomberg, Apple is quietly catching up with its competitors in the AI field. Observing Apple’s layout for the AI field, in addition to acquiring AI-related companies to gain relevant technology quickly, Apple is now developing its large language model (LLM).

TrendForce’s insights:

  1. Apple’s Low-Profile Approach to AI: Seizing the Next Growth Opportunity

As the smartphone market matures, brands are not only focusing on hardware upgrades, particularly in camera modules, to stimulate device replacements, but they are also observing the emergence of numerous brands keen on introducing new AI functionalities in smartphones. This move is aimed at reigniting the growth potential of smartphones. Some Chinese brands have achieved notable progress in the AI field, especially in large language models.

For instance, Xiaomi introduced its large language model MiLM-6B, ranking tenth in the C-Eval list (a comprehensive evaluation benchmark for Chinese language models developed in collaboration with Tsinghua University, Shanghai Jiao Tong University, and the University of Edinburgh) and topping the list in its category in terms of parameters. Meanwhile, Vivo has launched the large model VivoLM, with its VivoLM-7B model securing the second position on the C-Eval ranking.

As for Apple, while it may appear to be in a mostly observatory role as other Silicon Valley companies like OpenAI release ChatGPT, and Google and Microsoft introduce AI versions of search engines, the reality is that since 2018, Apple has quietly acquired over 20 companies related to AI technology from the market. Apple’s approach is characterized by its extreme discretion, with only a few of these transactions publicly disclosing their final acquisition prices.

On another front, Apple has been discreetly developing its own large language model called Ajax. It commits daily expenditures of millions of dollars for training this model with the aim of making its performance even more robust compared to OpenAI’s ChatGPT 3.5 and Meta’s LLaMA.

  1. Apple’s Advantage in Developing a Paid Subscription Model for Large Language Models Compared to Other Brands

Analyzing the current most common usage scenarios for smartphones among general consumers, these typically revolve around activities like taking photos, communication, and information retrieval. While there is potential to enhance user experiences with AI in some functionalities, these usage scenarios currently do not fall under the category of “essential AI features.”

However, if a killer application involving large language models were to emerge on smartphones in the future, Apple is poised to have an exclusive advantage in establishing such a service as a subscription-based model. This advantage is due to recent shifts in Apple’s revenue composition, notably the increasing contribution of “Service” revenue.

In August 2023, Apple CEO Tim Cook highlighted in Apple’s third-quarter financial report that Apple’s subscription services, which include Apple Arcade, Apple Music, iCloud, AppleCare, and others, had achieved record-breaking revenue and amassed over 1 billion paying subscribers.

In other words, compared to other smartphone brands, Apple is better positioned to monetize a large language model service through subscription due to its already substantial base of paying subscription users. Other smartphone brands may find it challenging to gain consumer favor for a paid subscription service involving large language models, as they lack a similarly extensive base of subscription users.

Read more

2023-10-25

[News] Lenovo’s AI PC and ‘AI Twins’ Unveiled, Market Entry Expected After September

At Global Tech World Event on October 24th, Lenovo Group’s Chairman and CEO, Yuanqing Yang, has presented AI-powered PCs and enterprise-level “AI Twins”(AI assistant) to a global audience, heralding a new dawn for personal computers. He revealed that AI PCs are slated to hit the market no sooner than September of the following year.

Yang said that the journey of AI PCs involves a maturation process. Historically, they start with a 10% market share but are destined to become the norm, envisioning a future where every computer is an AI PC.

Regarding foundation models, Yang pointed out that some companies are hastily jumping on the bandwagon. However, he emphasized Lenovo’s commitment to not rush into trends and noted the drawbacks and vulnerabilities in China’s existing public foundation models, including concerns about personal privacy and data security. Lenovo’s focus is on establishing hybrid foundation models.

Given the need to compress models for device deployment, Lenovo is currently concentrating on research related to domain-adaptive model fine-tuning, lightweight model compression, and privacy protection techniques.

Moreover, Yang highlighted Lenovo’s prior announcement of a US$1 billion investment in the AI Innovation over the next three years. However, he clarified that this amount falls short of the financial demands since virtually all of Lenovo’s business domains involve AI and fundamental services, requiring substantial financial backing.

Lenovo’s Q1 earnings report from mid-August had unveiled the company’s plan to allocate an additional $1 billion over the next three years for expediting AI technology and applications. This encompasses the development of AI devices, AI infrastructure, and the integration of generative AI technologies like AIGC into industry vertical solutions.

Besides, chip manufacturers like Intel is joining forces to expedite the development of the AI PC ecosystem. Their objective is to realize AI applications on over 100 million personal computers by 2025. This endeavor has piqued the interest of well-known international brands like Acer, Asus, HP, and Dell, all of which have a positive outlook on the potential of AI-powered PCs. It is anticipated that AI PCs will be a pivotal factor in revitalizing the PC industry’s annual growth by 2024.

Currently, there are no brands selling AI PCs in the true sense, but leading manufacturers have already revealed their plans for related products. The industry anticipates a substantial release of AI PCs in 2024.

(Image: Lenovo)

2023-10-23

Dive into the Future of AI at COMPUTEX 2024 – Registration Open!

This year’s COMPUTEX successfully attracted global attention, with the participation of 1,000 exhibitors and nearly 50,000 technology experts, startups, professional buyers, and international media. Building on the success of this year’s exhibition, COMPUTEX 2024 will revolve around Artificial Intelligence (AI) and is scheduled to take place from June 4 to 7 at the Nangang Exhibition Center, Halls 1 and 2. The organizer cordially invites interested exhibitors to register and join this global technology extravaganza.

COMPUTEX 2024: Building an AI Technology Ecosystem

According to market research firm IDC’s predictions, global spending on AI system-related software, hardware, and services is expected to exceed $300 billion by 2026. Additionally, the compound annual growth rate (CAGR) for 2022 to 2026 is projected to reach 26.5%. Given the continued expansion of the global AI computing trend into various industries, COMPUTEX 2024 focuses on six major themes: AI Computing, Advanced Connectivity, Future Mobility, Immersive Reality, Innovations, and Sustainability. New additions to the event include exhibition areas with forward-looking development potential, such as AI Computing and System Integration, Components and Battery Energy Storage, and Smart Mobility. We invite global technology enterprises to join us in co-creating the AI technology ecosystem.

InnoVEX, the innovation and startup exhibition area, stands as COMPUTEX’s international benchmark platform for startups. It provides startup companies with opportunities to find manufacturing partners and international sales channels while hosting startup competitions and product presentations to increase visibility and garner support from international investors.

Diverse and Rich Exhibition Activities Receive Support from Technology Leaders

This year, COMPUTEX Keynotes & Forums made a significant impact by featuring heavyweight tech CEO speakers. The event created quite a buzz in the industry, drawing an attendance of 7,000 people on-site and an impressive 2.5 million online views. In 2024, the organizers will continue to host these speaking engagements, inviting tech giants and high-level executives to share their profound insights into future technology trends. These discussions delve deep into areas such as AI, system integration, data science, IoT, and more, fostering cross-disciplinary exchanges and knowledge sharing.

Throughout the exhibition, there will be additional events, including international press conferences, opening ceremonies, procurement meetings, themed guided tours, Happy Hour, and more. These activities are designed to enhance communication and interaction among exhibitors, international media, international buyers, and domestic industry professionals.

COMPUTEX 2024: Registration to Exhibit is Now Open for this Global Business Opportunity Event

Registration for COMPUTEX 2024 is now officially open; an outstanding platform for businesses interested in expanding international markets, staying updated on the latest industry trends, and identifying potential supply chain partners. We warmly welcome participants in today’s tech, especially the fields of AI technology, system integration solutions, smart technology, and startups.

(Image: COMPUTEX)

2023-10-19

TSMC Q323 Earnings Call Full Transcript: Question 6 to Question 10

Question 6. Brett Lin from Bank of America

Operator: Next one to ask questions, Brett Lin from Bank of America.

Brett: Thank you for taking my question. So first of all, congrats on the strong result and then also the impressive gross margin. So I have two questions. One is on the end device AI, HAI and the other is on the CPO.

Brett: So appreciate the management’s constructive comments on growth outlook on the HAI. So besides the, well, interesting engagement with the clients, what are the implications for the wafer consumptions for the firm?

Brett: And also on the computing power and energy consumption angle on the end device with additional AI functions, should we expect it to re-accelerate the node migration for the end devices? That’s my first question. Thank you.

Host: Okay, so Brad’s first question is about edge and end device AI. He wants to know what is the implications for wafer consumption?

Host: And then with the increasing need for energy efficiency and power, he is wondering, does this re-accelerate or increase the node for, I guess, his words is node migration or adoption of leading edge technologies?

C.C.: Well, the edge device start to, that’s including smartphone and PCs, start to incorporate the AI functionality inside. We observe some of the neural engines has been added increasingly.

C.C.: So the die size will increase, even the unit did not increase dramatically, but the die size, it’s in mid-teens, no, not, I mean, mixed single-digit is the die size increase so far. And I expect that this kind of trend will continue.

C.C.: And so more and more application on the AI side will be incorporated into those kind of edge device. And that will need a very power-efficient chips to put it into the edge device, especially when it is a mobile.

C.C.: So I do expect, from my own perspective, I do expect that my customer will move into the leading edge node more and more quickly to compete in the market.

Brett: Okay. Thank you very much, C.C.. So, well, so a bit of a follow-up is that, well, now it accounts for some, well, mixed single-digit of the die size incremental for a chip. So does that, or are we seeing that to enlarge to something like mid-teens or, well, even bigger in the, well, mid to long-term?

Host: So Brett’s quick follow-up is if the AI portion is kind of mid-single-digit now, how should we expect? Can we expect mid-teens or what type of percentage in a few years’ time?

C.C.: Well, I will answer the question. Actually, we see the increase on the die size, but we cannot nail down the, we say the mid-single-digit, but I expect it to start to increase. And whether that will increase our forecast and our growth or something, it’s still too early to say to, at this moment.

Brett: Yeah, but we’re still quantifying the impact from these development. So we’re maintaining the previous statement that we expect it to grow to about, in five years, about mid-single-digit. I’m sorry, mid-teens of our revenue.

Host: Yeah, I think, Brett, probably just very simply, as we said, edge AI, we do see some activity. It will drive silicon content, but this will occur over time, okay? And we don’t have any quantitative number to share. All right?

Brett: Got it, thank you.

Host: Okay, what is your second question?

Brett: Okay, so the second question is on CPO. So basically, we have learned that TSMC is doing quite well and also leading the industry in CPO or so-called silicon photonics and has introduced a platform to clients with the technologies.

Brett: So may we learn that opportunities and implications of the new technology for the industry and for our firm? And also, shall we expect the platform to offer additional competitive advantage for TSMC in the mid to long run? Thank you.

Host: Brett, your second question is on silicon photonics. Is that correct?

Brett: Yes.

Host: So he wants to know our positioning or progress on silicon photonics. How important is this? And will this be a competitive advantage for TSMC going forward in the future?

C.C.: Okay, let me answer that question.Silicon photonics actually is growing its importance because of just a larger amount of data need to be collected, processed, and transferred in an energy-efficient manner. Silicon photonics tends to be the best to fit that role.

C.C.: And TSMC has been working on silicon photonics for years.

C.C.: And most importantly, we’re collaborating with multiple leading customers to support their innovations in this field. It takes a lot of time to develop the technology and to build the capacity.

C.C.: And when we increase the volume production, we believe that TSMC’s silicon photonics will be the best technology and when customers roll out all their innovations. But as I said, it’s gradually increasing in their activity and gradually increasing their demand as of today.

Brett: Got it, thank you very much.

Host: All right, thank you. Operator, can we move on to the next caller, please?

Question 7. Sunny Lin from UBS

Operator: Next one, we have Sunny Lin from UBS.

Sunny: Good afternoon. Thank you very much for taking my questions. So, my first question is on advanced packaging. Incrementally, we are hearing more customers’ interest in the adoption to achieve better heterogeneous integrations. But I want to get your thoughts on what could be the potential impact of customers relying a bit more on packaging to improve the system performance and perhaps less on the process migrations given cost considerations. Meanwhile, SOIC has been introduced for quite a while, whereas the customer adoption still seems to be limited at this point. So, when should we expect a more meaningful pickup of SOIC and what could be the major catalyst? Thank you.

Host: Okay. So, Sunny, sorry, I may have missed a little bit of the first part, but I think her question is on overall advanced packaging, looking at this trend and the move to more, of course, heterogeneous integration. What are the cost implications and how does advanced packaging work and go together with the process technology standpoint? And then also a question about the update or progress of SOIC. Is that correct, Sunny?

Sunny: Well, so maybe if I may clarify a bit, so for the first part, I wonder if customers may consider relying a bit more on packaging, whereas slowing down a bit on the process migration because of the increasing cost.

Host: Okay, so she’s asking will customers, because of the increasing cost of the process technologies, will customers rely more on advanced packaging as a result?

C.C.: Let me answer that. It’s not because of the increasing of the costing in the more advanced node. It’s actually, they try to, our customers try to maximize the system performance. That’s all. That’s the major portion. That includes the kind of a speed improvement or the power consumption decrease, all those kinds of thing, put it all together. And maybe cost is also part of the consideration, which we notice about. And so more and more customers are moving into the very advanced technology node, and they start to adopt the chiplet approaches. And so, you know, no matter what, TSMC provides the industry-leading solution in both very leading technology and also very advanced packaging technology. And to work with our customers for their product, they have a best system performance.

C.C.: And the other one is, you are asking about the SOIC, when it will become a high volume and more substantial revenue for TSMC. It’s coming, it’s coming. Actually, the customer is already ready to announce their new product, which are widely adopted. And I expect, you know, starting from now and next year, the SOIC will generate revenue and become one of the faster-growing advanced packaging solution in the next few years.

Sunny: Got it, thank you very much. If I may, a quick follow-up. Three months ago, you had a target to double your co-op capacities. And just now you mentioned AI demand continues to surprise on the upside. I do wonder if there’s any update on your co-op capacity expansion?

Host: Okay, so I will take this as your second question, Sunny. But as she’s asking about co-op expansion, we had said that we will double the capacity three months ago. Can we provide an update on the overall co-op capacity and I guess capex and capacity go hand in hand? What is our plan?

C.C.: Well, Sunny, you know, the last time we say that we will double our co-op capacity, we are working very hard to increase the capacity more than double, but today is limited by my supplier’s capability or their capacity.

C.C.: So we still maintain that we will double our co-op capacity by the end of 2024. But the total output actually is more than double from 2023 to 2024 because of a very high demand from our customer. So as you can, this kind of a trend, we will continue to increase our co-op capacity to support our customer, even into 2025.

Host: Okay, Sunny, does that answer your question?

Sunny: Okay, thank you.

Host: Operator, can we move on to the next caller, please?

Question 8. Madi Husseini from FIG

Operator: Next one, please. Welcome, Madi Husseini from FIG.

Madi: Yes, thanks for taking my question. I understand there are a number of new products that you’re ramping into a year end and into first half of 2024 for various markets. And I want to understand how the ramp of these new products are to impact the seasonality. Could we see a scenario where in the first half, the ramp of these new products, especially to be at the leading edge to somewhat upset the seasonal factors? And any thoughts there? And I’ll have a follow up.

Host: All right, Madi. Well, Madi’s first question is, in terms of new products, which of course, customer products, we don’t comment on, but he said we’re ramping products into the second half. And so how will this ramp of new products go into as we go into first half 2024? And can this offset or mitigate some of the seasonality?

Madi: Yeah, let me rephrase that. Contribution of customers’ new products, and how would that impact, or how could that upset seasonal factors?

Wendell: Yeah, Madi, I don’t think we can comment on specific customer products, but I can tell you that we’re not seeing any dramatic change in the seasonality as of now.

Madi: Okay, because I was looking at your year calendar 23, and given your Q4 guide, you’re actually doing better than what you guided three months ago. Three months ago, you said revenues could be down 10% US dollar, and now it could actually be down by a single digit. Is that a combination of a stronger new product ramp and better pricing? Is that a fair assessment?

Host: Okay, so Madi is really looking at, he rightly notes that three months ago, we said this year will decline around 10% in US dollar term.Now with the fourth quarter implied guidance is slightly better. So he wants to know what is the implication or behind this.

C.C.: Well, let me give you one simple reason, because our ramp up of N3, because of the demand of N3 is strong. So ramp up quickly to meet customers’ demand, so the final result is better than we expected three months ago.

Host: Yeah, and we have also said that the strong ramp of N3 will continue in the next year, okay? That’s about all the seasonality we can give.

Madi: Gotcha, okay. And then perhaps if I were to ask a second question, I just want to better understand your view on your customers’ inventory correction. We’re reaching the bottom, where we don’t have any visibility on how quickly they’re going to refresh inventory. The slope of the recovery is still not clear. Did I understand you correctly?

Host: So, Madi’s second question you would like to clarify. So are we saying that customers’ inventory is reaching or approaching a bottom, but the slope of the inventory is not clear? Is that what we are saying?

C.C.: Okay, I’ll answer the question. Actually, you know, in these couple of months, we start to see the demand stabilized in the PC and smartphone end market. And in fact, we see some kind of urgent PO asked for more devices to be shifted to their place to meet the demand. That gave us a hint that their inventory control has already become more healthier than we thought. So in terms of uncertain macro, it probably will continue, but our expectation is very close to a healthy condition. So that’s why we say we can expect 2024 to be a healthy growth year for TSMC. Okay. Madi, did I answer your question?

Madi: Great, thank you.

Host:

Okay, in the interest of time, thank you, Madi. In the interest of time, we’ll take questions from the last two participants, please.

Question 9. Krish Sankar from TD Cohen.

Operator: Next one to ask a question is Krish Sankar from TD Cohen.

Krish: Yeah, hi, thanks for doing my question. I had two of them. First one is on gross margins. When do you expect N3 to reach corporate average gross margin? And as you look into next year, as more mature node capacity comes online across the industry, how do you think about mature node gross margins also? And I have a follow up after that.

Host: Okay, so Krish’s first question is on gross margin. When do we expect three nanometer or N3, I should say, to reach the corporate average gross margin? And how do we see the gross margin trend for the more mature nodes?

Wendell: Yeah, Krish, in the past, our leading nodes normally reach gross margins, corporate margin in about eight quarters. But as we progress with more and more leading nodes, it will become more and more challenging because of several reasons. Well, first of all, our corporate margin is higher than before. And secondly, the leading node is N3. The leading node, as I just said, is becoming more and more complex.

And also, in the past few years, the inflation pressure that was not expected also contribute the higher costs in the N3. So it’s gonna be pretty challenging for future leading nodes to reach corporate margin as in before, like before in the same timeframe. The mature nodes, I can tell you that our mature nodes are the gross margins are really congregated around the corporate average in a pretty narrow band because we focus on specialty technology. It’s not a commodity capacity. Yes.

Krish: Okay. Yeah, that’s very helpful. Thank you for that. And then as a follow-up on Arizona, you mentioned that you’d hired about 1,100 local employees. I’m kind of curious, is that critical mass enough for you to start four nanometer production or do you have another target level of employees before they can actually start getting this production since you’re still maintaining the output to be in first half of 2025? Okay. Thank you.

Host: Thank you, Krish. So Krish’s second question is about our first fab in Arizona. He notes that we have said we hired 1,100 local employees. So his question is, is this enough critical mass or enough people basically to support the ramp of the first fab as we plan, as we said today, in first half 25?

C.C.: Of course. We continue to hire the local talents to join the TSMC 5 in Arizona. So when we start to have a volume production, we are confident that we will have enough resources to support our ramp up in Arizona.

Host: Okay. All right.

Krish: Thank you.

Host: Thank you, Krish. Operator, can we move on to the last participant, please?

Question 10. Charles Shi from NIEM

Operator: Yes, the last one to ask question, Charles Shi from NIM.

Charles: Hi, thank you for squeezing me in. First off, I really want to congratulate TSMC for delivering good results for Q3 and very good guidance for Q4. But I want to really call out the reported revenue for 5 nanometers in the third quarter looks like you are showing some really good counter cyclical strength and probably the record high. I want to understand the rebound in the 5 nanometer business in Q3. Is that going to be more in the following quarters and what’s behind that?

And the relative, let’s say, with the expectation like three to six months ago when you were reducing your 23 hour, is 5 nanometer doing better than expected? And how has the demand trended for the last two to five nanometer? Thank you.

Host: Okay, so Charles first question is about 5 nanometer. He’s asking in the very near term, he notes that he saw a very strong sequential revenue increase in the third quarter. So he’s wondering what is driving this? And then he’s asking about what is the outlook for the next three to six months for 5 nanometer specifically?

Wendell: Yeah, Charles, I can share with you the increase in revenue and five in the third quarter mainly comes from two platforms, HPC and smartphone. HPC also includes the AI related demand. Smartphones are basically customers, some customers product seasonalities. Now for looking wise, I’m not going to share with you, but we will tell you in January what actual the next quarter N5 revenue will be.

Host: The overall revenue.We don’t provide revenue by process node, okay. What’s your second question, Charles?

Charles: Yeah, thanks Jeff. The other question is about CapEx. It sounds like that you are expecting CapEx on absolute dollar may still grow going forward. I know that’s a long-term comment, but I looked at the near term, QSMC CapEx seems to be running at a 7 billion U.S. dollars per quarter in the second half 23, which kind of is at a 28 billion annualized around rate. But if we are expecting total CapEx for 24 to grow in a dollar term over 23, it seems like you are expecting a CapEx ramp in 2024. Maybe that’s your planning for some of the CapEx ramp in 24. Is that the right way to think about a CapEx is 7 billion pretty like a really bottom level run rate for QSMC CapEx at this point? Thank you.

Host: All right, so Charles’ second question is also on CapEx.Basically he’s saying, given the guidance, he’s looking at our CapEx is running at about a U.S. dollar 7 billion run rate. So he’s assuming, although we do not comment on 24, he’s assuming if next year’s CapEx dollar amount is going to increase, but if we’re running at 7 billion run rate, does that imply 28 billion? How should he reconcile this?

Wendell: Charles, every year the CapEx is invested based on the future opportunity to growth. We invested to capture those future opportunities. Too early to talk about 2024, really. We’ll share the guidance with you in January quarterly release.

Host: Okay. All right, thank you, Charles. Thank you, everyone. This concludes our Q&A session.

*The entire content of the Transcript was assisted by AI tools and edited by human correction for the purpose of industry information dissemination and reference only. It should not be used as an investment guideline, and official announcements should be the primary source for detailed information.

(Photo Credit: TSMC)

2023-10-18

[News] Foxconn Aims to Create AI Factory for Smart Cities and EVs with NVIDIA

Foxconn’s annual Technology Day kicked off today at the Nangang Exhibition Center, following the tradition of celebrating founder Terry Gou’s birthday. Although Mr. Gou was not present at the event, NVIDIA CEO Jensen Huang made a surprise appearance, introducing the production version of Model B and highlighting its appeal to young couples.

Jensen Huang announced that he and Foxconn Chairman Young Liu are collaborating to establish an AI Factory. He shared hand-drawn sketches, emphasizing that they are adopting a novel approach to software development, enabling computers to create software based on human preferences and experiences. To ensure a computer’s ability to learn effectively, it must have access to abundant data from which it can derive valuable insights and experiences..

He unveiled the vital role of robust computers in facilitating AI learning. With NVIDIA’s assistance, Foxconn may collect enough data for AI to process shape network models, paving the way for innovative intelligence.

Jensen Huang then emphasized that this groundbreaking system is set to empower any factory or company, with Foxconn’s cutting-edge electric vehicle Model B enabling interaction between drivers and passengers. The AI Factory will offer a wide array of tools, including software solutions, to enhance the overall quality of life.

Chairman Liu reiterated their determination to bring AI Factory into fruition, backed by three core platforms: Smart City, Smart Manufacturing, and Smart EVs, all of which are driven by the innovative AI Factory.

Huang noted that these efforts will all be fueled by AI intelligence, ultimately creating substantial value. In a light-hearted tone, he concluded, “Let’s meet at the night market!”

Adding an interesting twist to the day’s events, Foxconn’s Tech Day featured three Foxconn N7 electric cars in deep blue, light blue, and white. Notably, the white vehicle showcased a handwritten message from Jensen Huang that read, “To Young and my friend at Foxconn, beautiful and amazing EVs! Jensen Huang.”

(Photo credit: Foxconn’s Stream)

  • Page 2
  • 4 page(s)
  • 17 result(s)