Research Reports

Research Reports

[Selected Topics] Era of Machine Data: Analysis on Training and Inference Applications of AI Servers

icon

Last Modified

2023-05-08

icon

Update Frequency

Not

icon

Format

PDF



Overview

Summary
The expectation that AI would introduce immense benefits to the commercial market has prompted major suppliers to engage in AI training, which yielded the demand for hardware equipment. AI is currently at the mass training phase, and applications of AI servers are still focused on training, where GPU and GPGPU serve as the cores, while ASIC and FPGA facilitate inference through specific algorithms that will improve overall computing performance.

Table of Contents
1. Perpetual AI Training under Era of Machine Data
  (1) Existing AI Servers Primarily Focused on Training
  (2) FPGA and ASIC Could Assist with Inference

2. Analysis on Architectures and Trends of AI Servers
  (1) GPU Serves as the Core for AI Servers
  (2) ASIC and FPGA to Accelerate Inference Applications
  (3) Mass Data Prompts Storage Demand
  (4) DPU Emerges to Strengthen Resource Allocation
  (5) High Energy Consumption Amplifies Demand for Cooling

3. TRI’s View
  (1) Fast Iterations of AI Models Actuates Advancement of Server Modules
  (2) AI Servers Currently Focused on Training with GPU Serving as a Key Factor

<Total Pages:12>





USD $200

icon

Membership

Get in touch with us