NVIDIA's stellar AI chip performance in 2023: Details on revenue, tech and market

Date: 01/09/2023
NVIDIA turns goldminer with its H100 and A100 AI capable GPU processor chips. AI cloud powered datacentres buying and stocking significantly huge number of two cutting-edge AI GPU chips H100 and A100 from NVIDIA like never before in the second quarter of 2023.
Along with well-established players like Google, Amazon AWS, Tesla, a good number of new entrants into this businesses are also buying huge number of H100 and A100 chips from NVIDIA.

H100 is estimated to cost around US$ 30,000 each. Whereas A100 is estimated to cost around US$ 10,000 each. These chips are made using TSMC 7nm (A100) 5 nm (H100) nodes. If you compare these two chips to the equivalent weight of gold. Gold doesn't cost as high as these chips. AI focused data center industry is stocking these chips like cash-rich people buying gold as investment.

So in that sense, we can call NVIDIA a 'semiconductor goldminer' and TSMC a 'semiconductor goldmine'. Let's look into what is driving this and who are buying these chips. And also little history of NVIDIA.

These NVIDIA chips process and render graphic computing data faster and better. Other chips in the market may be performing better than NVIDIA's. But NVIDIA's chips have proven their ability with support of complete development eco including software required to develop AI algorithms and systems. Intel, AMD, and many highly capable start-ups in this area may soon prove their chips outperforming NVIDIA, but before that NVIDIA will reap lot more revenue from this present market wave.

Let's first look at these two chips NVIDIA A100 and NVIDIA H100 which are in crazy demand in the AI market.

NVIDIA A100 has 54 billion transistors with silicon die size of 826 mm square and can execute 5 petaflops of performance. It has tensor cores and has up to 128 streaming multiprocessor (SMs) and 8192 FP32 CUDA cores.

NVIDIA H100 packs 80 Billion Transistors and has up to 144 streaming multiprocessors. Each SM is composed of up to 128 FP32 units which should give us a total of 18,432 CUDA cores. Other specs include:
4 Fourth-generation Tensor Cores per SM, 528 per GPU
80 GB HBM3, 5 HBM3 stacks, 10 512-bit Memory Controllers
50 MB L2 Cache

nvidia a100

Pic above: NVIDIA A100

nvidia h100

Pic above: NVIDIA H100


To compare these two chips to others in the market in terms of logic processing features available, bandwidth support, innovative processor cores, and the number of transistors. There are good number of alternate GPUs and CPUs available in the market now.

The problem is, industry is well adapted to the usage of A100 and H100 in handling their AI workloads with the software developers and development eco fully invested into these NVIDIA-based systems. In this highly competitive industry, time spent on building new AI systems using better non-NVIDIA GPU/AI processors will cost significantly.

For NVIDIA, 2023 is spectacular year in revenue performance, where it's quarterly revenue for it's 2nd quarter fiscal 2024 (quarter ending 30th Jul 2023) reached 13.51 Billion, up by 101% from it's 2nd Quarter Fiscal 2023 (year ago) and up 88% from it's 1st Quarter Fiscal 2024 (quarter ending 30th Apr 2023). Specifically if we look at its datacentre revenues, it is massive $10.32 billion in its 2nd quarter fiscal 2024 up 141% from 1st Quarter Fiscal 2024, up 171% from year ago.NVIDIA say a new computing era has begun, in fact it's very true.

The finacial Times reported NVIDIA plans to triple production of its H100 chips from 500K this year to 1.5 Million to 2 Million in 2024.

If you look at the market forecasts for AI chips, Gartner forecasted AI semiconductor market to generate a revenue of 53.445 billion US dollars in 2023, a growth of 20.9% in 2023 compared to 2022 and further in 2024 it is estimated to reach US dollar 67.148 billion, a growth rate of 25.6% compared to 2023.

Generative AI is a clear driver of this market, more GPU's are preferred than CPUs to handle AI workloads. Generative AI revenue is forecastec to increase at a compound annual growth rate (CAGR) of 80% between 2022 and 2027 growing from $1.75 billion in 2022 to $33 billion by 2027 as per GlobalData.

For more detailed study and full report on this topic of AI semiconductor enabling NVIDIA's meteoric quarterly growth is available at US$25(Rupees 2000/-), please email to nsr@emittsolutions.com to procure the detailed study.