Read more
2:18 PM · 9 August 2023

Nvidia announced brand-new, cost-effective AI chips!

-
-
Open account Download free app
-
-
Open account Download free app
-
-
Open account Download free app

Rising Costs of AI, is it actually profitable business?

AI is currently a significant trend, with millions of users accessing ChatGPT daily. Given this high demand, it's unsurprising that ChatGPT incurs substantial operational costs. It is estimated that running ChatGPT costs OpenAI about $700,000 each day, translating to 36 cents per query. Despite skyrocketing interest from businesses and individuals, running an OpenAI chatbot in a free to use basic business model could make it challenging to generate profits. These expenses are exacerbated by the need for AI companies, including Microsoft, to purchase GPUs in large quantities from manufacturers like NVIDIA. To support its commercial endeavors, it's estimated that OpenAI might require an additional 30,000 NVIDIA GPUs this year, compared to the 10,000-15,000 GPUs currently in usage.

 

Nvidia's upcoming cutting edge technology!

Yesterday, Nvidia announced the release of its GH200 super chip, designed to meet the growing demand for running large AI models and associated large costs. The GH200 boasts the same GPU as Nvidia's current top-tier AI chip, the H100, but offers triple the memory capacity. This enhancement is crucial for generative AI applications, such as ChatGPT, which require substantial computational power. The cost of running such models is significant, especially given that even with Nvidia's H100 chips, some models need to be distributed across multiple GPUs. The GH200 aims to address this challenge, with Nvidia's CEO, Jensen Huang, emphasizing its design for scaling out global data centers. The chip is set to be available in the second quarter of 2024, and while the price remains undisclosed, the current H100 line is priced around $40,000.

 

Nvidia market share

Microsoft and Nvidia have collaborated on building new supercomputers, even as Microsoft reportedly explores manufacturing its own AI chips. However, Nvidia's near-monopoly in the AI-capable GPU market, with an estimated 80% market share, might face challenges. Cloud providers, including AWS, Azure, and Google, mostly use Nvidia's H100 Tensor Core GPUs, but they are also working on adding their own services. Nvidia's dominance is also under threat from competitors like AMD, which plans to increase production of its AI GPU later this year. Additionally, tech giants like Google and Amazon are venturing into designing custom AI chips. The competitive landscape suggests that while Nvidia remains the dominant player, the AI hardware space is rapidly evolving, with new companies also exploring this space.

6 February 2026, 6:56 AM

Morning wrap: Tech sector sell-off (06.02.2026)

5 February 2026, 9:38 PM

Amazon shares tumble 10% as investors recoil at the price of AI dominance

5 February 2026, 5:55 PM

Disaster for Volvo shares. Is this the end of an iconic brand?

5 February 2026, 4:27 PM

Stock of the week: Alphabet is no longer just a search engine (05.02.2026)

Join over 2 000 000 XTB Group Clients from around the world
The financial instruments we offer, especially CFDs, can be highly risky. Fractional Shares (FS) is an acquired from XTB fiduciary right to fractional parts of stocks and ETFs. FS are not a separate financial instrument. The limited corporate rights are associated with FS.
This page was not created for investors residing in Brazil. This brokerage is not authorized by the Comissão de Valores Mobiliários (CVM) or the Brazilian Central Bank (BCB). The content of this page should not be characterized as an investment offer in Brazil or for investors residing in that country.
Losses can exceed deposits