Skip to Main Content
PCMag editors select and review products independently. If you buy through affiliate links, we may earn commissions, which help support our testing.

With Nvidia's GeForce RTX 40 Series, Are High GPU Prices the New Normal?

The GPU shortage is over, hurrah! But prepare to pay even more for your next-gen graphics card...yikes!

By Michael Justin Allen Sexton
September 20, 2022
(Credit: Nvidia)

With CEO Jensen Huang's keynote today, Nvidia lifted the veil, at least in part, on its upcoming GeForce RTX 40 Series graphics cards. His presentation gave us the first confirmed, concrete details so far on the graphics giant's much-anticipated upcoming family of graphics cards, with the reveal of the high-end GeForce RTX 4090, and two versions of a step-down GeForce RTX 4080.

Naturally, performance projections were the star of the show, with Nvidia claiming that these new cards will be much faster than their old ones. But the other big takeaway: These new cards will be extremely expensive, and later cards in the series (not yet announced) could set new all-time highs, in terms of list price, for the cost of gaming. Ouch!


Price Increases by Skips and Jumps

Graphics cards have been selling for a premium in recent years, often way above MSRP, as the dual interest in PC gaming and cryptocurrency mining has led to massive shortages. With cryptocurrency enthusiasm cooling off (the recent Ethereum merge will be a big factor accelerating that trend) and the actual hardware shortage more or less over, however, prices have been dropping closer to standard list prices for current-gen cards...or even below, i places. And all signs point to that trend continuing, at least for the near future.

But that's current cards. Nvidia threw cold water on that low-price vibe with the pricing announcement of its upcoming Nvidia GeForce RTX 4080 and RTX 4090 graphics cards. Base models will cost $899 and $1,599, respectively.

If you’ve bought a card in the last year or two (or at least been window-shopping for one), these prices may not seem so bad. But it still shows an ongoing--and worrisome to consumers--trend of "price floor" increases over several generations. A few card cycles ago, Nvidia started selling its best graphics cards under the Titan brand name, and these cards were viewed as exorbitantly expensive at the time.

Nvidia GeForce RTX 4090
(Credit: Nvidia)

There was some justification for this, however, as most consumer-focused graphics cards have their GPGPU performance handicapped. That's to prevent standard graphics cards from competing with the workstation-grade cards that Nvidia and AMD also sell. These workstation cards are still very much graphics cards, but they are optimized for other types of workloads, and they are frequently used in supercomputers, servers, and systems devoted to high-end content creation and massive data crunching. The best of these cards cost a great deal more than your typical gaming card. So at the time, the high price made sense on the Titan, as a sort of jack-of-all-trades bridge between gaming cards and GPGPU-focused ones.

Nvidia no longer sells its best consumer graphics cards under the Titan name, but you can see here below how the top two cards in each generation compare in terms of their launch prices. (We'll ignore the Ti cards for now, as Nvidia hasn't decloaked an RTX 4090 Ti.) Though the GeForce RTX 4090 isn’t the most expensive card in the chart, it’s important to note that it's always possible that Nvidia may launch a Titan card at some future point, and an RTX 4090 Ti will probably appear. Either would be even more costly than the $1,599 of the RTX 4090.

As you can see, the graphics/workstation GTX Titan X flagship card originally sold for what was then an exorbitantly expensive $999, but that level of card has drifted up now to $1,599 over the last decade, for the Nvidia GeForce RTX 4090. That’s "only" $100 over the base price for the previous-gen GeForce RTX 3090. But as we said earlier, this is a disturbing trend that's mounted over several generations. And we’d argue the RTX 3090 itself was expensive to begin with.

Also note: None of these cards, except the Titans, has full GPGPU performance. In spite of the RTX 3090's price, and that it holds the same position in the line that the Titan cards used to, its GPGPU performance is handicapped just like all other consumer graphics cards. The RTX 4090 is likely to be the same. (The GPGPU cards are now Nvidia's workstation-focused RTX A series.)

The “lower end" x080 cards fare even worse in comparisons. Take the lowest-priced GeForce RTX 4080, which at $899, costs $200 more than the last-generation RTX 3080 (launch price: $699). Nvidia also announced that memory-enhanced versions of the RTX 4080 will be even more expensive: $1,199, if you want to go from 12GB of GDDR6X to 16GB of GDDR6X RAM. From a price standpoint, that's even harder to swallow.


Video Card Inflation: Is It Here to Stay?

As a gamer and tech enthusiast, I hate to see prices on graphics cards going up, and as a tech analyst, I feel these prices have a bit of padding in them. But I also can’t deny that there likely is some justification for them. And this leaves me with little faith that the situation will change much going forward.

The big chip-making companies like AMD, Intel, and Nvidia have long relied on improvements in semiconductor technology to help drive their technological advancements. This is still true today, but the rate at which semiconductors are improving is slowing down. Serious questions are afoot about how much further silicon-based semiconductor technology can be pushed, at least to elicit performance advances in proportion to those of recent generations.

Nvidia GeForce RTX 4080
(Credit: Nvidia)

The move to newer fabrication processes has long enabled companies to cram more hardware resources into increasingly small areas while also reducing production costs. Companies rely on these added hardware resources to increase performance, but if they can’t move to a new process at the same time, then this often results in chips needing to become larger to accommodate additional components. Naturally, this drives up the price, as well.

At the moment, we don’t know which process technology Nvidia will use for the GeForce RTX 40 series GPUs, but this is one key area that could be to blame for cost increases on Nvidia's end to create the GPUs. What we do know for sure, from the initial specs, is that the RTX 4090 will pack an enormous payload of resources, with roughly 50% more CUDA cores than the RTX 3090. So it’s very possible it will be larger, even if Nvidia transitioned to a new process node.

Jensen Huang
(Credit: Nvidia)

In light of that, the thermal and power components will have to be more robust, and thus pricier, too. The new PCI Express 5.0 power connector that the GeForce RTX 4090 will use can supply up to 600 watts of power, of which the RTX 4090 is rated to pull 450 watts. That’s quite the step up from the 350-watt power rating on the RTX 3090, and that means more-robust power circuitry to control all of that juice. That's not to mention the extra cooling needed to keep something pulling that much current from overheating.

All of these factors might be punching up the price, and that’s not even taking into account external forces like inflation and, potentially, more expensive shipping. Though things could improve to some degree, it’s likely that the days when you could buy a top-of-the-line flagship graphics card for $1,000 or less are gone for good. Who knew we'd see the day when we'd be ready to pour one out for an $899 RTX 3080 as a "bargain"?

Get Our Best Stories!

Sign up for What's New Now to get our top stories delivered to your inbox every morning.

This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.


Thanks for signing up!

Your subscription has been confirmed. Keep an eye on your inbox!

Sign up for other newsletters

About Michael Justin Allen Sexton

Analyst

For as long as I can remember, I've had love of all things tech, spurred on, in part, by a love of gaming. I began working on computers owned by immediate family members and relatives when I was around 10 years old. I've always sought to learn as much as possible about anything PC, leading to a well-rounded grasp on all things tech today. In my role at PCMag, I greatly enjoy the opportunity to share what I know.

I wrote for the well-known tech site Tom's Hardware for three years before I joined PCMag in 2018. In that time, I've reviewed desktops, PC cases, and motherboards as a freelancer, while also producing deals content for the site and its sibling ExtremeTech. Now, as a full-time PCMag analyst, I'm focusing on reviewing processors and graphics cards while dabbling in all other things PC-related.

Read Michael Justin Allen's full bio

Read the latest from Michael Justin Allen Sexton