SANDBOX

Mid-tier Graphics Card Woes (Big Oof, Large Yikes)

RTX 5050, RTX 5060, RTX 5060 Ti, RX 9060 XT

That is a lot of new cards launched recently! The RTX 5050, the RTX 5060, two versions of the RTX 5060 Ti, and two versions of the RX 9060 XT!

Instead of checking each one out separately, let us take a look at them all in one fell swoop, with an eye toward deciding where/if they belong in our main build chart. I am somewhat unsure of what a swoop is, and why it needs to be of the fell variety, but here are the cards:

 

New GPUs for $250

 

The RTX 5050 is an overpriced card that is recommended because the alternatives in this price range barely exist. It should be a $200 card, but they set prices as high as they can get away with. The card itself, other than the price, is very decent. It is energy-efficient, quiet, and well-suited for 1080p gaming. For $250, you are only paying $50 extra, so it is not shockingly overpriced.

 

New GPUs for $300

 

Both the RTX 5060 and the RX 9060 XT (8GB) are available at this price. The 5060 is not bad, but the 9060 is significantly stronger, so we will recommend it and wait for nVidia to cut prices. Knowing nVidia, it might be a long wait.

 

New GPUs for (Roughly) $400

 

The RTX 5060 Ti (8GB) and the RX 9060 XT (16GB) are both $380. With the 9060 performing slightly better and having more VRAM, we will recommend it and skip the 8GB 5060 Ti. But if you want the 5060 Ti anyway, the 16GB version is $450, and performs better.

 

Buyer Beware

 

So, you may have noticed that AMD has released two cards named RX 9060 XT, with an 8GB and 16GB models set $80 apart. nVidia did the same by releasing the RTX 5060 Ti with 8GB and 16GB variants set $70 apart.

For both launches, the 16GB model was launched/reviewed first, and the lesser 8GB model either launched later, or launched without being sampled to reviewers, so it was reviewed later. Now, I am not implying that this is a nasty and shady practice, which is meant to trick unsuspecting customers. No, no, no… we all know that these large companies absolutely love their user base, and want the best for them, and would never ever do anything greedy. Instead, let me tell you of a hypothetical scenario that is completely unrelated to these GPUs.

A Very Informative Aside:

Double Burger

Edited from Photo by Good As Burgers

If a new burger joint opened up, and a reviewer went there and got a nice double-patty burger, and reviewed it well, I would be tempted to buy that burger. If I were to order a burger based on the review, and that burger joint had a very similarly named burger, close in price to what I wanted, I could easily order that other burger by mistake.

If, after paying and going home, I unwrapped my burger only to find that it is a single-patty burger, I would be pissed. Half of a very important ingredient is missing! Yes, I made the mistake, but I would still feel disgusted and cheated. If I had alternatives, I would never buy from that burger joint again.

But there are no alternatives, are there? No, not in this podunk town with just a couple restaurants. It is just you, dear reader, and your ability to heed this warning.

Back on Track:

And now that the completely unrelated hypothetical scenario has been presented to you, let us get back to the topic. Why are there two cards, identically named, with 8GB and 16GB VRAM? And does that VRAM make a difference? We will skip the first question, because you are an adult and should be familiar with the concept of value engineering, and you would know that companies have entire departments that exist just to squeeze every last dollar out of you. But what about 8GB vs 16GB when it comes to performance?

When you buy a $100-$150 graphics card, you are a person on a budget. You will most likely be on a lower-resolution screen. For 1080p, 8GB of VRAM is adequate, and you will likely face few (if any) issues. Hence we have no problem with the RTX 5050 with its 8GB VRAM, as (while a bit overpriced) it still targets 1080p gaming. But we are in the year 2025, and 1440p screens can be had for $180. 4K screens, which once cost between $1000 to $2500, are now $250 to $300. And these are standard normal prices, no discounts or sales or coupons.

For 1440p and 4K, 8GB of VRAM is increasingly considered not enough, especially on modern/recently-released titles. When you run out of VRAM, the games become completely unplayable. Please take a few minutes to watch the video linked there, as seeing is believing, and you may not yet believe how bad games become when you run out of VRAM.

$200 cards with 8GB are fine for now. But $380 cards? Those cost more than your screen! Before, if you had tried to use a $150 card to play games on a $1000 screen, you were making a mistake. Now, when your $380 card cannot properly play games on your $250 screen, your manufacturer has made a mistake. You would think that VRAM must be a seriously expensive component, but it is not. An additional 8GB of VRAM costs around $18 extra, so being stingy with VRAM is a very cheap, very greedy move.

Please be careful, and understand that (from both AMD and nVidia) there are two cards, similarly named and somewhat close in price, but different in specs and performance. Do not buy the wrong card by mistake, because you may feel disgusted and cheated if you do.

 

Going Forward

 

In our main chart, we will be adding some of the newly released cards, and retiring some of the old. The following changes are already live for the USA, and will roll out to other countries over the coming weeks.

Additions:

    • Added RTX 5050 to Very Good
    • Added RX 9060 XT 8GB to Great
    • Added RX 9060 XT 16GB to Great and Superb
    • Added RTX 5060 Ti 16GB to Superb

Removals:

    • Removed RX 7600 from Very Good (price makes it bad vs RTX 5050)
    • Removed Arc B580 from Great (price makes it bad vs RX 9060 XT)
    • Removed RX 6750 XT from Great (price makes it bad vs RX 9060 XT)
    • Removed RX 7700 XT from Superb, since it is EOLed (replaced fully by RX 9060 XT / RTX 5060 Ti)
    • Removed RX 7800 XT from Excellent, since it is EOLed (replaced fully by RX 9070 / RTX 5070)

 

And that’s all for this update! Thank you for reading. Now I think I’ve made myself hungry, and so I must go procure a burger with (hopefully) two patties.

 

Sources

For nVidia:

For AMD:

Related Articles