AMD Radeon 7000

Very much hoping you're right.
But, is there any evidence for high stock levels of the 3000 series that they'll need to shift?
I guess we'll see in a few weeks.
At some point Nvidia will have to follow suit...just like AMD has done over the past few months.
Only very dedicated RT takers will buy an $850 RTX3080Ti over a $900 7900XT.
 
Last edited:
So today will be the day of truth!
Anyone found a benchmark yet? I guess they will pop up at around 18:00 EU time?
(they dropped ACC from their test suite :cry:)
and
https://www.techpowerup.com/review/amd-radeon-rx-7900-xtx/40.html mentions very high multi-monitor power consumption :poop:

AMD overpromised and underdelivered like they used to. If I was on the market for a $1000 GPU I'd pay $200 more for the 4080. And 7900XT is even worse, almost -20% of the performance for 10% price discount.
 
Using RTX 3080 Ti as a 100% benchmark, these are the relative performances per Techpowerup.
1670862180843.png
 
Last edited:
I will be the first to admit that the performance fell a little shy from where I hoped it would be, and the price hurts for what you're getting. But the price hurts even more for the green team. Unless you really care about ray tracing the 7900XTX is the better buy over the 4080. The cut down 7900XT is the tougher sell.

From what I've seen so far, they have a few power management kinks to iron out in the drivers still. The card is drawing more power than it should at idle. At least the reference card is.
 
I will be the first to admit that the performance fell a little shy from where I hoped it would be, and the price hurts for what you're getting. But the price hurts even more for the green team. Unless you really care about ray tracing the 7900XTX is the better buy over the 4080. The cut down 7900XT is the tougher sell.

From what I've seen so far, they have a few power management kinks to iron out in the drivers still. The card is drawing more power than it should at idle. At least the reference card is.
It's also drawing +15% more compared to 4080 in-game and during a video playback (88 watts, seriously???). Any $$$ savings will get eaten by extra power cost in a couple of years. Thanks, but no thanks.
 
I added the new gen to my sheet. "Fair" prices were calculated by 3D center org. Afaik they took inflation and the average generational leaps over the last 10 years or so into consideration. The prices make a lot of sense for my gut feel.
What's a bit "funny" is that the new gen is worse in price to performance than the last gen.

1670888754981.png


1670888845016.png


Yeah.. 4080 and 4090 are so bad, that I cut them off...
1670888945968.png
 
It's also drawing +15% more compared to 4080 in-game and during a video playback (88 watts, seriously???). Any $$$ savings will get eaten by extra power cost in a couple of years. Thanks, but no thanks.
I'm more concerned that it would be a space heater during the summer months.

The power issues are probably just a driver issue that will get worked out shortly. If not then AMD has a serious issue on their hands.
 
I'm more concerned that it would be a space heater during the summer months.

The power issues are probably just a driver issue that will get worked out shortly. If not then AMD has a serious issue on their hands.
Meanwhile you can use that space heater for the winter months :roflmao:
This is not the first time I see that in the context of an AMD card. You would've thought they've learned their lesson, but no
Also, I'm not a big fan of their new video output layout. I've got 3 monitors that I can connect via DP or HDMI and an HDMI VR headset, which means I would need some kind of converter for Type-C output and I'd rather avoid that.
 
Hopefully the board partners will fix both issues.
Gamersnexus said the reference card is just that and isn't trying to compete with the board partners cards. At least the power spikes should be solved but hopefully the low-load and idle power too, alongside a 3rd DP port!
 
Also, I'm not a big fan of their new video output layout. I've got 3 monitors that I can connect via DP or HDMI and an HDMI VR headset, which means I would need some kind of converter for Type-C output and I'd rather avoid that.
Don't worry that's only on the reference card. All the 3rd party cards will have the exact same 3x dp 1x hdmi as usual. Did the exact same thing last gen.

I'm pretty annoyed in this launch tbh. There is no way around it, but AMD lied this time. I was hoping we were beyond this as the 6000 generation was very honest with numbers and did exactly what they said at launch. This time round they were like yeah its going to be 1.5-1.7 x better than the 6950xt with no actual numbers. Which would put it right in between the 4080 and 4090. What we have instead is a broken unfinished architecture / drivers which is on average 1.3x better than the 6950xt and with loads of idle power issues it seems. The benchmarks simply do not match the raw specifications of the cards and clearly something has gone very wrong.

One good thing I did notice, Eposvox was kind about the new dual encoders so it seems they have actually fixed streaming at least, so finally AMD is a good card for streaming and video editing. But it’s still god awful in blender and 3d render apps still sadly.

Always keep in mind though AMD has never delivered a card as well as Nvidia at launch. They always claw back huge chunks of performance in the 6-12 months afterwards. It’s just annoying that it’s value is so marginal day one, and that they completely mis-sold the performance and overhyped it in that presentation. The 7900xtx is still slightly better value than a 4080 but only marginally, with the prospect that you "might" get a big jump with a driver update in a few months. I'm not sure I would put money on this day one. I feel like none of the current gen gpus are worth putting money on this generation. 4090 is a developer / workstation card sold to IT whales (hats off for those who can afford one), the 4080 is just insultingly bad value, 7900xtx is an unfinished product and the 7900xt is simply awful value (probably shouldn't even exist) and suspect the 4070ti could be better value when it launches in Jan but still like $800. Also with these current AMD FPS numbers you can forget the 4080 dropping in price as well.

Bottom line is I think the 7900xtx will be a very good card, but not if you buy it right now. Clearly yet again the driver team need to catch up with the hardware. Moors law is dead was saying in his roundup video that the driver team are working through the holidays! :roflmao:
 
The 7900xtx is still slightly better value than a 4080 but only marginally, with the prospect that you "might" get a big jump with a driver update in a few months.
Did you see my charts above?
The 7900xtx isn't that bad, just not great either.
Basically, the 7900xtx would be a good last gen card, being slightly worse value than the mid tier 3060 but with massively higher fps.
Like it always was.
The issue is that it's only "okay" in comparison to the last gen, not when you'd expect the normal generational leap in value, lol.

The 4080 is a full on middle finger from nvidia to the users. It's value doesn't even fit into the chart.

If the driver updates deliver as much as is rumoured, it might become a great card if you ignore generational leaps.
I feel like none of the current gen gpus are worth putting money on this generation.
Yep, that's what my charts very clearly show. The value is worse than the last gen. It's not how evolution works.
In comparison I paid 3% more for my 7600x + b650 mobo + ddr5 than I paid for my 10600k + cheap Z490 mobo + ddr4.
But I get 40-60% more performance after 2.5 years!
4080 is just insultingly bad value, 7900xtx is an unfinished product and the 7900xt is simply awful value (probably shouldn't even exist)
Good summary haha!
The 7900xt is a good card in theory. But it would need to be 150-200€ cheaper...

The 4090 is okay in my opinion though. It's the "best of the best" without competition. The power efficiency is stunning and the performance is awesome.
If you have the money for such a card, you don't care about 500€ more or less in your pocket anyway.
 
Been sitting on the fence waiting for these benchmarks as I currently have a 6900xt, and while the 7 series looks a good card for the cost, because I race exclusively in VR and am likely to continue doing so, I think I'll wait a little longer and then go for the 4090 (prices have started to drop sometimes £200/£300 in the UK - fingers crossed they may drop a little more). I've mostly been very happy with the 6900xt (aside from the compression over link cable on Quest 2) but I think particularly for VR the 4090 is a real step up... Until the 5090 comes along!
 
Always keep in mind though AMD has never delivered a card as well as Nvidia at launch. They always claw back huge chunks of performance in the 6-12 months afterwards.
Having been on AMD gpu's and cpu's for a number of years now i can testify that it is normal practice for AMD to get a product to market in a working state and then work on ironing out the wrinkles. They sell it as "Fine wine" but you call it what you like.

The XTX seems fairly honest and relatively decent value. A bit less bang than expected but a fair replacement for the 6900XT.

The XT on the other hand is clearly the 6800XT replacement given the x900 nomenclature and an obscene price bump to match. Just like the 4080 12GB this card either shouldn't exist or not at this price point. I doubt they will "unlaunch" it, but yep, £750-800 is more in the ballpark. Cheesy move AMD.
 
Bottom line is I think the 7900xtx will be a very good card, but not if you buy it right now. Clearly yet again the driver team need to catch up with the hardware. Moors law is dead was saying in his roundup video that the driver team are working through the holidays! :roflmao:
I was just watching his video 5 minutes before coming here. It was like deja vu reading through your post. For a minute I thought maybe it was your channel. :laugh:

Anyway, might as well post it.
 
Back
Top