Nvidia GeForce RTX 2080 Ti leak reveals a very powerful graphics card

I'm not saying it won't be in the game, I wouldn't be surprised if it doesn't make it into the game any time soon, if at all. I don't know how easy or difficult it is to implement RTX. If it's just a matter of flipping a switch in the UE4 engine then maybe it's going to be there from the start of EA. If it's much more complicated I don't think we'll see RTX until full release.

Kunos have been very quiet about RTX, I don't think they've said anything other than it's not a priority for them at this time.

I'm not trying to knock the idea, I'm just saying don't get too excited just yet, it could be a long way away, and I'd still be very shocked if they signed anything. I think they were just in the right place at the right time.
 
Yes its speculation. There are no benchmarks. But specs are released and are mathematical. The 2080ti is from 20 to 27% faster if it scales linearly and if the cuda cores are the same version.

The jump is too small. Usually the new 104 chip is always faster than the previous gen 102 chips (1080 > 980ti). This time it isn't, and they are forced to release the 102 chip as well.

Its not incompetence from nvidia. They chose to pack half the GPU with AI and RT instead of cuda cores. Its a gamble.

the 2080ti is barely keeping 50fps with RT on, the 2070 will be even worse. No user will adopt the tech to play @ 30fps, not on PC. I dont think.

Your two post are spot on and should be considered by anyone who think about buying those cards.

Nvidias event was all about convincing us to adopt to Ray tracing and I am as impressed as anyone else about what this technology might bring in the future. But at the moment the performance when turning on ray tracing is not what I want. I heard Shadow of the Tomb Raider was running at about 30-50 fps at 1080p with ray tracing turned on. I don't want to game at 1080p just for some more graphical effects that I may even won't notice if not implemented right. Right now I run most games (RRRE, DOOM, Far Cry 5, Wolfenstein, F1 2017 etc) at 4K (with DSR) and quality settings and AA maxed out at 100-150 fps on my 1080TI and 144Hz screen. I don't want to trade that for a few graphical enhancements and go down to a third of that performance.

And then we don't even know how well the RTX compare in terms of fps to the corresponding card in the GTX 10-series. To fully judge the new cards, wait at least for the reviews where we can see what fps it can manage compared to the 10-series and compare the fps gain against how much extra you'll have to spend for that gain in performance. My guess it that you'll have to pay a big premium as an early adopter of the RT cores that just isn't worth it considering raw fps speed.

I am really impressed by Nvidia and the ray tracing technology but I think it will take at least another generation to get the proper performance that will enable us to run ray tracing at acceptable levels at high resolutions. I'm really looking forward to the next generation ofcards on 7 nm - until then I'm happy with my 1080TI.

A good discussion on this can be found at UFD Tech in this video: https://www.youtube.com/watch?v=__ccxzh4G4s
 
So can someone tell me why a 2070 would exist or have the RTX feature set if it then would reduce game framerates down to between 30-60 at best even with only 1080p when RTX is on. Yet with the same game a user could get this frame-rate with 4x resolution increase (1080p-4K) and RTX off.

I do not believe any sersious gamers will take eyecandy over framerate if the cost of performance is so great to the point some are saying.

Very simply are there any videos, reliable sources confirming BF5 performance with RTX on and off?

For titles like Tomb Raider to only achieve on average 40fps at 1080p no less and with a 2080TI top tier card just does not make sense at all. If RTX was that costly in performance then it clearly is not yet ready to be hyped or released. To then also expect consumers to pay high prices for it.

Nay, sorry but something isn't adding up or making sense.
I see videos of BF5 Alpha on GTX 1060 with Ultra settings pulling 60fps.

So a £1000+ top end 2080Ti with RTX enabled using the same game but achieving less frames is just so crazy I can't believe is the case. Maybe it is but its just crazy.....
 
Last edited:
  • Deleted member 197115

I'm not saying it won't be in the game, I wouldn't be surprised if it doesn't make it into the game any time soon, if at all. I don't know how easy or difficult it is to implement RTX. If it's just a matter of flipping a switch in the UE4 engine then maybe it's going to be there from the start of EA. If it's much more complicated I don't think we'll see RTX until full release.

Kunos have been very quiet about RTX, I don't think they've said anything other than it's not a priority for them at this time.

I'm not trying to knock the idea, I'm just saying don't get too excited just yet, it could be a long way away, and I'd still be very shocked if they signed anything. I think they were just in the right place at the right time.
upload_2018-8-21_10-30-0.png
 
Think the addition of AI cores might be great for race sims?
Would be great to free up the Cpu from doing the AI or at least have the options to off load that to the GPU.
The problem with the RT and the AI in the Nvidia GPU are what about people that got AMD Gpu and as fare as I understand the market the Consoles owners are the main buyers of games.
Bit of a stretch to think that the game developer would make completely different versions of the games for the RTX series card.
 
Think the addition of AI cores might be great for race sims?
Would be great to free up the Cpu from doing the AI or at least have the options to off load that to the GPU.
The problem with the RT and the AI in the Nvidia GPU are what about people that got AMD Gpu and as fare as I understand the market the Consoles owners are the main buyers of games.
Bit of a stretch to think that the game developer would make completely different versions of the games for the RTX series card.

I don't think the ai cores are designated for working with ai in games. The ai would probably be calculated by the cpu as before. In this case I understood it like the ai core are for calculating and enhancing graphics by some smart algoritms and try to "fill in" where the graphics can be enhanced by using earlier learned patterns by the ai function of the chip. I'm not a tech geek so I hope my layman explanation is understandable. For more info watch the Nvidia presentation from the Gamescon event.
 
Why would they show a sim which most of the crowd did not care about to show one static shot of rendered RTX if there was no thought to implement it

Why would Kunos leave themselves open to being accused of hype, why do they need to even ?

this negativity to new technology which frankly has the ability to make cars and tracks look more photo realistic then ever and breathe life into scenes then there has ever been is confusing to me

Look at the new Forza coming, is a shot of a cockpit and wood steering wheel almost looks like RTX ...but it ain't ... Its all canned textures as are all sims, dynamic baloney hehe

If you happy to miss out on upcoming RTX titles do it but don't drag others down into your
sad place

Anyone would think all that most people play is sims lol
 
If those ~ 22% improvement are the line between playing 4k @ 50fps or 60fps, or Keeping your VR from dipping bellow 90fps, and you are willing to pay 100% more to cross that line, then do it. Otherwise wait for 2019 and 7nm.

The only reason nvidia thought prudent to release the 2080ti at launch, its explained in this graph:
980ti to 1080ti = 100% increase in performance
1080ti to 2080ti = 18% increase in performance

3p6pqA7.png



Raytracing is just not enough reason to buy this card, since adoption will be zero this year and at appalling performance, considering 2019 will have the real next gen cards.

my 2 cents.
Yep! (good post)
While we can speculate about the real performance of the RTX cars, the general consensus seems along those lines, yes (rough estimated results based in specs calculations/comparisons).

Considering that these RTXs are 12nm process technology, and the 7nm ones are expected sometime late next year, it must mean then that these 12nm RTXs are stop gaps.
And it may explain why the "2080Ti" model was made available right at launch (instead of the usual 9 months later for the top "Ti" GPU) and same for the outrageous prices at launch excused by the "Ray Tracing" technology (which could turn out irrelevant at this time period in games development, with many based around AMD powered consoles).

Also note AMD is presenting the Vega-II (or whatever the new gen will be called) sometime later in 2018/2019, using 7nm process technology (and why Nvidia may get in the 7nm bandwagon before estimated). It may actually in the end (finally?) represent the challenge to Nvidia that hasn't been so far and, if so, disrupting part of the market.

Honestly, I think early adopters of RTX GPUs should think twice and thrice about it...
 
Why would they show a sim which most of the crowd did not care about to show one static shot of rendered RTX if there was no thought to implement it

Why would Kunos leave themselves open to being accused of hype, why do they need to even ?

this negativity to new technology which frankly has the ability to make cars and tracks look more photo realistic then ever and breathe life into scenes then there has ever been is confusing to me

Look at the new Forza coming, is a shot of a cockpit and wood steering wheel almost looks like RTX ...but it ain't ... Its all canned textures as are all sims, dynamic baloney hehe

If you happy to miss out on upcoming RTX titles do it but don't drag others down into your
sad place

Anyone would think all that most people play is sims lol

You're forgetting that the performance of Real Time Ray Tracing is clearly very expensive, i.e, it's a resources hog. If the games themselves are already pushing it at ultra-settings, then with RTX enabled, it'll be extremely hard to get the smoothness in the kind of frames-per-sec that hardcore simmers tend to prefer (120+ fps?).
Meaning, the investment just for it may turn out as a shot in the foot for those willing to push high framerates in their 1440P/144Hz and 4K/60Hz fancy monitors....

More over, the technology is proprietary to Nvidia. Most games are also made first for the consoles, which are all AMD powered (Sony Playstation and Microsoft Xbox). The yet to be presented Sony PS5 will be powered by AMD Ryzen and Vega (CPU and GPU respectively), the next Xbox is also expected to follow the very same hardware choice.
...do you really think developpers will get out of their way just to include Ray Tracing on the consoles conversions to PC? I'm not so sure yet they will..... (to be seen).

In the end, and if your interest lies specifically in PC simulation-games, you need to understand that this RTX tech may, or may not, turn out irrelevant then. Plus, it may well be a situation where this tech will only make sense, performance wise, in the next following series/gen of GPUs, merely presented as a novelty as of this time.
Too many unknowns (IMO) and why I think one should buy a GPU for today's need, not for the unknown of tomorrow (for that, read previous post above this one).

I think each individual should understand this before riding on the promiss of RTX tech, and dropping their hard earned money (as early adopters) at such ridiculous high prices....
 
Last edited:
Honestly, I think early adopters of RTX GPUs should think twice and thrice about it...


Should have pioneering VR users done the same thing ? do they feel cheated ?

For me it's nothing about performance gain in existing sims as long as it's not slower :) lol

RTX card will run RTX titles, other cards won't, simple as that for me

Gawd, I will be running Gsync in BF5 for smoothness and turn off RTX and think UGH! lol
I got that from just watching videos imagine what it will be like live or better still in VR titles

for a minute imagine online shooters and seeing a enemy sneaking up behind you !
 
Last edited:
Should have pioneering VR users done the same thing ? do they feel cheated ?

For me it's nothing about performance gain in existing sims as long as it's not slower :) lol

RTX card will run RTX titles, other cards won't, simple as that for me

Gawd, I will be running Gsync in BF5 for smoothness and turn off RTX and think UGH! lol
I got that from just watching videos imagine what it will be like live or better still in VR titles

for a minute imagine online shooters and seeing a enemy sneaking up behind you !
Again, the expected performance improvements seem to be 15% to 25% performance better, at a cost of 50% to 100% more.
If the expected numbers turn out as true in the end, I think it's obvious it'll be a trully awfull purchase decision, not just for those early adopters but also subsequent owners.

If the purchase is not based on performance gains, but solely based on Nvidia's new RTX tech ("Real Time Ray Tracing"), I've already said my mind on my previous post.


What I think is wrong in your aproach is that kind of thinking that this RTX tech of Nvidia is as revolutionary as some kind of drastically new DirectX would be, or something really different enough to be an alternative take, like VR turned out to be.
It clearly isn't. :) And even if RTX is going to be globally adopted (unknown), it may take YEARS, not months. If so, at a point in time where these GPUs may be obsolete.

This "Real Time Ray Tracing" RTX tech from Nvidia is in the same vein as their previous "Nvidia Hairworks" and their "Nvidia Physx".
....have you ever counted how many titles have used these other Nvidia proprietary tech?
.....and of those, how many have done it optimized, and in trully "well worth it" fashion?
Maybe you haven't even noticed them!
In truth, these tech have always been something cool that, in the end, are always presented unoptimized and even as irrelevant in the grand scheme of gaming, because the few game developers adopting have not put as much effort on it as Nvidia always seemed to tell us they would, when selling these techs to the world in super enthusiastic press launches, like this one was.
 
Last edited:
Well I have faith in this technology and personally think it has the potential to improve sims environment every bit as much as shooters even more
I want realtime dynamic reflections in cockpit not 20 year old canned static textures to imitate glass


The new v2 machines are 3 times dearer to lease but 10 times the output
for mine that would be good decision for large developers and benefit users with a richer experience

BF5 engineer said how much easier it was manipulate shadows and time
saved from doing canned static boring textures lol

With so many top developers and coming titles already listed this is not going away and to compare to hair is a bit rich even for you Duc :)
 
Already you see threads at certain sites by people can't afford RTX moaning about BF5
Suddenly after all these dice titles they now ready to give it up and swap camps

Or was it the shots with RTX OFF horrified them ? lol :laugh:

I want to see reflections of Lara's butt :roflmao:
 
With so many top developers and coming titles already listed this is not going away and to compare to hair is a bit rich even for you Duc :)

...a bit rich? :cautious: why?
I know you only care for sim-racing and, my guess, you may be unaware of other surroundings because of that distance from other very relevant genres.

If you cared for RPGs, and other other genres where the characters appearance realism really matter, you would be aware that Nvidia Hairworks was supposed to be a huge deal.
In these sort of games, the faces and overall characters looks (as well as clothes materials, etc) have now reached ultra realistic levels, but the hair, beards and fur still look like crap. Often depicted like bits of rope (at best) if not like melted plastic (the most usual).
.....among top AAA titles (Call of Duty Ghosts, Far Cry 4, etc) the best depiction of Nvidia HW so far has been in Witcher 3, and it still looks and runs like... :poop::sick:
Not only the change in looks are not as big as we've seen Nvidia advertising, the performance hit is really bad (trully awful) to the point of being irrelevant.

In my perspective, Nvidia is presenting another proprietary tech of theirs (RTX) again as if it's going to change the landscape, the bigger difference being that they're doing it combined with the launch of their new gen of GPUs. Perhaps because it may be the only thing they have to justify them as "better" than the previous ones? :) (and at these prices! WOW).

...sure hope this is not another Nvidia flop like their Physx cards were (remember those?). I feel sorry for the poor chaps that went for it and bought something that never even really mattered (now extinct).
 
Last edited:
Nivida say RTX has taken 10 years to develop.
Some members on Race Department forums destroy its chances, significance or importance in 10 mins. :roflmao:
 
I've read some articles of journalists playing Tomb Raider with RTX, it's not great news, it struggled to get 30 fps at 1080p. So it seems the dedicated ray tracing component isn't completely independent from rest of the GPU so performance will take a hit turned on. Guessing RTX options in games will have their own level of low to high settings.
 
I hear ya Duc :)
Never told everyone to rush out and buy a card

I am not the one citing performance gains for unreleased hardware
running early drivers and game builds am I ?

So hypocritical for people historically mention
perceived driver improvement as a reason to buy
any GPU they like to mention lol ;)


Not true I like shooters with cars, golf, chess, forza lol

Like any product it's all about what the market will bear don't tell me DD
wheels are not overpriced for example, look at feelVR vs Fanatec prices
 
I've read some articles of journalists playing Tomb Raider with RTX, it's not great news, it struggled to get 30 fps at 1080p. So it seems the dedicated ray tracing component isn't completely independent from rest of the GPU so performance will take a hit turned on. Guessing RTX options in games will have their own level of low to high settings.
Yep.
These guys do a fairly good vid article (as always) and they do mention that too (at 17:58):

 
Back
Top