Is VR dead?

  • Thread starter Deleted member 197115
  • Start date
I just can't see myself buying a GPU with higher than 300W consumption. 250W has been the sweet spot for noise for literally years and everytime they went above that the noise and heat was a real problem. We are already at the stage where GPUs are rapidly going from 1.5-2 slot to 3 to even 3.5 and 4 slot designs just to cool them well which seriously reduces the expansion ability of an ATX motherboard but if we start pushing 600W they are going to have to get really big and probably water cooled. Not only will none of that be cheap but it wont be quiet either nor comfortable to be next to especially in the summer. It also will cost a fair bit to game on.
The problem is (probably) nVidias problems refining the litho "grainness" on the dies from 10nm to 7 (or even lower).
Same problem for Intel - where they "solved" it by renaming 10nm to Intel 7.:roflmao:
Why this probably is the main problem conserning the newer GPUs astronomical heat output - is because the g-card producers until now have been able to keep the heat problem under control(somewhat) is because most new generation g-card was followed by stepping down to a finer "grainness" on the nm ladder.
Which automagically gave cooler g-cards - or about the same temp with higher specs.
 
Their not talking about it, that is already happening in Ryzen and the upcoming RDNA 3 will be an MCM (multi chip module) design also. It will be funny to watch AMD take the performance crown only to have people still justify their Nvidia purchase with "the chips will get too hot". Anyone even used a 30 series card? Nvidia LOVE throwing power and die size at the problem. It's all they have. Although this time, it's going to be Goliath vs not only one David, but many.
 
Intel won't be competitive at the higher end until 2024 according to Intel for GPU's.

My i9-12900 runs pretty cool and doesn't pull that much wattage in actual use for games.

AMD might have a chance at dethroning Nvidia at the high end.

FWIW I'm not an Nvidia fan boy. Sadly Nvidia still gets preferential treatment by software companies and therefore at least historically they've been better supported and had more reliable drivers.

That could all change if AMD takes the lead.
 
Last edited:
  • Deleted member 197115

Same for Intel.
Surprised anyone still paying attention to what comes from Scammax.
 
paying attention
More "besides GPU": 3 Augmented Reality Developments in Early 2022
My supposition: AR a more attractive investment for many enterprises;
VR will piggy-back AR tech.

  1. Microsoft shifting from its Hololens to Samsung collaboration
  2. Lenovo, Motorola, and Verizon collaboration, perhaps using
    Lenovo ThinkReality A3 AR glasses driven by smartphones
    or '5G neckband'.
  3. Cisco Adds 3D Modeling to Webex Hologram
https://www.nojitter.com/enterprise-connect/3-augmented-reality-developments-early-2022
 
Then how will you know they are working?

It's like a car revving its engine to show you how powerful it is, or that someone is to broke to replace their muffler.
You don't understand post. Maybe too much alcohol?

Read original post again:
Guy writes that 250 W is ok for noise but more W is too noisy.
And my advise to get 300W+ but replace stock fans with quiet Noctua fans.

You should try Noctua fans.
They are less noisy. Good quality fans.

I hope it helps you also.
 
Lol!

One thing that I absolutely am is a Noctua fan ;)

I have Noctua fans throughout my Fractal Define R6 case plus a D15 CPU cooler. The case fans have the low speed adapters and it is nearly silent.

My Fractal Torrent has a Noctua D15s.

I have a 200mm Noctua fan on my rig mounted just above my SC2 Pro to help cool me and keep air flow through my Index so it never fogs up.

I have Noctua fans in both of my Behringer amps because the stock fans were too noisy.

If you thought my attempts at humor were alcohol induced, my apologies.
 
Positive Varjo Aero review:
Another member in the “Best VR Headsets” discussion thread asked if I would write a few words about my experiences with the Varjo Aero VR headset that I got it a few months ago. (BTW I have a 3090, 19-11900 based system)

I had originally owned a Vive and then moved to a Rift S. My next step was going to be an Index but it was not initially available in Canada… and then I “just” about pulled the trigger on the G2. My concern with the G2 was the sweet spot issues that some, but certainly not all, experienced. Given that I am a race sim and flight sim guy when the Aero was announced it seemed logical to me as it met most of my requirements. It took about 3 months to arrive.

The first thing that struck me when I put it on was the clarity. Up came the Windows desktop and it was vivid. No pixilation, no screen door effect, no blurriness etc… it was just a perfectly clear desktop at a very high resolution. The headset does have an automatic IPD setup so you are ready to go as soon as you put it on. The second thing that struck me was the brightness and colour. My other headsets all had slightly muted or duller colours - this certainly did not - a bright strong picture. The third thing that quickly dawned on me was that there was no sweet spot. None. It was all one contiguous clear view.

Before I discuss any sims let me point out two other things. I certainly do not want to portray this as perfect because nothing is. So let me discuss 3 things, two of which have a happy ending. After some use I noted 3 things 1) Some barrel distortion 2) some chromatic aberration 3) A slightly smaller vertical FOV that I expected. I need not dwell on points 1 and 2 as a software update a week ago (version 3.5 of their Base software) took care of them for me. I now have no distortion and the chromatic aberration is 98% gone. And there are more software updates to come as this is still a new device. The device does have a slightly narrower vertical FOV than my Rift S but the horizontal feels the same. Small price to pay for the clarity I see but hopefully someday we will have it all.

Performance was my next concern. With this high resolution (the default is high - but you can set it up or down) was I going to suffer performance loss? The answer is no. On every app to date I have either held constant or gained on the performance of my previous headsets. Microsoft Flight Simulator was the most extreme example given that it uses OpenXR rather than OpenVR. With the Rift S I was suffering some stuttering issues even with a 3090. I used the same settings with the Aero at a higher resolution and it was smooth as butter. I have yet to run a sim at high settings and have a performance issue with my setup and the Aero. I am sure I could max things out in a given sim and experience issues but on more than acceptable settings it is just fine.

First sim I tried was Automobilista 2. The VR implementation I knew to be good from my past experience with the Rift S. It turned out to be wonderful from all points of view. I have spent more time in it than any other sim to date. I then tried the original Assetto Corsa - it too was wonderful. My third sim was ACC and I left it to third for a reason. I dearly love the sim but the VR implementation has never really got there. Don’t get me wrong it ran very smoothly and looked OK on the Aero but I am afraid the graphics of the VR implementation just will never be up there with the others. I then moved on to my flight sims and if you use MSFS 2020, DCS or IL2 you are in for a treat (it saved MSFS for me). I have also tried Alyx and it was a treat and a joy to see it like this.

Anyway… sorry for rambling on but as you can tell I am having fun. I have an RSeat with all the appropriate Racing and Flight/HOTAS peripherals and the Aero was the one thing that put the icing on the cake.
 
I dont know if you can run DLSS in VR kits.
But I guess you can.
My monthly IT HW bible the german PCGH (PCGamesHardware) has an article about DLSS.
Named DLSS: Killerargument for Geforce?
And the article does more or less answer the question with a great YES.
And I just have to say that the quality of the DLSS treated picture output is absolutely astonishing.
And when you furthermore does check the much lower ressource impact of this graphic "overdrive" then it looks like Nvidia has an absolute killer in this Deep Learning Super Sampling.
 
Last edited:
And these types of things are why people buy with Nvidia.

It's not AMD's fault, but it is still a reality. NVidia support always comes first. Debugging issues for NVidia always comes first. Optimization for NVidia always comes first.
Yes. One more thing.. Film-makers in Hollywood simply Love nVidia's Quadro cards. ;)
 
NVidia is about to start their 2022 GDC. March 21-23. A lot of this will center around AI, but at the end they will likely drop a few hints about the 40 series.

Then everyone can start speculating again about what will drop 6 months from now.
 
Listening carefully to Sebastian's commentary, he seemed more impressed by the FOV than it's overall clarity.
I think he was trying to highlight the improvement without possibly offending the folks at Pimax.
As to WMR in general...
It may not be for everyone but it certainly is a game-changer for me.
Given the option, I'll use it every time.
I've had the good fortune of going from the Rift CV1 under a GTX-1070 to a Lenovo Explorer under a GTX-1080Ti.
Both were good to great experiences.
I now own the HP Reverb G2 V1 with the upgraded cable and face plate...essentially the V2 specification.
It is a great headset in every single area and blows the previous two out of the water.
I cannot wait to pair it with a more powerful card which will allow me to play around more with super-sampling.
Interestingly enough...of the previous two headsets, the Lenovo Explorer offered better image quality though in a much less expensive class.
The CV1 offered better overall fit and of course, more utility as it should.
I have not heard anybody mention their expectations for VR or WMR under Intel's upcoming ARC.
I'm really hoping it will be good, since those are the cards I am looking at next.
I only use WMR for simracing, so I can't speak to other types of gameplay.
I hardly ever touch the controllers...actually bought the Lenovo without them.
 

Latest News

Do you prefer licensed hardware?

  • Yes for me it is vital

  • Yes, but only if it's a manufacturer I like

  • Yes, but only if the price is right

  • No, a generic wheel is fine

  • No, I would be ok with a replica


Results are only viewable after voting.
Back
Top