Is VR dead?

  • Thread starter Deleted member 197115
  • Start date
I guess it's great that there is a checkbox so anyone can enable/disable this feature.

If it ever gives me a headache, I'll remember there is an off switch.
 
Last edited:
I guess it's great that there is a checkbox so anyone can enable/disable this feature.

If it ever gives me a headache, I'll remember there is an off switch.
Lots of technical information was provided from some one who's almost certainly much, much more capable, informed, and educated on the topic than you are. I don't know how that warrants a laugh from you.

Not just headaches. It can give wrong 3D, loss of depth, loss of naturalness to the world's scale, etc. It's much more than just some possible headaches.
 
Last edited:
As said I still havent seen anybody saying iRacings VR setup is fake.
Or does look fake in VR.
So...

I have only seen a lot (or at least many) people agree that iRacings VR method does make a rather small demand of the graphic ressources on peoples PC.
Hehe relatively seen :)
 
This is a good Reddit post:
I just read some of the outpourings in the links above from a guy who obviously use a pretty shining metallic DIY hat.
If I dont have to back my oppinions up with anything else than "something I have discovered on my own" then I can garantee I have a lot of goofy oppinions to post.
Hehe but I usually try to behave myself.:whistling:

Quote:
THIS IS BAD FOR YOUR VISION. Our brain is trying to fit the pieces and perform stereopsis with wrong input data which can result in nausea or even damaged vision with prolonged use. (No links, as this is something I have discovered on my own.)

Don't be scared if you have been using it until now.
Our brain is so marvelous that it will readjust after you stop showing it the wrong images.
 
1679845576563.png


I need to build one :)
He also says Apple's VR system is getting close to a release.

 
Last edited:
Why does every one walking with these devices look like they just crapped their pants? I keep seeing it on rigs like these. With all the thought, effort, and tech going into these devices, is it not possible to design something which allows the user to walk/run like a normal human being? Lol.
 
Last edited:
Looking like you are taking a dump is half the fun!

Although to be fair, that guy almost looks like a gimp on a torture device. So I can see how a majority of us would likely want it to look better than that.
 
Last edited:
I'm looking at my VR system right now and the complexity and possible points of failure are many.

Dynamic Foveated Rendering is great stuff, but you have to keep up with a number of things.

I had eye tracking stop working after I upgraded to Windows 11. Simple enough I had to open the security settings and give my computer access to the eye tracking camera.

Then it stopped working again. I was 1 version behind on my NVidia driver. So I upgraded that and it worked.

OpenXR is definitely advancing, but the individual titles need to support new features as well for the best performance.

Right now I have

DFR in iRacing at 35ppd, but at the OpenXR level. That works and yields a mild performance increase. Don't get me wrong it works well and looks sharp. This uses all the adjustment setting in the OpenXR toolkit.

However in DCS Eagle Dynamics added native Varjo SDK support. As a result we have NATIVE DFR in the Varjo toolkit. That unlocks 39ppd which looks even sharper. Using this bypasses the OpenXR toolkit and amazingly is both sharper and give a more noticeable performance increase.

So the pixel densities I'm running everything are now the following:

39ppd Native DFR
36ppd Native SDK without DFR ( basically when my camera isn't working )
35ppd everything else.

I found that for all my SteamVR games if I just backed off the MCAA I was able to get solid 90fps frame rates. At 35ppd the MCAA isn't as valuable. So frequently I'm running at 2x or even none and everything still looks fantastic.

I'm now playing with OpenXR Motion Correction.
It runs with a Vive Tracker( lighthouse tracker) or Witmotion module( lower cost accelerometer). They each have their strengths. There is an inside out Vive tracker coming this fall.

I had someone with an SFX150 system say that his motion sometimes puts his head through the window or roof in game. So there is a real value to something like this to help keep the cockpit in place with motion systems. The D-Box system I have doesn't have enough motion to do that to me, but I still wanted to play with this to see additional stability.

That's a work in progress.

Reprojection:
Varjo reprojection 3:1 and 2:1 is a massive improvement in performance if you don't mind how it looks.

In iRacing I can drop my GPU load down to about 21% at 35ppd locked to 45fps and 2:1 reprojection.
In DCS I can drop my GPU load down to 48% at 39ppd locked to 45fps and 2:1 reprojection.

This should mean that many titles are now usable with less powerful video cards than the 4090. I haven't tried 3:1 reprojection yet.

However I like how the displays look better without reprojection. I think it may be usable for some people, but it wasn't seamless like the Oculus reprojection, which was the last time I think I actually used reprojection for anything.

About the 39ppd in DCS. Holy Crap is that sharp! 35ppd is pretty darn sharp, however I think(completely guessing) that there is more going on in terms of image enhancement with the native DFR. That's only another 11.4% resolution, so I "think" more is going on. Where I to guess again, because of the Aero's optics that focus more pixels at the very center of the displays, I would bet that the 39ppd is the maximum ppd that you get when looking straight ahead and it almost has to drop some in resolution as you look further away from the center of the displays.

I had said earlier that I thought anything past 35ppd is probably past the point where you would notice. What I'm seeing now seems to dispute that, so bring on the full retina displays :) The only downside is that the optics have to be extremely good to hold up at this resolution. The Aero's optics appear very good, but at 39ppd, small distortions are easier to make out.

There is still some motion blur that is more noticeable in some titles than others. It can't be in the displays themselves because when your head is still and what you are watching is moving there is no blur. For the most part I've gotten used to this, but it is there. If they can solve this one issue, I don't think I'll have any reservations about this headset. I'd like more FOV, but I'm still so blown away by the visuals I'm now seeing running everything at 35-39ppd that I'm pretty content.

Keep in mind this is still a moving target.
In just just the last month we got the following:

DFR in iRacing ( OpenXR toolkit )
Native DFR in DCS ( Eagle Dynamics + OpenXR hook)
OpenXR CA/Red shift adjustments (for those that notice issues, I haven't) ( OpenXR tool)
3:1 and 2:1 reprojection ( Varjo Base )

The guy who created the native SDK hook for the Aero in DCS, has been trying to do something for Pimax. However Pimax doesn't appear to be helping themselves in this regard. Also I'm not sure that Eagle Dynamics will have as strong a reason to add native Pimax support to their title. So for now it looks like it may be a ways off, but to be clear there is still active work going on as shown below.



Hopefully at some point there will become a standard for headset API's so that hooking into the best performance is possible.
 
Last edited:
  • Deleted member 197115

And you have zero depth perception with it. It's similar to driving one-eyed - doable, but you're missing certain useful information
Not entirely accurate. Some informative discussion on reddit on this.

There are many different mecanisms that give you depth perception which you combine to give you an estimate of the depth of what you observe. eg:

  • binocular vision: if you compare the small differences between what the two eyes see, you can know where the objects you see are
  • where is your gaze focused: your eyes can change their focal point: this is why when you look at an object, object far from it are blurry. You can know from your muscles that control this where you are focussing = how far the object is
  • you have an internal model of the world, of how objects are organised into perspective lines. This is why some paintings can have a feeling of depth whereas other feel like paintings (think of those amazing pavements frescos you see sometimes)
Now, when you watch a film, some of these mecanisms are absent but some of them remain so you can still figure out where objects are relative to each other, but this is not the same as seeing objects in true 3D.

And this is the interesting and probably the most surprising one.

The vast majority of your depth perception comes from monocular cues (e.g. this tree occludes that house, therefore the house is further than the tree, or this object is closer to the horizon). So yes, you're using depth perception.

Binocular cues, e.g. the distance between two viewpoints of your eyes, is primarily useful for objects about an arms distance away. The reason 3D glasses build such an effective presentation is because the binocular disparity matches all the other depth cues already present in the film. Because they don't match when watching a 2d film, you don't have such an intense percept. Obviously, you're using your binocular depth cues, but they don't match the intended percept, as they tell you the screen is flat.
 
I know that we have had debates early on about the strengths and weaknesses of VR vs screens.

Seems this has been hashed and rehashed a few times.

I think we have even had a number of people who flipped preference from VR to screens, so no new ground there either.

I won't debate this subject because it is purely preference and sometimes preferences change over time.
 
  • Deleted member 197115

Was merely pointing at the user experience. I know first hand how great VR can be, when it works.
 
Was merely pointing at the user experience. I know first hand how great VR can be, when it works.
Agreed, when everything is working properly it can be great.

Also becoming increasingly aware of how complicated everything is getting and how easy it is for something to be setup wrong.

It's like having someone over and realizing after the fact that the SC2 profile was still on iRacing while he was in Dirt Rally, or that I didn't have the optimum VR setting.
 
Not entirely accurate. Some informative discussion on reddit on this.



And this is the interesting and probably the most surprising one.
I don't need reddit to explain me that what I see and feel in VR is placebo. It is definitely not. And those non-binocular cues that exist in real life don't exist with flat screens. Neither the focus point (everything is in focus), nor a subtle change of objects relative positions when you move your head (this exists in VR though). Judging distances correctly is very important in racing, so "the house is behind the tree, hence it's further away" level of precision isn't gonna cut it
 
I had someone with an SFX150 system say that his motion sometimes puts his head through the window or roof in game. So there is a real value to something like this to help keep the cockpit in place with motion systems. The D-Box system I have doesn't have enough motion to do that to me, but I still wanted to play with this to see additional stability.
That would probably be me and I just got rid of that 30cm left/right head movement at full tilt by adding a 60 bucks witmotion tracker to my rig with SRS enabling the tracker to speak to OpenXRMC.

So far everything is good, easy to set up, basically works out of the box.

Personally I would consider MC being a must for flight with a motion rig after having experienced it now. For racing I still need to try, but it never annoyed me apart from occasional tracks with banked turns etc.
 
Last edited:
  • Deleted member 197115

I don't need reddit to explain me that what I see and feel in VR is placebo. It is definitely not. And those non-binocular cues that exist in real life don't exist with flat screens. Neither the focus point (everything is in focus), nor a subtle change of objects relative positions when you move your head (this exists in VR though). Judging distances correctly is very important in racing, so "the house is behind the tree, hence it's further away" level of precision isn't gonna cut it
You stated that there is zero depth perception on flat screen same as with single eye view. Both of these are fundamentally incorrect.
In VR you only gain binocular depth perception which is good for close up "arm length" 1-2m objects. and contributes to a relatively small portion of depth perception when simracing with the rest coming from monocular (single eye) cues.
The game is 3D model projected to 2D screen, so you get your depth cues from perspective, relative objects size, occlusion, etc.
Close one eye and look at the mid to long distance object, how much does it really affect your depth perception.
 

Latest News

Do you prefer licensed hardware?

  • Yes for me it is vital

  • Yes, but only if it's a manufacturer I like

  • Yes, but only if the price is right

  • No, a generic wheel is fine

  • No, I would be ok with a replica


Results are only viewable after voting.
Back
Top