ACC VR - The definitive VR Performance Guide

Hi,

I have been polishing ACC for VR, made a guide to help others out.
This guide is mostly for Valve Index, Vive and other SteamVR headsets, not tested with Oculus. However, if you find good results with any Oculus headset, let me know so we can update the guide.

Guide on Web

In this guide i will explain in detail how to get the most out of Assetto Corsa Competizione in terms of VR Performance


This guide is up to date with 1.8, as i am not 100% happy with the VR performance introduced in 1.8.

There will probably be updates to this guide after some long-term testing, since i am not perfectly satisfied with the performance yet. However, this guide should be a good pointer or baseline for anyone looking to improve their performance in VR significantly.

Last Update: 2022-11-17

Sections:
 
Last edited:
Thank you so much for the guide.... and for it being in text form and not a video I have to wade through! I can't wait to try it when I get home (currently away with work) as so far everything I've tried hasn't worked and the VR ACC experience has so far been terrible.

Quick (and probably naïve) question as I'm totally new to VR - do I need to make any changes to WMR settings?

12700k/3080Ti/32gb 3600mHz/Reverb G2 v2
 
Thank you so much for the guide.... and for it being in text form and not a video I have to wade through! I can't wait to try it when I get home (currently away with work) as so far everything I've tried hasn't worked and the VR ACC experience has so far been terrible.

12700k/3080Ti/32gb 3600mHz/Reverb G2 v2

*UPDATE* Finally got around to testing this out, and the results are bloody fantastic... thank you so much. I was ready to give up on ACC for VR, but this really has made the game/VR experience enjoyable. Was able to lock in 90 fps, only thing I tweaked a bit was pixel density (only up to 140.... when I tried 150 it dropped fps by about 10).

I haven't tried tweaking the engine.ini yet, but would be interested in peoples' experience doing this with 30 series graphics cards....?
 
*UPDATE* Finally got around to testing this out, and the results are bloody fantastic... thank you so much. I was ready to give up on ACC for VR, but this really has made the game/VR experience enjoyable. Was able to lock in 90 fps, only thing I tweaked a bit was pixel density (only up to 140.... when I tried 150 it dropped fps by about 10).

I haven't tried tweaking the engine.ini yet, but would be interested in peoples' experience doing this with 30 series graphics cards....?
You have pretty much all the same specs as me. I left Pixel density at 100 and getting consistent 90FPS, I havent tried 140 yet, I'll give that a shot. Did you have to turn down any other settings from this guide to achieve this?

Edit: Nevermind, I cant raise pixel density past 110 without dropping under 90FPS with cars on track.
 
Last edited:
You have pretty much all the same specs as me. I left Pixel density at 100 and getting consistent 90FPS, I havent tried 140 yet, I'll give that a shot. Did you have to turn down any other settings from this guide to achieve this?

Edit: Nevermind, I cant raise pixel density past 110 without dropping under 90FPS with cars on track.
Yep, I found 120% PD was the highest I could run and maintain solid 90 fps.

I'm still playing around with the various other settings... main things I'm trying to optimise is detail on opponent cars and reducing shimmering on objects in distance. Have been reading up on use of LOD bias vs anisotropic filtering, but still yet to properly test.
 
Have been reading up on use of LOD bias vs anisotropic filtering, but still yet to properly test.
Lod bias is broken for Nvidia drivers since the 700 series or so. Clamp vs enabled doesn't change anything.
You could force it for dx9 for a while but that possibility is gone too.

You might be able to shift the LODs via the ini files but for most games this just kills details when shifting into positive values, not really getting rid of shimmering.
Negative values will increase tiny details and cause shimmering though.

For anisotropic filtering I've found that the shimmering and aliasing stays the same no matter what setting.
The only thing that changes is how sharp the shimmering/aliasing will look like.
At 4x you will have blurry textures shimmering, at 16x you'll see what's actually shimmering..

Either way, it's a mess imo. I don't like how games evolved to more and more details with more and more aliasing and shimmering.

It's getting better with TAA, dldsr, dlss etc but a few years ago it was really bad with most games using dx11 and only fxaa.
Great for Screenshots, horrible during motion..

Anyway, I'm personally really liking the combination of dlsr + dlss in acc now.
It smoothes things out without much gpu load, since dlss cancels out the load from dldsr.

No idea how you can combine them, but you can in acc and it looks great lol
 
Lod bias is broken for Nvidia drivers since the 700 series or so. Clamp vs enabled doesn't change anything.
You could force it for dx9 for a while but that possibility is gone too.

You might be able to shift the LODs via the ini files but for most games this just kills details when shifting into positive values, not really getting rid of shimmering.
Negative values will increase tiny details and cause shimmering though.

For anisotropic filtering I've found that the shimmering and aliasing stays the same no matter what setting.
The only thing that changes is how sharp the shimmering/aliasing will look like.
At 4x you will have blurry textures shimmering, at 16x you'll see what's actually shimmering..

Either way, it's a mess imo. I don't like how games evolved to more and more details with more and more aliasing and shimmering.

It's getting better with TAA, dldsr, dlss etc but a few years ago it was really bad with most games using dx11 and only fxaa.
Great for Screenshots, horrible during motion..

Anyway, I'm personally really liking the combination of dlsr + dlss in acc now.
It smoothes things out without much gpu load, since dlss cancels out the load from dldsr.

No idea how you can combine them, but you can in acc and it looks great lol
Thanks very much for the explanation/details mate, really helpful.

Could you clarify DLDSR? Can't find much info on it....
 
Thanks very much for the explanation/details mate, really helpful.

Could you clarify DLDSR? Can't find much info on it....
Dldsr is very new. Implemented into the latest Nvidia driver for 20 and 30 series.
Dsr is quite old. Dynamic super resolution.
Example: you have a 1080p monitor and select 4x dsr = your gpu renders in 4k and then a filtering algorithm similar to anti aliasing scales it down to 1080p.
Depending on the smoothing in this filter algorithm, you'll get more details or anti-aliasing.
Preferebly both.

Now dldsr uses dlss to scale the image up before it gets filtered and scaled down instead of the gpu rendering in the higher resolution.

Only 2 factors are available right now though.
So with dldsr, dlss will be used to scale your 1080p image up to 4k, then the filtering algorithm from dsr will scale it back down to 1080p.

Similar look to dsr but way lower gpu load!

Normally dlss will make your gpu render the image below your monitor's resolution (for example 720p) and scale it back up.
Which gives you :
- almost the same quality with reduced gpu load.

- Dldsr increases the gpu load slightly while giving a better image quality.

And you can use both in acc. Not sure how they are chained and which resolutions will be taken for what.
It can't be 2 cycles because your gpu only renders once in one resolution.

But somehow it is combined and it makes a difference if you select dlss or no AA.

The results are great, so I don't care haha.

Maybe it's like this:
Numbers for the factor:

2560 x 1440p = 3.686.400p
3840 x 2160p = 8.294.400p

3.686.400p x 2,25 = 8.294.400p

- 1440p Monitor
- 2.25x dldsr
- dlss set to quality

1. Dlss let's the gpu render in 1080p and scales it back up to 1440p

2. Dldsr scales this 1440p image up to 4k, then filters it down to 1440p

So you get the gpu load of 1080p + the load of 2x dlss scaling + 1x filtering down.

Overall this causes more gpu load than rendering it 1440p without dldsr or dlss but on my monitor it looks a lot better than 1440p + TAA!

Sadly resolution scale is greyed out so you can't find tune the gpu load.

Btw, using 1440p + TAA + higher resolution scale until the gpu load is identical to dldsr+dlss looks nowhere near as good!

I need to take some screenshots.. The cables at Spa at the outside between les combes and bruxelles is a great spot to compare quality!

AMD FSR results in gaps in the cables. Dldsr + dlss is the only combination that makes them look good!
 
AMD FSR results in gaps in the cables. Dldsr + dlss is the only combination that makes them look good!
Thanks very much for the detailed response... my brain is still recovering from trying to understand it all (it still doesn't!).

When I went to test DLDSR this morning my Nvidia control panel showed this wasn't an option in the ACC program specific settings? Do I just need to set in global?

Re: DLSS - are you using the standard Nvidia in game DLSS or the modified OpenVR version from LynxSec's guide?

Overall this causes more gpu load than rendering it 1440p without dldsr or dlss but on my monitor it looks a lot better than 1440p + TAA!

When you say 'my monitor' do you mean your HMD, or are you using flat screen?
 
When I went to test DLDSR this morning my Nvidia control panel showed this wasn't an option in the ACC program specific settings? Do I just need to set in global?
Yes, needs to be set in global! I got tricked a few times too :p
The DLDSR/DSR resolutions will become available for desktop use too
When you say 'my monitor' do you mean your HMD, or are you using flat screen?

Re: DLSS - are you using the standard Nvidia in game DLSS or the modified OpenVR version from LynxSec's guide?
Currently I only have my 3440x1440 gsync monitor. I did own a an Oculus Rift for a few months though.
So I sadly can't give much help for all this stuff in VR.

So I'm using nvidia's standard in game DLSS.

I'm also not sure if DLDSR is available for VR.. It probably needs a similar hack/toolkit to make it available.
 
Yes, needs to be set in global! I got tricked a few times too :p
The DLDSR/DSR resolutions will become available for desktop use too

Currently I only have my 3440x1440 gsync monitor. I did own a an Oculus Rift for a few months though.
So I sadly can't give much help for all this stuff in VR.

So I'm using nvidia's standard in game DLSS.

I'm also not sure if DLDSR is available for VR.. It probably needs a similar hack/toolkit to make it available.
Understood - thanks for the swift response. Will have a play with DLDSR to see if it helps matters in VR
 
Last edited:
Are you using a freesync monitor in gsync compatible mode or triple screens with/without Nvidia surround?
Also if you're using different monitors for triples it might vanish
Just one, Samsung odyssey g9 240hz mode with gsync and hdr enabled. On latest drivers, 2070 super.
On my list it goes cdus then low latency and just skips dldsr.
 
Last edited:
Just one, Samsung odyssey g9 240hz mode with gsync and hdr enabled. On latest drivers, 2070 super.
On my list it goes cdus then low latency and just skips dldsr.
Aahh a friend of mine has the same issue with that monitor and a 3080.
It's either the gsync or hdr. Probably due to the combination with the massive resolution.

That monitor doesn't have a gsync module inside. So it's "gsync compatible" mode.
You could try to disable gsync, reboot, check for dsr.
Then disable hdr, reboot, check for dsr.
Or change to a more standard resolution like 3440x1440 or 2560x1080 if you have the 1080p model.
 

Latest News

Do you prefer licensed hardware?

  • Yes for me it is vital

  • Yes, but only if it's a manufacturer I like

  • Yes, but only if the price is right

  • No, a generic wheel is fine

  • No, I would be ok with a replica


Results are only viewable after voting.
Back
Top