Perfect SLI Scaling w/ rFactor 2 - Great Job ISI!

Settings and PC Info
- ISI NSX and Portugal, 12:00 AM, Rain, no opponents, pit garage
- GPU: 2x GTX 970 (SLI)
- Nvidia Driver: v353.30
- Nvidia Profile: All default except using "prefer maximum performance" and max pre-rendered frames @ 1.

NOTE: The overall framerates for these tests aren't nearly as high as could be due to the tests being performed while my GPU, RAM, and CPU were at the following slow default speeds:
- GPU @ 1315/7000 MHz core/memory (instead of 1500/7500 MHz)
- CPU @ 3.6 GHz (instead of 4.5 GHz)
- RAM @ 16-16-16-39 2T timings (instead of 11-11-11-26 1T)

TEST #1

Untitleddddd.jpg


Untitledfff.jpg


Default Compatibility Bits - 0x02402005 (Sanctum, UAZ Racing 4x4, rFactor 2[rFactor2 Mod Mode.exe], etc.)

With the default SLI comp. bits I had some real weird results...
- Cockpit camera, both GPUs around 55% usage and poor SLI scaling and framerates
- TV cam, same thing, both GPUs around 55% usage and around 75 fps
- Chase cam (or any other cam like the cam near the wheels, side panel, etc.) and the framerates rise to 145 fps with both GPUs showing 98% usage. Wow!
- Then I go back to TV cam and, unlike the first time, the GPU usage stays at 98% and framerates in the 140s

Basically cockpit cam always SLI scaled poorly, any other cam SLI scaled great, and TV cam SLI scaled poor if the previous cam was cockpit but SLI scaled great if the previous cam was a cam that SLI scales great (e.g. chase cam)...Weird.


Using SLI Compatibility Bits: 0x02D04005
I just tried the SLI compatibility bits that I've seen others mention as of RF2 b982 called "0x02D04005 (LEGO Batman 2, LEGO Pirates of the Caribbean: The Video Game, LEGO Lord of the Rings, LEGO Star Wars III: The Clone Wars, LEGO Harry Potter: Years 1-4, LEGO Star Wars Saga, LEGO Harry Potter: Years 5-7)" and I'm happy to report that all issues are now gone and the GPU usage is at 98% for both GPUs and no matter which view is used including the cockpit view where I'm getting over 130 fps
smile.png




TEST #2 Both Reflections @ "High" - everything else same as above

Default SLI Compatibility Bits
- GPU Usage = 55% (both GPUs)
- Framerate = 60 fps

Note: Always poor SLI scaling no matter the view

SLI Compatibility Bits 0x02D04005 (LEGO Batman 2, etc.)

- GPU usage = 98% (both GPUs)
- Framerate = 120 fps

Note: Wow! Each GPU's usage stays in the mid-to-high 90s % no matter what view is used and framerates are doubled. FANTASTIC!



TEST #3 - Max Graphics - except FXAA (disabled) and AA (Lvl 2)

Default Compatibility SLI Bits
- GPU Usage = 55% (both GPUs)
- Framerate = 42 fps

Note: Always poor SLI scaling no matter the view, not only that but there was major graphics corruption (solid coloured pink stuff) on the outer monitors.

SLI Compatibility Bits 0x02D04005 (LEGO Batman 2, etc.)

- GPU Usage = 98% (both GPUs)
- Framerate = 88 fps

Note: Even with all graphics settings maxed-out, SLI scaling remains pretty-much perfect (and no matter the view). The graphics corruption was also gone. FANTASTIC!



TEST #4 - Older Content
Same gfx settings as Test #3 from my OP (all graphics @ max except FXAA disabled and AA @ Lvl 2)
- Historic Eve F3
- Historic Spa
- In garage
- Night time
- Rain

Results

1x GPU = 60 fps
2x GPU (SLI) w/ default RF2 comp. bits = 52 fps
2x GPU (SLI) w/ "Lego" comp. bits = 103 fps



Conclusion: Until Nvidia sets it as default for rFactor 2, try SLI Compatibility Bits 0x02D04005 (LEGO Batman 2, etc.). The SLI experience, for me, goes from "broken" to great!

- Check out my positive Stock Car Extreme results here
 
Last edited:
hey Spinelli I know you're the graphics guru and i'm not trying to argue with you just trying to help the guy out i've been using rf2 with sli for the past year and a half and that's how it is on my system....
View attachment 99679
if i don't enable sli on this screen it simply does not work....every time i update my driver it changes to disable sli and i have to change it back it does not work "automatically"....
also when in inspector when i click restore nvidia defaults on rf2 profile this is the result
View attachment 99680

View attachment 99681
bottom line if i do not change it to the settings I told Paul it doesn't work properly....again like i said on my system
Hi billyblaze13. I'm just trying to help as much as I can too :)

I apologize about the SLI thing. I didn't know you were talking about that other screen (first screenshot you linked to). I thought you were talking about the actual settings in the 3D settings and/or in Inspector. So, yes, you're right, you have to make sure you've enabled SLI in that other screen or your PC is not running in SLI regardless of game or your other graphics setting.

In the Inspector settings though, you don't need to touch anything to enable SLI. All you need to do is, under the rFactor 2 profile, change the "SLI compatibility bits" (not the "SLI compatibility bits (DX1x)") to the 0x02D04005 LEGO profile. Those SLI settings that say auto or even four don't matter.

It can be different for some rare cases of different systems, I guess, but I'd find it extremely odd if you had to mess with those SLI settings in Inspector to get SLI to work (other than the comp bits which you definitely should change).

What motherboard are you running Spinelli? Looking to do an upgrade later in the year to the new i7 series CPU so will be needing a new mobo and RAM too. Not seen much that offers dual x16 lanes for SLI last time I checked.
I'm not running the more mainstream line, I run the "E" series line. These offer DDR4 memory (instead of DDR3), quad channel memory (instead of dual channel), 40 PCI-E lanes (instead of 16) and in most cases 6/12 cores/threads (instead of 4/8 cores/threads).

So if you want full PCI-E speeds on both GPUs, you need to go with either Sandy Bridge-E (X79 chipset, LGA 2011 socket), Ivy Bridge-E (also X79, LGA 2011), or Haswell-E (X99, LGA 2011-3).

CPU - # of lanes - # of cores/threads
SB-E:
3820K - 40 - 4/8
3930K - 40 - 6/12
3960X - 40 - 6/12

IB-E:

4820K - 40 - 4/8
4930K - 40 - 6/12
4960X - 40 - 6/12

H-E:

5820K - 28 - 6/12
5930K - 40 - 6/12
5960X - 40 - 8/16


NOTES:
- The Sandy Bridge-E line needs a patch to make PCI-E 3.0 work or else your gimped at 2.0. The patch takes 2 seconds to change and seems to work fine. However, it apparently only works fine with C2 revision CPUs not C1, so make sure to find out before buying.
- I'd avoid the 5820K as you'll still be gimped (only 28 lanes)
- Unless you find an amazing deal, I'd ignore the 3960X and the 4960X as they are 99% the same CPU as the 3930K and 4930K yet, when brand new, were almost double the price - $1000 instead of $550 or so.)
- Don't worry about the 6 or even 8 cores, most games don't take advantage yet (some do but not all). They make a big difference in other things like rendering videos, etc., but for 90% of gaming they don't.
- Apart from the benefits I mentioned in the first paragraph and a few other minor things, the 3000 line (SB-E) are pretty-much a 2600K, the 4000 line (IB-E) are pretty-much a 3770K, and the 5000 line (H-E) are pretty-much a 4770K/4790K.
- The x820K models are priced about the same as their equivalent non-E line models (mentioned in previous note, e.g. 3820K was priced about the same as the 2700K).


I suggest a 4820K (or a 3820K if you're sure it can fully support PCI-E 3.0). If you don't mind spending a bit more, I'd go with a 5930K which is what I have (used to have a 4930K). If you have money burning a hole in your pocket then get a 5960X, lol.


The following picture shows results running 4 GTX 680s @ 16x/8x/8x/8x but in only PCI-E 2.0 which equates to a dismal 8x/4x/4x/4x in PCI-E 3.0. Look at the improvement when the CPU got patched to enable PCI-E 3.0:
PCI-ETests_zps3db0b0e7.jpg







Tip:
Buy second hand on Craigslist or eBay :) I've bought all my CPUs and motherboards (and much more) that way and saved a ton. Sometimes I even come out on top when upgrading. Because I sold my 4930K CPU and ASUS Sabertooth X79 motherboard, I hardly put any additional money into purchasing my 5930K and ASRock Fatal1ty X99X Killer motherboard.
 
Last edited:
The thing is most people never run more than two video cards in SLI and there's practically zero improvement from PCI-E 2.0 vs. 3.0 in a 2 way SLI configuration. It really takes 3 and 4 way SLI configurations before PCI-E 3.0 has any benefit and even then its hit or miss depending on the application.
 
Last edited:
The thing is most people never run more than two video cards in SLI and there's practically zero improvement from PCI-E 2.0 vs. 3.0 in a 2 way SLI configuration. It really takes 3 and 4 way SLI configurations before PCI-E 3.0 has any benefit and even then its hit or miss depending on the application.
Yup, but it's massive with RF2 (and possibly other ISI engine based games) even with just 1 GPU.

From a previous post of mine....
It gets worse though. Most games only suffer around 3-5 % when going from PCI-E 3.0 16x down to 8x, but rFactor 2 for some reason is an exception and there have been many tests about this in the official ISI rFactor 2 forum and people, including me, have seen framerate increases of 10% to a whopping 40% just by going from PCI-E 3.0 8x to 16x, and you're essentially running 4x in SLI or 8x in single GPU mode.


Actually I just did some more research. Here's some further large differences.
Battlefield 3 with only SLI GTX 680s comparing PCI-E 3.0 @ 8x (equivalent to PCI-E 2.0 @ 16x) to PCI-E 3.0 @ 16x in single screen as well as triple screen tests.

4sYbB.jpg


Some good improvements in 1080p in this test, and in triple screens, the differences are massive.


One more thing to make note of, if you're running Sandy Bridge (eg. i7-2600k) or older, you're running PCI-E 2.0; in that case, PCI-E 2.0 @ 8x is equivalent to only PCI-E 3.0 @ 4x.
 
Last edited:
Ya it's weird, seems like it's very dependent on systems as GTX 680s should be nowhere near close to maxing out the bandwidth yet the difference in the triple screen test I linked to was pretty huge. And the differences are big even with one card and one 1080p monitor with rFactor 2 for some reason (and possibly other ISI engine based games like SCE, RF1, etc).
 
Ya it's weird, seems like it's very dependent on systems as GTX 680s should be nowhere near close to maxing out the bandwidth yet the difference in the triple screen test I linked to was pretty huge. And the differences are big even with one card and one 1080p monitor with rFactor 2 for some reason (and possibly other ISI engine based games like SCE, RF1, etc).

I'm still using a socket 1155, 2500K CPU @ 4.4GHz so I'm on PCI-E 2.0 at 8x in SLI, however I recently decided to try a single GTX 980 Ti overclocked to 1500MHz, compared to my GTX 980's in SLI at 1425MHz, in most games it feels like the 980 Ti is just as quick which is probably due to the SLI scaling in most games. Project CARS is the only game that I can't run at the same settings on the single 980 Ti vs. 980's in SLI, but I didn't even have to back the settings down much.

I'm waiting to see how Skylake performs before I decide to upgrade to either it or Devils Canyon. I'll be using PCI-E 3.0 then.
 
Yes the differences are massive when QUAD CORE supported games and benchmarks are used but how many QUAD CORE supported games there are ?, Right not many. SLI PCI-E 3.0 @ x16 x16 is wasted money 99% of games you gain only 2-5 fps and rFactor2 is one of them. Please don`t tell me about that thread from ISI forums about benchmarks and PCI-e lines, why not? Results are based to trust and ISI fanboys can not be trusted that's why :(.

Default Compatibility Bits - 0x02402005 don`t work, why not? Because it is NVIDIA Default Compatibility Bits - 4 Way SLI AFR that`s why. NVIDIA Default Compatibility Bits - 2 Way SLI AFR is 0x02402001 supplies same scaling and fps as SLI Compatibility Bits 0x02D04005 if you apply in NVIDIA Inspector SLI>Number of GPUs to use on SLI rendering mode>SLI_GPU_COUNT_TWO and NVIDIA predefined number of GPUs to use on SLI rendering mode>SLI_PREDIFINED_GPU_COUNT_TWO.

Perfect SLI Scaling w/ rFactor 2 - Great job ISI is our words but is it rely great job?, I don`t thing so.
I am wondering why after all these years with SLI problems does not ISI submit the latest build of the rF2 executables to Nvidia so that an updated working SLI profile can be created?
 
Last edited:
Ok it seems the multiview is the killer setting for me. I best I got on the start line with 10 AI's with multiview and half decent graphics settings was low 50's FPS - with multivew turned off on the same settings I'm getting 115FPS.

Hi Paul,

If you are using windows 10 with WHQL 353.62 SLI profiles don`t work. WHQL 353.62 uses _GLOBAL_DRIVER_PROFILE (Base Profile) to all games. WHQL 353.30 is fine with Windows 10.
WHQL 353.62 is working fine with Windows 7 and 8.
NVIDIA Inspector has two profiles , rFactor 2(rFactor2 Mod Mode.exe) and rFactor2.exe. I think that in our PC both profiles are added to rFactor2.exe (32bit) and not to rFactor2.exe(64bit). What you can do is to delete one of our profiles, does not matter which one. Choose profile you like to use >Apply Changes>Add application to current profile>rFactor2 root folder>bin64>rFactor2.exe>open>Apply Changes and you are redy to try out Spinelli`s miracle SLI Compatibility Bits0x02D04005 :)
 
Yes the differences are massive when QUAD CORE supported games and benchmarks are used but how many QUAD CORE supported games there are ?, Right not many. SLI PCI-E 3.0 @ x16 x16 is wasted money 99% of games you gain only 2-5 fps and rFactor2 is one of them. Please don`t tell me about that thread from ISI forums about benchmarks and PCI-e lines, why not? Results are based to trust and ISI fanboys can not be trusted that's why :(.

Default Compatibility Bits - 0x02402005 don`t work, why not? Because it is NVIDIA Default Compatibility Bits - 4 Way SLI AFR that`s why. NVIDIA Default Compatibility Bits - 2 Way SLI AFR is 0x02402001 supplies same scaling and fps as SLI Compatibility Bits 0x02D04005 if you apply in NVIDIA Inspector SLI>Number of GPUs to use on SLI rendering mode>SLI_GPU_COUNT_TWO and NVIDIA predefined number of GPUs to use on SLI rendering mode>SLI_PREDIFINED_GPU_COUNT_TWO.

Perfect SLI Scaling w/ rFactor 2 - Great job ISI is our words but is it rely great job?, I don`t thing so.
I am wondering why after all these years with SLI problems does not ISI submit the latest build of the rF2 executables to Nvidia so that an updated working SLI profile can be created?
PCI-E version/lanes makes a huge difference in rFactor 2. Please stop spreading you're agenda bullshit around. There have been tons of tests from many, many users confirming this.

There have also been tons of people confirming that the LEGO SLI bits works great (as of the current build [b982]).

I don't even see what any of this has to do with being a fan boy or not. It's about tests and results.

You have a major problem and it blinds you from the truth even when the results/facts are presented to you by many other users.
 
Last edited:
PCI-E version/lanes makes a huge difference in rFactor 2. Please stop spreading you're agenda bullshit around. There have been tons of tests from many, many users confirming this.

There have also been tons of people confirming that the LEGO SLI bits works great (as of the current build [b982]).

I don't even see what any of this has to do with being a fan boy or not. It's about tests and results.

You have a major problem and it blinds you from the truth even when the results/facts are presented to you by many other users.

There is nothing spesial with rFactor2. Gmotor2 don`t give you better fps then any other game with DUAL CORE support. Test made in ISI forums about PCI-e was based to trust. No files was posted. Test like that is wasted time.The only one who spreads bullshit about PCI-e @ 3.0 x16 x16(SLI) is actually you. Your source is ISI forums. My source is documents from NVDIA, Intel etc. rFactor2 don`t have QUAD CORE support it is simply not technically possible that rFactor2 is able to get advantage PCI-e @3.0 x16 x16 because rFactor2 has only DUAL CORE support like most of the other games. I really hope that nobody believes your claims and uses loads of money to get 5 fps more in most of the games.

Perfect SLI Scaling w/ rFactor 2 - Great job ISI is our words but what is the truth?
otta56 found by pure chance that SLI Compatibility Bits 0x02D04005 works with build 982 and NVIDIA default SLI profile as it is and of course guy like you who likes to have reputation as knowledgeable meber makes bold statment "Perfect SLI Scaling w/ rFactor 2 - Great job ISI". There is no guarantee that SLI Compatibility Bits 0x02D04005 is going to work with the next build. What has ISI done ? Nothing! Thx. otta56 -Great jobb, not ISI is the naked truth.

When ISI submit the latest build of the rF2 executables to Nvidia so that an updated working SLI profile can be created. When that is done you are free to make bold statements about ISI`s great jobb.

PS. Read my #29 again about SLI Compatibility Bits. There is no statement from me that SLI Compatibility Bits 0X2D04005 does not work.
 
There is no guarantee that SLI Compatibility Bits 0x02D04005 is going to work with the next build. What has ISI done ? Nothing! Thx. otta56 -Great jobb, not ISI is the naked truth.

– Optimized single-pass HDR for multiview.
– More HDR process optimizations
– Added multiview adjustments. Values in Config.ini, as well as parameter to disable new functionality (to retain use of older FOV options).
– Additional SLI optimizations when using reflections.

How is that nothing? Before SLI didn't work at all, now it does, along with multiview improvements and proper support for multiview adjustments. A lot has been done.
 
– Optimized single-pass HDR for multiview.
– More HDR process optimizations
– Added multiview adjustments. Values in Config.ini, as well as parameter to disable new functionality (to retain use of older FOV options).
– Additional SLI optimizations when using reflections.

How is that nothing? Before SLI didn't work at all, now it does, along with multiview improvements and proper support for multiview adjustments. A lot has been done.


rFactor2 has allways worked with SLI if you have been able make profile to you spesific hardware.
If you read an official NVidia documentation for developers It points clearly that SLI performance is not being only based in GPU drivers but the game programming aswell. It is easy to understand when NVidia profile (rFactor2 Mod Mode.exe, rFactor2.exe) is created 2011-09-22 and when ISI updates gmotor2, additional SLI optimizations etc. in 2015 and never submit the latest build of the rF2 executables to Nvidia that an updated working SLI profile can be created it is ISI and their communication and relationship with NVIDIA which is the reason to rFactor2 SLI problems.
The truth is that otta56 found by pure chance that SLI Compatibility Bits 0x02D04005 works with build 982 and NVidia profile (rFactor2 Mod Mode.exe, rFactor2.exe) created 2011-09-22. NVIDIA profile is more then four years old and never updated. ISI has not done good jobb to improve rFactor2 SLI performance that`s for sure. It is thx to otta56 that guys like you can play rFactor2 with NVIDIA SLI Gaming PC.
 
Last edited:
There is an entire thread with tons of people sharing their results about how rFactor 2 mysteriously loses a lot of frames-per-second with anything other than PCI-E 3.0 @ 16x. Many users, with many different hardware and software setups all confirmed this. Why would anybody, let alone a whole ton of people, make that up? You are acting crazy and paranoid (for what reason? I have no idea).

Why do you say it's impossible? How the hell do you know what is possible and not possible? Computers are so complex yet you're claiming that something isn't possible.

Many people are finally having great success with RF2 and SLI, including with triple screens, multiview, and HDR, thanks to b982 and the LEGO bits. SLI hardly worked before then regardless of which GPU bits/settings you used.

You have some major trust problems. Please stop spreading around misinformation and your delusions.
 
There is an entire thread with tons of people sharing their results about how rFactor 2 mysteriously loses a lot of frames-per-second with anything other than PCI-E 3.0 @ 16x. Many users, with many different hardware and software setups all confirmed this. Why would anybody, let alone a whole ton of people, make that up? You are acting crazy and paranoid (for what reason? I have no idea).

Why do you say it's impossible? How the hell do you know what is possible and not possible? Computers are so complex yet you're claiming that something isn't possible.

Many people are finally having great success with RF2 and SLI, including with triple screens, multiview, and HDR, thanks to b982 and the LEGO bits. SLI hardly worked before then regardless of which GPU bits/settings you used.

You have some major trust problems. Please stop spreading around misinformation and your delusions.

I discuss with you about (SLI) PCI-E 3.0 @ x16 x16 vs PCI-E 3.0 @ x8 x8 and not about (Single card) PCI-E 3.0 @ x16 vs PCI-E 3.0 @ x 8. You claims is that PCI-E 3.0 @ x16 x16 makes a huge difference in rFactor 2 which is a lie and technically impossible simply because rFactor2 has DUOAL CORE support and not QUAD CORE support. The problem is that somebody may believe you and invest loads of money to nonsense. Please stop spreading around misinformation like this.
I don`t know how many times I must repeat this , YES LEGO bits works fine when stock profile is used but it is not a miracle bits, read my #29 and you may understand what I mean.
 
You claims is that PCI-E 3.0 @ x16 x16 makes a huge difference in rFactor 2 which is a lie and technically impossible simply because rFactor2 has DUOAL CORE support and not QUAD CORE support. The problem is that somebody may believe you and invest loads of money to nonsense. Please stop spreading around misinformation like this.
Ughhh.... What does having a dual core CPU, or quad core, or single core, dual core, tri core, 6 cores, 8 cores, 12 threads, 16 threads, 8 threads, 4 threads, etc. etc. etc. have to do with PCI-E bandwidth? I don't think you understand what you're talking about.

Once again, there is a thread with TONS of people doing tests and we all came to the same conclusion from all of our tests, and that is that rFactor 2 mysteriously losses a lot of performance anytime you are not using PCI-E 3.0 @ 16x for each GPU (wether you use 1 or 2)


275 posts with almost everyone's results proving it --> http://isiforums.net/f/showthread.p...rf2-using-PCI-e-3-0-x16-with-higher-end-cards!

I think it was first talked about somewhere in this thread --> http://isiforums.net/f/showthread.php/21983-Live-Performance-Benchmarking-Comparison-for-rFactor-2

More discussions here, of course most ignorant people here don't believe all the proof and results because it's such a weird thing to happen. I'm not surprised by their skepticism since it's a very rare thing to happen in videogames --> http://linustechtips.com/main/topic/226787-huge-fps-gains-on-pcie-30-vs-20-1080p-in-rfactor-2/


Stop calling me and a ton of other people liars. There is no reason for so many people to lie about this. Do you think we all work for Intel and motherboard companies and are spreading misinformation in order to get people to buy more expensive CPUs and motherboards? Why would a 275 post thread be full of liars all agreeing to post fake results? You're crazy. And if we were trying to do that, then we would spread misinformation about a huge mainstream game not a small, niche product.

Stop your crap. It's ridiculous. We have tons of proof from many users with consistent results from PC system to PC system.

And the LEGO bits + b982 combo seems to work awesome for most people and SLI hardly scaled positively before that, it would almost always be worse than 1 GPU and sometimes just a tiny bit better than 1, but now it's great with b982 + LEGO bits.
 
Last edited:
Ughhh.... What does having a dual core CPU, or quad core, or single core, dual core, tri core, 6 cores, 8 cores, 12 threads, 16 threads, 8 threads, 4 threads, etc. etc. etc. have to do with PCI-E bandwidth? I don't think you understand what you're talking about.

Once again, there is a thread with TONS of people doing tests and we all came to the same conclusion from all of our tests, and that is that rFactor 2 mysteriously losses a lot of performance anytime you are not using PCI-E 3.0 @ 16x for each GPU (wether you use 1 or 2)


275 posts with almost everyone's results proving it --> http://isiforums.net/f/showthread.p...rf2-using-PCI-e-3-0-x16-with-higher-end-cards!

I think it was first talked about somewhere in this thread --> http://isiforums.net/f/showthread.php/21983-Live-Performance-Benchmarking-Comparison-for-rFactor-2

More discussions here, of course most ignorant people here don't believe all the proof and results because it's such a weird thing to happen. I'm not surprised by their skepticism since it's a very rare thing to happen in videogames --> http://linustechtips.com/main/topic/226787-huge-fps-gains-on-pcie-30-vs-20-1080p-in-rfactor-2/


Stop calling me and a ton of other people liars. There is no reason for so many people to lie about this. Do you think we all work for Intel and motherboard companies and are spreading misinformation in order to get people to buy more expensive CPUs and motherboards? Why would a 275 post thread be full of liars all agreeing to post fake results? You're crazy. And if we were trying to do that, then we would spread misinformation about a huge mainstream game not a small, niche product.

Stop your crap. It's ridiculous. We have tons of proof from many users with consistent results from PC system to PC system.

And the LEGO bits + b982 combo seems to work awesome for most people and SLI hardly scaled positively before that, it would almost always be worse than 1 GPU and sometimes just a tiny bit better than 1, but now it's great with b982 + LEGO bits.

You are free to call me crazy or that I have have some major trust problems I rely don`t care. Your off topic personal attacks only confirms to me who you are and what level you are. I don`t claim that ton`s of people are lairs. What I am saying is that that post 275 from ISI forums is nonsense. That rFactor 2 "mysteriously losses a lot of performance and such a weird thing to happen" is the best joke I have heard a long time. Do you mean this claim seriously or are you joking?
If you don`t know how much better fps and performance QUAD CORE Supported games have then DUAL CORE supported games when it comes to support of PCI-E lines and how system works I rely can not help you, maybe google a bit more?
However as owner of i7 4770k> Two way SLI GTX 780Ti and i7-5930k> Two way SLI GTX 980Ti I think I know better then most how much fps you can gain with better system and PCI-E 3.0 @ x16 x16 vs PCI-E 3.0 @ x8 x8. With you settings from post 1 my i7 4770k> Two way SLI GTX 780Ti vs my i7-5930k> Two way SLI GTX 980Ti PCI-E 3.0 @ x16 x16 gain is only 9-13 fps more and that is not because of PCI-E lines it is because of better GFX cards. Some QUAD CORE supported games like BF4 and GTA V gain is about 15-20 fps. I think you should look this video,PCIe Lanes - PCIe 8x vs 16x in SLI. i7 - 5820K PCI-E 3.0 @ x 8 vs i7 - 5930K PCI-E 3.0 @ x16 :
.
Your agenda is : "Perfect SLI Scaling w/ rFactor 2 - Great job ISI !!!" What makes you to make claim like this when it is only bullshit? Everybody knows that ISI`s work with NVIDIA SLI is pure and It is easy to understand when NVidia profile (rFactor2 Mod Mode.exe, rFactor2.exe) is created 2011-09-22 and when ISI updates gmotor2, additional SLI optimizations etc. in 2015 and never submit the latest build of the rF2 executables to Nvidia so that an updated working SLI profile can be created it is ISI and their communication and relationship with NVIDIA which is the reason to rFactor2 SLI problems.

Buld 910: SLI Compatibility Bits 0x42C06405
Build 930: SLI Compatibility Bits 0x02C00405 (F1 2010, Icarus)
Build 982: SLI Compatibility Bits 0x02D04005 (LEGO, etc)
Next Build: Nobody knows. There is no guarantee that 0x02D04005 (LEGO Batman 2, etc) is going to work with next build. Good work ISI ? I don`t think so.

Once again, YES, SLI Compatibility Bits 0x02D04005 (LEGO Batman 2, etc) is working with BUILD 982 but if you are using Windows 10 it works only with WHQL 353.30. SLI Compatibility Bits 0x02D04005(LEGO Batman 2, etc) does not work with WHQL 353.52 or WHQL 355.60 with Windows 10.


Sorry to say this but people like you are harmful the development of rFactor2 more then you can imagine.
 
Last edited:
You keep going on about nothing. The point is, rFactor 2, for whatever reason, looses a ton of performance when not running at full PCI-E speed (PCI-E 3.0 @ 16x). The video you posted is irrelevant to the discussion/argument.

There is a 275 post thread with many people doing the test and confirming the results. Tons of proof from many people yet you argue against all that just because it's too difficultfor you to believe. Well it was for me too, and many others, that is why we did tests and many kinds too (single GPU, dual, single screen, triple, etc.).

P.S. About the personal attacks - you come on here claiming that I, along with a 275 post thread full of people, am a liar. Also, it is crazy that you dont believe a ton of user's tests. Why would a ton of rFactor 2 players lie about PCI-E performance tests? Why would we all get together to create such a massive lie with such a large amount of fake test results? What benefit could that serve a bunch of simracing gamers? It literally sounds crazy...

Anyways, the tests/numbers/results speak for themselves. Keep arguing about your theories and opinions even though there is a 275 post thread full of hard data test results that reveals the truth.
 
You keep going on about nothing. The point is, rFactor 2, for whatever reason, looses a ton of performance when not running at full PCI-E speed (PCI-E 3.0 @ 16x). The video you posted is irrelevant to the discussion/argument.

There is a 275 post thread with many people doing the test and confirming the results. Tons of proof from many people yet you argue against all that just because it's too difficultfor you to believe. Well it was for me too, and many others, that is why we did tests and many kinds too (single GPU, dual, single screen, triple, etc.).

P.S. About the personal attacks - you come on here claiming that I, along with a 275 post thread full of people, am a liar. Also, it is crazy that you dont believe a ton of user's tests. Why would a ton of rFactor 2 players lie about PCI-E performance tests? Why would we all get together to create such a massive lie with such a large amount of fake test results? What benefit could that serve a bunch of simracing gamers? It literally sounds crazy...

If you like to call you self liar it is fine for me. It is you words not my. Post 275 is nonsense as you hilarious claim how rFactor 2 mysteriously loses a lot of frames-per-second with anything other than PCI-E 3.0 @ x16 x16. We discuss SLI PCI-E 3.0 @ x16 x16 vs PCI-E 3.0 @ x8 x8 not singe card PCI-E 3.0 @ x16 and video show clearly that there is no gain Two way SLI PCI-E 3.0 @ x16 x16 vs Two way SLI PCI-E 3.0 @ x8 x8 . rFactor2 is not different then any other game there is nothing mysterious with rFactor2 performance. The only mystery is that NVidia profile (rFactor2 Mod Mode.exe, rFactor2.exe) is created 2011-09-22 and that there is no updated working SLI profile after all these years.
The truth is that your NVIDIA gaming PC Two way SLI GTX 970 PCI-E 3.0 @ x16 x16 vs i7 4790K Two way SLI GTX 970 PCI-E 3.0 @ x8 x8 has same fps with DUAL CORE supported games, also with "mysterious" rFactor2. In QUAD CORE games your system gains few more fps but it is not worth all that extra money.
I rely hope that nobody believes your claims about SLI PCIe lines wasting a lot of money to your nonsense.
 
Back
Top