PC graphics question/ramble
Posted: Sun Apr 10, 2022 10:12 pm
For the last 5 years or so, I've been using a Gigabyte GA-Z97-D3H mobo with an i7 4790K S CPU and a GTX 1070 and tonight I've realised that I may have been an utter tool. Or not, as the case may be.
So, here's the thing. For 5 years, I've had my HP 32" HD monitor plugged into the mobo, and the Rift plugged into the GPU. And in MSFS I've been getting pretty good performance, given the specs above. But tonight, having moved the computer and stuff around in the room upstairs, I've then connected everything back together, gone to connect the monitor via the DVI-D port on the mobo as usual and thought "...hang on, I've got a HD cable I can use for this..."
So, I've connected the monitor to the computer via HD cable into the mob's HD port and powered on. And everything works ok, POST is ok and Win10 boots up pretty much immediately as normal (1TB SSD). I noticed that image scaling on the GTX 1070 was now turned off and couldn't get the BIOS screens to appear but realised that's probably because I was plugged into the mobo(?). And then the sodding obvious struck me. For the past 5 years I've had the monitor plugged into the mobo, not the GPU. I've had the Rift plugged into the GPU and performance there has been ok.
So, here's the other thing though. Performance in general has been, in my view, ok. With the previous setup (DVI from the monitor into the mobo) I was getting up to 50fps at my local airports in the 172 G1000 and an easy 25 to 30 fps in the Rift. It's only a 1070 remember. Performance was far from getting to a point where I've had done some research and potentially realised that the reason was due to the cable going into the mobo. All seemed fine.
My thought is, that performance should have been abysmal with plugging into the integrated graphics of the i7 and mobo, but clearly it's been ok. By default, good performance follows plugging directly into the GPU. So I guess my question after this rambling tale is this:
Is the GA-Z97-D3H basically clever enough to realise that because a GTX 1070 is installed, it can use that to do the heavy lifting but still display via the DVI with no perf impact? I know that some mobos can do this, but cannot really find confirmation on Gigabyte's website. I've never had cause to look at the Resource Monitor whilst gaming; I have looked tonight and now that the monitor is plugged into the GPU, the i7 is running at 50%. Which makes sense. But the temp is 100 degrees, which doesn't make sense. Temps were always 100 degrees on the i7 whilst gaming, so maybe there's not a change here at all and as I said, the mobo has been accounting for my questionable wiring. I may also be suffering CPU throttling, but that's another story...
The irony of all this is that this rig is about to be replaced with something a lot more powerful, but I'm still curious as to what's been going on.
Any thoughts among the PC geeks?
So, here's the thing. For 5 years, I've had my HP 32" HD monitor plugged into the mobo, and the Rift plugged into the GPU. And in MSFS I've been getting pretty good performance, given the specs above. But tonight, having moved the computer and stuff around in the room upstairs, I've then connected everything back together, gone to connect the monitor via the DVI-D port on the mobo as usual and thought "...hang on, I've got a HD cable I can use for this..."
So, I've connected the monitor to the computer via HD cable into the mob's HD port and powered on. And everything works ok, POST is ok and Win10 boots up pretty much immediately as normal (1TB SSD). I noticed that image scaling on the GTX 1070 was now turned off and couldn't get the BIOS screens to appear but realised that's probably because I was plugged into the mobo(?). And then the sodding obvious struck me. For the past 5 years I've had the monitor plugged into the mobo, not the GPU. I've had the Rift plugged into the GPU and performance there has been ok.
So, here's the other thing though. Performance in general has been, in my view, ok. With the previous setup (DVI from the monitor into the mobo) I was getting up to 50fps at my local airports in the 172 G1000 and an easy 25 to 30 fps in the Rift. It's only a 1070 remember. Performance was far from getting to a point where I've had done some research and potentially realised that the reason was due to the cable going into the mobo. All seemed fine.
My thought is, that performance should have been abysmal with plugging into the integrated graphics of the i7 and mobo, but clearly it's been ok. By default, good performance follows plugging directly into the GPU. So I guess my question after this rambling tale is this:
Is the GA-Z97-D3H basically clever enough to realise that because a GTX 1070 is installed, it can use that to do the heavy lifting but still display via the DVI with no perf impact? I know that some mobos can do this, but cannot really find confirmation on Gigabyte's website. I've never had cause to look at the Resource Monitor whilst gaming; I have looked tonight and now that the monitor is plugged into the GPU, the i7 is running at 50%. Which makes sense. But the temp is 100 degrees, which doesn't make sense. Temps were always 100 degrees on the i7 whilst gaming, so maybe there's not a change here at all and as I said, the mobo has been accounting for my questionable wiring. I may also be suffering CPU throttling, but that's another story...
The irony of all this is that this rig is about to be replaced with something a lot more powerful, but I'm still curious as to what's been going on.
Any thoughts among the PC geeks?