Need some help with new monitor dropping display during gaming.?
I just purchased a LG 27GL83A-B and have had is a few days. My older model was some old Samsung 1080p 60z that I had hooked up through HDMI. My new monitor is hooked up through display port so I can maximize those sweet frames. My problem is that when I try to enter games, the display loses connection. It worked fine the first few days. My graphics card is the gigabyte RX 5700xt. All the display ports yield the same result so far and I've used two cords. One thing I noticed is that if I put a game in windowed mode then it works. If I put a game in full-screen that's when it seems to start having these issues. I can't even get into my guild wars 2 to set the graphics because it immediately goes dark. Anyone have any ideas?
Edit: There is definitely a difference between full-screen and windowed full-screen. It works fine in windowed full-screen.
- DavidLv 61 month ago
OK, it is interesting that you went from a 1080P monitor to one with a higher resolution. Immediately, the thought entered my mind that the graphics card would be drawing a lot more power to run the new monitor. So I checked the specifications of the graphics card. It needs a minimum 700W power supply with two power connectors. Ouch. So then I checked the TDP, and found that the graphics card ALONE can draw up to 250W if it is running your new monitor at native resolution.
That 250W number is interesting, because it does not jive with the recommended 700W minimum power supply. It's not a good idea to run a power supply much beyond 1/3rd of maximum output power on a regular basis. But if you've got a video card pulling 250W to run your new monitor, then a puny 700W power supply would likely be outputting WELL beyond 50% of maximum output on the 12V rail.
To put it simply, I wouldn't trust a puny 700W power supply to reliably handle that system. And you must keep in mind that power supplies degrade over time. So if it was 700W to start with, it could be 600W or less if it is just a few years old.
Let's look at your symptom. Windowed mode (low resolution, low power requirement) works fine. Full screen (high resolution, video card drawing about 20A or so off of the +12V rail) causes screen to go black.
That right there screams to me that your power supply is weak, dude. It's either under-sized, or wearing out...or it is under-sized AND wearing out.
I'd prescribe a EVGA SuperNOVA 850 PQ, 80 Plus PLATINUM 850W. If you can afford it, bump up to 1000W, but don't cheap out to get a higher wattage rating. A good quality 850W power supply will kick the crap out of a cheap quality 1000W power supply.
- AdrianLv 71 month ago
Running in Windowed full screen (or any Windowed mode) means that Windows itself is controlling the video card and how the data is sent to it. Running in dedicated mode (not Windowed) means the game itself is controlling the video card directly.
Since the video messes up with direct control, this suggests a possible video driver problem (update), bad video card (not too likely, but possible), or your power supply is just not good enough power-wise for that video card. When running under direct game control, running at higher FPS and such, the video card will use more power. Under Windows control, the frame rates and data process by the video card may be less, using less power. That said, in extreme cases, the video power consumption may be almost the same in both modes,
The 5700XT video card requires one 6 pin and one 8 pin video power cables, direct from the PSU. Do not use any power adapters, as you get larger voltage drops with those. Minimum power supply recommended is 550W, so I would suggest you have a bit more than that, like 600 to 650W or a bit more.
Since you don't say what PSU you have, look into that first, then your video power cables, then your video drivers.