I believe dutch is spot-on in his assessments. The DisplayPort stuff mentioned earlier doesn't matter at all if you're running non-Gsync, 1080p 60Hz. And these cables are all digital (as long as you're not talking the older, typically blue, analog "VGA" cables). The colors being different/brighter is likely subjective, since these are all digital connections. (Of course, if you have any kind of analog VGA adapter in the line, then it's not all digital). Mind you, once you go beyond 1080p 60Hz, then yes HDMI (prior to rev 2.0) will only support 30Hz. I didn't see where the OP specified exactly what resolution, but there is mention of "an option for 3140 display", which seems to indicate a 21:9 aspect ratio, which isn't technically supported by HDMI before 2.0, like dutch said.
EDIT: I should point out here that the actual electronic connections, at the physical display panel level, are not HDMI, nor DVI, analog, or any of the rest of it. They are actually a totally different spec called LVDS (Low Voltage Differential Signalling). This applies to any kind of flat panel I'm aware of; it is the 'native' form of signal at the input to the panel hardware. The reason I mention this is because, in order to communicate with the LVDS panel, obviously
any of the various peripheral interface types (DVI, HDMI, DP...) will require some kind of conversion - even if it is digital-to-digital conversion. What that means is that it's possible some variation in what you see on the screen is entirely possible just by using the different interface, not because any one is "
better" than the other, but just because they're possibly
different, and it has nothing to do with DP vs HDMI specs at all. For example, the digital HDMI-LVDS converter in one person's Samsung monitor might produce what they see as different colors, compared to the converter which handles conversion from DisplayPort to LVDS in the same monitor. The fact that there are two different signal paths involves means the result can obviously be different.
It is possible, though, that the cables in use were low-quality, because even with digital cables, poorly designed or constructed cables can cause problems with signal transmission. Just because the computer store sells it doesn't make it good - they sell all kinds of connectors and adapters that are not just often incompatible, in some cases they are outright forbidden by relevant specs. One example of this is (was) USB extension cables - explicitly forbidden in the first USB specification, but you could buy them all day long. As these specs become more complicated, the cables/connectors become more important. I have a USB 3.1 drive cradle that is very picky about it's connection to the PC - but, with a proper cable, the connection is reliable and speed is measurably improved. A different cable I had, which looked OK otherwise, was constantly dropping the connection.
Moral of this story is don't buy cables based strictly on what's cheap.
Well, dutch, thanks, but I tried driver updates and all that, Some said that some monitor information that is stored in the monitor got corrupted, but I found a little utility to check it, and it was fine. My buddy at work (who is to this point the only fellow I know with a server rack running his house), said change to the pinned cable and that worked. I dunno. i used to be better at this stuff, but they're supposed to be making things easier... Silly Windows 10 (I have more appropriate terms but forum rules, you know...) decided my printer wasn't worth writing a driver for, so off I went to replace a perfectly good printer... And I tried about everything up to insatllingan XP virtual machine, it wasn't worth that.
HumanDrone, that kind of issue comes along sometimes, but it's not limited to computer monitors. I have a TV which used to decide every now and again that it couldn't recognize the cable box (before I switched to DirecTV). Hasn't happened since I switched to DirectTV (and thus, have a different receiver connected to the TV). Same TV, same cable, but obviously changing the receiver made the problem disappear.
What's at issue there is not HDMI v DVI or whatever, it's about what is called (deep breath, yet another acronym inbound) HDCP (High-definition Digital Content Protection). Yup, your old pal Digital Rights Management, finding another way to intrude on every form of electronic entertainment we have. It exists to prevent unauthorized playback/recording of copyrighted media. In this case, the remarkable thing is that it's part of the hardware; that is, each device - computer (GPU), monitor,
and the cable connecting them MUST all support a compatible version of HDCP. When these devices are powered on, there is a requirement for a digital communication between them to check/verify HDCP. This is called the "handshake".
I would bet money that your monitor not being recognized was related to HDCP handshake. Interesting article here, though technically deep:
https://www.eetimes.com/document.asp?doc_id=1273716 You can also just google HDCP connection issues etc and you'll find lots of stories about it.
Also, since you have an older GPU (the 580 - great card, but long in the tooth) it appears that drivers (much as what Dutch described) changed at some point and caused HDCP handshaking issues.
https://www.overclock.net/forum/69-...vers-now-gtx580-not-hdcp-compatible.html I believe both HDMI and DVI support HDCP, officially, but any part of the chain that isn't fully compatible (the driver for example) and you start having goofy problems.
And oh, BTW: You now know (or know of, more accurately) two guys who have server racks running in their house. Well, did have anyway, before my last move. Too much work to put it all back, and I haven't used the features that made it a good idea since the move, so I bought a little NAS box; way cheaper to run, doesn't generate as much heat etc. Not perfect, but close enough that the benefits outweighed the few limitations.
Anyhow just some thoughts FWIW.