display - Dual monitor with two DVI to VGA adapters

07
2014-07
  • pappy

    I've got a client who has a graphics card with two DVI outputs.

    He bought two brand new VGA screen yesterday, so I tried fitting them using two DVI to VGA adapters, one for each screen, but it doesn't work.

    Someone told me you can't use two DVI to VGA adapters at the same time but Googling doesn't bring up any definitive answers. All the search results I get are about using one DVI screen and one VGA screen.

    Is using two DVI to VGA adapters possible? If not, what might be the solution? How about a DVI adapter that splits into two VGAs?

  • Answers
  • Lee Harrison

    It largely depends on the make/model of the graphics card but the short answer is if its a modern card the likelihood is it won't be possible.

    As you know, VGA is an analog standard. The DVI standard allows for outputs that support analog connections (DVI-A), digital connections (DVI-D) or both analog and digital integrated into one connector (DVI-I). The DVI to VGA adapter doesn't do anything clever, it just takes the analog pins from a DVI output and presents them in DSUB connector. If the output is DVI-D, these pins are missing (or unconnected) so it won't work.

    If the graphics card is equipped with two DVI-I (analog-only is rare) connectors then you can connect two VGA adapters. However, most recently manufactured graphics cards are equipped with only one analog-enabled DVI port, the second is usually digital only.


  • Related Question

    display - Can you use a DVI-VGA Adapter on a monitor instead of a video card?
  • Joel Coehoorn

    I suspect the answer to this is "no", but here goes:

    I have a monitor with inputs for DVI and VGA. I want to be able to share this display with two computers (one at a time, of course) that both have VGA only. I also have a DVI->VGA dongle that came with a video card that's in a different computer.

    Can I connect this dongle directly to the DVI port on the monitor so that I can connect both VGA computers? I'd rather not resort to a kvm.


  • Related Answers
  • David Spillett

    No, you'll need to use the KVM option or buy a couple of VGA extension cables and manually switch between the two (bring the two monitor ends of the extensions and the PC end of the monitor's table up to you test and hold them there with something like http://lifehacker.com/5499838/binder-clips-as-cable-catchers-redux%5D and just plug the monitor one into the right extension when you need to). You might find a cheap 2-machine KVM with the required cables doesn't cost much more than a couple of plain VGA extension cables though.

    A DVI->VGA "converter" doesn't actually convert any signals at all. Most DVI ports on graphics cards also carry the analog RGB signals needed by a VGA monitor along side the digital signal lines (see http://en.wikipedia.org/wiki/Digital_Visual_Interface for relevant pinout diagrams) and all the adaptor does is connect these pins to the right pins of the VGA port. So unless your monitor can accept the reverse (analog signals through its DVI input), which I doubt many (if any) do, this does not work the other way around.

  • Tom Wijsman

    Nope. The DVI-->VGA dongle is actually "DVI-I" to VGA. The "I" representing that the DVI port has the analog signals needed to convert to analog VGA.

    Since the Monitor doesn't produce a signal (analog or otherwise) they only have DVI-D ports on them ("D" being 'digital-only'), so you won't even be able to plug the dongle into the port (the spade is the wrong size, and there will be 4 extra pins on the dongle (these are the analog signal pins).