windows xp - DisplayPort to DVI not working on Quadro FX 580

24
2014-04
  • kaosvid

    I have a PNY NVIDIA Quadro FX 580 graphics card with 1x DVI and 2x DisplayPorts. The DVI port works fine with both my Viewsonic monitors but I cannot get either of the DPs to work using the supplied DP to DVI adapter; all I get is a "no signal" on either monitor when connected to either DP port.

    The NVIDIA Control Panel shows that the second monitor is not connected when in fact it is.

    How do I get the second monitor to work?

    System:

    • Windows XP Professional 32-bit
    • Asus P5Q motherboard
    • Core 2 Duo E8500 CPU
    • 4GB PC8500 RAM
  • Answers
  • happy_soil

    I've heard "loosely" in a graphics card video review that some cards can only utilise one type of interface at a given time, i.e. you can't use DVI and Display Port connectors at the same time.

    So, check the manual to see if there are any information that reads along these lines.

  • Shinrai

    Can you provide details on the monitors? I have firsthand experience with this card and it SHOULD work with the DVI and at least one DisplayPort using the bundled adapter. You may possibly just have a bad adapter - it's rare but I've seen plenty in my time. I've also seen rare issues with the Quadro driver in XP locking out displays - you could try rolling back to a 190 revision.


  • Related Question

    dvi - Dual VGA monitors with Quadro FX 580?
  • Dentrasi

    I have a machine with a Quadro FX 580 card (DVI and two Displayport). Attached to it are two 19" Acer screens, which are both (annoyingly) VGA. The first one works perfectly, with a DVI->VGA adaptor.

    The second one doesn't work. It's got a VGA cable, which goes into a VGA->DVI converted, which then goes into a DVI->Displayport converter. Initally, I was getting 'Cable Unplugged' on the screen, and it couldn't be seen by Windows or the nVidia control panel. After swapping the VGA->DVI adaptor (which works perfectly on another machine), Windows can now see the monitor. The nVidia panel sees the model and native resolution, but I get a constant 'No Signal' error. Switching to the other Displayport makes no difference.

    I suspect that the card is seeing a DVI connection plugged into it (nVididaCP shows the monitor as having a DVI connection), and it only sending out a digital signal because of this. Does anyone know of a solution (other than trying to get a Displayport->VGA adaptor), or of a way to force the card to see it as VGA?

    Thanks,

    ~Dentrasi


  • Related Answers
  • Henk

    Edit: A DVI plug has a combination of digital and analog pins. There are different versions, sometimes pins are missing. A DVI -> VGA converter only channels the analog signal to the right pins, it doesn't (and can't) convert digital to analog. Your card has to provide both Digital and Analog: DVI-I.

    A Displayport output is a digital output, it does not contain an analog signal. So the DP->DVI converter can only supply the digital-only DVI-D.

    So, sadly, chaining DP -> DVI -> VGA cannot work.

    It probably has only 1 A/D converter, that is kind of common for such cards. A D/A converter is relatively expensive. That would mean you can't get it to run 2 analog monitors.