Jump to content

Dual Monitors and ADD2 Cards

6 posts in this topic

Recommended Posts

Greetings all,


I have recently put together a Core2Duo system with the AsRock 945G-DVI at the center of the rig. E6600 for those who wish to know and running rather smooth. Most of know that this board comes with an attached D-Sub (VGA) port and the DVI port is an AsRock home grown ADD2 card. That means that the board itself does not have an attached DVI port hard wired onboard. Now that puts us at the same level as an Intel 945 board with an ADD2 card. Don't get me wrong, setting this board up with 10.4.7 JaS was a breeze, however, there is one slight dilema. I have 2 Apple Cinema Displays, both 20" and both DVI-D. Let me elaborate in hopes that others have the same problem and perhaps solved it:


AsRock or any other board with an ADD2 card will only give you extended desktop through the D-Sub(VGA) onboard port and the ADD2 card with one DVI port. Having 2 DVI-D screens, this can present a problem.


I have recently purchased a Pegasus Dual DVI ADD2 card from newegg. PCIe and all that good stuff. Hoping that the onboard graphics VGA port would de-activate and the Dual DVI ADD2 card would kick in - No go on that one, and this is after going into BIOS and supposedly disabling the onboard port. Only one DVI port works and the other goes black. System Profiler detects the onboard VGA and the ADD2 card, but shows no display connected on the VGA and detects one screen on the ADD2.


in theory, with the onboard graphics port and a Dual DVI ADD2, you should be able to either:


1. Connect 3 screens consisting of 1 VGA Analog screen and 2 DVI screens through the Dual DVI ADD2 Card.


2. or, Onboard VGA disables once ADD2 is in place and thus, allows you to connect 2 DVI screens through Dual DVI ADD2 Card.


The other alternative I have seen is to purchase a Digital DVI to VGA analog converter box by gefen. $200-$300 just so it can half ass my digital signal? No thank you. For that much, we can get another large Digital LCD screen. We're not talking about the little dongle that comes with Dual DVI Graphics cards like ATI and Nvidia, this is the other way around.


If there's anyone who got an ADD2 Dual DVI card to work on two DVI only screens, please enlighten me. Otherwise, one of the many things that makes OSX superior to Windows (Dual Display) will fall short, as many other things have been, since the 10.4.4 Kernel. But then again, we're still on that kernel aren't we.


With Apple releasing the 10.4.7 kernel source, they must be laughing their arses off at us. I don't want to be the one yelling "we'll show you" and not show'em anything. In the meantime, lets tackle them one by one:


Dual DVI ADD2 for Dual Monitor / Displays






Latest EDIT:


Check here for dual display support: http://forum.insanelymac.com/index.php?showtopic=32536

Link to comment
Share on other sites

I share your pain, and can't offer anything in the way of a solution either. I have dual displays (2 SGI 1600SW's) hooked up through onboard VGA and an ADD2 card. The VGA output is terrible and swims, but the ADD2 card output is rock steady.


One thing I did notice is that some ADD2 cards function but aren't recognised by the BIOS - that is, you alter the BIOS to make the ADD2 Card the primary display, and it will ignore you. Some are more equal than others it seems.


Anyway - yeah dual screen support is really flimsy and nasty. Its persuading me towards a 24" imac at the mo....



Link to comment
Share on other sites

I do not have a solution.


However, I would recommend that you check to make sure you are running GMA950 driver, not 915.


Type "kextstat | grep GMA" at a terminal window. In my case, it shows up as

com.apple.driver.AppleIntelGMA950 (4.4.0) <65 51 16 11>


That is GMA950 driver from 10.4.8

Link to comment
Share on other sites

  • 4 weeks later...
  • 4 months later...

I do apologise but as a complete novice, I have just bought a system set up exactly like Synthology describes above - 945G with one DVI and one VGA. However, I cannot get any output from the DVI.

The VGA works just fine. I have connected just to the DVI and I get no output.

Is this the BIOS?

I have two VGA screens, but have a suitable VGA/DVI adapter/converter for the second screen.

As I said I am a novice, but if you can point me in the right direction.


Thanks for your help.

Link to comment
Share on other sites


  • Create New...