Jump to content

2x 9800gx2 on a real Mac Pro


amaia
 Share

15 posts in this topic

Recommended Posts

Today I was trying to get 2x nVidia 9800 gx2 to work on vista64 on my MacPro (early 2008 with 2x2.8quad).

 

Everybody knows that for the mac pro even turn on (and actually do the chime), you need an EFI-capable video card pluged in inside your mac.

 

First I tryed with the original ATI 2600 card + the two 9800gx2. With that setup, I could boot into windows, but since the primary monitor always is recognized on the ati, since thats the one EFI-enabled, I could never get the nvidia drivers to load on windows. I guess you can't have ati and nvidia drivers loaded at the same time.

 

Ok. Based on the few things that I could conclude, I decided to get the apple upgrade kit (with one 8800 GT, EFI-enabled). With that I could try to load windows and get nvidia drivers loaded. However, for some wierd reason, after I hold down "option key" and select the boot camp partition, the mac insta-freeze.

 

I was tired of try to get vista64 to work. I will keep trying tomorrow. However, before I shutdown everything, I decided to boot on my MacOSX 10.5.5 partition with the 3x nvidias pluged (2x 9800gx2 and one 8800gt from the apple upgrade kit).

 

If you try to boot MacOS X with the 9800gx2 and any other card, you will see that MacOS will not even detect the 9800gx2, because there is no EFI on the 9800 firmware. But look what happens when you boot with the 9800gx2 and the 8800gt (EFI enabled).

 

8800gt.png

9800gx2s1.png

9800gx2s2.png

 

That was without the SLI cable connecting the two 9800gx2.

 

If I connect the SLI cable, putting the two 9800gx2 in SLI, I get this on MacOS:

 

withsli.png

 

If you take a look on the screenshots, on the 9800gx2 it shows the Rom Revision as 3233. In fact, thats the EFI of the 8800gt, that for some reason MacOS also used for the two non-EFI 9800gx2 cards.

 

I am no driver expert. But my guess is: now that we actually know how to make MacOS "see" the two 9800gx2 as valid cards, if we could make him associate that card with a working driver, we could make those 9800 work on MacOSX (running on a real mac, not hackintosh).

 

For those who are curious about how I managed to actually power the two 9800gx2 and one 8800gt inside a MacPRO, I just used an additional external power supply. Here are the pics:

 

externalpsu.jpg

cards.jpg

sli.jpg

 

If anyone can help me on that, please feel free to ask any additional information.

 

Also, does anybody has any clue about why my MacPRO just freezes when I choose the bootcamp partition? (with the 3 nvidia cards inside, it will boot and load windows no problem if I plugin the ATI2600 plus the two 9800gx2 together.) It freezes right away, with the disk boot selection screen. It doesnt even turn the screen black to boot windows.

 

I am using MacOS X 10.5.5 and Vista64sp1.

Link to comment
Share on other sites

After some reading, I saw that little to no applications/games benefits from quad-sli (with two 9800gx2). So i decided to screw with the two 9800gx2 and stick just one 9800gx2 and one 8800gt( so I could boot OSX).

 

As soon as I removed the 9800gx2 that was on Slot-1 and transfered the 8800gt from Slot-4 to Slot-1. And left the other 9800gx2 on the Slot-2. I could finally boot on my boot-camp partition. I guess the double GX2 setup was making vista64 boot go crazy.

 

However, vista64+macpro+9800gx2 is no news.

 

Let's talk about the MacOS side.

 

I learned from windows that, when you power on your MacPRO, the 9800gx2 fans go crazy, full power. But as soon as the windows driver loaded, the fan goes into a quite mode. You can barely hear the fan, unless it gets too hot.

 

Well, when I tried to boot MacOSX with ATI2600 + 9800gx2, the gx2 stayed with fans at full speed all the time. MacOS would never recognize the 9800gx2, only the ATI2600.

 

But when I tried to boot with the 8800gt on SLOT-1 and 9800gx2 on SLOT-2, as soon as the apple disappeared and the blue screen appeared, the 9800gx2 went into a quite mode (fan at lower speeds, drivers probably reconegnized the card). And OSX got stuck on that blue screen right before the login screen. The machine was not frozen. I could turn caps lock on and off. But Never got past that screen. My guess is, when I boot in OSX with 8800gt+9800gx2, the 8800 EFI makes OSX recongnize the 9800gx as a legit card also and tries to load the drivers.

 

Will keep you all posted.

Link to comment
Share on other sites

As I suspected, the system was not frozen. Just stuck on that blue-ish screen before the login screen. But I could login via SSH. And as i suspected, OSX indeed recognized the video card. My CUDA-enable nVidia driver even saw the 9800gx2 as a CUDE device.

 

$ ./deviceQuery
There are 3 devices supporting CUDA

Device 0: "GeForce 8800 GT"
 Major revision number:                         1
 Minor revision number:                         1
 Total amount of global memory:                 536674304 bytes
 Number of multiprocessors:                     14
 Number of cores:                               112
 Total amount of constant memory:               65536 bytes
 Total amount of shared memory per block:       16384 bytes
 Total number of registers available per block: 8192
 Warp size:                                     32
 Maximum number of threads per block:           512
 Maximum sizes of each dimension of a block:    512 x 512 x 64
 Maximum sizes of each dimension of a grid:     65535 x 65535 x 1
 Maximum memory pitch:                          262144 bytes
 Texture alignment:                             256 bytes
 Clock rate:                                    0.60 GHz
 Concurrent copy and execution:                 Yes

Device 1: "G92-450"
 Major revision number:                         1
 Minor revision number:                         1
 Total amount of global memory:                 536674304 bytes
 Number of multiprocessors:                     16
 Number of cores:                               128
 Total amount of constant memory:               65536 bytes
 Total amount of shared memory per block:       16384 bytes
 Total number of registers available per block: 8192
 Warp size:                                     32
 Maximum number of threads per block:           512
 Maximum sizes of each dimension of a block:    512 x 512 x 64
 Maximum sizes of each dimension of a grid:     65535 x 65535 x 1
 Maximum memory pitch:                          262144 bytes
 Texture alignment:                             256 bytes
 Clock rate:                                    1.51 GHz
 Concurrent copy and execution:                 Yes

Device 2: "Device Emulation (CPU)"
 Major revision number:                         1
 Minor revision number:                         1
 Total amount of global memory:                 536674304 bytes
 Number of multiprocessors:                     16
 Number of cores:                               128
 Total amount of constant memory:               65536 bytes
 Total amount of shared memory per block:       16384 bytes
 Total number of registers available per block: 8192
 Warp size:                                     32
 Maximum number of threads per block:           512
 Maximum sizes of each dimension of a block:    512 x 512 x 64
 Maximum sizes of each dimension of a grid:     65535 x 65535 x 1
 Maximum memory pitch:                          262144 bytes
 Texture alignment:                             256 bytes
 Clock rate:                                    0.00 GHz
 Concurrent copy and execution:                 Yes

Test PASSED

Press ENTER to exit...

 

Look at that. the G92 devide was recognized with 1.51Ghz Clock rate, which is indeed the 9800gx2 clock.

 

My guess is, that after load the 9800gx2 device, OSX tried to switch the default display to that card. I will do some more testing. Will connect one display on each of the 9800gx2 outputs, even the HDMI one, and see if I get any signal.

 

./crossfingers

Link to comment
Share on other sites

Checking system.log via SSH I got this:

 

Sep 19 22:24:47 localhost kernel[0]: NVDA::probe(display)
Sep 19 22:24:47 localhost kernel[0]: NVDA::start(display) <1>
Sep 19 22:24:47 localhost kernel[0]: NVDA::start(display) <1> failed

 

So yeah. After the boot (apple with gray screen) it tries to send the signal to display <1> and it fails. Thats why I don't get the login screen. everything else starts ok, I can access the system via SSH.

 

My guesses are:

 

1. The system recognizes everything, including my 9800gx2, but when it tries to send display signal to display<1> it fails. And since I tryed to boot with a DVI monitor connected to both DVI ports on the 9800gx2, I think the display<1> is the HDMI port. My ****ing TV is too far away from my macpro right now. I guess i will have to bring it closer (and unplug all the billion cables connected to it. My wife will kill me.)

 

or

 

2. The system recognizes everything, including my 9800gx2 (because of the 8800gt EFI-bios). However, the EFI is not 9800gx2 compatible and thats why OSX can't send the display signal thought the 9800gx2 card. Which I doubt it's true. Because I read somewhere that once the system see that EFI is up, it wont even use anything from the EFI firmware, it will load the actual drivers (kext) and go from there. And also, I can use CUDA-applications via command line (ssh). So I am actually been able to access the 9800gx2 no problem.

 

I will do two more tests now. Will first swap 8800gt and 9800gx2. So 8800 now goes on Slot-2 and 9800 on Slot-1.

 

If no luck with that, I will take some fresh air and think about bring my HDMI TV close to my macpro. :(

Link to comment
Share on other sites

Woot! :)

 

After almost 24 hours without sleep, I finally got my 9800gx2 fully functional (as my default display and everything).

 

System:

Apple MacPRO (2x 2.8Ghz quad-core)

6GB RAM

NVidia 8800GT (Apple upgrade kit)

NVidia 9800GX2 (Retail one. Bought eVGA from newegg)

 

Tomorrow I will post more details and pics of the 9800GX2 working under MacOSX 10.5.5 on my MacPro.

 

One last note. I F***ing hate Apple for this. If it took me only 24 hours wihtout sleep to make this work. Why the hell they don't make it available for everybody. I paid a LOT of money for this computer. Plus another big one for the two 9800GX2. PLUS some more for the 8800GT upgrade kit, just so I could use 8800GT EFI-enable firmware to "validate" the 9800's for MacOSX.

 

Next time I will only get all this money and create a customized hackintosh and save all the headache......

Link to comment
Share on other sites

Thats not true. As you can see on the deviceQuery above, MacOSX recognized both cores. They will just not be used in "SLI". Which I dont care, since like 90% of the stuff I use gets no benefit from SLI.

 

In fact, CUDA applications works better if you disable SLI, even under windows. So your applications can manage both cores by itself.

 

Also, a single 9800GX2 is already faster than a 8800GT for gaming.

 

So, in the end, it's a win/win trade.

Link to comment
Share on other sites

For those interested in try this before I have time to post a full Howto.

 

The final setup was:

 

Slot-1 ----9800GX2----

Slot-2 ----8800GT-----

 

You also need to add the strings for the 9800GX2 on the GeForce, NVResman and NVDANV50Hal extensions.

 

You also need 2 monitors. One connected on the DVI port 2 of the 8800GT and one on the DVI port 1 of the 9800gx2.

 

On the final step to make it really work, I had to use NVinject 0.2.2 modified to make the OS not only see 9800gx2 as a valid card(the 8800gt EFI firmware makes the system see all other nvidia cards as valid) but to make the OS actualy load the drivers.

 

The kexts for OSX 10.5.5 are here:

 

http://www.unsekure.net/mac9800gx2/9800gx2kexts.tgz

 

While the system is booting, the gray apple will show on the 8800gt display. Right before the system load the login screen, the display with be auto-switched to the 9800gx2 display (NVinject gets in action and makes the OS recognize the 9800gx2).

 

In the end, because of NVinject, the system will see all nvidia cards as a 9800gx2, as you can see on the pic below. I still need to fix that.

 

9800gx2working.png

 

However, I think it's just a cosmetic thing. Cause in the end, CUDA can see the right cards:

 

There are 3 devices supporting CUDA

Device 0: "G92-450"
 Major revision number:						 1
 Minor revision number:						 1
 Total amount of global memory:				 536674304 bytes
 Number of multiprocessors:					 16
 Number of cores:							   128
 Total amount of constant memory:			   65536 bytes
 Total amount of shared memory per block:	   16384 bytes
 Total number of registers available per block: 8192
 Warp size:									 32
 Maximum number of threads per block:		   512
 Maximum sizes of each dimension of a block:	512 x 512 x 64
 Maximum sizes of each dimension of a grid:	 65535 x 65535 x 1
 Maximum memory pitch:						  262144 bytes
 Texture alignment:							 256 bytes
 Clock rate:									1.51 GHz
 Concurrent copy and execution:				 Yes

Device 1: "G92-450"
 Major revision number:						 1
 Minor revision number:						 1
 Total amount of global memory:				 536674304 bytes
 Number of multiprocessors:					 16
 Number of cores:							   128
 Total amount of constant memory:			   65536 bytes
 Total amount of shared memory per block:	   16384 bytes
 Total number of registers available per block: 8192
 Warp size:									 32
 Maximum number of threads per block:		   512
 Maximum sizes of each dimension of a block:	512 x 512 x 64
 Maximum sizes of each dimension of a grid:	 65535 x 65535 x 1
 Maximum memory pitch:						  262144 bytes
 Texture alignment:							 256 bytes
 Clock rate:									1.51 GHz
 Concurrent copy and execution:				 Yes

Device 2: "GeForce 8800 GT"
 Major revision number:						 1
 Minor revision number:						 1
 Total amount of global memory:				 536674304 bytes
 Number of multiprocessors:					 14
 Number of cores:							   112
 Total amount of constant memory:			   65536 bytes
 Total amount of shared memory per block:	   16384 bytes
 Total number of registers available per block: 8192
 Warp size:									 32
 Maximum number of threads per block:		   512
 Maximum sizes of each dimension of a block:	512 x 512 x 64
 Maximum sizes of each dimension of a grid:	 65535 x 65535 x 1
 Maximum memory pitch:						  262144 bytes
 Texture alignment:							 256 bytes
 Clock rate:									0.81 GHz
 Concurrent copy and execution:				 Yes

Test PASSED

 

All cards are working perfect. Except that after the 9800gx2 gets the display, the 8800gt cant show anything. It wont even detect the display again. But who cares :2cents: the 9800gx2 is working.

 

I tested a couple games (Spore and WoW). And both were at 60fps (with vsync) and 160+ fps without vsync. 1920x1200 with maximum FSAA. Full details.

 

Of course your mileage may vary depending on your system. :(

 

Post any new info you get if you try this. And feel free to contact me asking anything.

Link to comment
Share on other sites

Hey amaia,

 

wow - that's sound pretty interesseting and I've some questions about ya config.

 

I'm currently running an older GeForce 7600GS side-by-side with my "Apple" GeForce 8800GT and it work's fantastic on Vista x64 as well as on XP 64bit I installed last night. But as you can read on this thread, I'm trying to get this 7600GS running in 10.5.5, too to be able to use the 4 TFT-setup here.

 

As far as I understood (correct me if I'm wrong), it's a BIOS-card you've added? No EFI?? And you just get it work using NVinject?

 

The question is how :-)

 

I also tried NVinject.kext and everytime I added one of them, I got smashed to a Kernel Panic on boot (ok - I know now how to remove the kext in the Single User Mode and so I'm everytime able to get the system back working, but I still can't get it work).

 

Ideas are really appreciated, because it looks like there's NO ROM for my Colorful GeForce 7600GS/256MB/128Bit.

 

Thanks in advance!!

 

Nemo

Link to comment
Share on other sites

Do you think it's possible to get this working on a hackintosh?

 

I've got a 9800GX2 in my GA-P35-DS3L, and using your kexts, I'm getting the same blue-ish screen as well. I boot with verbose and safe boot (-v -x) and I also see your other "NVDA::start(display) <1> failed" issue

Link to comment
Share on other sites

 Share

×
×
  • Create New...