Jump to content

GeForce 200 series support


EGOvoruhk
 Share

638 posts in this topic

Recommended Posts

Great work everyone.

 

I got a system with 2x GT200 boards now working quite nicely now on OSX (except for the second DVI port on each of the boards).

 

However, I am having a tough time getting CUDA to work now. I reinstalled both the tookit/kext and the SDK and CUDA still refuses to acknowledge there is any NVIDIA card for it to use. Even though the CUDA kext is loaded.

 

Anyone having the same issues regarding CUDA?

post-242187-1243967501_thumb.png

Link to comment
Share on other sites

Wow, this is great! So what would be the procedure for a fresh Retail 10.5.7 + Chameleon v2.0 RC1 install like the one detailed here? Would I need to run the install first with another video card first, then install the GX 200 drivers, shut down, swap the cards, and reboot?

Link to comment
Share on other sites

Great work everyone.

 

I got a system with 2x GT200 boards now working quite nicely now on OSX (except for the second DVI port on each of the boards).

 

However, I am having a tough time getting CUDA to work now. I reinstalled both the tookit/kext and the SDK and CUDA still refuses to acknowledge there is any NVIDIA card for it to use. Even though the CUDA kext is loaded.

 

Anyone having the same issues regarding CUDA?

 

was the vram amount of both cards recognized properly

Link to comment
Share on other sites

Thanks - but couldnt get it to work on my machine,

Just got stuck at a black screen at boot (using the drivers and netkas enabler).

This was using lawlessppc 10.5.4 .

 

I tried installing ideneb v1.3 (with nforce patch too) and v1.4 with no luck (got stuck at detecting root device during startup). Tried changing graphics card to 8600gts and still wouldnt install.

 

Anyone know any other good distros for amd phenom procs (amd hammer chipset)

 

Thanks

Link to comment
Share on other sites

i just have to reply to say thank you netkas and to everyone else for their hard work - am now very happily running my GTX 260 on my Gigabyte EX58-UD5. CI/QE are now hardware accelerated with 896MB VRAM - and it runs sweeeeeet!

 

as for install - removed NVinject kext and old geforce kext's, installed drivers and injectors, rebooted and rebuilt caches, all as per others have said earlier in this thread.

Link to comment
Share on other sites

was the vram amount of both cards recognized properly

 

Everything works fine now with the latest CUDA driver from the irc. Thanks Netkas!

 

BTW, the GTXs are kinda picky when they initialize the PCI-E (i.e. if having more than 1 which one decides to be the display for the BIOS, etc). So make sure your mobo settings are taking care of that, esp. regarding PCI memory remap, PEG port initialization (those are the main things to check in my ASUS mobo, dunno about other brands).

 

Cheers, and again... great work!

Link to comment
Share on other sites

javi, does deviceQuery (in CUDA samples) detect shader clock properly on your cards?

for some reason it doesn't on my gtx 260's, should be 1.2 ghz, detects as 600 mhz.

 

i am also somewhat sad about the speed of CUDA on macos, non-graphical samples run as fast as xp cuda, but when it comes to CUDA-opengl interop performance blows completely.

Link to comment
Share on other sites

Just got my 280 set up on my iPC 10.5.6 build.

 

1. Deleted the following kexts.

 

GeForce.kext

NVDANV40Hal.kext

NVDANV50Hal.kext

NVDAResman.kext

NVinject.kext

 

2. No reboot, install package, then install enabler.

 

3. Swap cards, reboot, and viola!

 

Note: As previously stated, multiple monitors does not work, and will not display anything if both cables are plugged in. Pull out the cable and one display works fine.

 

Thanks to netkas and everyone on this thread, I've been waiting for this for a long, long time!

 

post-385566-1244085693_thumb.png

Link to comment
Share on other sites

Is there anything that can be done to enable both dvi outputs on each card? I have a three display setup, and would like to use the second out on one of my GTX280's. Regardless, thanks a ton for this.

Link to comment
Share on other sites

javi, does deviceQuery (in CUDA samples) detect shader clock properly on your cards?

for some reason it doesn't on my gtx 260's, should be 1.2 ghz, detects as 600 mhz.

 

i am also somewhat sad about the speed of CUDA on macos, non-graphical samples run as fast as xp cuda, but when it comes to CUDA-opengl interop performance blows completely.

 

Hmmm... In my case deviceQuery picks up the correct frequency for my shaders (1.5 Ghz), it seems that your cards are stuck in the low throttle state (I think the GT200 parts have 3 speed levels for their shaders/cores) and aren't kicking out of it.

 

I haven't noticed any significant slowdown between my linux CUDA code and the version I just compiled for OSX. Although my GL calls are minimal, so I may not be stressing that part of the driver. For normal computation, as I said... I don't see an appreciable slowdown when I time them between linux and OSX.

Link to comment
Share on other sites

Hmmm... In my case deviceQuery picks up the correct frequency for my shaders (1.5 Ghz), it seems that your cards are stuck in the low throttle state (I think the GT200 parts have 3 speed levels for their shaders/cores) and aren't kicking out of it.

 

I haven't noticed any significant slowdown between my linux CUDA code and the version I just compiled for OSX. Although my GL calls are minimal, so I may not be stressing that part of the driver. For normal computation, as I said... I don't see an appreciable slowdown when I time them between linux and OSX.

 

Nod, thanks, I'll look into it.

I am not sure about Linux, never tried CUDA on it. If you have access to XP machine, try running smoke and fluids demo on XP and OS X and compare. It's about 5x performance difference in my case.

Link to comment
Share on other sites

Anyone know if the GTX 260 works with dual DVI monitors?

No, none of the gtx cards work with dual monitors with this new injector.

I run a dual monitor setup with 1 screen connected to each GTX 260.

Link to comment
Share on other sites

No, none of the gtx cards work with dual monitors with this new injector.

I run a dual monitor setup with 1 screen connected to each GTX 260.

 

Doh.. the wait continues for me. Happy to see all this success tho! Have fun ppl :P

Link to comment
Share on other sites

Hey guys, can someone write step-by-step guide what exactly i have to do? I tried install drives x-times but still with no result. I'm absolutely noob at Mac so pls have a patience. First removed GeForce.kext (if there is one), NVDANV40Hal.kext, NVDANV50Hal.kext, NVDAResman.kext, NVinject.kext, installed the package, then downloaded latest_nvinject_0.2.1.zip, opened it and installed with kext helper. Also tried other version of nvinject and some universal injector (found with google). But I still getting this

 

Graphics by NVIDIA:

 Chipset Model:	Graphics by NVIDIA
 Type:	Display
 Bus:	PCIe
 PCIe Lane Width:	x16
 VRAM (Total):	256 MB
 Vendor:	NVIDIA (0x10de)
 Device ID:	0x05e2
 Revision ID:	0x00a1
 ROM Revision:	NVinject 0.2.1
 Displays:
ASUS VK222H:
 Resolution:	1680 x 1050 @ 60 Hz
 Depth:	32-bit Color
 Core Image:	Software
 Main Display:	Yes
 Mirror:	Off
 Online:	Yes
 Quartz Extreme:	Not Supported
Display Connector:
 Status:	No display connected

 

pc: C2D E8400, ASUS P5Q-E with modified bios, Gigabyte GTX260 896MB, 4GB DDR2, blackmagic hd extreme .... everything else works well, only that stupid graphics not

 

Thanks for each advise.

Papek.

Link to comment
Share on other sites

What is your card?

Is dual display work for you?

 

 

i have only one display but it is connected via vga to dvi connecter nearest to the mother board n its a zotac gtx295

 

also the second core on the card only shows 512mb instead of 896 unlike the one in the screenshot n i dont know if the second core has hardware acceleration.

 

using 10.5.5

Link to comment
Share on other sites

First of all thanks netkas!

 

I have my Club3d Gtx 260 running like a dream in 10.5.7 on a Asus p5q modified bios 2002 with C2Q Q 9550 overclocked to 3,72 Ghz. Same as most others in this thread; no dual output. Performance increase from 8600 GTS is disappointing. Flashed the GTX260 with firmware modified to overclock values that were stable in Windows (rivatuner) and saw hardly any speed increase in OSX. (The overclocked values were mega; +/- 18%. In Windows the card was only becoming unstable at around 23%+ GPU and RAM speed)

 

Does anybody know if the OSX drivers override BIOS speed settings and if so, can the driver side speedsettings be modified withe hexedit?

 

Just bought a GTX 285 will need to fly about 13 Hours before I can test it. If needed I'll flash it later on with a mac bios.

Link to comment
Share on other sites

Cool! Could you post the EFI string or tell us how to generate one?

 

 

 

I followed aqua-mac's instructions here http://aquamac.proboards.com/index.cgi?boa...&thread=569

 

 

its a little shifty with the sound and every now and again it pops out. I had it working as a 9500GT with the efi strings I got from that and surprise to me it worked and no problems. I think I might go back to that.

 

 

 

*EDIT

It seems to work great now that I reinstalled the kernel my bad.

 

thanks aqua for the info

Link to comment
Share on other sites

 Share

×
×
  • Create New...