Jump to content

List of Lion GF100 cards which make the kext load with no editing


VooD
 Share

52 posts in this topic

Recommended Posts

For full explanation go to post 13

0x06c010de&0xffe0ffff 

06C0 = "NVIDIA GeForce GTX 480"
06C4 = "NVIDIA GeForce GTX 465"
06CD = "NVIDIA GeForce GTX 470"
06D1 = "NVIDIA Tesla C2050 / C2070"
06D2 = "NVIDIA Tesla M2070"
06D2 = "NVIDIA Tesla X2070"
06D8 = "NVIDIA Quadro 6000"
06D9 = "NVIDIA Quadro 5000"
06DC = "NVIDIA Quadro 6000 "
06DD = "NVIDIA Quadro 4000"
06DE = "NVIDIA Tesla T20 Processor"
06DE = "NVIDIA Tesla S2050"
06DE = "NVIDIA Tesla M2050"
06DE = "NVIDIA Tesla X2070 "
06DF = "NVIDIA Tesla M2070-Q"

0x0dc010de&0xffc0ffff

0DC0 = "NVIDIA GeForce GT 440"
0DC4 = "NVIDIA GeForce GTS 450"
0DC5 = "NVIDIA GeForce GTS 450 "
0DC6 = "NVIDIA GeForce GTS 450 "
0DD8 = "NVIDIA Quadro 2000"
0DE0 = "NVIDIA GeForce GT 440 "
0DE1 = "NVIDIA GeForce GT 430"
0DDA = "NVIDIA Quadro 2000M"
0DF0 = "NVIDIA GeForce GT 425M"
0DF4 = "NVIDIA GeForce GT 540M "
0DF5 = "NVIDIA GeForce GT 525M"
0DF7 = "NVIDIA GeForce GT 520M"
0DFA = "NVIDIA Quadro 1000M"
0DE2 = "NVIDIA GeForce GT 420"
0DE5 = "NVIDIA GeForce GT 530"
0DF8 = "NVIDIA Quadro 600"

0x0e2010de&0xffe0ffff
0E22 = "NVIDIA GeForce GTX 460"
0E23 = "NVIDIA GeForce GTX 460 SE"
0E24 = "NVIDIA GeForce GTX 460 "

0x104010de&0xffc0ffff
1040 = "NVIDIA GeForce GT 520"
1050 = "NVIDIA GeForce GT 520M "
1056 = "NVIDIA NVS 4200M"
1057 = "NVIDIA NVS 4200M "

0x124010de&0xffc0ffff
1241 = "NVIDIA GeForce GT 545"
1243 = "NVIDIA GeForce GT 545 "
1244 = "NVIDIA GeForce GTX 550 Ti"
1245 = "NVIDIA GeForce GTS 450 "

 

Does anybody know what are the exact meaning of the wildcards in the info.plist for nvidia kexts?

 

I mean, given this:

 

0x06c010de&0xffe0ffff
0x0dc010de&0xffc0ffff
0x0e2010de&0xffe0ffff
0x0ee010de&0xffe0ffff
0x0f0010de&0xffc0ffff
0x104010de&0xffc0ffff
0x124010de&0xffc0ffff

 

What does exactly the 0xffe0ffff value in 0x06c010de&0xffe0ffff?

As far as I know tells the driver to accept any device with the 10de vendor id (last part), and with a device id starting with 06...so ff = the same....but... what are the meaning of "e0" and "c0" values?

  • Like 1
Link to comment
Share on other sites

It's the "tolerancy" of the dev ID.

0x06c010de&0xffe0ffff means it can be +/- e0 for the 2nd byte (c0)

It's only to cover a range of dev IDs at once so the same kext can load for several variations of the board with different dev IDs. It is useless for us, directly inputing your own dev ID is enough.

Link to comment
Share on other sites

It's the "tolerancy" of the dev ID.

0x06c010de&0xffe0ffff means it can be +/- e0 for the 2nd byte (c0)

It's only to cover a range of dev IDs at once so the same kext can load for several variations of the board with different dev IDs. It is useless for us, directly inputing your own dev ID is enough.

 

Thanks for the info. The point of learning this was making a "compatibility table" based on the default pciid's and wildcards.

 

I don't want to get a vga card which makes me to mod the kext with every new update, so I was looking for a way to exactly know what is supported by default by every kext.

 

But I don't completly understand the "+/- e0" stuff. If I subtract e0 to c0 I get a less than 0 value, also If I add e0 to c0 I get a value bigger than a single byte.

 

Thanks for your help. :)

Link to comment
Share on other sites

The list in your first post is from GF100Hal.kext right?

 

06c0 is the device ID of the GTX 480. If I'm reading Krazubu right, it's the first device ID in the range of video cards supported by GF100Hal.kext, and you're not supposed to subtract anything.

 

The device ID of the GTX 470 is 06cd, which means (if I'm following the logic correctly!) that the GTX 470 is also supported by GF100Hal.kext without having to modify anything. It falls inside the range of supported IDs.

 

The 0x0e20 range is for GF104 based cards like the GTX 460. Mine has dev ID 0x0e022.

 

More device IDs here:

http://forge.voodooprojects.org/p/chameleo...ibsaio/nvidia.c

Link to comment
Share on other sites

I know that. But the point is, those wildcards allow many more cards not specifically listed to work without manually editing any kext. I want to know how wildcards work so I can make a list of directly compatible cards by looking at their names and pci id inside the windows driver .inf.

Link to comment
Share on other sites

You have that information already - any device ID that fits in the allowed range will cause the corresponding NVxxxHal.kext to load.

 

If I'm understanding this correctly, 0x06c010de&0xffe0ffff means that all cards with dev ID in the range of 0x06c0 to 0xffe0 will cause the driver to load.

 

But you need to consider that even if the driver does load there is no guarantee that the "supported" cards will really work. For example there is a Zotac GTX 460 that has a strange output configuration, the driver will load for it because the device ID matches, but you need to inject a patched BIOS ROM with NVEnabler or Chameleon GraphicsEnabler to be able to use the DVI ports.

 

Also there were issues (can't remember what they were anymore) with the 9800GTX+ with device ID 0x0613 while the 0x0612 version works perfectly.

 

I can't get a grip on what the situation is with the 8400GS but some versions of it seems to cause issues, just do a forum search to see what I mean. Then again maybe it's no different from the others, maybe there are more "perceived" issues with it simply because it is a cheap and popular video card and more people own it.

 

I'm sure there are many other quirky cards like those.

Link to comment
Share on other sites

You're welcome, looking forward to seeing the list. It should be a useful sticky post here in the nvidia subforum.

I think there is a problem with your theory. You said the first line:

 

0x06c010de&0xffe0ffff

 

Means the driver loads when it found a 06c0 device id, or any device id from 06c0 to ffe0.

It that was correct the GF100 driver would be loaded for the "NVIDIA GeForce 9300 GE" which is 06e0.

 

So by now, the only ones I can say for sure are:

 

06c0 "NVIDIA GeForce GTX 480"

0dc0 "NVIDIA GeForce GT 440"

0e20 - not present in current Windows desktop and mobile drivers - Future model? Custom id for Apple versions?

0ee0 - not present in current Windows desktop and mobile drivers - Future model? Custom id for Apple versions?

0f00 - not present in current Windows desktop and mobile drivers - Future model? Custom id for Apple versions?

1040 "NVIDIA GeForce GT 520"

1240 - not present in current Windows desktop and mobile drivers - Future model? Custom id for Apple versions?

 

 

And for GF50

 

00f0 - not present in current Windows desktop and mobile drivers - Future model? Custom id for Apple versions?

0190 - not present in current Windows desktop and mobile drivers - Future model? Custom id for Apple versions?

0400 "NVIDIA GeForce 8600 GTS"

0420 "NVIDIA GeForce 8400 SE"

05e0 "NVIDIA GeForce GTX 295"

05f0 - not present in current Windows desktop and mobile drivers - Future model? Custom id for Apple versions?

0600 "NVIDIA GeForce 8800 GTS 512"

0620 - not present in current Windows desktop and mobile drivers - Future model? Custom id for Apple versions?

0640 "NVIDIA GeForce 9500 GT"

06e0 "NVIDIA GeForce 9300 GE"

0860 "NVIDIA GeForce 9400"

08a0 "NVIDIA GeForce 320M"

0a20 "NVIDIA GeForce GT 220"

0ca0 "NVIDIA GeForce GT 330"

 

Anyway I still need to know how those wildcards work. For example I know the ""NVIDIA GeForce 9600 GT" works without any editing, and its id is 0622, being its line at the info.plist "0x062010de&0xffe0ffff"

 

This is all the info I could find about those wildcards (Apple call them MASKS).

 

http://developer.apple.com/library/mac/#do.../TP30000347-TP9

 

As far as I understand in those docs:

 

F means = keep the same, and

0 means = any value.

Link to comment
Share on other sites

You can add this to the list of cards that will work with no modification: 0612 - 9800 GTX+

I think there is a problem with your theory. You said the first line:

 

0x06c010de&0xffe0ffff

 

Means the driver loads when it found a 06c0 device id, or any device id from 06c0 to ffe0.

It that was correct the GF100 driver would be loaded for the "NVIDIA GeForce 9300 GE" which is 06e0.

 

Yes..but NVDANV50Hal.kext would load for that card, not NVDAGF100Hal.kext.

 

I'm not sure how that actually works..maybe there's more to the identification than just the device ID.

Look at the device ID range in NVDANV50Hal.kext, you should find the 9300 GE in the allowed range there. Maybe something can be learned by comparing the two, the IDs seem to be overlapping but there must be more to it than that.

Link to comment
Share on other sites

You can add this to the list of cards that will work with no modification: 0612 - 9800 GTX+

 

 

Yes..but NV50Hal.kext would load for that card, not NV100Hal.kext.

 

I'm not sure how that actually works..maybe there's more to the identification than just the device ID.

Look at the device ID range in NV50Hal.kext, you should find the 9300 GE in the allowed range there.

 

I think I got it:

 

As far as I understand in those docs:

 

F means = keep the same, and

0 means = any value.

 

If I trust the first post and the mask are tolerency based it would be:

 

F = no tolerancy, the number has to be the same.

0 = full tolerancy, the number can be any.

E = the value can be +/- 1

D = the value can be +/- 2

C = the value can be +/- 3, etc...

 

 

So in this case:

 

"0x062010de&0xffe0ffff"

 

That would mean:

 

0620,

0620 -> 062F , (0 deviation in the first half of the second byte, and any number at the second half of the second byte)

0610 -> 061F , (-1 deviation in the first half of the second byte, and any number at the second half of the second byte)

0630 -> 063F (+1 deviation in the first half of the second byte, and any number at the second half of the second byte) and

 

That would make the 9600GT to load with gf50 since they are

 

0622.01 = "NVIDIA GeForce 9600 GT"

062D.01 = "NVIDIA GeForce 9600 GT "

062E.01 = "NVIDIA GeForce 9600 GT "

0637.01 = "NVIDIA GeForce 9600 GT "

 

:)

 

I going to try to apply my "theory" with the first line in the gf100 kext

 

0x06c010de&0xffe0ffff

 

Valid ranges:

 

06b0 -> 06bf

06c0 -> 06cf

06d0 -> 06df

 

 

Now the second line 0x0dc010de&0xffc0ffff

 

Means it would load for all these ranges:

 

0DF0 -> 0DFF

0DE0 -> 0DEF

0DD0 -> 0DDF

0DC0 -> 0DCF ---

0DB0 -> 0DBF

0DA0 -> 0DAF

0D90 -> 0D9F

 

For the third line: 0x0e2010de&0xffe0ffff

 

0E30 -> 0E3F

0E20 -> 0E2F ---

0E10 -> 0E1F

 

The fourth line has no use since those are the only devices starting by 0E. Maybe future cards use additional id's in that range

Also the fifth line has no coincidences with current Nvidia id's (probably is reserved for future cards).

 

The sixth line 0x104010de&0xffc0ffff ranges are:

 

1070 -> 107F

1060 -> 106F

1050 -> 105F

1040 -> 104F

1030 -> 103F

1020 -> 102F

1010 -> 101F

 

Finally 0x124010de&0xffc0ffff ranges are:

 

1270 -> 127F

1260 -> 126F

1250 -> 125F

1240 -> 124F

1230 -> 123F

1220 -> 122F

1210 -> 121F

 

So let's do a quick recap of the cards which doesn't need plist modification in order to use the GF100 kext:

 

0x06c010de&0xffe0ffff 

06C0 = "NVIDIA GeForce GTX 480"
06C4 = "NVIDIA GeForce GTX 465"
06CD = "NVIDIA GeForce GTX 470"
06D1 = "NVIDIA Tesla C2050 / C2070"
06D2 = "NVIDIA Tesla M2070"
06D2 = "NVIDIA Tesla X2070"
06D8 = "NVIDIA Quadro 6000"
06D9 = "NVIDIA Quadro 5000"
06DC = "NVIDIA Quadro 6000 "
06DD = "NVIDIA Quadro 4000"
06DE = "NVIDIA Tesla T20 Processor"
06DE = "NVIDIA Tesla S2050"
06DE = "NVIDIA Tesla M2050"
06DE = "NVIDIA Tesla X2070 "
06DF = "NVIDIA Tesla M2070-Q"

0x0dc010de&0xffc0ffff

0DC0 = "NVIDIA GeForce GT 440"
0DC4 = "NVIDIA GeForce GTS 450"
0DC5 = "NVIDIA GeForce GTS 450 "
0DC6 = "NVIDIA GeForce GTS 450 "
0DD8 = "NVIDIA Quadro 2000"
0DE0 = "NVIDIA GeForce GT 440 "
0DE1 = "NVIDIA GeForce GT 430"
0DDA = "NVIDIA Quadro 2000M"
0DF0 = "NVIDIA GeForce GT 425M"
0DF4 = "NVIDIA GeForce GT 540M "
0DF5 = "NVIDIA GeForce GT 525M"
0DF7 = "NVIDIA GeForce GT 520M"
0DFA = "NVIDIA Quadro 1000M"
0DE2 = "NVIDIA GeForce GT 420"
0DE5 = "NVIDIA GeForce GT 530"
0DF8 = "NVIDIA Quadro 600"

0x0e2010de&0xffe0ffff
0E22 = "NVIDIA GeForce GTX 460"
0E23 = "NVIDIA GeForce GTX 460 SE"
0E24 = "NVIDIA GeForce GTX 460 "

0x104010de&0xffc0ffff
1040 = "NVIDIA GeForce GT 520"
1050 = "NVIDIA GeForce GT 520M "
1056 = "NVIDIA NVS 4200M"
1057 = "NVIDIA NVS 4200M "

0x124010de&0xffc0ffff
1241 = "NVIDIA GeForce GT 545"
1243 = "NVIDIA GeForce GT 545 "
1244 = "NVIDIA GeForce GTX 550 Ti"
1245 = "NVIDIA GeForce GTS 450 "

 

Now if someone could confirm I'm right I would be quite happy :P. So according to this list a GTX470 would be the best price/power card with direct support in Lion...but I heard powermanagement is not running so the card is always in 2d mode ;).

I remember something similar happened on my GT 8600m, but it could be solved by forcing the screen to sleep.

  • Like 1
Link to comment
Share on other sites

Your findings make a lot of sense, hopefully some of our video card gurus can confirm.

So according to this list a GTX470 would be the best price/power card with direct support in Lion..

According to a handful of reviews that I read before I bought it, the GTX 460 is better value for money than the GTX 470. It's been more than six months and I don't remember the details though.

Link to comment
Share on other sites

Your findings make a lot of sense, hopefully some of our video card gurus can confirm.

 

According to a handful of reviews that I read before I bought it, the GTX 460 is better value for money than the GTX 470. It's been more than six months and I don't remember the details though.

I was considering going nvidia basically in order to have cuda support, but it seems Apple is moving to AMD/ATI, and apparently latest ones (such as the 6870 which is my other option) work better in OS X and have much less power requeriments.

Link to comment
Share on other sites

LOL so.. all this research for nothing? :)

 

The Fermi cards that are supported by GF100Hal.kext should work fine in Lion, all you need is GraphicsEnabler=y and if you have a Gigabyte motherboard, PciRoot=1.

 

I've seen several posts claiming that the notorious Fermi Freeze that plagues GF104 based cards on Snow Leopard is gone in Lion.

Link to comment
Share on other sites

Nah, I've changed my mind I think I'll probably get a 560 ti even if it's not supported by default. I think CUDA and Physix support are important enough to compensate the extra power consumption and editting the kext in every update. (though, maybe a legacy kext could make the work).

Link to comment
Share on other sites

I have a particular interest in this topic regarding my Sapphire GTX 570 SC HD. Other threads suggest the 570 has a device_id of 1081. My 570 has a device_id of 1086 (probably driven by the presence of DisplayPort; which doesn't work in Lion at this time). Since I patched the hal100 kext right away, I don't know if it would have been enabled without the patch (will test when I return from travel). With the edit, my card is enabled but not recognized. I notice in Chameleon nvidia.c, no entry exits for device_id 1086. What is the process to get model name of the graphics card associated with the device_id in Chameleon? In the meantime, I inject the name with the DSDT.

Link to comment
Share on other sites

I have a particular interest in this topic regarding my Sapphire GTX 570 SC HD. Other threads suggest the 570 has a device_id of 1081. My 570 has a device_id of 1086 (probably driven by the presence of DisplayPort; which doesn't work in Lion at this time). Since I patched the hal100 kext right away, I don't know if it would have been enabled without the patch (will test when I return from travel). With the edit, my card is enabled but not recognized. I notice in Chameleon nvidia.c, no entry exits for device_id 1086. What is the process to get model name of the graphics card associated with the device_id in Chameleon? In the meantime, I inject the name with the DSDT.

 

I guess you should compile your own modified Chameleon version, or maybe asko for help in Voodoo labs forums. Anyway your id is not present in the default ranges for the kext, so you still have to add it manually.

Link to comment
Share on other sites

Thank you so much for putting this list together.

 

I have always meant to gather the info and do this but never did.

 

It needs to be remembered that the other issue with Fermi and other Nvidia cards not used regularly is that they frequently will stay in 2D mode due to throttling not getting triggered properly.

 

If your Hack id's as "Mac Pro 4,1" or "Mac Pro 5,1" you can edit the AGPM kext to fix this.

 

It is easy to figure out if you have the issue. Run OpenGl view and check test results.

 

If you get numbers in hundreds, you need to fix. If numbers start in hundreds then JUMP into thousands, no throttling issue.

 

The AGPM is a bit of a mystery still. Oddly, the 2 GPUs enumerated in it are the GT120 and the GTX260. SInce the GTX260 was never a "real" mac card, this is very odd. Especially since a GTX285 can throttle itself.

Link to comment
Share on other sites

Thank you so much for putting this list together.

 

I have always meant to gather the info and do this but never did.

 

It needs to be remembered that the other issue with Fermi and other Nvidia cards not used regularly is that they frequently will stay in 2D mode due to throttling not getting triggered properly.

 

If your Hack id's as "Mac Pro 4,1" or "Mac Pro 5,1" you can edit the AGPM kext to fix this.

 

It is easy to figure out if you have the issue. Run OpenGl view and check test results.

 

If you get numbers in hundreds, you need to fix. If numbers start in hundreds then JUMP into thousands, no throttling issue.

 

The AGPM is a bit of a mystery still. Oddly, the 2 GPUs enumerated in it are the GT120 and the GTX260. SInce the GTX260 was never a "real" mac card, this is very odd. Especially since a GTX285 can throttle itself.

I remember the same happened with the 8600GT mobile in my Acer notebook, but curiously if you put the screen to sleep, once you wake it the clock stayed in 3d mode.

Link to comment
Share on other sites

Well, I don't get it.

 

I have an NVIDIA GT 430 but no native support.. I need an EFI String to get the card working properly. Do I something wrong or do i have to install something?

 

You need an injector. Either Chameleon, an EFI String, or a Kext able to inject. The problem is Chameleon doesn't include the pci id for your 430, so if you want to use its "GraphicsEnabler" option you have to modify the source code and compile your own version, so probably is easier just using an EFI string.

Link to comment
Share on other sites

 Share

×
×
  • Create New...