Jump to content

GPU and IGP Running a Monitor Each?


shrieken213
 Share

22 posts in this topic

Recommended Posts

Hi,

 

I've found out recently that when both of my monitors are wired to the same graphics card, even when the card is idling, the performance drops by ~10%. So I moved one of my monitors to the IGP, where it would act as the secondary while the other is my main monitor. This is a perfectly working setup in Windows.

 

On Hackintosh, it's causing me a lot of trouble. At first the IGP (HD 4600, 0x08060412 on 4770K) refused to be recognized by Yosemite until I put in vendor-id as 0x8086 and device-id as 0x0412. I'm still working on finding the best ig-platform-id but that's not the issue. When I did get the IGP enabled, the GPU stopped working! No output from the DVI and HDMI slots on the GPU, no HDMI on the IGP with the IGP enabled.

 

So simply put, when one starts working, the other stops.

 

That isn't to say the card stops being recognized. Both the GTX 770 and HD 4600 are recognized by OS X, but only the IGP is able to output a video signal when IGP is enabled.

 

I've tried rerouting the _ADR to 0x00210000 on the IGP to stop it from conflicting with the GPU, but my DSDT doesn't have a GFX0 entry. It's referenced in the DSDT, so I assume it's floating in the SSDTs somewhere.

 

Can someone help please? > < It's kind of painful using a computer on a vertical monitor only...

DSDT.aml.zip

Link to comment
Share on other sites

So simply put, when one starts working, the other stops.

For both graphics enabled, IGPU must be boot display.

BIOS setting:

1. Primary Display: IGPU

HD46000 setting:

1. Graphics/Inject/Intel/YES

2. IGPlatformID=0d220003

 

Edit: 8/8

Link to comment
Share on other sites

For both graphics enabled, IGPU must be boot display.

BIOS setting:

1. Primary Display: IGPU

HD46000 setting:

1. IGPlatformID=0x0300220D

 

I just booted with the BIOS set to Initial Display Output: IGFX instead of PCIe 1. No effect.

Also, 0x0300220D causes a reboot directly after the initial startup phase, I don't know why...bcc9 has the same motherboard and he wrote an entire article on this using that IGPlatformID but my system refuses to boot with it.

Link to comment
Share on other sites

I just booted with the BIOS set to Initial Display Output: IGFX instead of PCIe 1. No effect.

Also, 0x0300220D causes a reboot directly after the initial startup phase,

Did you undue all your previous experiments, i.e., FakeID, Inject Intel, Inject Nvidia, etc?

The setting noted above are all that are required.

Link to comment
Share on other sites

I think you need to have both gfx0 and igpu injected in your acpi tables. That's what worked for me at least. It looks like you are correct, your dedicated graphics card and possibly your integrated graphics are in one of your ssdt's. Also, usually on an unmodified DSDT, GFX0 is the integrated graphics and PEGP is the dedicated graphics. Some of it is still referencing GFX0 and I'm pretty sure it should all be changed to IGPU. Otherwise when you do change PEGP to GFX0 and inject your card you will have problems. So look through your SSDT's for PEGP and GFX0 and inject them there, then change the names in the DSDT accordingly.

 

I could be wrong, but I remember reading that Clover and Chameleon have slightly different formatting (for lack of a better term, sorry) for ig-platform-id so that one might work in Chameleon but not Clover.

 

EDIT: found it. http://www.insanelymac.com/forum/topic/284656-clover-general-discussion/?p=2040293

The byte order is the reverse of Chameleon's.

So you should actually use 0x0d220003 in Clover.

Link to comment
Share on other sites

Did you undue all your previous experiments, i.e., FakeID, Inject Intel, Inject Nvidia, etc?

The setting noted above are all that are required.

 

All injections are disabled at the moment. Enabling Intel injection injects the wrong device ID: 0x04120412, so it's reserved for when I mess up my DSDT. :)

 

I think you need to have both gfx0 and igpu injected in your acpi tables. That's what worked for me at least. It looks like you are correct, your dedicated graphics card and possibly your integrated graphics are in one of your ssdt's. Also, usually on an unmodified DSDT, GFX0 is the integrated graphics and PEGP is the dedicated graphics. Some of it is still referencing GFX0 and I'm pretty sure it should all be changed to IGPU. Otherwise when you do change PEGP to GFX0 and inject your card you will have problems. So look through your SSDT's for PEGP and GFX0 and inject them there, then change the names in the DSDT accordingly.

 

I could be wrong, but I remember reading that Clover and Chameleon have slightly different formatting (for lack of a better term, sorry) for ig-platform-id so that one might work in Chameleon but not Clover.

 

EDIT: found it. http://www.insanelymac.com/forum/topic/284656-clover-general-discussion/?p=2040293

The byte order is the reverse of Chameleon's.

So you should actually use 0x0d220003 in Clover.

I noticed the wonkiness in injecting hex values in the DSDT with Clover. Will try that.

EDIT: Tried it and it makes the IGP unrecognizable, even with the vendor ID and device ID set correctly. The GPU loads instead.

 

On another note, I found both PEGP and GFX0 in SSDT5.aml. I had to cross-compile with the DSDT to resolve any external device referencing issues. PEGP is a TINY device, located at \_SB.PCI0\PEG0, while GFX0 is under \_SB.PCI0 and takes up more than half the SSDT. GFX0 also has the address 0x00020000, if it's of any note. I'm not sure which one is the IGP and GPU, and I did not find any other references to GFX0, IGPU and PEGP in any of the 7 other SSDTs.

SSDT.zip

Link to comment
Share on other sites

I injected device GFX0 under \_SB.PCI0\ in the DSDT using RampageDev's example here: http://rampagedev.wordpress.com/guides/inject-your-nvidia-fermi-and-quadro-graphic-card-into-a-dsdt/ and values already found in my working setup. The injection worked I think, but the card still resides under PCI0\PEG0 instead of PCI0\, and the IGP still overrides the GPU when the ig-platform-id is set to 0x0000160a. 0x0d220003 and 0x0300220d both cause IGP failure...

Link to comment
Share on other sites

I identified, using my MacBookPro8,1 as reference, the GFX0 entry in SSDT5.aml as the IGPU and patched it accordingly. Now, no kext gets loaded regardless of platform ID!

I tried injecting GFX0 and IGPU data, with GPU under _SB.PCI0/PEG0/GFX. Injecting IGPU seems to be beneficial as at least SOMETHING shows up on IOReg.

Link to comment
Share on other sites

I woud put it back to where you had it before then, have you tried other platform IDs?

Check out this post with a bunch of other ones. Here they are with the bytes flipped

0x00000604

0x04001204

0x00001604

0x02001604

0x00002604

0x0000260A

0x0500260A

0x0600260A

0x0800260A

0x08002E0A

0x0000060C

0x0000160C

0x0000260C

0x0000260D

0x0700260D

 

About the byte flipping, I was looking at one of Pike's blog posts with a DSDT patch for haswell, the bytes are used the same way that Clover does it. As you can see in that link one of the platform IDs is 0A160000, So it seems like that's the correct way to do it for DSDT injection. Oh, and here is the relevant post from Pike: http://pikeralpha.wordpress.com/2013/06/16/intel-hd4600-with-full-resolution/

 

Have you tried all the different ports for the IGP to see if it makes a difference? Looks like you have several.

 

Also, does the 10% performance decrease happen in Windows as well or only OS X? If it's only in OS X I would think it could be solved with some AppleGraphicsPowerManagement edits.

Link to comment
Share on other sites

I woud put it back to where you had it before then, have you tried other platform IDs?

Check out this post with a bunch of other ones. Here they are with the bytes flipped

0x00000604

0x04001204

0x00001604

0x02001604

0x00002604

0x0000260A

0x0500260A

0x0600260A

0x0800260A

0x08002E0A

0x0000060C

0x0000160C

0x0000260C

0x0000260D

0x0700260D

 

About the byte flipping, I was looking at one of Pike's blog posts with a DSDT patch for haswell, the bytes are used the same way that Clover does it. As you can see in that link one of the platform IDs is 0A160000, So it seems like that's the correct way to do it for DSDT injection. Oh, and here is the relevant post from Pike: http://pikeralpha.wordpress.com/2013/06/16/intel-hd4600-with-full-resolution/

 

Have you tried all the different ports for the IGP to see if it makes a difference? Looks like you have several.

 

Also, does the 10% performance decrease happen in Windows as well or only OS X? If it's only in OS X I would think it could be solved with some AppleGraphicsPowerManagement edits.

Indeed I saw both articles. The platform IDs and ports don't help because the kexts are not loading in the first place for an unknown reason. I even modified the kexts so that the PCI ID match is set to "IGPU" to no effect.

 

I also tried artur_pt's suggestion of going with an iMac14,1 SMBIOS, still no luck. Going back to the MacPro6,1 SMBIOS.

 

The performance hit is inherent in a GPU driving two monitors regardless of the operating system. It has to draw twice the amount of pixels; no patch can fix that. :P

Link to comment
Share on other sites

Indeed I saw both articles. The platform IDs and ports don't help because the kexts are not loading in the first place for an unknown reason. I even modified the kexts so that the PCI ID match is set to "IGPU" to no effect.

 

I also tried artur_pt's suggestion of going with an iMac14,1 SMBIOS, still no luck. Going back to the MacPro6,1 SMBIOS.

 

The performance hit is inherent in a GPU driving two monitors regardless of the operating system. It has to draw twice the amount of pixels; no patch can fix that. :P

Got it. You're using a fake Id in either clover or dsdt right? Cause if you're just adding it to the kext it needs to be in both appleintelhd5000 and appleintelframebufferazul AFAIK.
Link to comment
Share on other sites

Got it. You're using a fake Id in either clover or dsdt right? Cause if you're just adding it to the kext it needs to be in both appleintelhd5000 and appleintelframebufferazul AFAIK.

No FakeID, no InjectIntel in Clover. Injecting actual device IDs via DSDT according to the preexisting values found in both HD5000.kext and FBAzul.kext: 0x80860412.

 

But the issue is it won't even load the kexts in the first place, as IlhanRaja pointed out.

 

Running RampageDev's SSDT does a proper GFX0 device injection but I'm still lost as to why the internal graphics won't work.

 

EDIT: RampageDev's SSDT required the same GFX0->IGPU patch as well as the flipped byte order for Clover. So far, the PlatformID 0x0D220003 inside the SSDT is showing the same restart behavior as when I injected it via DSDT, so that's something I guess. Will post back with previously working 0x0A160000.

Link to comment
Share on other sites

Corrected Post #2.

Working HD4600/Clover Injection

1. Graphics/Inject/Intel/YES

2. Graphics/Inject/Intel/YES & ig-platform-id: 0d220003

3. Graphics/Inject/Intel/YES & ig-platform-id: 0x0d220003

 

dsdt/ssdt HD4600 Injection

1. Graphics/Inject/Intel/NO & "AAPL,ig-platform-id":  0x03, 0x00, 0x22, 0x0D

 

First time: 0d220003 = 0x0d220003, Clover feature

dsdt: 0x03, 0x00, 0x22, 0x0D, Azul: 0300220d0003030300000002, IOReg: <03 00 22 0d>

 

Back to your issue:

1. no need to try any other framebuffer; others have LVDS or no connectors

2. native dsdt is fine, device names do not matter (only exception; HD4600 GPU PM, later)

 

Attach an IOReg with one of the injection options installed, HD4600 primary and boot display attached.

Link to comment
Share on other sites

Corrected Post #2.

Working HD4600/Clover Injection

1. Graphics/Inject/Intel/YES

2. Graphics/Inject/Intel/YES & ig-platform-id: 0d220003

3. Graphics/Inject/Intel/YES & ig-platform-id: 0x0d220003

 

dsdt/ssdt HD4600 Injection

1. Graphics/Inject/Intel/NO & "AAPL,ig-platform-id":  0x03, 0x00, 0x22, 0x0D

 

First time: 0d220003 = 0x0d220003, Clover feature

dsdt: 0x03, 0x00, 0x22, 0x0D, Azul: 0300220d0003030300000002, IOReg: <03 00 22 0d>

 

Back to your issue:

1. no need to try any other framebuffer; others have LVDS or no connectors

2. native dsdt is fine, device names do not matter (only exception; HD4600 GPU PM, later)

 

Attach an IOReg with one of the injection options installed, HD4600 primary and boot display attached.

I repeat myself: 0x0300220d injected through any method causes either a restart on startup or no video output and disables both the GPU and IGP at the same time. I am unable to provide any IOReg file for you with these settings.

Link to comment
Share on other sites

It wasn't clear whether or not you tried the other platform ids I posted, I'm asumming you did? I remember reading a thread on editing the connector data for the different framebuffers, similar to what you would do for an ati card. Also I wonder if there is a whitelist for what board ids or smbios can load those kexts and that's why they're not loading (and maybe why you don't have a display?)... Also do you get a display with that platform id if you have your dedicated GPU removed? At least then you might be able to get an ioreg for toleda.

I'm also wondering if you add aapl,boot-display in the dsdt if it would get you video output.

 

 

 

EDIT: I was going to post something else but it won't let me, it keeps cutting off half of my post. I think I found something in your SSDT related to the graphics doing something different depending on which OS you are using. What version of Windows are you using where this setup works properly?

Link to comment
Share on other sites

I repeat myself: 0x0300220d injected through any method causes either a restart on startup or no video output and disables both the GPU and IGP at the same time. I am unable to provide any IOReg file for you with these settings.

Post #1 implies IGP working.  "That isn't to say the card stops being recognized. Both the GTX 770 and HD 4600 are recognized by OS X, but only the IGP is able to output a video signal when IGP is enabled."

 

Choices:

1. Validate setup on Mavericks

2. Set BIOS/Primary Display to PCIe until Yosemite/HD4600 boot problem solved

3. Use Yosemite/DP2; no HD4600 boot issue

4. Set  BIOS/Primary Display to IGFX; boot without display connected to HD4600.  When Desktop appears, plug in HD4600 display (tested with DP)

Link to comment
Share on other sites

As a side note, as far as compatibility is concerned, Haswell was a bit of a step behind, compared to Ivy Bridge, where it usually is a matter to set things up properly without any need for hard hacks or workarounds. I have the same setup on Ivy Bridge, IGPU and GPU running at the same time, each one powering its own monitor (the GPU one uses VGA) and, since Toleda and RampageDev helped me setting up my DSDT, have had zero issues related to it (in fact, all the issues i've ever had with my main hack were related to a BIOS corruption caused by a power surge after a heavy storm).

Link to comment
Share on other sites

Post #1 implies IGP working.  "That isn't to say the card stops being recognized. Both the GTX 770 and HD 4600 are recognized by OS X, but only the IGP is able to output a video signal when IGP is enabled."

 

Choices:

1. Validate setup on Mavericks

2. Set BIOS/Primary Display to PCIe until Yosemite/HD4600 boot problem solved

3. Use Yosemite/DP2; no HD4600 boot issue

4. Set  BIOS/Primary Display to IGFX; boot without display connected to HD4600.  When Desktop appears, plug in HD4600 display (tested with DP)

 

1. BIOS setting for primary monitor does not change a single thing. POST is displayed on the GPU no matter what...and that's with me changing the settings.

2. Loaded OEM SSDT tables alongside Piker's PM SSDT and RD's SSDT, as well as undid my GFX0/PEGP patch on both DSDT and RD SSDT. GPU loads under PCI0/AppleACPIPCI/PEG0@1/IOPP/ as device PEGP, and IGP is "recognized" as GFX0 under PCI0/AppleACPIPCI/. No video output from IGP, no kexts loading. **edit** on second look, the IGP _DSM injection did not work at all. will try to move RD's injection directly into DSDT.

3. Tried plugging in the display after desktop appearing to no effect. The kexts aren't loading in the first place!

 

 

Edit2: Tried booting with correct injection. System boots into IGP only AGAIN; GPU is disabled the moment IGP is enabled. Also, the system will not boot unless I hot-plug the monitor after boot process.

Also, if I hot-plug anything into the HDMI port, I get a KP I think. It says "Debugger called <> Debugger not configured. Hanging." It crashes the system, simply put...

 

That is not to say that the GPU ceases to be recognized. The Nvidia drivers correctly read the VBIOS settings, VRAM capacity and even the ROM version. Nvidia accelerator loads fine, but there is simply no video output.

 

Edit3: Back to instant reboot before desktop, 0x0300220d

Link to comment
Share on other sites

If you are in fact getting a kernel panic, the boot flag debug=0x144 along with -v may give you a little more info... Maybe the problem is the nvidia rather than integrated? I know the kexts still aren't loading but maybe you should figure out how to get both screens working first, idk. When you uses the nvidia dsdt patch did you specify the right connector(s)? http://rampagedev.wordpress.com/guides/inject-your-nvidia-fermi-and-quadro-graphic-card-into-a-dsdt/

 

Not loading the kexts is typically the wrong device Id or (maybe) wrong smbios, but I know you already tried all that. I noticed in some kexts different device ids are made to operate slightly differently. The one you are using seems to be what everyone uses though.

 

Anyway sorry but I'm about just as stumped as you are, this is kind of beyond my knowledge :-( it sounds like you may actually be more knowledgeable than me about this.

 

The only other suggestions I have would be:

-trying an efi string for either card

-updating the bios with a modded or beta one from tweaktown that has newer oproms (in my experience these are often unstable)

-last resort... putting a semi-cheap but compatible nvidia card in a secondary slot and using that for the second monitor instead

 

I wonder if netkas would know a solution to this...

Link to comment
Share on other sites

Further testing seems to show that if the IGP is enabled (i.e. kexts load) at all, regardless of the port that the monitor is plugged into, the graphics card video output is automatically disabled.

That said, the SSDT edits by RampageDev do seem to make the kexts load, so I guess that's a step in the right direction. But for now, it's an either-or situation.

 

I've tried to roll back the Intel HD graphics kexts to Mavericks, but the OS complains of improperly compiled headers or something and refuses to load them.

 

I'm out of options, I guess I'll just run with a single monitor and the IGP disabled on Yosemite.

Link to comment
Share on other sites

Further testing seems to show that if the IGP is enabled (i.e. kexts load) at all, regardless of the port that the monitor is plugged into, the graphics card video output is automatically disabled.

That said, the SSDT edits by RampageDev do seem to make the kexts load, so I guess that's a step in the right direction. But for now, it's an either-or situation.

 

I've tried to roll back the Intel HD graphics kexts to Mavericks, but the OS complains of improperly compiled headers or something and refuses to load them.

 

I'm out of options, I guess I'll just run with a single monitor and the IGP disabled on Yosemite.

that's unfortunate. If I were you I'd probably just get a cheap second card that's supported. I'm pretty sure that would work.

You know what's weird? Your situation is exactly what happens to me in Windows, but not osx.

Link to comment
Share on other sites

 Share

×
×
  • Create New...