I recently acquired a Samsung U28E510 monitor (4K UHD) for my hackintosh and had a hard time finding the information I needed to get it working with my integrated Intel HD 4600 graphics (desktop H97 chipset with i7-4790s) at full resolution. This post will share how I got it working. As a new hackintosher, a lot of the information I found was confusing because it assumed I knew how the pieces fit together,
I was using Sierra 10.12.6 (iMac15,1 SMBIOS) and already had the HD 4600 working with full acceleration on my 1080p monitor. When I first plugged in the new monitor, I could get at most 2560x1440 @ 60Hz from MacOS using the DisplayPort cable that came with the monitor. Using a Linux Live USB, I could get 3840x2160 @ 60Hz so I knew the hardware was working and capable.
My old config used the kexts Lilu, Shiki and IntelGraphicsFixup via Clover injection. I was injecting the ig-platform-id 0x0d220003, and on the desktop "<Apple Icon> -> About This Mac" showed that my graphics were allocated 1536MB.
I added CoreDisplayFixup.kext to avoid the pixel clock limit in the CoreDisplay framework. Since that required a newer version of Lilu, I downloaded updated source for Lilu, IntelGraphicsFixup and Shiki and rebuilt all of those along with CoreDisplayFixup.
The hardest part for me to understand, was why MacOS wouldn't offer me the option to go above 1440p resolution. I could get to 2160p with SwitchResX, but it wouldn't stick across reboots and I was seeing a phantom display on my system when I used that app. I spent a couple of days trying various things, and upgraded to High Sierra 10.13.1 along the way.
I eventually stumbled across a post at RampageDev, which explained how the magical ig-platform-id value is used -- which ended up being the key to my problem! I don't know if the bits in 0x0d220003 are important, but for my purposes that value is used as a key to a table in AppleIntelFramebufferAzul.kext which is used to initialize the HD 4600. This includes attributes such as buffer sizes and memory allocations which are needed for large, 4k displays. The link helped me understand the contents and format of that table, which in turn helped me understand the problem. By patching the kext via Clover, I could have MacOS configure the graphics hardware so that it would work as I wanted.
The patch itself is added to config.plist -> KernelAndKextPatches -> KextsToPatch (it's in an <array>):
<string>Framebuffer for 4K display</string>
I won't go into the grisly details here; if you're technical and want to understand, you can figure it out from the patched values. Basically, I'm increasing three values to accommodate the larger display:
The RAM allocation The framebuffer memory size The VRAM allocation
Once I enabled this patch in Clover and added CoreDisplayFixup.kext, I was able to boot into glorious 4K UHD. I also see the retina resolutions in the display settings. It's possible that I did not need to change all three fields. I changed them all on the first try, and things seem to be working well. I would be happy to update this post if an expert can tell me a better way to do it.
Note that this does allocate 2GB of RAM for the Intel graphics, but that's fine with me since I have 16GB and plan to upgrade to 32GB anyway. I'll probably end up getting a discrete GPU when I upgrade the RAM, but this patch helps me stay productive while I save for the upgrade.
I suspect that this approach could be packaged into a Lilu plugin for those that prefer to do their patches from kexts rather than the Clover config. Similarly, some of the Lilu plugins I'm using could probably be implemented via Clover patches. At this point, I'm just happy to have something that works.