Jump to content
Sign in to follow this  
Followers 0
nindustries

4K mini Kaby hackintosh, please checkup!

4 posts in this topic

Recommended Posts

Ahoy there!

 

I am looking at building my first vanilla hackintosh to drive a 4K display over Displayport.

Requirements for me are; speed, performance but not gaming.
I do realize I included a -K CPU, but the base clock of that model is far larger than the non-K version. (4.2Ghz vs 3.6Ghz)

 

CPU: Intel Core i7-7700K

MOBO/Case/PSU: AsRock Deskmini 110
Memory: Crucial CT16G4SFD824A
NVMe (PCI): Samsung 960 EVO 500GB

Bluetooth/Wifi: to be determined

 

Any thoughts? Thank you!

 

 

Kexts I would certainly need:

- FakeSMC ; to let it think I run genuine apple hardware

Fake CPUID ; to let it think I run Skylake instead of Kaby Lake

FakePCIID.kext and FakePCIID_Intel_HD_Graphics.kext ; to let it think I run Skylake graphics

'Skylake Glitch Fix' (AAPL,GfxYTile) ; no clue why

- VoodooHDA ; to let sound work (I think) AppleALC: native apple audio for non-apple hardware

IONVMeFamily.kext ; from macos, to let NVMe work at boot.

Share this post


Link to post
Share on other sites
Advertisement

Ahoy there!

 

I am looking at building my first vanilla hackintosh to drive a 4K display over Thunderbolt.

Requirements for me are; speed, performance but not gaming.

I do realize I included a -K CPU, but the base clock of that model is far larger than the non-K version. (4.2Ghz vs 3.6Ghz)

 

CPU: Intel Core i7-7700K

MOBO/Case/PSU: AsRock Deskmini 110

Memory: Crucial CT16G4SFD824A

NVMe (PCI): Samsung 960 EVO 500GB

Bluetooth/Wifi: to be determined

 

Any thoughts? Thank you!

 

 

Kexts I would certainly need:

- FakeSMC ; to let it think I run genuine apple hardware

Fake CPUID ; to let it think I run Skylake instead of Kaby Lake

FakePCIID.kext and FakePCIID_Intel_HD_Graphics.kext ; to let it think I run Skylake graphics

'Skylake Glitch Fix' (AAPL,GfxYTile) ; no clue why

- VoodooHDA ; to let sound work (I think)

IONVMeFamily.kext ; from macos, to let NVMe work at boot.

I think you got all the basics figured out, but in practice maybe you'll come across a couple of obstacles that could be resolved with a little research (usually settings on config.plist). I would only change VoodooHDA to AppleALC, Lilu combination kexts.

 

Best of luck and skills!

Share this post


Link to post
Share on other sites

yeah, the basics (kexts plus settings) look fine BUT how do you drive your display? Your mainboard seems to have USB-C but that doesn't mean it has Thunderbolt support - and even if it had Thunderbolt - do you know if the HD630 can drive a 4K display? ASRock claims that the video output has to be either VGA, HDMI or DisplayPort...

Share this post


Link to post
Share on other sites

yeah, the basics (kexts plus settings) look fine BUT how do you drive your display? Your mainboard seems to have USB-C but that doesn't mean it has Thunderbolt support - and even if it had Thunderbolt - do you know if the HD630 can drive a 4K display? ASRock claims that the video output has to be either VGA, HDMI or DisplayPort...

Thanks! My mistake, I meant Displayport.  :)

 

I think you got all the basics figured out, but in practice maybe you'll come across a couple of obstacles that could be resolved with a little research (usually settings on config.plist). I would only change VoodooHDA to AppleALC, Lilu combination kexts.

 

Best of luck and skills!

Thank you! AppleALC doesn't mention adding support for other chipsets, so I assume the Realtek ALC283 is natively supported?

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  
Followers 0

  • Recently Browsing   0 members

    No registered users viewing this page.

  • Similar Content

    • By v.osypets
      Hello!

      Yesterday I decided to install Mojave on my hack and installed it, but in the process I ran into a problem - the monitor image flickers. I tried everything I could - different Properties in Whatevergreen, I tried without it, deleted DSDT, SSDT, which I did under the system, changed SMBIOS (iMac18,1, iMacPro1,1, iMac18,3, MacBookPro 15,2) and etc., but unfortunately nothing can fix the situation.

      The monitor is connected via DisplayPort to the 4k monitor LG 31MU97Z-B. If you connect the monitor to the system via HDMI, then the monitor works stably, but the PICTURE is PICTED in purple.

      Following the links below, I posted a video where the monitor's behavior was recorded:

      DisplayPort issue


       
      HDMI port issue


      Before moving to Mojave, I used macOS High Sierra 10.13.6 (17G2208), where everything worked like a clock.

      Archive with Clover folder also attached to the post.

      Help, please, in solving the problem, otherwise I have this working tool, and it's simply unrealistic to work for him.
      CLOVER.zip
    • By WarDoc
      So is anyone planning on getting this to Hack it as its the perfect mix of intel and amd in the perfect size 
       
       
      https://www.pcworld.com/article/3267074/computers/intel-hades-canyon-nuc-nuc8i7hvk-review.html 
       
    • By bepferr
      Hello everyone! I installed OS Sierra 10.12.6  on a Lenovo y50-70 with 4k monitor following the tonymacx86 guide (this: https://www.tonymacx86.com/threads/guide-lenovo-y50-uhd-or-1080p-using-clover-uefi.232960/ ). All is OK (except the bluetooth but I consider that a secondary problem) except that I can not absolutely activate the monitor! For example, I can connect an external monitor via HDMI and so doing everything OK. I would like to understand how to do it, does anyone have any idea? I have already set up the DVMT prealloc to 128mb with modified bios and installed coredisplayfixup in L / E but nothing. If I boot up setting clover id-ig 0x12345678 the monitor works but the graphics card is recognized only as 31mb and everything is slow, the web pages for example are unusable. I have an integrated intel hd4600 card. I hope you can help me, thank you so much in advance!
    • By frompdog
      I recently acquired a Samsung U28E510 monitor (4K UHD) for my hackintosh and had a hard time finding the information I needed to get it working with my integrated Intel HD 4600 graphics (desktop H97 chipset with i7-4790s) at full resolution.  This post will share how I got it working.  As a new hackintosher, a lot of the information I found was confusing because it assumed I knew how the pieces fit together,
       
      I was using Sierra 10.12.6 (iMac15,1 SMBIOS) and already had the HD 4600 working with full acceleration on my 1080p monitor.  When I first plugged in the new monitor, I could get at most 2560x1440 @ 60Hz from MacOS using the DisplayPort cable that came with the monitor.  Using a Linux Live USB, I could get 3840x2160 @ 60Hz so I knew the hardware was working and capable.
       
      My old config used the kexts Lilu, Shiki and IntelGraphicsFixup via Clover injection.  I was injecting the ig-platform-id 0x0d220003, and on the desktop "<Apple Icon> -> About This Mac" showed that my graphics were allocated 1536MB.
       
      I added CoreDisplayFixup.kext to avoid the pixel clock limit in the CoreDisplay framework.  Since that required a newer version of Lilu, I downloaded updated source for Lilu, IntelGraphicsFixup and Shiki and rebuilt all of those along with CoreDisplayFixup.
       
      The hardest part for me to understand, was why MacOS wouldn't offer me the option to go above 1440p resolution.  I could get to 2160p with SwitchResX, but it wouldn't stick across reboots and I was seeing a phantom display on my system when I used that app.  I spent a couple of days trying various things, and upgraded to High Sierra 10.13.1 along the way.
       
      I eventually stumbled across a post at RampageDev, which explained how the magical ig-platform-id value is used -- which ended up being the key to my problem!  I don't know if the bits in 0x0d220003 are important, but for my purposes that value is used as a key to a table in AppleIntelFramebufferAzul.kext which is used to initialize the HD 4600.  This includes attributes such as buffer sizes and memory allocations which are needed for large, 4k displays.  The link helped me understand the contents and format of that table, which in turn helped me understand the problem.  By patching the kext via Clover, I could have MacOS configure the graphics hardware so that it would work as I wanted.
       
      The patch itself is added to config.plist -> KernelAndKextPatches -> KextsToPatch (it's in an <array>):
                              <dict>                                 <key>Comment</key>                                 <string>Framebuffer for 4K display</string>                                 <key>Find</key>                                 <data>                                 AwAiDQADAwMAAAACAAAwAQAAAAAAAABgmRQ=                                 </data>                                 <key>Name</key>                                 <string>AppleIntelFramebufferAzul</string>                                 <key>Replace</key>                                 <data>                                 AwAiDQADAwMAAAAEAAAgAgAAAAAAAACAmRQ=                                 </data>                         </dict> I won't go into the grisly details here; if you're technical and want to understand, you can figure it out from the patched values.  Basically, I'm increasing three values to accommodate the larger display: The RAM allocation The framebuffer memory size The VRAM allocation Once I enabled this patch in Clover and added CoreDisplayFixup.kext, I was able to boot into glorious 4K UHD.  I also see the retina resolutions in the display settings.  It's possible that I did not need to change all three fields.  I changed them all on the first try, and things seem to be working well.  I would be happy to update this post if an expert can tell me a better way to do it.
       
      Note that this does allocate 2GB of RAM for the Intel graphics, but that's fine with me since I have 16GB and plan to upgrade to 32GB anyway.  I'll probably end up getting a discrete GPU when I upgrade the RAM, but this patch helps me stay productive while I save for the upgrade.
       
      I suspect that this approach could be packaged into a Lilu plugin for those that prefer to do their patches from kexts rather than the Clover config.  Similarly, some of the Lilu plugins I'm using could probably be implemented via Clover patches.  At this point, I'm just happy to have something that works.
    • By Morpheus NS
      Hello. I am trying to install High Sierra od my machine from signature, and I am having difficulties reaching the installer. I always get this error or automatic restart depending on options I choose:
       

       
      Important (I think) information:
      - USB installer was made with createinstallmedia method (I am new to this, until now I used "manual" method which required Disk Utility)
      - Clover version is 4220
      - I am using iMac18,1 SMBios because I read somewhere that it supports Kaby Lake
      - I have also tried using Skylake fake ID for my processor
      - I am using all the same kexts (newer versions) that worked in Sierra
      - I am using DSDT that works in Sierra
      - SIP is disabled
      - I did not create Kernels folder (and obviously didn't put kernel inside) in System because I thought it wasn't necessary if I use createinstallmedia (mistake?)
       
      I've done some basic research and found that I probably need to drop MATS and DMAR tables in my config.plist - didn't help. I am stuck. 
       
      Does anyone have a suggestion? 
×