Jump to content
Mieze

Tracing back the AMD GPU wakeup issue to its origin

367 posts in this topic

Recommended Posts

Hi nms, any tipps for my problem? It seems RadeonDeInit=Yes always leads to a AMD R9 xxx GPU definition. How to change that to RX 580?

Unfortunately my knowledge about ATI/AMD stuff very low )-;

Share this post


Link to post
Share on other sites
Advertisement

Has sombody else checked power consumption of his RX 580 or another AMD card under High Sierra? Are the Apple drivers this bad optimized?

 

Under windows the 580 is listed with 12 watt idle power consumption, under Mac it seams to be 36 watt. My 5960 processor jumps back to 1,2 MHz when idle, so speedsteps seem to work okay. It really seems to be the 580 GPU.

 

 

Share this post


Link to post
Share on other sites

Thanks a lot. Now system setting display AMD RX 580 8GB. But power draw hasn´t changed.

 

Complete system idle:

88 watt with a NVIDIA 1060 = approx. 8 watt for the GPU

116 watt with RX 580 = approx. 36 watt (!) for the GPU

 

For graphics power management to work properly make sure you have selected a system definition which matches your hardware, checked that the GPUs have proper names in your DSDT and checked that platform power management (ASPM) in BIOS setup is enabled.

 

Mieze

Share this post


Link to post
Share on other sites

Hi Mieze,

 

thanks for your answer. Just I don´t know what exactly to do.

 

My system runs as Mac Pro 6.1 with perfect speedsteps.

I use the kext patch "Rename AMD R9 xxx to AMD RX 580", in system infomation the GPU appears as RX580

I switched ASPM in BIOS, but there is no "enabled" settings, just Auto or "L1", whatever that means. This brought down power consumption from 116 watt to 113 watt idle - not bad.

 

So what is the easiest way to make shure the GPU has a proper name in DSDT? I don´t use a DSDT file, just some SSDT, Pikes speedsteps, audio and nvme patch.

Do I need a DSDT file or is there a way in clover?

Share this post


Link to post
Share on other sites

So what is the easiest way to make shure the GPU has a proper name in DSDT? I don´t use a DSDT file, just some SSDT, Pikes speedsteps, audio and nvme patch.

Do I need a DSDT file or is there a way in clover?

The MacPro6,1 has two GPUs which have the names GFX1 and GFX2. Take a look at IORegistry in order to find out which names your GPUs are using. Basically you can patch your DSDT manually or try to use Clover's ability to apply custom DSDT patches (which might not be possible in some cases).

 

By the way, are you injecting a framebuffer personality? I'm asking because some of these also influence graphics power management by selecting special configuration parameters for the GPU.

 

Mieze

Share this post


Link to post
Share on other sites

Hi Mieze,

 

thanks for your efforts - but I already fail to find out whether it is GFX1 or GFX 2. No GFX ist to be found in IORegistryExplorer.

 

The other thing: I use RedeonDeinit, which is kind of black box. How to patch GFX number or/and framebuffer exactly, and does this work together with RadeonDeinit or do I have to use a different approach?

 

Specialists like you can´t surely imagine the absolute basics problems a dummy like me has. I poke around until the system runs sufficient, and then I´m going to use the system for work for 2 years without changing something.

Share this post


Link to post
Share on other sites

Hi Mieze,

 

thanks for your efforts - but I already fail to find out whether it is GFX1 or GFX 2. No GFX ist to be found in IORegistryExplorer.

 

The other thing: I use RedeonDeinit, which is kind of black box. How to patch GFX number or/and framebuffer exactly, and does this work together with RadeonDeinit or do I have to use a different approach?

 

Specialists like you can´t surely imagine the absolute basics problems a dummy like me has. I poke around until the system runs sufficient, and then I´m going to use the system for work for 2 years without changing something.

GFX isn't showing up in IOReg because you probably aren't renaming your PEGx to GFX in your SSDT or DSDT.

Share this post


Link to post
Share on other sites

The MacPro6,1 has two GPUs which have the names GFX1 and GFX2. 

Outstanding work.  Suggestion, edit Post #1 dsdt patch to Device (GFX0) replacing Device (PEGP).  

 

PEGP/AGPM - threshhold GPM:

post-618506-0-78000200-1511386836_thumb.png

 

GFX0/AGPM - p-state GPM (imac18,3)

post-618506-0-80026200-1511387324_thumb.png

 

Desktop/Skylake/Polaris: imac18,3/Mac-BE088AF8C5EB4FA2

 

Otherwise,  a dummy AMD AGPM kext can be injected by adding the GFX0 (or PEGP/PEG0/H000, etc.) property to any board-id:

post-618506-0-86831000-1511388572_thumb.png

Share this post


Link to post
Share on other sites

Can someone explain to me why SSDT patching doesn't work unless you inject ATI using Clover?

Basically it boils down to the question: why do Apple's framebuffer drivers define dedicated framebuffer personalities although they are able to auto-generate a default personality for almost any graphics card we are using?

 

Well, I think that the answer is quite obvious. It's because the auto-generated connector data isn't meant to be a full replacement for a dedicated one but more as a fallback mechanism for situations in which a dedicated framebuffer personality for a certain graphics card is missing so that basic screen output can be provided even for unknown hardware. This is also the reason why auto-generated connector data is limited in functionality in a way that it doesn't support advanced feature like multi screen support, etc.

 

Mieze

Share this post


Link to post
Share on other sites

Basically it boils down to the question: why do Apple's framebuffer drivers define dedicated framebuffer personalities although they are able to auto-generate a default personality for almost any graphics card we are using?

 

Well, I think that the answer is quite obvious. It's because the auto-generated connector data isn't meant to be a full replacement for a dedicated one but more as a fallback mechanism for situations in which a dedicated framebuffer personality for a certain graphics card is missing so that basic screen output can be provided even for unknown hardware. This is also the reason why auto-generated connector data is limited in functionality in a way that it doesn't support advanced feature like multi screen support, etc.

 

Mieze

But I get multi-monitor perfectly fine with auto-generated connector data when I don't inject anything, still doesn't explain why SSDT/DSDT injection will only work when ATI injection is enabled only.

Share this post


Link to post
Share on other sites

But I get multi-monitor perfectly fine with auto-generated connector data when I don't inject anything, still doesn't explain why SSDT/DSDT injection will only work when ATI injection is enabled only.

Apple's algorithm for auto-generation of the connector data is defective. In case multi screen support is working for you, you might be one of the lucky few but this doesn't mean the the connector data is 100% correct. The hotplug id might be wrong causing display detection to fail after wakeup. There are several reports of limited functionality using the default framebuffer so that we must consider that as the normal state.

 

I only wonder if this is a driver bug or if Apple designed it that way intentionally?

 

Mieze

Share this post


Link to post
Share on other sites

Apple's algorithm for auto-generation of the connector data is defective. In case multi screen support is working for you, you might be one of the lucky few but this doesn't mean the the connector data is 100% correct. The hotplug id might be wrong causing display detection to fail after wakeup. There are several reports of limited functionality using the default framebuffer so that we must consider that as the normal state.

 

I only wonder if this is a driver bug or if Apple designed it that way intentionally?

 

Mieze

OK I can understand that, maybe I should re-phase the question then. What all info is needed in the SSDT/DSDT injected in order to not have to use Clover's ATI Inject enabled?

Share this post


Link to post
Share on other sites

Outstanding work.  Suggestion, edit Post #1 dsdt patch to Device (GFX0) replacing Device (PEGP).  

 

PEGP/AGPM - threshhold GPM:

attachicon.gifScreen Shot 2017-11-22 at 4.38.35 PM.png

 

GFX0/AGPM - p-state GPM (imac18,3)

attachicon.gifScreen Shot 2017-11-22 at 4.47.34 PM.png

 

Desktop/Skylake/Polaris: imac18,3/Mac-BE088AF8C5EB4FA2

 

Otherwise,  a dummy AMD AGPM kext can be injected by adding the GFX0 (or PEGP/PEG0/H000, etc.) property to any board-id:

attachicon.gifScreen Shot 2017-11-22 at 5.08.27 PM.png

Sir Ive Injected AGPM For my AMD Radeon HD 7650M via FakeSMC.

 

Sir I've used MBP 9,2 Board id for CPU and MBP 8,3 board id for  GPU PM, sir is it good way for getting PM !

 

Sir I've tried everything to get  AMD Radeon HD 7650M on 10.13.1 but no luck :(

sir can u guide me!

 

Want for u reply !

post-941217-0-68053400-1511426363_thumb.png

post-941217-0-32590900-1511426378_thumb.png

post-941217-0-30904500-1511426684_thumb.png

Share this post


Link to post
Share on other sites

OK I can understand that, maybe I should re-phase the question then. What all info is needed in the SSDT/DSDT injected in order to not have to use Clover's ATI Inject enabled?

Did you make  it as Device Properties injection or also DSDT Fix? Second place can explain why SSDT injection has or not has effect.

 

But I see something strange. If I set RadeonDeInit and inject Arbitrary properties then drivers ignore these properties, i.e. I can't inject model name.

It is very strange because I see the properties present in DeviceTree/Platform.

May be boot.efi thinks that the card is absent?

Share this post


Link to post
Share on other sites

Apple's algorithm for auto-generation of the connector data is defective. In case multi screen support is working for you, you might be one of the lucky few but this doesn't mean the the connector data is 100% correct. The hotplug id might be wrong causing display detection to fail after wakeup. There are several reports of limited functionality using the default framebuffer so that we must consider that as the normal state.

 

I only wonder if this is a driver bug or if Apple designed it that way intentionally?

 

Mieze

There are number of cards where connector data in video bios does not correlate with hardware. IMHO, apple software just fails there as result.

Share this post


Link to post
Share on other sites

Did you make  it as Device Properties injection or also DSDT Fix? Second place can explain why SSDT injection has or not has effect.

 

But I see something strange. If I set RadeonDeInit and inject Arbitrary properties then drivers ignore these properties, i.e. I can't inject model name.

It is very strange because I see the properties present in DeviceTree/Platform.

May be boot.efi thinks that the card is absent?

I think I injected as Device Properties, take a look.

SSDT-RX-480.aml.zip

Share this post


Link to post
Share on other sites

I've been trying to get my ASUS RX560 to work without Whatevergreen as per this thread but I'm not having luck.

 

Using:

GA-Z270-HD3, i7-7700k

 

On 10.13.1 using iMac18,3 with Clover 4297.

 

I tried RadeonDeInit with no whateverGreen. Black screen at login

 

Without RadeonDeInit I tried this SSDT patch... Black screen at login

DefinitionBlock ("", "SSDT", 2, "Apple", "Radeon", 0x00001000)
{
    External (_SB_.PCI0.GFX0.PEGP, DeviceObj)    // (from opcode)

    Scope (\_SB.PCI0.GFX0.PEGP)
    {
        OperationRegion (PCIB, PCI_Config, Zero, 0x0100)
        Field (PCIB, AnyAcc, NoLock, Preserve)
        {
            Offset (0x10), 
            BAR0,   32, 
            BAR1,   32, 
            BAR2,   64, 
            BAR4,   32, 
            BAR5,   32
        }

        Method (_INI, 0, NotSerialized)  // _INI: Initialize
        {
            If (LEqual (BAR5, Zero))
            {
                Store (BAR2, Local0)
            }
            Else
            {
                Store (BAR5, Local0)
            }

            OperationRegion (GREG, SystemMemory, And (Local0, 0xFFFFFFFFFFFFFFF0), 0x8000)
            Field (GREG, AnyAcc, NoLock, Preserve)
            {
                Offset (0x6800), 
                GENA,   32, 
                GCTL,   32, 
                LTBC,   32, 
                Offset (0x6810), 
                PSBL,   32, 
                SSBL,   32, 
                PTCH,   32, 
                PSBH,   32, 
                SSBH,   32, 
                Offset (0x6848), 
                FCTL,   32, 
                Offset (0x6EF8), 
                MUMD,   32
            }

            Store (Zero, FCTL)
            Store (Zero, PSBH)
            Store (Zero, SSBH)
            Store (Zero, LTBC)
            Store (One, GENA)
            Store (Zero, MUMD)
        }
    }
}

I have these graphics related DSDT patches in my config file:

				<dict>
					<key>Comment</key>
					<string>Intel GPU PM. Rename GFX0 to IGPU</string>
					<key>Disabled</key>
					<false/>
					<key>Find</key>
					<data>
					R0ZYMA==
					</data>
					<key>Replace</key>
					<data>
					SUdQVQ==
					</data>
				</dict>
				<dict>
					<key>Comment</key>
					<string>Rename PEG0 to GFX0 (Graphics Card)</string>
					<key>Disabled</key>
					<false/>
					<key>Find</key>
					<data>
					UEVHMA==
					</data>
					<key>Replace</key>
					<data>
					R0ZYMA==
					</data>
				</dict>

Any suggestions? I'm attaching my F4 origin SSDT files and my ioreg (butting with Whatevergreen).

Thanks.

origin.zip

iMac.ioreg.zip

Share this post


Link to post
Share on other sites

I've been trying to get my ASUS RX560 to work without Whatevergreen as per this thread but I'm not having luck.

 

Using:

GA-Z270-HD3, i7-7700k

 

On 10.13.1 using iMac18,3 with Clover 4297.

 

I tried RadeonDeInit with no whateverGreen. Black screen at login

 

Without RadeonDeInit I tried this SSDT patch... Black screen at login

DefinitionBlock ("", "SSDT", 2, "Apple", "Radeon", 0x00001000)
{
    External (_SB_.PCI0.GFX0.PEGP, DeviceObj)    // (from opcode)

    Scope (\_SB.PCI0.GFX0.PEGP)
    {
        OperationRegion (PCIB, PCI_Config, Zero, 0x0100)
        Field (PCIB, AnyAcc, NoLock, Preserve)
        {
            Offset (0x10), 
            BAR0,   32, 
            BAR1,   32, 
            BAR2,   64, 
            BAR4,   32, 
            BAR5,   32
        }

        Method (_INI, 0, NotSerialized)  // _INI: Initialize
        {
            If (LEqual (BAR5, Zero))
            {
                Store (BAR2, Local0)
            }
            Else
            {
                Store (BAR5, Local0)
            }

            OperationRegion (GREG, SystemMemory, And (Local0, 0xFFFFFFFFFFFFFFF0), 0x8000)
            Field (GREG, AnyAcc, NoLock, Preserve)
            {
                Offset (0x6800), 
                GENA,   32, 
                GCTL,   32, 
                LTBC,   32, 
                Offset (0x6810), 
                PSBL,   32, 
                SSBL,   32, 
                PTCH,   32, 
                PSBH,   32, 
                SSBH,   32, 
                Offset (0x6848), 
                FCTL,   32, 
                Offset (0x6EF8), 
                MUMD,   32
            }

            Store (Zero, FCTL)
            Store (Zero, PSBH)
            Store (Zero, SSBH)
            Store (Zero, LTBC)
            Store (One, GENA)
            Store (Zero, MUMD)
        }
    }
}

I have these graphics related DSDT patches in my config file:

				<dict>
					<key>Comment</key>
					<string>Intel GPU PM. Rename GFX0 to IGPU</string>
					<key>Disabled</key>
					<false/>
					<key>Find</key>
					<data>
					R0ZYMA==
					</data>
					<key>Replace</key>
					<data>
					SUdQVQ==
					</data>
				</dict>
				<dict>
					<key>Comment</key>
					<string>Rename PEG0 to GFX0 (Graphics Card)</string>
					<key>Disabled</key>
					<false/>
					<key>Find</key>
					<data>
					UEVHMA==
					</data>
					<key>Replace</key>
					<data>
					R0ZYMA==
					</data>
				</dict>

Any suggestions?

Thanks.

External (_SB_.PCI0.GFX0.PEGP, DeviceObj)

Should be 

External (_SB_.PCI0.PEG0.PEGP, DeviceObj)

or 

 External (_SB_.PCI0.PEG0.GFX0, DeviceObj)

You either use PEGP or GFX0 not both, if you are replacing PEGP with GFX0 in ACPI patching you don't want _SB_.PCI0.PEG0.GFX0 also in SSDT

Share this post


Link to post
Share on other sites

used MBP 9,2 Board id for CPU and MBP 8,3 board id for  GPU PM, sir is it good way for getting PM !

Not correct, MPB 8,3/HD 6xxxM, enables threshold GPM

Try MPB 11,5/R9 3xxM, enables pstate GPM

 

AMD HD 7xxx and newer use same APGM/.../GFX0 properties (ex, GFX1)

Edit: Except HD 7650M, actually 6xxx; credit: Slice/#249

Share this post


Link to post
Share on other sites
External (_SB_.PCI0.GFX0.PEGP, DeviceObj)

Should be 

External (_SB_.PCI0.PEG0.PEGP, DeviceObj)

or 

 External (_SB_.PCI0.PEG0.GFX0, DeviceObj)

You either use PEGP or GFX0 not both, if you are replacing PEGP with GFX0 in ACPI patching you don't want _SB_.PCI0.PEG0.GFX0 also in SSDT

 

 

Thanks for looking at this for me... I tried both approaches and in both cases black screen at boot.

 

Could I be missing something fundamental that is causing RadeonDeInit and even the SSDT method both fail? Like I said, WhateverGreen does work.

Share this post


Link to post
Share on other sites

Not correct, MPB 8,3/HD 6xxxM, enables threshold GPM

Try MPB 11,5/R9 3xxM, enables pstate GPM

 

AMD HD 7xxx and newer use same APGM/.../GFX0 properties (ex, GFX1)

Really 7650M is 6xxx family

{ 0x6841, CHIP_FAMILY_TURKS, "AMD Radeon HD 7650M Series",

Share this post


Link to post
Share on other sites

Really 7650M is 6xxx family

{ 0x6841, CHIP_FAMILY_TURKS, "AMD Radeon HD 7650M Series",

@slice @toleda sir,

 

i have this card AMD Radeon HD 7650M Series, how can i get it working in 10.13.1 natively  !

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

  • Recently Browsing   0 members

    No registered users viewing this page.

  • Similar Content

    • By vin047
      I have a Kingston HyperX Predator PCIE-SSD (model: SHPM2280P2H) matched with an X58 motherboard (legacy bios). The card has a Marvell 88SS9293 controller with an Option ROM which allows it to be detected on my motherboard as a bootable drive - listed as "IDE: Kingston SHPM". As my BIOS is non-UEFI, I assume the Option ROM presents it in IDE mode instead of AHCI.
       
      I use Clover legacy with BiosBlockIO to get around this, and it works well - Clover detects my drive and allows me to install macOS onto it and use it as a boot drive. macOS detects the drive correctly as a SATA AHCI drive. I guess Clover correctly detects (configures?) the Option ROM into AHCI mode and presents it as such to macOS - getting full speed read Read/Write performance. All is well.
       
      Except waking from sleep. When waking from sleep, depending on clover config options + dsdt, I either get an immediate reboot or hang on black screen that requires hard reset. I eventually realised that nothing after entering sleep mode is recorded in the logs, its completely empty until next boot up. This makes me think that on resume from sleep, macOS isn't seeing the drive anymore (hence can't even write to logs). My assumption here is that the Option ROM is presenting/configuring the card in IDE mode, which is not what macOS is expecting.
       
      This post describes the problem: 
       
      and also has a solution: in dsdt, simply write the correct values to 0x40, 0x41, 0x42 (which must be the BAR0 register?) to configure Option ROM to use AHCI mode. But I have no idea what address BAR0 is on the Marvell 88SS9293 - there appears to be no documentation whatsoever on this controller available online!
       
      I managed to dump the ROM using Linux sysfs, but don't have the skills to disassemble/decipher the binary to find the correct addresses. Anyone with such experience able to help?? I've attached the ROM dump and lspci output.
       
      Thanks in advance!
      rom.bin
      lspci_output.txt
    • By vin047
      Hi, have an old Asrock X58 Deluxe motherboard with an i7-920 Nehalem CPU (yes old board ) + AMD RX460 . As per the title, i've managed to get Mojave installed and running with Speed Step working too!
       
      Final issue i'm having is with sleep - it sleeps fine and even responds to keyboard/mouse event, but instead of waking up to the desktop it restarts and loads from BIOS boot screen. When finally booted into macos, it restores as if restored from unexpected shutdown (though no error is shown).
       
      I've tried for days to get it working, sometimes depending on clover config changes, I can get it to wake up without reboot, but it just remains at a black screen and is unresponsive. 
       
      I have no custom DSDT, i've tried writing one but don't know really know what i'm looking for? Currently relying on clover patches. I've uploaded my original DSDT (dumped from Clover, F4 option) + current config. Anyone have any ideas?
      dsdt original.zip
      config.plist.zip
    • By AppleVegas
      Hello! I was trying to install hackintosh a while ago and today I decided to install it today.
      I am able to boot it, I installed OS but have problems with iGPU. It has only 4mb vram.
      I did almost everything, I installed whatevergreen, used special clover config, but nothing. I've only got it to recognize my igpu.
       
      I have a laptop with this specs:
      Laptop brand name: Lenovo G50-80
      iGPU: Intel HD Graphics 5500 -- Only shows 4mb of vram
      CPU: Intel Core i5-5200U 2.20GHz
      dGPU: AMD Radeon R5 M230 or AMD Radeon HD 8500m (it has two brand names lol) -- will be really glad if you tell me how to launch it in High Sierra with my iGPU, now I can't even load OS when I have discrete GPU enabled
      RAM: 6GB ddr3
      Bluetooth: Qualcomm Atheros QCA61x4 -- On the way to enable it
      Wi-Fi: Qualcomm Atheros AR956x -- Managed to enable it
       
      P.S: In the windows I can see that my iGPU has Dedicated Video Memory: 128MB
       
      Any help? I think you are only my savers

    • By karthiksh1989
      can somebody walkthrough with the mojave installation guide?
      i have a mac high sierra 13.6 version on my macbook pro and want to install mojave on my desktop
      Exact config is Gigabyte gaming wifi 7 motherboard amy ryzen 7 2nd gen 2700X processor Gigabyte G1 8GB grapgic card VEGA 64
      running 2 nvme Samsung 256gb cards and 1 intel 180gb ssd internal and 1tb WD HDD, with 64 GB 3200Mhz Ram
       
      https://wa.me/919611736534 (whatsapp)
×