Jump to content

Tracing back the AMD GPU wakeup issue to its origin

AMD GPU Sierra El Capitan sleep wakeup Radeon

  • Please log in to reply
301 replies to this topic

#221
Gigamaxx

Gigamaxx

    InsanelyMac Legend

  • Donators
  • 942 posts
  • Gender:Male

Hitting into milk.
For you to know the library in for Windows and not open source. That all.


OSX has “AutoWattMan” for power regulation of AMD cards. It’s in all the AMDxxxxControllerkext’s, maybe if we could alter these settings using kext to patch method we could adjust performance?

#222
Smallersen

Smallersen

    InsanelyMac Protégé

  • Members
  • Pip
  • 41 posts

You need to patch the AMD9500Controller.kext via clover. Add this patch to your config.plist under the "Kernel and Kext Patches"

Thanks a lot. Now system setting display AMD RX 580 8GB. But power draw hasn´t changed.

Complete system idle:
88 watt with a NVIDIA 1060 = approx. 8 watt for the GPU
116 watt with RX 580 = approx. 36 watt (!) for the GPU

I use Pikes ssdt, generated with the 580 in place. Is there another possibility, perhaps a GPU ssdt? Or inject Ati, framebuffer or something else?

#223
Gigamaxx

Gigamaxx

    InsanelyMac Legend

  • Donators
  • 942 posts
  • Gender:Male
Probably not, High Sierra seems to be over cautious with fan speeds right now. The vega cards are idling with 75% fan speed at minimum. This happened in Sierra with the RX 480 until the Drivers were optimized over time. If you really like the 1060 go with it, but good luck surviving updates or running beta versions. The tradeoff is pretty clear, AMD = Apple Drivers, Nvidia = Web Drivers ( more user control).

Have you tested your RX in Sierra 12.6?

#224
Smallersen

Smallersen

    InsanelyMac Protégé

  • Members
  • Pip
  • 41 posts
The 1060 had very bad performance, a lot of gliches, very slow in every days performance - bad choice. I sent it back.

I never had Sierra installed on my computer, I came from El Capitan. The RX 580 is just a few days old, so I have no experience with other systems. It is just running very nice with Photoshop, Lightroom, After Effects, video in common an so on.

Just would be nice to have better power consumption. If there is no way, we just can wait for Apple...
Just strange, that the patch changed nothing except name.

#225
Picasso

Picasso

    InsanelyMac Sage

  • Members
  • PipPipPipPipPipPip
  • 393 posts
  • Gender:Male

Probably not, High Sierra seems to be over cautious with fan speeds right now. The vega cards are idling with 75% fan speed at minimum. This happened in Sierra with the RX 480 until the Drivers were optimized over time. If you really like the 1060 go with it, but good luck surviving updates or running beta versions. The tradeoff is pretty clear, AMD = Apple Drivers, Nvidia = Web Drivers ( more user control).

Have you tested your RX in Sierra 12.6?

 Vega should be fan ramp modified in bios, under windows. You can set percentage between temps and fan actuation. You can check the best way in Overclockers site, Vega/Amd forums. I did for 390X because my render times are extreme.



#226
nms

nms

    InsanelyMac Protégé

  • Coders
  • 27 posts
  • Gender:Not Telling

Hi nms, any tipps for my problem? It seems RadeonDeInit=Yes always leads to a AMD R9 xxx GPU definition. How to change that to RX 580?

Unfortunately my knowledge about ATI/AMD stuff very low )-;



#227
Smallersen

Smallersen

    InsanelyMac Protégé

  • Members
  • Pip
  • 41 posts
Has sombody else checked power consumption of his RX 580 or another AMD card under High Sierra? Are the Apple drivers this bad optimized? Under windows the 580 is listed with 12 watt idle power consumption, under Mac it seams to be 36 watt. My 5960 processor jumps back to 1,2 MHz when idle, so speedsteps seem to work okay. It really seems to be the 580 GPU.

#228
Pavo

Pavo

    InsanelyMac Legend

  • Developers
  • 629 posts
  • Gender:Male
  • Location:Fort Gordon, GA
I don’t know why people continue to compare Windows results to macOS(hackintosh) results. These results will at all times be different and vary

#229
Mieze

Mieze

    Giant Cat

  • Developers
  • 1,239 posts
  • Gender:Female
  • Location:Germany
  • Interests:Cats

Thanks a lot. Now system setting display AMD RX 580 8GB. But power draw hasn´t changed.

Complete system idle:
88 watt with a NVIDIA 1060 = approx. 8 watt for the GPU
116 watt with RX 580 = approx. 36 watt (!) for the GPU

 

For graphics power management to work properly make sure you have selected a system definition which matches your hardware, checked that the GPUs have proper names in your DSDT and checked that platform power management (ASPM) in BIOS setup is enabled.

 

Mieze



#230
Smallersen

Smallersen

    InsanelyMac Protégé

  • Members
  • Pip
  • 41 posts
Hi Mieze,

thanks for your answer. Just I don´t know what exactly to do.

My system runs as Mac Pro 6.1 with perfect speedsteps.
I use the kext patch "Rename AMD R9 xxx to AMD RX 580", in system infomation the GPU appears as RX580
I switched ASPM in BIOS, but there is no "enabled" settings, just Auto or "L1", whatever that means. This brought down power consumption from 116 watt to 113 watt idle - not bad.

So what is the easiest way to make shure the GPU has a proper name in DSDT? I don´t use a DSDT file, just some SSDT, Pikes speedsteps, audio and nvme patch.
Do I need a DSDT file or is there a way in clover?

#231
Mieze

Mieze

    Giant Cat

  • Developers
  • 1,239 posts
  • Gender:Female
  • Location:Germany
  • Interests:Cats

So what is the easiest way to make shure the GPU has a proper name in DSDT? I don´t use a DSDT file, just some SSDT, Pikes speedsteps, audio and nvme patch.
Do I need a DSDT file or is there a way in clover?

The MacPro6,1 has two GPUs which have the names GFX1 and GFX2. Take a look at IORegistry in order to find out which names your GPUs are using. Basically you can patch your DSDT manually or try to use Clover's ability to apply custom DSDT patches (which might not be possible in some cases).

 

By the way, are you injecting a framebuffer personality? I'm asking because some of these also influence graphics power management by selecting special configuration parameters for the GPU.

 

Mieze



#232
SiddRamesh

SiddRamesh

    InsanelyMac Geek

  • Members
  • PipPipPipPip
  • 186 posts
  • Gender:Male
  • Location:Mumbai
  • Interests:Developing

@Slice @pavo Sir,

 

Still didn't got my AMD Radeon HD 7650M on 10.13.1 :(

 

plz someone guide me :(



#233
Smallersen

Smallersen

    InsanelyMac Protégé

  • Members
  • Pip
  • 41 posts
Hi Mieze,

thanks for your efforts - but I already fail to find out whether it is GFX1 or GFX 2. No GFX ist to be found in IORegistryExplorer.

The other thing: I use RedeonDeinit, which is kind of black box. How to patch GFX number or/and framebuffer exactly, and does this work together with RadeonDeinit or do I have to use a different approach?

Specialists like you can´t surely imagine the absolute basics problems a dummy like me has. I poke around until the system runs sufficient, and then I´m going to use the system for work for 2 years without changing something.

#234
Pavo

Pavo

    InsanelyMac Legend

  • Developers
  • 629 posts
  • Gender:Male
  • Location:Fort Gordon, GA

Hi Mieze,

thanks for your efforts - but I already fail to find out whether it is GFX1 or GFX 2. No GFX ist to be found in IORegistryExplorer.

The other thing: I use RedeonDeinit, which is kind of black box. How to patch GFX number or/and framebuffer exactly, and does this work together with RadeonDeinit or do I have to use a different approach?

Specialists like you can´t surely imagine the absolute basics problems a dummy like me has. I poke around until the system runs sufficient, and then I´m going to use the system for work for 2 years without changing something.

GFX isn't showing up in IOReg because you probably aren't renaming your PEGx to GFX in your SSDT or DSDT.



#235
toleda

toleda

    InsanelyMac Deity

  • Gurus
  • 2,043 posts
  • Gender:Male

The MacPro6,1 has two GPUs which have the names GFX1 and GFX2. 

Outstanding work.  Suggestion, edit Post #1 dsdt patch to Device (GFX0) replacing Device (PEGP).  

 

PEGP/AGPM - threshhold GPM:

Attached File  Screen Shot 2017-11-22 at 4.38.35 PM.png   70.16KB   13 downloads

 

GFX0/AGPM - p-state GPM (imac18,3)

Attached File  Screen Shot 2017-11-22 at 4.47.34 PM.png   63.64KB   13 downloads

 

Desktop/Skylake/Polaris: imac18,3/Mac-BE088AF8C5EB4FA2

 

Otherwise,  a dummy AMD AGPM kext can be injected by adding the GFX0 (or PEGP/PEG0/H000, etc.) property to any board-id:

Attached File  Screen Shot 2017-11-22 at 5.08.27 PM.png   48.24KB   11 downloads



#236
Pavo

Pavo

    InsanelyMac Legend

  • Developers
  • 629 posts
  • Gender:Male
  • Location:Fort Gordon, GA

Can someone explain to me why SSDT patching doesn't work unless you inject ATI using Clover?



#237
Mieze

Mieze

    Giant Cat

  • Developers
  • 1,239 posts
  • Gender:Female
  • Location:Germany
  • Interests:Cats

Can someone explain to me why SSDT patching doesn't work unless you inject ATI using Clover?

Basically it boils down to the question: why do Apple's framebuffer drivers define dedicated framebuffer personalities although they are able to auto-generate a default personality for almost any graphics card we are using?

 

Well, I think that the answer is quite obvious. It's because the auto-generated connector data isn't meant to be a full replacement for a dedicated one but more as a fallback mechanism for situations in which a dedicated framebuffer personality for a certain graphics card is missing so that basic screen output can be provided even for unknown hardware. This is also the reason why auto-generated connector data is limited in functionality in a way that it doesn't support advanced feature like multi screen support, etc.

 

Mieze



#238
Pavo

Pavo

    InsanelyMac Legend

  • Developers
  • 629 posts
  • Gender:Male
  • Location:Fort Gordon, GA

Basically it boils down to the question: why do Apple's framebuffer drivers define dedicated framebuffer personalities although they are able to auto-generate a default personality for almost any graphics card we are using?

 

Well, I think that the answer is quite obvious. It's because the auto-generated connector data isn't meant to be a full replacement for a dedicated one but more as a fallback mechanism for situations in which a dedicated framebuffer personality for a certain graphics card is missing so that basic screen output can be provided even for unknown hardware. This is also the reason why auto-generated connector data is limited in functionality in a way that it doesn't support advanced feature like multi screen support, etc.

 

Mieze

But I get multi-monitor perfectly fine with auto-generated connector data when I don't inject anything, still doesn't explain why SSDT/DSDT injection will only work when ATI injection is enabled only.



#239
Mieze

Mieze

    Giant Cat

  • Developers
  • 1,239 posts
  • Gender:Female
  • Location:Germany
  • Interests:Cats

But I get multi-monitor perfectly fine with auto-generated connector data when I don't inject anything, still doesn't explain why SSDT/DSDT injection will only work when ATI injection is enabled only.

Apple's algorithm for auto-generation of the connector data is defective. In case multi screen support is working for you, you might be one of the lucky few but this doesn't mean the the connector data is 100% correct. The hotplug id might be wrong causing display detection to fail after wakeup. There are several reports of limited functionality using the default framebuffer so that we must consider that as the normal state.

 

I only wonder if this is a driver bug or if Apple designed it that way intentionally?

 

Mieze



#240
Pavo

Pavo

    InsanelyMac Legend

  • Developers
  • 629 posts
  • Gender:Male
  • Location:Fort Gordon, GA

Apple's algorithm for auto-generation of the connector data is defective. In case multi screen support is working for you, you might be one of the lucky few but this doesn't mean the the connector data is 100% correct. The hotplug id might be wrong causing display detection to fail after wakeup. There are several reports of limited functionality using the default framebuffer so that we must consider that as the normal state.

 

I only wonder if this is a driver bug or if Apple designed it that way intentionally?

 

Mieze

OK I can understand that, maybe I should re-phase the question then. What all info is needed in the SSDT/DSDT injected in order to not have to use Clover's ATI Inject enabled?







Also tagged with one or more of these keywords: AMD, GPU, Sierra, El Capitan, sleep, wakeup, Radeon


0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

© 2017 InsanelyMac  |   News  |   Forum  |   Downloads  |   OSx86 Wiki  |   Designed by Ed Gain  |   Logo by irfan  |   Privacy Policy