Jump to content
About Just Joined group Read more... ×
Mieze

Tracing back the AMD GPU wakeup issue to its origin

365 posts in this topic

Recommended Posts

Do you have IGPU partially enabled? The macOS GPUFamily1 status is a rare achievement unless using Intel igpu in some way.

@Gigamaxx, this is completely wrong, GPUFamily1 is a feature set for Metal, has nothing to do with iGPU. https://developer.apple.com/documentation/metal/mtlfeatureset It probably because of the FB he is using or SMBIOS he is using.

Also, with the SSDT method, the System Profiler states " Metal: Supported, feature set macOS GPUFamily1 v3" but with RadeonDeInit method, it says " Metal: Supported"

Because when using RadeonDeInit method doesn't inject a framebuffer, which means it will use the default RadeonFramebuffer which doesn't have any feature sets allocated to it. when using injected framebuffer ie... InjectATI=True or SSDT method the framebuffer has a certain feature set allocated for it.

Share this post


Link to post
Share on other sites
Advertisement

@Gigamaxx, this is completely wrong, GPUFamily1 is a feature set for Metal, has nothing to do with iGPU. https://developer.apple.com/documentation/metal/mtlfeaturesetIt probably because of the FB he is using or SMBIOS he is using.

I have noticed with my pentium and Ryzen systems I always get the metal supported, but with my i5 6500 (metal supported igpu) I get the macOSGPUFamily1 status. I’ve never seen this status without IGPU metal active.

Share this post


Link to post
Share on other sites

I have noticed with my pentium and Ryzen systems I always get the metal supported, but with my i5 6500 (metal supported igpu) I get the macOSGPUFamily1 status. I’ve never seen this status without IGPU metal active.

Because there is a feature set that is allocated for the ig-platform-id you are using for your i5 6500 iGPU

Share this post


Link to post
Share on other sites

Because there is a feature set that is allocated for the ig-platform-id you are using for your i5 6500 iGPU

Exactly, it seems to assign the Radeon Framebuffer automatically to the family status. I’m not trying to be skeptical just curious I would like to be able to get this level on my non supported CPUs. I’ve also noticed using the deinitiate or whatevergreen it never gets family1.

Share this post


Link to post
Share on other sites

Exactly, it seems to assign the Radeon Framebuffer automatically to the family status. I’m not trying to be skeptical just curious I would like to be able to get this level on my non supported CPUs. I’ve also noticed using the deinitiate or whatevergreen it never gets family1.

Again because RadeonDeInit and Whatevergreen do not inject a framebuffer

Share this post


Link to post
Share on other sites

Do you have IGPU partially enabled? The macOS GPUFamily1 status is a rare achievement unless using Intel igpu in some way.

 

Nope IGPU is completely disabled. "Metal: Supported, feature set macOS GPUFamily1 v3" has nothing to do with IGPU.

See this tech doc from Apple. 

Share this post


Link to post
Share on other sites

Is there anyone with an rx560 that is having no issues? No crashes / artifacts, dual screen working.

 

I have a RX560 and the only issue I have is no sleep but that could also be macos 10.13.2 beta. I once had a working sleep but no more.

The issue is display goes into standby but fans keep working, the system is unreachable and can only be revived by restart. This is with latest Clover and RadeonDeInit = true. It uses the default Radeonframebuffer.

 

If I use Acre framebuffer I boot into black display and can login remote, IORegistryExplorer shows Acre framebuffer being used but no display connected to any of the three connectors, patched or not.

 

I can live with it not having sleep, because from system off to Desktop takes some 10 seconds and I use less energy if shutdown  :) = less warming the earth, so I do everyone a favor.

 

But it keeps nagging for a solution if I am being honest  :D

Share this post


Link to post
Share on other sites

Can someone explain to me why when I use GFX0 instead of PEGP the SSDT doesn't work but only works when using PEGP.

I only learned that recently as well, in my case I would have to first apply the renaming patch from PEGP -> GFX0, and then utilize the patch with GFX0 naming.

Share this post


Link to post
Share on other sites

I only learned that recently as well, in my case I would have to first apply the renaming patch from PEGP -> GFX0, and then utilize the patch with GFX0 naming.

hi,,i now only use redonindet=true in config.plist

Share this post


Link to post
Share on other sites

hi,,i now only use redonindet=true in config.plist

OK noted, thanks. But my response was for a specific question asked.

Share this post


Link to post
Share on other sites

Can someone explain to me why when I use GFX0 instead of PEGP the SSDT doesn't work but only works when using PEGP.

Check your Clover patches. There is probably one among them which renames GFX0 to IGPU and it gets applied to your SSDT too.

 

Mieze

Share this post


Link to post
Share on other sites

Check your Clover patches. There is probably one among them which renames GFX0 to IGPU and it gets applied to your SSDT too.

 

Mieze

 

 

I have a single goal right now.... to not use Clover DeInit and do it with a SSDT so that after that works I can inject FrameBuffer and specifics of my RX560. (iMac18,3 with i7-7700K). So... In my config.plist I have:

	<key>Graphics</key>
	<dict>
		<key>Inject</key>
		<dict>
			<key>ATI</key>
			<true/>
			<key>Intel</key>
			<true/>
		</dict>
		<key>ig-platform-id</key>
		<string>0x59120003</string>
	</dict>

And in Patches:

				<dict>
					<key>Comment</key>
					<string>Intel GPU PM. Rename GFX0 to IGPU</string>
					<key>Disabled</key>
					<false/>
					<key>Find</key>
					<data>R0ZYMA==</data>
					<key>Replace</key>
					<data>SUdQVQ==</data>
				</dict>
				<dict>
					<key>Comment</key>
					<string>Rename PEGP to GFX0</string>
					<key>Find</key>
					<data>
					UEVHUA==
					</data>
					<key>Replace</key>
					<data>
					R0ZYMA==
					</data>
				</dict>

I have this SSDT to do the DeInit:

DefinitionBlock ("", "SSDT", 2, "Apple", "Radeon", 0x00001000)
{
    External (_SB_.PCI0.PEG0.PEGP, DeviceObj)    // (from opcode)

    Scope (\_SB.PCI0.PEG0.PEGP)
    {
        OperationRegion (PCIB, PCI_Config, Zero, 0x0100)
        Field (PCIB, AnyAcc, NoLock, Preserve)
        {
            Offset (0x10), 
            BAR0,   32, 
            BAR1,   32, 
            BAR2,   64, 
            BAR4,   32, 
            BAR5,   32
        }

        Method (_INI, 0, NotSerialized)  // _INI: Initialize
        {
            If (LEqual (BAR5, Zero))
            {
                Store (BAR2, Local0)
            }
            Else
            {
                Store (BAR5, Local0)
            }

            OperationRegion (GREG, SystemMemory, And (Local0, 0xFFFFFFFFFFFFFFF0), 0x8000)
            Field (GREG, AnyAcc, NoLock, Preserve)
            {
                Offset (0x6800), 
                GENA,   32, 
                GCTL,   32, 
                LTBC,   32, 
                Offset (0x6810), 
                PSBL,   32, 
                SSBL,   32, 
                PTCH,   32, 
                PSBH,   32, 
                SSBH,   32, 
                Offset (0x6848), 
                FCTL,   32, 
                Offset (0x6EF8), 
                MUMD,   32
            }

            Store (Zero, FCTL)
            Store (Zero, PSBH)
            Store (Zero, SSBH)
            Store (Zero, LTBC)
            Store (One, GENA)
            Store (Zero, MUMD)
        }
    }
}

When I use all this I get gIOScreenLockState 3.

 

If I keep everything as above but enable DeInit in Clover, it boots fine.

 

Any assistance would be appreciated!

Share this post


Link to post
Share on other sites

Don't use this patch: "Rename PEGP to GFX0" and it should work.

If you rename PEGP to GFX0, your SSDT could not work, because it doesn't find any PEGP.

 

 

This worked!!! Thank you.

 

Better to rename PEGP in ssdt to GFX0

 

Sent from my ONEPLUS A5000 using Tapatalk

 

How do I do this? Can I do it in this same SSDT? Currently I'm not patching and SSDT other than with hotpaching in clover.

Actually, I changed the SSDT to be:


   External (_SB_.PCI0.PEG0.GFX0, DeviceObj)    // (from opcode)

    Scope (\_SB.PCI0.PEG0.GFX0)

Kept my renaming in config and it worked as well... now to attempt to use the Acre frame buffer!

Share this post


Link to post
Share on other sites

If you really have RX580, than ACRE is the wrong Framebuffer, cause it is using only 3 connectors: DP, HDMI and DVI.

Correct Framebuffer should be ORINOCO with five connectors: DP,DP, HDMI, HDMI and DVI

Share this post


Link to post
Share on other sites

If you really have RX580, than ACRE is the wrong Framebuffer, cause it is using only 3 connectors: DP, HDMI and DVI.

Correct Framebuffer should be ORINOCO with five connectors: DP,DP, HDMI, HDMI and DVI

 

I have a RX560 with 3 connectors, DP, HDMI and DVI. I looked at a bunch of different SSDTs and I'm trying to configure it based on my card. This is what I have so far... how can double check all of these or add others that might be needed? I know the DeviceID is correct for me 0x67FF.

        Method (_DSM, 4, NotSerialized)  // _DSM: Device-Specific Method
        {
                If (LEqual (Arg2, Zero))
                {
                    Return (Buffer (One)
                    {
                         0x03                                           
                    })
                }
                Return (Package (0x18)
                {
                    "AAPL,slot-name", 
                    Buffer (0x07)
                    {
                        "Slot-1"
                    }, 
                    "@0,name", 
                    Buffer (0x0C)
                    {
                        "ATY,Acre"
                    }, 
                    "@0,AAPL,boot-display", 
                    Buffer (One)
                    {
                         0x00                                           
                    }, 
                    "@1,name", 
                    Buffer (0x0C)
                    {
                        "ATY,Acre"
                    }, 
                    "@2,name", 
                    Buffer (0x0C)
                    {
                        "ATY,Acre"
                    }, 
                    "@3,name", 
                    Buffer (0x0C)
                    {
                        "ATY,Acre"
                    }, 
                    "@4,name", 
                    Buffer (0x0C)
                    {
                        "ATY,Acre"
                    },
                    "@5,name", 
                    Buffer (0x0C)
                    {
                        "ATY,Acre"
                    },           
                    "ATY,VendorID", 
                    Buffer (0x02)
                    {
                         0x02, 0x10                                     
                    }, 
                    "ATY,DeviceID", 
                    Buffer (0x02)
                    {
                         0xFF, 0x67                                     
                    }, 
                    "model", 
                    Buffer (0x12)
                    {
                        "AMD Radeon RX 560"
                    }, 
                    "hda-gfx", 
                    Buffer (0x0A)
                    {
                        "onboard-1"
                    }
               })
            }
        }
        Device (HDAU)
        {
            Name (_ADR, One)  // _ADR: Address
            Method (_DSM, 4, NotSerialized)  // _DSM: Device-Specific Method
            {
                If (LEqual (Arg2, Zero))
                {
                    Return (Buffer (One)
                    {
                         0x03                                           
                    })
                }
                Return (Package (0x04)
                {
                    "layout-id", 
                    Buffer (0x04)
                    {
                         0x01, 0x00, 0x00, 0x00                         
                    }, 
                    "hda-gfx", 
                    Buffer (0x0A)
                    {
                        "onboard-1"
                    }
                })
            }
        }

Share this post


Link to post
Share on other sites

 

I have a RX560 with 3 connectors, DP, HDMI and DVI. I looked at a bunch of different SSDTs and I'm trying to configure it based on my card. This is what I have so far... how can double check all of these or add others that might be needed? I know the DeviceID is correct for me 0x67FF.

        Method (_DSM, 4, NotSerialized)  // _DSM: Device-Specific Method
        {
                If (LEqual (Arg2, Zero))
                {
                    Return (Buffer (One)
                    {
                         0x03                                           
                    })
                }
                Return (Package (0x18)
                {
                    "AAPL,slot-name", 
                    Buffer (0x07)
                    {
                        "Slot-1"
                    }, 
                    "@0,name", 
                    Buffer (0x0C)
                    {
                        "ATY,Acre"
                    }, 
                    "@0,AAPL,boot-display", 
                    Buffer (One)
                    {
                         0x00                                           
                    }, 
                    "@1,name", 
                    Buffer (0x0C)
                    {
                        "ATY,Acre"
                    }, 
                    "@2,name", 
                    Buffer (0x0C)
                    {
                        "ATY,Acre"
                    }, 
                    "@3,name", 
                    Buffer (0x0C)
                    {
                        "ATY,Acre"
                    }, 
                    "@4,name", 
                    Buffer (0x0C)
                    {
                        "ATY,Acre"
                    },
                    "@5,name", 
                    Buffer (0x0C)
                    {
                        "ATY,Acre"
                    },           
                    "ATY,VendorID", 
                    Buffer (0x02)
                    {
                         0x02, 0x10                                     
                    }, 
                    "ATY,DeviceID", 
                    Buffer (0x02)
                    {
                         0xFF, 0x67                                     
                    }, 
                    "model", 
                    Buffer (0x12)
                    {
                        "AMD Radeon RX 560"
                    }, 
                    "hda-gfx", 
                    Buffer (0x0A)
                    {
                        "onboard-1"
                    }
               })
            }
        }
        Device (HDAU)
        {
            Name (_ADR, One)  // _ADR: Address
            Method (_DSM, 4, NotSerialized)  // _DSM: Device-Specific Method
            {
                If (LEqual (Arg2, Zero))
                {
                    Return (Buffer (One)
                    {
                         0x03                                           
                    })
                }
                Return (Package (0x04)
                {
                    "layout-id", 
                    Buffer (0x04)
                    {
                         0x01, 0x00, 0x00, 0x00                         
                    }, 
                    "hda-gfx", 
                    Buffer (0x0A)
                    {
                        "onboard-1"
                    }
                })
            }
        }

2 things:

 

1. You have 5 connectors in your SSDT, you only have 3 then you only need 0-2

2. If you use AppleALC with lilu+applealc and patch your DSDT with HDAS then you need to change your HDMI audio to be onboard-2

 

But I asm using a RX 580 and only use Lilu+Whatevergreen with 2x monitors D-DVI and DP with no problems with using RadeonFramebuffer

Share this post


Link to post
Share on other sites

display is not getting detected properly n also m getting artefacts  !

 

how can i unlock 30bit colors

 

It looks like you have no accelerator kext loaded. Have you tried to add your 0x????1002 Device ID to one of the AMD?000kexts info.plist?

Share this post


Link to post
Share on other sites

I thing I got QE/CI! 

 

But now my only concern is 30-bit colors(its present in 10.11.6), how can I achieve it !


Sir I got QE/CI with screen  :)

 

Now my only concern is, how can I unlock 30-bit color (I got it in 10.11.6) !

 

AMD6000Controller/LVDS  -Pondweed 

 

​Sir its strange I've injected 2 FB patches 

 

1. AMD FrameBuffer Utility 

 

FB           02000000 00010000 19010100 00000000 10000505 00000000 (LVDS) 

Patch      02000000 40000000 09010100 00000000 10010007 00000000 

 

2. Manually Generated 

 

FB            00040000 04030000 00010000 12040105. (DP)

Patch       02000000 40000000 08010000 10000107

post-941217-0-52733300-1513136712_thumb.png

Share this post


Link to post
Share on other sites

Now m facing another issue :(

 

the screen is looking reddish !

 

The screen appears to be ok , but from certain angle it looks redish, while in 10.11.6 its clear n white(ok).

Share this post


Link to post
Share on other sites

  • Recently Browsing   0 members

    No registered users viewing this page.

Announcements

  • Similar Content

    • By kokozaurs
      Hi all,
       
      I'm using ESXI 7.0b (did use 6.7 to try but with no difference).
       
      Right now on Catalina but I can't seem to try to get GPU (RX580) working properly.
       
      I pass it through to the VM(tried with windows before with no problems) and it appeared just as pci-device with no name.
       
      After that, I've applied whatevergreen + lilu kexts and it does properly recognize it under system report however that's all there is. It's listed at GFX0. See attached images.
       
      No hardware acceleration. Monitor also not turning on using either DP/HDMI. 
       
      Since it is being passed through to the VM, it seems that the problem is not on ESXi side but on macOS side. 
       
      Has anyone has been able to pass through an AMD GPU to ESXI VM? Can anyone suggest some pointers as to where I could find some solutions to this problem?
       
      Thanks! 
       
       



    • By b2550
      Currently my install is 100% working except that so far I have only been able to get one of my 1080p monitors working with a MiniDP to HDMI cable. I currently am getting a second MiniDP to HDMI cable but for now I am stuck with a DVI to HDMI cable.

      The monitor that is working is plugged in via the MiniDP to HDMI. The monitor that isn't working is using the DVI cable. However I also tried switching it to HDMI to HDMI which had the same result. The monitor wakes up but it's black. However for whatever reason, both monitors are still recognized in system preferences and hackintool.

      I know all my cables are good and work because I updated from an install of El Capitan (which I still have on it's original SSD just in case this install fails) as well as a Windows install. Both monitors worked fine on El Capitan for literally years. El Capitan is just too old now and I need to update to keep up with software updates.

      Problem reporting files should provide needed info about how I've configured this install. Build is in my signature.
      debug_22725.zip
    • By autantpourmoi
      I'm an happy user of a x99 built hackintosch since 6/7 years using it mainly for photoshop and fcpx ... Using new camera with better resolution and video in ProResRaw , my built start to struggle a bit 
      I'm thinking of making a new built and seeking for advices for this new built that I want evolutive and last at least as long as my previous built 
      I'll use a SSD M2 forth Generation so I need at least 2 to 3 SSD M2 PCI x4 slots
      then which proc to use , I was thinking about the AMD Ryzen 9 3900xt or the Intel I9 10900k ( don't have the money for AMD threadripper ) if you have better idea I'm really open to it as long as you explain it to me 
      then which chipset should I use:
      for Intel  , should I go to Z490 or X299 or W480
      For AMD , I think I have only the choice of X570
      I always used Gigabyte motherboard so it will be naturally my first choice but again I'm open to any suggestion
      I need at least usb 3,1 Gen 2 and TB3 is not necessary but an option and can be add later on with a PCI Card I think
      So I'll be pleased o read your opinion and the choices that you'll do for the purpose of this built 
      thanks in advance 
       
       
    • By asheenlevrai
      Hi
       
      I recently built my 1st ryzentosh.
      I also used Opencore (0.6.1) for the 1st time and followed the install guides on Dortania.
       
      Now I need to map my USB ports and apparently it's not achieved the same way on AMD platforms compared to what I was "used to" on Intel platforms.
       
      The guide on Dortania is a bit confusing to me since there is information that is not clearly labelled as "for Intel" or "for AMD" and I thus don't understand very well what I should and shouldn't do.
       
      Can anyone give me advice or point towards as comprehensive tutorial intended for AMD-based builds.
       
      Thank you very much in advance for your help.
       
      Best,
      -a-
×