Jump to content

WhatEverGreen Support Topic


MattsCreative
1,503 posts in this topic

Recommended Posts

9 minutes ago, al6042 said:

the screenshot comes from the skylake without discrete graphics... ;)

 

I see on a screenshot that the discrete GPU is available.
Without the discrete GPU, output another.

Edited by Andrey1970
Link to comment
Share on other sites

4 minutes ago, al6042 said:

There you see my problem... :)

It's either one or the other....

It would be great to have both up and running.

 

Thank you both for your time and explanations

I think that is the way it works, either you get QuickSync+ or you get DRM playback, I don't think you can have both unless there is someway that hasn't been discovered yet for switchable graphics on a hack. 

Link to comment
Share on other sites

16 minutes ago, Pavo said:

I think that is the way it works, either you get QuickSync+ or you get DRM playback, I don't think you can have both unless there is someway that hasn't been discovered yet for switchable graphics on a hack. 

DRM without IQSV is not possible on CPU with IGPU.
Switch-off IGPU does not do a CPU to another.

Link to comment
Share on other sites

11 hours ago, jsl2000 said:

Thanks for this advice which worked for WEG 1.2.x now without broken HDMI audio of R9 290X GPU anymore !

BTW have you tried edit CFG_FB_LiMIT data in aty-config of AMD7000Controller.kext ? It should be the number of total display ports ( in yours 6) instead of default 0.

Great!  Glad to hear that helped.

 

I have not tried editing CFG_FB_LIMIT, however I can see in Ioreg that it is set correctly on GFX0 (to 6).  At least it is with WEG installed - without WEG, 'display@0' shows CFG_FB_LIMIT=0.  But booting with WEG doesn't change my problem, so I think it's unrelated.

 

In any case I am pretty sure I do not have any problem with connector detection.  I used to think that was the problem, and I spent a long time testing every possible FB choice (with InjectATI) and I tried patching every available AMD7000 FB using Clover kext patches. I thought if I could get a FB patch working, it might solve my problem.  I finally managed to find a FB I could patch correctly (Ramen - which required I also set boot argument agdpmod in order to use it), and found the problem still existed, no change at all.

 

After that I realised that connector detection must surely be working fine - all my displays appear in Display Preferences, and show up in About This Mac with the correct details.  The problem is something else, something that can be fixed by a sleep and wake.  I guess there must be some re-initialisation of the GPU done during wake, something done differently to boot.

 

The last thing I can think of to try is to compare Ioreg taken straight after boot - when only two monitors have picture - with Ioreg taken after sleep&wake.  See if I can spot anything obviously changed, that maybe can be injected with an SSDT.  But I don't have much hope any more, especially now that I learned that someone else had the problem and fixed it with an EFI-related BIOS change, something I cannot do on a Legacy BIOS.

Edited by TheBloke
Link to comment
Share on other sites

23 hours ago, TheBloke said:

The last thing I can think of to try is to compare Ioreg taken straight after boot - when only two monitors have picture - with Ioreg taken after sleep&wake.  See if I can spot anything obviously changed, that maybe can be injected with an SSDT.  But I don't have much hope any more, especially now that I learned that someone else had the problem and fixed it with an EFI-related BIOS change, something I cannot do on a Legacy BIOS. 

 

I've done this comparison, and found a few properties that differ before and after sleep&wake - and therefore before and after I get a picture on all monitors.

 

These are the differences I found in Ioreg:

GFX0 -> PP_BootupDisplayState

After boot, before sleep:  <01000000> |  After wake =  <02000000>

 

GFX0 -> RadeonFramebuffer@0,1,2, etc -> AMDFrameBufferSI -> IOScreenRestoreState 

After boot, before sleep = not there |  After wake = <02000000>

 

GFX0 -> RadeonFramebuffer@0,1,2,etc -> AMDFrameBufferSI  -> display0 -> AppleDisplay -> IOPowerMAnagement -> DevicePowerState

After boot, before sleep = not there |  After wake = 0x3

 

PP_BootupDisplayState looks the most promising, as it's set on GFX0 itself.  But I don't know if this is a symptom, or a cause.  Ie, would changing PP_BootupDisplayState to 02,00,00,00 fix the problem?  Or is it set automatically when the problem is fixed, by some other means.

 

The question is, is there any way I can set PP_BootupDisplayState myself?  Is it possible to set this via an SSDT?  It's not an aty_config, aty_properties or cail_properties setting, which are the three mentioned on the WEG Github page.  But can any setting of GFX0 be changed by SSDT?

 

I have already tried a couple of SSDTs already but I'm not sure if I'm doing it right for my mobo, and I don't want to keep experimenting if it's not even possible. 

 

I'd be grateful if anyone knows for sure if I can set any property on GFX0 via SSDT - or any other method? 

Edited by TheBloke
Link to comment
Share on other sites

1 hour ago, TheBloke said:

 

I've done this comparison, and found a few properties that differ before and after sleep&wake - and therefore before and after I get a picture on all monitors.

 

These are the differences I found in Ioreg:

GFX0 -> PP_BootupDisplayState

After boot, before sleep:  <01000000> |  After wake =  <02000000>

 

GFX0 -> RadeonFramebuffer@0,1,2, etc -> AMDFrameBufferSI -> IOScreenRestoreState 

After boot, before sleep = not there |  After wake = <02000000>

 

GFX0 -> RadeonFramebuffer@0,1,2,etc -> AMDFrameBufferSI  -> display0 -> AppleDisplay -> IOPowerMAnagement -> DevicePowerState

After boot, before sleep = not there |  After wake = 0x3

 

PP_BootupDisplayState looks the most promising, as it's set on GFX0 itself.  But I don't know if this is a symptom, or a cause.  Ie, would changing PP_BootupDisplayState to 02,00,00,00 fix the problem?  Or is it set automatically when the problem is fixed, by some other means.

 

The question is, is there any way I can set PP_BootupDisplayState myself?  Is it possible to set this via an SSDT?  It's not an aty_config, aty_properties or cail_properties setting, which are the three mentioned on the WEG Github page.  But can any setting of GFX0 be changed by SSDT?

 

I have already tried a couple of SSDTs already but I'm not sure if I'm doing it right for my mobo, and I don't want to keep experimenting if it's not even possible. 

 

I'd be grateful if anyone knows for sure if I can set any property on GFX0 via SSDT - or any other method? 

You sure can https://github.com/acidanthera/WhateverGreen/blob/master/Manual/FAQ.Radeon.en.md

Link to comment
Share on other sites

On 11/6/2018 at 12:12 AM, Pavo said:

 

Thanks for the confirmation, Pavo.

 

I still haven't managed to get an SSDT working for my GPU. I am not sure how to address the GPU, as in Ioreg it appears (without WEG) as pci-bridge@7 -> IOPP -> display@0:

NptmvOJ.png

 

With WEG, 'display@0' is renamed GFX0.  But all SSDT guides I have read describe referring to the GPU under an entry like NPE3 or PEG0, eg _SB.PCI0.NPE3 or _SB.PCI0.PEG0.  But I have no such intermediate entry visible, it just goes pci-bridge > display0, so I don't know how to select the GPU in an SSDT.  I saw one post that suggested that the missing intermediate entry might be caused by 'Drop OEM Tables' in Clover, but I don't have that set.  So I got stuck there.  I am going to keep researching on this to see what's needed to use an SSDT for GPU on my system.

 

In the meantime I tried directly hacking the WEG code.  The good news is that I found it's pretty easy to set properties on the GPU from WEG.  The bad news is that I cannot set PP_BootupDisplayState.  Or at least, any changes I make do not stick - every time I check it in ioreg it is <01,00,00,00>.  So either my setting fails, or it is set back again by something else.


I did it by modifying kern_rad.cpp, changing the end of void RAD::mergeProperty:

Spoiler



if (!strcmp(name, "PP_BootupDisplayState")) {
  DBGLOG("rad", "PP_BootupDisplayState - changing value to 02,00,00,00");
  uint8_t PPData[] = { 0x02, 0x00, 0x00, 0x00 };
  auto newval = OSData::withBytes(PPData, sizeof(PPData));
  props->setObject(name, newval);

  DBGLOG("rad", "Setting PP_BootupDisplayState2 to 02,00,00,00");
  const char *newname = "PP_BootupDisplayState2";
  props->setObject(newname, newval); 
}
else {
  // Default merge as is
  props->setObject(name, value);
  DBGLOG("rad", "prop %s was merged", name);
}


 

 

I also set a new property called PP_BootupDisplayState2.  This was to test that I could set a property and that I had the format correct.  I could set the new one fine, but not the original property:

$ ioreg -fl | grep BootupDisplayState
    | |   |   | |   "PP_BootupDisplayState2" = <02000000>
    | |   |   | |   "PP_BootupDisplayState" = <01000000>

No matter how I set PP_BootupDisplayState, its value always shows as 01,00,00,00.  I also tried changing it in a couple of other places, for example in the same way that CFG_FB_LIMIT is set (in applyPropertyFixes: service->setProperty("PP_BootupDisplayState", ...) ), but I can never get a changed value to apply to this property.  So maybe I am setting it too early, and something else is changing it back later.   Or maybe it's a read-only property, or can only be set at boot/wake.   I don't know.

 

Anyway, even if I could change it it was always a long-shot that changing this property would fix my problem.  I thought I'd try and maybe I'd get lucky.  I didn't :)

 

Oh well, at least now I understand a little bit of how WEG works, and it was fun to read through the code and experiment with some edits. Even if I mostly felt like this guy:

SgiYlVTm.jpg

 

 

Link to comment
Share on other sites

14 minutes ago, TheBloke said:

 

Thanks for the confirmation, Pavo.

 

I still haven't managed to get an SSDT working for my GPU. I am not sure how to address the GPU, as in Ioreg it appears (without WEG) as pci-bridge@7 -> IOPP -> display@0:

NptmvOJ.png

 

With WEG, 'display@0' is renamed GFX0.  But all SSDT guides I have read describe referring to the GPU under an entry like NPE3 or PEG0, eg _SB.PCI0.NPE3 or _SB.PCI0.PEG0.  But I have no such intermediate entry visible, it just goes pci-bridge > display0, so I don't know how to select the GPU in an SSDT.  I saw one post that suggested that the missing intermediate entry might be caused by 'Drop OEM Tables' in Clover, but I don't have that set.  So I got stuck there.  I am going to keep researching on this to see what's needed to use an SSDT for GPU on my system.

 

In the meantime I tried directly hacking the WEG code.  The good news is that I found it's pretty easy to set properties on the GPU from WEG.  The bad news is that I cannot set PP_BootupDisplayState.  Or at least, any changes I make do not stick - every time I check it in ioreg it is <01,00,00,00>.  So either my setting fails, or it is set back again by something else.


I did it by modifying kern_rad.cpp, changing the end of void RAD::mergeProperty:

  Reveal hidden contents

 



if (!strcmp(name, "PP_BootupDisplayState")) {
  DBGLOG("rad", "PP_BootupDisplayState - changing value to 02,00,00,00");
  uint8_t PPData[] = { 0x02, 0x00, 0x00, 0x00 };
  auto newval = OSData::withBytes(PPData, sizeof(PPData));
  props->setObject(name, newval);

  DBGLOG("rad", "Setting PP_BootupDisplayState2 to 02,00,00,00");
  const char *newname = "PP_BootupDisplayState2";
  props->setObject(newname, newval); 
}
else {
  // Default merge as is
  props->setObject(name, value);
  DBGLOG("rad", "prop %s was merged", name);
}

 

 

 

 

 

I also set a new property called PP_BootupDisplayState2.  This was to test that I could set a property and that I had the format correct.  I could set the new one fine, but not the original property:


$ ioreg -fl | grep BootupDisplayState
    | |   |   | |   "PP_BootupDisplayState2" = <02000000>
    | |   |   | |   "PP_BootupDisplayState" = <01000000>

No matter how I set PP_BootupDisplayState, its value always shows as 01,00,00,00.  I also tried changing it in a couple of other places, for example in the same way that CFG_FB_LIMIT is set (in applyPropertyFixes: service->setProperty("PP_BootupDisplayState", ...) ), but I can never get a changed value to apply to this property.  So maybe I am setting it too early, and something else is changing it back later.   Or maybe it's a read-only property, or can only be set at boot/wake.   I don't know.

 

Anyway, even if I could change it it was always a long-shot that changing this property would fix my problem.  I thought I'd try and maybe I'd get lucky.  I didn't :)

 

Oh well, at least now I understand a little bit of how WEG works, and it was fun to read through the code and experiment with some edits. Even if I mostly felt like this guy:

SgiYlVTm.jpg

 

 

What exact are you trying to add or patch with using a SSDT? You can always add, patch or remove entries with device properties in Clover config.

  • Like 1
Link to comment
Share on other sites

1 minute ago, Pavo said:

What exact are you trying to add or patch with using a SSDT? You can always add, patch or remove entries with device properties in Clover config.

 

Ahhh I forgot all about Clover Device Properties!   I never used them before so I forgot that was an option.  That would have been a much simpler way to do it :) Thank you.  I am going to try setting PP_BootupDisplayState with Clover Device Properties, just to confirm it gets the same result as when I did it with WEG.

 

If you are interested, here is the full story of what I am trying to fix:

 

I am trying to resolve a problem with my X58 system with legacy boot, and AMD 7970 Ghz (R280X) GPU.  This GPU has 6 x connectors and 6 x monitors.    The problem: on boot, all monitors have signal and show in Display Preferences, but only two have a picture.  Other four are black.    But if I sleep and then wake, all 6 monitors get a picture and work fine.  This is the same both with and without WEG, and with/without all available Clover GFX options (InjectATI/RadeonDeInit, etc). 

 

I compared ioreg from straight after boot with ioreg after sleep&wake to see which properties changed on GFX0 (list in this post).   One property that was different was PP_BootupDisplayState:  after boot/before sleep, it is <01,00,00,00>.  After sleep & wake, it is <02,00,00,00>

 

So I thought: maybe if I can change PP_BootupDisplayState to <02,00,00,00>, that might fix something.   I first tried with SSDT, but I couldn't find a way to make an SSDT to reference my GPU on my motherboard (as described in previous post).   So then instead I did it with direct code changes to WEG.  But I can't get any change to apply to PP_BootupDisplayState, it always reads <01,00,00,00) until a sleep&wake is done.  So I think this solution cannot work - probably PP_BootupDisplayState is read-only, or set by something else.  It was always a long shot anyway.  But now I will try setting PP_BootupDisplayState with Clover Device Properties, just to be sure.
 

The final thing I am going to investigate is VBIOS registers for my GPU, like Mieze did with her SSDT fix for wake-up issues.  I am going to compare VBIOS before sleep and after sleep/wake, to see if I can spot any difference in registers that maybe I can change with SSDT or Clover Device Properties, like Mieze did.  The issue Mieze fixed sounded rather similar to my issue - she says: "When OS X boots up the framebuffer controller kext will find the AMD GPU in vanilla state, initialize it properly and wakeup will work as expected...   Using UEFI VBIOS the AMD GPU will be initialized too ... but unfortunately macOS's framebuffer controller kext will notice that the GPU has already been initialized and skips the basic setup ..."  This sounds a bit similar to my problem: it is like my GPU is not being properly initialised by the framebuffer kext, until a sleep & wake is done.  In my case, I am using Legacy BIOS, not UEFI.  But maybe I have a similar problem, but the other way around.  I don't know, but I want to investigate this too.

 

Link to comment
Share on other sites

I used to use hex editor patching of the AppleIntelFramebufferAzul but I can no longer be bothered doing this. I try to use whatvergree etc, and I have the enabled working but I cannot get the stolen and frame buffer sizes to work using whatevergreen.

I have the following binary patch working:

< 009f6e0 0003 0d22 0300 0303 0000 0200 0000 0130
< 009f6f0 0000 0000 0000 6000 1499 0000 1499 0000
---
> 009f6e0 0003 0d22 0300 0303 0000 0800 0000 0200
> 009f6f0 0000 0000 0000 8000 1499 0000 1499 0000

Which means I need to specify 3 settings (from: https://pikeralpha.wordpress.com/2013/06/27/appleintelframebufferazul-kext/) they are
AAPL,ig-platform-id -> AwAiDQ== (ieecho -n AwAiDQ | base64 -d | hexdump) gives 0003 0d22
framebuffer-patch-enable -> AQAAAA== -> 0001 0000
framebuffer-unifiedmem -> AACAAA== 0000 0080 (this is working, tested independently and is in the example)
framebuffer-stolenmem -> AAAAAg== (this is not working as it causes crashing)
framebuffer-fbmem -> I don't know what to use but I know that if it is not large I cannot run 4K display.

I have tried a few endian formats, but I have not looked at the source yet. Does anyone see anything wrong or have tips for the endianess?
 

Link to comment
Share on other sites

@Roger Smith I have no personal experience of Intel FB patching, but there is another thread with detailed instructions for it and which has examples for all those things you're trying to patch.

 

I'd check this out, and perhaps ask further questions there:

 

 

Edited by TheBloke
Link to comment
Share on other sites

2 hours ago, vit9696 said:

@TheBloke, you need PP,PP_BootupDisplayState to override the value. Check the FAQ more carefully. I doubt it will help unfortunatly though.

Thanks for replying, @vit9696.  Yes I have tried PP,PP_BootupDisplayState in an SSDT.   Also PP_BootupDisplayState and PP,PP_BootupDisplayState in Clover Device Properties.  Also a direct set of PP_BootupDisplayState from modifying WEG code.  None of them can even change the value - it seems GFX0->PP_BootupDisplayState cannot be overridden, it always retains its default value of <01,00,00,00> no matter how I try to set it.  Only sleep & wake changes it to <02,00,00,00>.  

 

And even if I could change it, I doubt it would fix my problem.  It's probably a symptom, not a cause.

 

Last night I tried Mieze's SSDT method for resetting all Radeon registers to default values, and that made no difference either.  I think I am going to give up on my problem.  It seems possibly related to EFI vs Legacy BIOS, and as I am on X58 with only Legacy boot as an option, maybe there is no solution.  At least none that I can work out after spending literally 10s of hours on this problem.

Edited by TheBloke
Link to comment
Share on other sites

Hi @Andrey1970

A couple of users recognised that in Mojave 10.14.1 the H.264 support seem to be lost, while using a connectorless IGPU together with an RX or Vega card.

VP-10_14.1-KBL-dual-1.png.59137e2b2c7f064eff848249d9afcc57.png

VP-10_14.1-KBL-dual-2.png.f1c880128eb70cbd32a23349037941f3.png

 

That happens also, if the guys are using the AppleGraphicsPowerManagement (AGPD)-KextsToPatch entry instead of WEG (WhateverGreen).

Do you know about this issue and would you might be able to help mitigate the situation?

 

 

Link to comment
Share on other sites

3 minutes ago, al6042 said:

Hi @Andrey1970

A couple of users recognised that in Mojave 10.14.1 the H.264 support seem to be lost, while using a connectorless IGPU together with an RX or Vega card.

VP-10_14.1-KBL-dual-1.png.59137e2b2c7f064eff848249d9afcc57.png

VP-10_14.1-KBL-dual-2.png.f1c880128eb70cbd32a23349037941f3.png

 

That happens also, if the guys are using the AppleGraphicsPowerManagement (AGPD)-KextsToPatch entry instead of WEG (WhateverGreen).

Do you know about this issue and would you might be able to help mitigate the situation?

 

 

 

What's app did you use?? I need to check with mine. I use Intel HD 530 + RX 580.

Link to comment
Share on other sites

On my AMD 7970 Ghz (R9 280X), VideoProc says I don't support any HW encode/decode at all:

 

Rrq4eGu.png

 

 

Is this expected?  My googling seems to suggest that the R9 280X should have some sort of HW decode, at least for H264 (VCE version 1.0).   But maybe VideoProc doesn't support this on macOS - the WinXDVD info page mostly mentions NVidia and Intel QuickSync.

 

Does anyone know if this is an expected result with AMD R9 280X?

Edited by TheBloke
Link to comment
Share on other sites

2 minutes ago, al6042 said:

Since your CPU doesn't contain an IPGU and the AMD R9 280X only supports VCE1.0, I guess this is to be expected.

 

 

Yeah, but VCE 1.0 allows H264, which is an option in VideoProc - so shouldn't it at least support H264?   Or do you think I have the same issue you mentioned - no H264 support in macOS 10.14.1 ?

Link to comment
Share on other sites

3 hours ago, al6042 said:

Hi @Andrey1970

A couple of users recognised that in Mojave 10.14.1 the H.264 support seem to be lost, while using a connectorless IGPU together with an RX or Vega card.

VP-10_14.1-KBL-dual-1.png.59137e2b2c7f064eff848249d9afcc57.png

VP-10_14.1-KBL-dual-2.png.f1c880128eb70cbd32a23349037941f3.png

 

That happens also, if the guys are using the AppleGraphicsPowerManagement (AGPD)-KextsToPatch entry instead of WEG (WhateverGreen).

Do you know about this issue and would you might be able to help mitigate the situation?

 

 

1409286946_ScreenShot2018-11-11at00_04_56.png.6c7857834662ee754cae12d465e9254b.png1777306401_ScreenShot2018-11-11at00_05_23.png.371ce2f40af111b7077d68ea7b28345b.png

 

 

Still working good here,, IGPU = Intel HD 530 and AMD RX580 with latest Lilu.kexts and WhateverGreen.kext. But my system is Mojave 10.14.2 Public Beta

 

 

Edited by Andres ZeroCross
  • Thanks 1
Link to comment
Share on other sites

×
×
  • Create New...