Jump to content

wrk73

Members
  • Content Count

    53
  • Joined

  • Last visited


Reputation Activity

  1. Like
    wrk73 got a reaction from Tetonne in Fix the time difference between osx86 and Windows in multiboot setups.   
    This is easy way to fix it: https://github.com/oxycoder/TimeHaxk
  2. Like
    wrk73 reacted to jl4c in Vaio users black screen after boot   
    Seems that your brightness keys are handled by ACPI events. Use ACPIDebug.kext.
    Search for "[Guide] Patching DSDT/SSDT for LAPTOP backlight control" to fix them.
  3. Like
    wrk73 reacted to jl4c in Vaio users black screen after boot   
    Sure, attach IOReg and unmodified DSDT.
  4. Like
    wrk73 reacted to Riley Freeman in [HOW TO] Fix second stage boot logo and loading bar for some dedicated desktop video cards   
    Both values were there in the IOReg. I'm not using any injection for the 670, but I do have Clover's FixDisplay DSDT fix set. Not for any particular reason I think, it just didn't seem to do any harm.
     
    I'm assuming the VRAM,totalsize value is ignored once the VRAM,totalMB one is set.
  5. Like
    wrk73 got a reaction from crusher in [HOW TO] Fix second stage boot logo and loading bar for some dedicated desktop video cards   
    With CSMVideoDrv installed, I get max resolution at 1600x1200 and another lower resolution, but all scale is 4:3.
    I have try set OS and Clover GUI resolution back to 4k (3480x2160). Seem like stage 2 is lost or it doesn't load anymore. It's usually take around 2 - 3 second at stage 2, now it just flash and load login screen
     
    I have take a capture about it: https://www.youtube.com/watch?v=BrDUhj-7UyI
  6. Like
    wrk73 got a reaction from arsradu in [HOW TO] Fix second stage boot logo and loading bar for some dedicated desktop video cards   
    After remove VRAM from xml, i got 256MB ram on my 970 
    Got following size with some tries:
     
    2560MB with 0xA0000000
    3584MB with 0xE0000000
     
    1st number give us 256MB, so max memory we can get: 256x15 (F) = 3840MB
    2nd number give us 16MB, so max memory we can get: 16x15 = 240MB
    3rd number give us 1MB: so max memory we can get: 1x15 = 15MB 
     
    With 0xFFF0000 I get 4095MB vram, as it detected by default (without inject the hex to fix loading bar)
     
    So if we devide 1MB = 1024KB to 16, we will get 4th number value is 64KB, max we can get is 64 x 15 = 960kb
    So on, 5th is 4KB each, max we can get is 4x15 = 60kb
    6th is 4x1024 = 4096bytes, devide to 16 we have 256 bytes each 6th, max we can get is 256 x 15 = 3840 bytes ~ 4KB.
    same for 7th and 8th
    Sum the total from 1 -> 8, i still got 4095MB at 0xFFFFFFFF . Some how the value from 4th are not calculate to total of VRAM, if it does, I should get 4096MB VRAM 
     
    P/S: I always get lower value than real value from bios/windows on osx, at yosemite, my processor show 3.99ghz only, ram at 2285MHZ, GTX 970 at 4095 VRAM. After upgrade to el capitan, my processor goes back to 4GHZ, other value still same as before. On windows and bios, my Ram running at 2400MHZ, not a big problem because my ram also has xmp profile for 2285MHZ too.
  7. Like
    wrk73 got a reaction from arsradu in [HOW TO] Fix second stage boot logo and loading bar for some dedicated desktop video cards   
    I uncheck Inject EDID and Patch VBios. From GUI, set resolution to 4k, and checked use custom logo.
    Here are result by changing resolution in osx:
    - Default scale (1080p): It's show correct like real mac, I got apple logo and loading bar at center  (stage 1 and 2 )
    - 1440p, 2160p (4k): It just flash like video above and enter direct to login screen (No stage 2).
     
    I do not have any 5k IMac to test how it appear when scale to 1440p or 4k resolution , so I just think that apple do not have stage 2 config for 1440p or above, hope someone who has 5K imac can confirm it for me
     
    p/s: Your patch are work for me, with default setting. Thank you a lot for helping me and post a nice tutorial
     
    Attached my current clover config, It may help someone else.
    config.plist.zip
  8. Like
    wrk73 reacted to arsradu in [HOW TO] Fix second stage boot logo and loading bar for some dedicated desktop video cards   
    So this is with CSM disabled, max resolution in Clover UI and OS X, and you have no second stage at all.
     
    What if you temporarily remove the injected properties (the hex code you initially injected in Clover config.plist) and let it load, in full resolution, with CSM disabled, just like before? Will the second stage appear but loading bar will drop back to the bottom left corner?
     
    In theory, and in my tests so far (though I'm on a 19" 1440x900 monitor, so you can understand why I'm drooling for that 4K of yours ) ), you should not need anything if you disable CSM in Bios. At least I don't...
    As I said, I keep it enabled just for my other drive. Still, you've got so many issues with pretty high end hardware.
     
    By the way, are you using SSD? Cause that boot is just too freaking fast!!
  9. Like
    wrk73 reacted to arsradu in [HOW TO] Fix second stage boot logo and loading bar for some dedicated desktop video cards   
    Ok, why can't you pass step 2? I don't get it. This is the untouched xml. Now you have to edit it for your needs. When you will start editing it, and ADDING all the information you need, you will find the place to add your VRAM value, as well.
    By the way, the selected part in the screenshot is not for design. That's the part that you need to add and change, according to your card and port.
     
    Also, if you want me to take a look at this issue, I will need your IOreg (config.plist would be nice too). To see which port are you using and apply the patch accordingly.
  10. Like
    wrk73 reacted to wegface in (GUIDE) 10.11 full speed USB (series 8/9) keeping vanilla SLE   
    Your ioreg shows XH01 and injector is XHC1/XHC. They should match
  11. Like
    wrk73 reacted to Meowthra in [SHARE]Haswell DSDT Original FIX   
    Original .aml FIX
     
    DSDT
     
    1.
    ERROR                 Store (\_GPE.MMTB (Local3, \_GPE.OSUP (Local3)), Store (Local2, REG6)) FIX                 Store (\_GPE.MMTB(), Local3) \_GPE.OSUP (Local3) Store (Local2, REG6) 2. ERROR                 {                     PS0X                 } FIX                 {                     Store(Zero, PS0X)                 } 3. ERROR                 {                     PS3X                 } FIX                 {                     Store(Zero, PS3X)                 }  
     
     
     
    SSDT-X
        1. ERROR             {                 Return (GPRW)                 0x09                 0x04             } FIX             {                 Return (Package(GPRW){0x09, 0x04})             }  
    2.
    ERROR
           Method (_OFF, 0, Serialized)        {             P8XH (Zero, 0xD6, One, P8XH (One, 0xF0, One, Store ("_SB.PCI0.RP05.PEGP._OFF",                  Debug), Store (LCTL, ELCT), Store (VREG, VGAB), Store (One, LNKD)), While (                 LNotEqual (LNKS, Zero))                 {                     Sleep (One)                 }, SGPO (HLRS, One), SGPO (PWEN, Zero))             Store (One, \_SB.PCI0.LPCB.EC0.DSPM)             Sleep (0x14)             Return (Zero)         } FIX
           Method (_OFF, 0, Serialized)        {             P8XH (Zero, 0xD6, One)             P8XH (One, 0xF0, One)             Store ("_SB.PCI0.RP05.PEGP._OFF", Debug)             Store (LCTL, ELCT)             Store (VREG, VGAB)             Store (One, LNKD)             While ( LNotEqual (LNKS, Zero))             {                 Sleep (One)             }             SGPO (HLRS, One)             SGPO (PWEN, Zero)             Store (One, \_SB.PCI0.LPCB.EC0.DSPM)             Sleep (0x14)             Return (Zero)         } 3.
    ERROR
               If (LEqual (Arg0, Zero))            {                 \_SB.PCI0.RP05.PEGP.SGPO (\_SB.PCI0.RP05.PEGP.ESEL, Zero)                 P8XH (One, 0x77, One, P8XH (Zero, Zero, One, Return (One),                      If (LEqual (Arg0, One))                         {                             P8XH (One, 0x77, One, P8XH (Zero, One, One, Return (One),                                  If (LEqual (Arg0, 0x02))                                     {                                         P8XH (One, 0x77, One, P8XH (Zero, 0x02, One, Return (LNot (                                             \_SB.PCI0.RP05.PEGP.SGPI (\_SB.PCI0.RP05.PEGP.ESEL))), Return (Zero)))                                     }))                         }))             } FIX
               If (LEqual (Arg0, Zero))            {                 \_SB.PCI0.RP05.PEGP.SGPO (\_SB.PCI0.RP05.PEGP.ESEL, Zero)                 P8XH (One, 0x77, One)                 P8XH (Zero, Zero, One)                 Return (One)             }             If (LEqual (Arg0, One))             {                 P8XH (One, 0x77, One)                 P8XH (Zero, One, One)                 Return (One)             }             If (LEqual (Arg0, 0x02))             {                 P8XH (One, 0x77, One)                 P8XH (Zero, 0x02, One)                 Return (LNot (\_SB.PCI0.RP05.PEGP.SGPI (\_SB.PCI0.RP05.PEGP.ESEL)))             }             Return (Zero) 4.
    ERROR
               If (LEqual (Arg0, Zero))            {                 \_SB.PCI0.RP05.PEGP.SGPO (ESEL, One)                 P8XH (One, 0x99, One, P8XH (Zero, Zero, One, Return (One),                      If (LEqual (Arg0, One))                         {                             P8XH (One, 0x99, One, P8XH (Zero, One, One, Return (One),                                  If (LEqual (Arg0, 0x02))                                     {                                         P8XH (One, 0x99, One, P8XH (Zero, 0x02, One, Return (\_SB.PCI0.RP05.PEGP.SGPI (                                             ESEL)), Return (Zero)))                                     }))                         }))             } FIX
               If (LEqual (Arg0, Zero))            {                 \_SB.PCI0.RP05.PEGP.SGPO (ESEL, One)                 P8XH (One, 0x99, One)                 P8XH (Zero, Zero, One)                 Return (One)             }             If (LEqual (Arg0, One))             {                 P8XH (One, 0x99, One)                 P8XH (Zero, One, One)                 Return (One)             }             If (LEqual (Arg0, 0x02))             {                 P8XH (One, 0x99, One)                 P8XH (Zero, 0x02, One)                 Return (\_SB.PCI0.RP05.PEGP.SGPI (ESEL))             }             Return (Zero) 5. Existing object has invalid type for Scope operator (\_SB.PCI0 [untyped])
     
    ADD
    External (\_SB_.PCI0, DeviceObj) 6. Existing object has invalid type for Scope operator (\_SB.PCI0.RP05 [untyped])
     
    ADD
    External (\_SB_.PCI0.RP05, DeviceObj)
  12. Like
    wrk73 reacted to arsradu in [HOW TO] Fix second stage boot logo and loading bar for some dedicated desktop video cards   
    Hey guys,

    This is a tutorial on how to fix (or at least try to fix) the second stage boot, when you're having these issues: missing logo and loading bar being displayed on the bottom left corner of the screen.
     
    Might or might not work for other issues. So proceed with caution.

    This is an issue that I experienced starting with early stages of development in Yosemite. Still continued in El Capitan. So this thread is primarily for these two versions of Mac OS X. Not sure it will work on earlier versions, cause I never tried it. So far it worked in Yosemite, El Capitan and Sierra.

    Please, note that I did not test this on all video cards. So it might or might not work, depending on that. Don't take this as a final solution for everything. Also, I'm pretty sure this won't work with iGPUs. So I would strongly suggest to look around for solutions if that's your case.

    This tutorial is not for multiple display setups! Please, don't use it for that! Especially not in iGPU + dedicated GPU combos.
     
    Known issue: if you're using the auto-login feature, try to disable it and use the regular login screen. Otherwise you won't get the second stage boot (confirmed as fixed in Sierra).
     
    This tutorial would have not been possible without the help and insight of:
    Pike R. Alpha
    cecekpawon
    Mirone
    Riley Freeman
     
    So all the credit goes to them. I just put this together based on my successes and failures.

    Succesfully tested with:
     
    NVidia cards:
    GeForce 210 GT 640 GTX 650 GTX 660 GTX 670 Superclocked+ 4GB - works partially (ok for 1080p, with CSM disabled, but no second stage in higher resolutions) GTX 750 Ti - works partially (ok for 1080p, but no Apple logo for second stage in higher resolutions) GTX 760 GTX 780 GTX 960 GTX 970 - works partially (ok for 1080p, and 1440p with CSM disabled, but no second stage in higher resolutions) ATI cards:
    Sapphire Toxic R9 270X Might work on other graphic cards, as well. But these are the ones that worked so far. Please, share if you got good results with another video card.

    Also, the same thing can be achieved by modding your DSDT. Unfortunately this thread does not make the object of that method, nor am I able to help you with that, since I really have no idea how to mod a DSDT, so far. Also, to be honest, this method seems a bit easier.

    Nonetheless, try this at your own risk! I'm not responsible for any damage that you might cause to your computer/components etc.
    Also, please, note that this tutorial is not perfect. I'm doing my best to make it as easy to understand as possible. Also, I'm opened to suggestions and I'm doing my best to improve this in the future. Still, it's not perfect. So keep that in mind. Suggestions are appreciated though.

    Please, note that, if you've got a video card with GOP UEFI Bios, you most likely don't need this tutorial in the first place. All you need is to disable CSM in BIOS (motherboard dependent). Note that, by doing this, only GPT drives will be loaded. So if you've got Windows installed on another drive, for example, you won't be able to select it anymore, if it's installed in "legacy" mode.


    With that being said, let's get to work.

    Prerequisites:

    Mandatory:
    a computer running Mac OS X Yosemite or newer Clover bootloader (changes are gonna be saved to the config.plist file) IORegistryExplorer v 2.1 (attached) gfxutil (attached) gfx_hex (attached) - script credit to cecekpawon Optional:
    Sublime Text Editor Clover Configurator a Windows PC for reading the graphic card's VBios version. So far, I couldn't find a way to do it from OS X. But if you know how to do it, please, share. I'll update this accordingly. An alternative (though not really the same thing) is this. an USB installer might come in handy, so keep one close by. Extra:
    If you want to try the modded Bootx64.efi (rev 3279), thanks to cecekpawon, featuring an option to easily enable/disable string injection in Clover (in case you're stuck outside your OS, and don't have a USB drive or something else to boot from), click here to get it. Please, note that this revision might not work with newer versions of OS X/macOS.
    You will need to replace the one in your EFI/EFI/CLOVER folder. If you want to revert to the previous one, you can either reinstall Clover, or make a backup of your current one before replacing it. I would recommend creating a single folder with all the necessary files and tools and putting it on your Desktop, just to have all the needed things in one place.

    Note: If you use a custom SSDT/DSDT, I would try first without it, and using Clover's patches instead, so that you minimize the risk for failure due to custom DSDTs. I don't use a custom DSDT, didn't test in collaboration with a custom DSDT, so I don't know if it will work. If you wanna try it this way, I would love to know your results.

    What to do:

    Step 1 (establishing the port):

    Open up your IOreg file (or just open IORegistryExplorer if you don't want to save it as a separate file), and search for "display".

    You should see something like this:



    Now, depending on your card, you might have more or less ports. The important thing is to note the one that has the AppleDisplay attached to it. In this example, the second port (B@1) is the one in use. So that's the one we need to set as default. Please, note that they start at 0, so first port is A@0, the second one is B@1 and so on so forth.

    Step 2 (extracting device-properties.hex):

    Open up a terminal window, navigate to the desktop folder you're using, and do this (make sure gfxutil is present into that folder):
    ioreg -lw0 -p IODeviceTree -n efi -r -x | grep device-properties | sed 's/.*<//;s/>.*//;' > "device-properties.hex" && ./gfxutil -s -n -i hex -o xml "device-properties.hex" "device-properties.xml" This will extract two new files into that folder: device-properties.hex and device-properties.xml

    Step 3 (modding the device-properties.xml):

    Open up the xml file using any text editor (I'm gonna use Sublime here) and ADD the number of ports you need for your card (the same number of ports, with the same name as in the IOreg) and the rest of the information, as described in the example below (use the attached xml as reference):



    Don't forget that the first port is port 0!

    In this example, we're using an MSi GTX 650 OC with 2GB of RAM, which has 3 ports, and we want to set as default port #2 (B@1).
    Attached you can find my device-properties.xml file, if you wanna use it as reference.

    The first section will set port B@1 as default. Change this according to your case.
    <!-- Primary display --> <key>@1,AAPL,boot-display</key> <string>0x00000001</string> Whereas on the bottom you can see regular values. Here we can set the amount of VRAM, the name of the card and the VBios version.
     
    <key>VRAM,totalMB</key> <string>0x00000800</string> // for 2048 MB or 2 GB of RAM <key>device_type</key> <string>NVDA,Parent</string> <key>model</key> <string>MSi GeForce GTX 650</string> // Name of your card <key>rom-revision</key> <string>80.07.35.00.04</string> //VBIOS version In my experience, these values are rather cosmetic than anything else. For an accurate version of your VBios, you can use either the Nvidia driver on Windows, or any hardware information software like GPU-Z, aida64 or nvflash.
     
    Here's a little table with hex values for different amounts of RAM. If you need other values than the ones specified in this table, please, ask. Also, if you spot a mistake, let me know so I can correct it.
     
    Amount of memory  Hex value
    128 MB                      0x00000080
    256 MB                      0x00000100
    512 MB                      0x00000200
    1024 MB (1 GB)        0x00000400
    2048 MB (2 GB)        0x00000800
    3072 MB (3 GB)        0x00000C00
    4096 MB (4 GB)        0x00001000
    5120 MB (5 GB)        0x00001400
    6144 MB (6 GB)        0x00001800
    7168 MB (7 GB)        0x00001C00
    8192 MB (8 GB)        0x00002000
    9216 MB (9 GB)        0x00002400
    10240 MB (10 GB)    0x00002800
    11264 MB (11 GB)    0x00002C00
    12288 MB (12 GB)    0x00003000
    13312 MB (13 GB)    0x00003400
    14336 MB (14 GB)    0x00003800
    15360 MB (15 GB)    0x00003C00
    16384 MB (16 GB)    0x00004000
     
    Once we're done modding, we're gonna save the changes and convert the xml file back into hex, using this command:
    ./gfxutil -i xml -o hex device-properties.xml device-properties.hex Step 4 (reading the modded hex):

    Unpack gfx_hex (attached) file into your working directory and double click to run it.
     
    It should output the content of your modded hex into a terminal window, and it should look something like this:
     

     
    Select and copy that code. After that, open up Clover Configurator and paste that hex code in Devices -> Properties section (see below) and check the box for Inject. You can also add this manually into the config.plist file, if that's more of your thing.



    Step 5 (setting up final arrangements):

    We're almost done. But, if your experience is the same as mine, at this point you only got the loading bar back centered. Which, of course is a step forward. But we're not quite there yet.

    So, to add the logo as well, we will check the boxes for Patch VBios and Inject EDID in Clover Configurator (Graphics section). You can inject your own EDID if you have one. Though, in my experience, Clover does a really good job in adding the correct one itself. So you don't really need to add a custom EDID. Just to use Clover's InjectEDID feature.



    Also, in my experience, you don't need CsmVideoDrv.efi for this to work. And I've got pretty similar results with and without it. So, if you have issues with it, you can try without it, and vice-versa.

    When it's all done, save the changes to your config.

    Now, I would highly recommend you to make a backup of your current config.plist from EFI/EFI/Clover/config.plist anywhere you want. You might need it later.

    When that's done too, replace the config.plist in your Clover folder with the one you just modded, and restart.

    Keep your fingers crossed (toes too). And hope for the best. If everything ok, you should have a pretty close to perfect second stage boot. If not, you might have nothing or an unbootable system.

    To fix this, boot from your USB installer, or use boot flag nv_disable=1 (for Nvidia cards), upon booting, and revert the changes. Or, if you already know where is the problem, fix that and restart.

    That's it.

    Please, let me know if it worked for you.
    gfx_hex.zip
    IORegistryExplorer.app.zip
    gfxutil.zip
    device-properties.xml
  13. Like
    wrk73 got a reaction from scott_donald in Clover General discussion   
    Hey ruki, FINALLY it work. 
    The last method work with me, I just edit a bit according clover wiki : http://clover-wiki.zetam.org/Configuration/GUI
     
    I have remove the custom key then just use only hide key
    <key>Hide</key> <array> <string>HD(1,MBR,0x00000000,0x1F40247C,0x12BFF03D)</string> <string>HD(2,MBR,0x00000000,0x320014F8,0x12BFEFFE)</string> <string>HD(3,MBR,0x00000000,0x44C00535,0x5C5698C)</string> </array> Thank you so much for your time. My hero
×