p.H Posted May 9, 2012 Share Posted May 9, 2012 Hi, guys. First of all, some basic info about my hack Dell Inspirion 14R N4010 CPU: I3-380M GFX: ATI-HD5650M MB: DELL Chipset: Intel HM57 Here, I want to get more details about Notebook 16 bit output in its screen Below is the source code for 5650M in Chameleon { 0x68C1, 0x033E1025, CHIP_FAMILY_REDWOOD, "ATI Mobility Radeon HD 5650", kNull }, { 0x68C1, 0x9071104D, CHIP_FAMILY_REDWOOD, "ATI Mobility Radeon HD 5650", kEulemur }, Just as it is, I can simply enable full QE/CI and resolution with a simple "GraphicsEnabler=Yes" But it's 16bit output in my notebook's screen Others say when you use the extern display, it's 32 bit output. In order to solve this problem, I tried my best to search on the web Luckily, I found a solution, and it works beautifully in my hack ----- ATY_Init.kext ATY_Init.kext.zip Something related to ATY First, there are many models in ATY_Init 's info.plist You should try to make a decision based on your graphics card I'm currently using the ATI Radeon 4600 Series model And, I've been trying to find the reason why ATY was able to make it 32bit output in my notebook's screen That seems to be the reason Here's a link : https://code.google.com/p/aty-hd/source/browse/trunk/ATY_String.h?spec=svn2&r=2 I'm curious about more arguements in ATY-Init But I can no longer try to figure it out myself So I'm here seeking for help Best regards ~ 2 Link to comment Share on other sites More sharing options...
Bringidea2life Posted March 19, 2014 Share Posted March 19, 2014 I've been trying to find the reason why ATY was able to make it 32bit output in my notebook's screen I think the reason is OSX doesn't get correct EDID you put in -717 file. I have same problem as you. My solution is keep original -717 file (without add EDID) and add your EDID in Ati_init.kext > Info.plis. This hack worked perfectly with Ati 5650m on my laptop. ATY_Init.kext.zip Link to comment Share on other sites More sharing options...
Recommended Posts