Jump to content

Mobility Radeon HD 5650 QE/CI WORK! (26/2 Update!)


eric028
 Share

113 posts in this topic

Recommended Posts

I think the display is running in millions of colours correctly because the corruption occurs only on some textures. 3D games are really bad, launchpad too (depending on the wallpaper you use) - but an image open in Chrome will be rendered correctly. It's perplexing. My best guess, the frame buffer isn't a perfect fit and is mangling texture compression.

Link to comment
Share on other sites

Vilczech18 i have no lines on launchpad whatsoever, so i might not have the the 16bit collor problem, which version are you running? 10.7.3?

 

what frambeuffer are you using? it makes a diference, i am using Hoolock

I'm using Orangutan, Lion 10.7.3.
Link to comment
Share on other sites

  • 3 weeks later...

Vilczech18 i have no lines on launchpad whatsoever, so i might not have the the 16bit collor problem, which version are you running? 10.7.3?

 

what frambeuffer are you using? it makes a diference, i am using Hoolock

Holy {censored}e.

 

this-changes-everything-thumb.jpg

 

Are you saying that using Hoolock as frame buffer, your laptop's built-in display is showing SMOOTH GRADIENT, unlike everyone else with AMD 5650 and 6550? Well then, set your wallpaper to either "Ladybug" or "Eagle & Waterfall", then run Launchpad and see if you don't see the same 10-bit type banding effect as these screenies*:

 

post-659888-0-55479000-1333386894_thumb.jpgpost-659888-0-21742400-1333386916_thumb.jpg

 

The above are shot with a camera instead of screen-captured because once output to a proper display capable of 32 bit color, the banding artifact will be gone.

 

BTW, Hoolock isn't the only frame buffer to use, not at all, or at least not on my 6550 card. Before this, I always held on to the superstition (thanks to this thread and others) that Hoolock is where it's at. Last weekend I sat down with 2 pints and went through every goddamn ATI frame buffer on the list. My finding is nothing short of amazing.

 

There are plenty of frame buffers that work EXACTLY LIKE Hoolock: Same exact performance, same 10-bit banding on the built in display, same blank output on external display. I sat patiently through it all, typing "GraphicsEnabler-Y AtiConfig=xxxxx" on each reboot, spell checking carefully each time. I really thought I could strike gold and find the perfect frame buffer that will give me that elusive 32-bit color on the internal display.

 

Sadly, I came up with none. I think the ATI5000Controller KEXT is to blame. After all, if the KEXT is working right, then we should be able to use the display ports for external output.

 

Fortunately, my effort wasn't completely wasted. I found one frame buffer that manages to do what others wouldn't. So without further ado, try and set this on your Chameleon Wizard:

 

AtiConfig=Langur

Link to comment
Share on other sites

  • 1 month later...
  • 2 weeks later...

I am able to get to the screen with GraphicsEnabler=No 100% times with FULL Resolution 1920x1080 but with No QE/CI but the moment I change it to Yes I get sometimes blank display with little backlight, sometimes little blurred display, sometimes overlapped display and sometimes mindblowing fully working display.. Framebuffer can be Hoolock or Eulemur doesn't make any difference. I am using all vanila kexts with vanila boot file from Chimera 1.9. Using the boot file from famous r875 makes no difference either.

EDID is extracted using Win7 and put in place of overrides directory properly. I am able to see all supported resolutions and change them when GE is set to Yes. With GE=No only one resolution appears and that is final. I am thinking of changing ATI5000Controller.kext to try and change the connector info but I am not able to get the senseid using radeon bios decode so the possible permutations are very huge to use trial and error method.

I just need right directions or hint to proceed from here..

Please help I am very close to get a nearly working hackintosh laptop

Link to comment
Share on other sites

  • 8 months later...

Some progress here)
I have successfully avoid problem of 16bit colors
I'm using Clover bootloader

So the recipe is:
Try to injecting EDID with clover, I have found that an editing DisplayProductID-717 causes wrong definition of LVDS. OS X shows it as an external display.
For this you must delete or return original DisplayProductID-717
And configure clover <key>CustomEDID</key> with your EDID
then
<key>InjectEDID</key>
<string>Yes</string>

That's it. This solves my problem with correct connectors and 16bit colors(gradients) on my ATI 5470m

Link to comment
Share on other sites

  • 1 year later...

Guys, someone has news about 5650 mobility on mavericks 10.9.2?

Mine works a quite well with this kext i found somewhere in the forum http://www.insanelymac.com/forum/index.php?app=core&module=attach&section=attach&attach_id=137073

and with "AtiConfig=Flicker"

 

But with this personality there is a problem: my graphic card relieve always an external VGA monitor plugged, even if there is no Vga monitor actually plugged in!

So, in the end:

 

- With lvds screen (laptop internal monitor) i have full resolution but two monitors always plugged, so sometimes it's annoying 'cause some window open it the "other screen" so i can't use it; when the screen shut off for saving battery, at the turn up it gives me glithces.

 

- With VGA screen I can't fit optimal resolution (1920x1080, but i can use at least 1600x1200 because if i set the optimal resolution with ScreenResX, i have all the video signal moved to the right, leaving a huge black corner on the left, and it won't fix with no solutions).

 

- HDMI video signal works well, but no HDMI audio.

 

Someone can help me?

Link to comment
Share on other sites

 Share

×
×
  • Create New...