Jump to content

[FIXED] Intel GMA HD 5700MHD


All Kand
 Share

107 posts in this topic

Recommended Posts

sorry for long silence i have been occupied with other responsibilities recently, i think your idea is great but am not sure if it will work since the kernel will not be loaded at a time and the kext needs to map memory for registers access i dont know how to do it without the implementation of kernels methods

 

i tried to dump some of registers values while vannila kexts are loaded and its slightly different from what i wrote using our custom kext and when i use the dumped values i got the similar distortions as the ones from the vannila kexts but with undivided desktop and if i change them nothing remarkable happens

 

i have planned to write a butchbuffer and ringbuffer so that i can have access to display planes and see what i may accomplish since to me it seems the display pipe part is ok and its important since most of the commands are at the plane level including allocation of framebuffer and other graphics memory functionality, in the pipe just timings are added and thats what we have accomplished till now

Link to comment
Share on other sites

sorry for long silence i have been occupied with other responsibilities recently, i think your idea is great but am not sure if it will work since the kernel will not be loaded at a time and the kext needs to map memory for registers access i dont know how to do it without the implementation of kernels methods

 

G62, are you saying that my idea is not going to work because of kernel dependencies, or that I want to load the driver with Clover and pray for different results, or that I want to load the kext before the OS? If it is the last two, that is not what I'm saying; what I'm saying is using your driver's codebase, write an EFI Graphics Driver. This will enable Clover to output our resolution, and Clover will hand that resolution to the OS. One thing to keep in mind is that Clover is based off of DUET, so it emulates EFI firmware, whereas Chameleon just implements the EFI system calls needed by the kernel to boot. This is why we can load UEFI drivers on PCs with BIOS firmware.

 

If you need help on how to develop a UEFI driver, take a look at this document:

http://www.intel.com...ller-guide.html

 

I hope this helps point you in the right direction! ;)

 

P.S. Even if you don't know it will work, why not just try it? I mean, if it works, great; but if it doesn't work, we're no better off than we are before! :wink2:

Link to comment
Share on other sites

1366x768 is annoying because Apple doesn't have anything that uses it, nor is there any monitor with that 14-inch/15-inch resolution.

This resolution howerver will work, because Apple designed their system to be able to handle pretty much any resolution, so long as it is picked up.

 

What are the ways that Chameleon or OSX picks up resolutions, other than Framebuffer kexts and VBIOSes?

 

Btw, you could try compiling Chameleon with the graphics-detection changes in Clover.

 

My HD Graphics machine here is a Thinkpad T410 that only has Intel Graphics. There is an Nvidia T410 and another T410 with Optimus, but we're referring to the Intel-only one here.

 

The resolution of 1280x800 is detected by Chameleon, and any OSX I boot shows up in 1280x800. This is the same for another Lenovo I have that has a GMA 950.

 

Maybe Chameleon is more friendly towards 1280x800, as many Macbooks/Pros/Airs use it. Similarly, my HD3000 machine gives out 1024x768 instead of your resolution, 1366x768 in Chameleon. (It works fine in OSX).

 

Maybe, you could either try cross-compiling Cham and Clover, or maybe take a look at the gma.c (Might not exist anymore, but i seen it b4. Try graphics.c/graphics.h) portion of Chameleon. Seen it before... but not sure whether it affects anything.

 

There's also a ChameleonMR2 that forces resolutions. Take a look at them, maybe.

 

Here's one of the latest breakthroughs with our predecessor, the GMA X4500MHD. http://www.osx86.net...id=2866&page=17. Supposedly the guy enabled QE/CI that runs in software mode. Might be of our help.

Link to comment
Share on other sites

To above :

 

1366x768 is annoying because Apple doesn't have anything that uses it, nor is there any monitor with that 14-inch/15-inch resolution.

This resolution howerver will work, because Apple designed their system to be able to handle pretty much any resolution, so long as it is picked up.

 

wrong statement

the macbook air 11" have native 1366 x 768

 

 

Prroof :

http://www.apple.com/macbookair/specs.html

Link to comment
Share on other sites

  • 3 months later...
  • 2 weeks later...

Has anyone made any headway on this? I have a Lenovo g560 with Intel Graphics HD 0046:8086 as well.. tried Manos option with the same 1024x768 resolution, and no acceleration as well. Hoping a solution is found pretty soon.

 

-Seige

There has been progress with the resolution module... check out my thread in The Genius Bar. (Or click on the link in my signature.)

Link to comment
Share on other sites

  • 1 month later...

Hi Guys!

 

This only works with 10.6 SNOW LEOPARD. Leopard unsure, Lion/ML doesn't work.

 

First, follow this guide here. http://www.insanelym...pic=223754&st=0

 

Once you have the transparent menubar running, you will realise that there are lots of distortions.

 

Fret not!

 

Firstly, enable Screen Sharing. Make sure you allow VNC connections. Set a password for VNC clients.

 

Then, get Chicken of The VNC. It's a VNC client. Get it set up and connected to your own hackintosh. Use the option 'View Only'.

Run the client on your the machine with the GMA 5700. That's right - run it on the very same machine. KEEP IT RUNNING.

 

You will realise that there are no more distortions! Now, you can watch Powerpoints and set Screensavers!

 

However...

 

I'm not sure about sleep and how to get Chicken of the VNC running at startup.

 

NOTE:

Minimise the VNC client. Use the desktop as normal. I'm not asking you to watch your PC thru a VNC client. You will notice that the display in the client might be distorted.

Now that we've got the Resolution.dylib module up and running for our card (if you still aren't aware of this discovery, check out RemC's first post in my thread in my signature), I'm wondering if this card can enable Quartz Extreme and Core Image without distortions and without this Chicken of the VNC hack. Try just the method in the guide listed in the link from the quote above above with your Resolution.dylib module. Do NOT mess around with Chicken of the VNC. For me, as long as my resolution is at 1366x768, there are no distortions with Finder. I'm not sure about anything else, though. Just thought I'd send this idea for you guys to test!!

Link to comment
Share on other sites

 Share

×
×
  • Create New...