Jump to content
1 post in this topic

Recommended Posts

Hello everyone!

 

OK I know this is a relatively ancient system, but here it is:

I was having some free time in the summer and thought of giving my old Toshiba Satellite A300 (PSAGCE) a second try

in installing Mac OS X El Capitan with Clover. Generally, I think that this is a good hackintosh-candidate laptop, apart from

its Mobility Radeon HD 3650 GPU (DevID 0x9591).

 

So, I tried to follow-up guides from bcc9, mucha, slice and vlada on ATI kext patching to enable the internal LVDS display

and see what will happen. I started with the two scripts decoding the VGA Bios. My results were the following:

 

Radeon_bios_decode:

 

ATOM BIOS Rom:

            SubsystemVendorID: 0x1179 SubsystemID: 0xff1c

            IOBaseAddress: 0x5000

            Filename: 28109C.bin 

            BIOS Bootup Message:

Tosh_IEC_Potomac_M86_DDR2 M86 GDDR2_16Mx16 128bit 256MB 600e/500m          

 

PCI ID: 1002:9591

Connector at index 0

            Type [@offset 45056]: VGA (1)

            Encoder [@offset 45060]: INTERNAL_KLDSCP_DAC1 (0x15)

            i2cid [@offset 45146]: 0x90, OSX senseid: 0x1

Connector at index 1

            Type [@offset 45066]: LVDS (7)

            Encoder [@offset 45070]: INTERNAL_KLDSCP_LVTMA (0x1f)

            i2cid [@offset 45169]: 0x14, OSX senseid: 0x5

Connector at index 2

            Type [@offset 45076]: HDMI-A (11)

            Encoder [@offset 45080]: INTERNAL_UNIPHY (0x1e)

            i2cid [@offset 45192]: 0x91, OSX senseid: 0x2

Connector at index 3

            Type [@offset 45086]: 9 pin DIN (9)

            Encoder [@offset 45090]: INTERNAL_KLDSCP_DAC2 (0x16)

 

Redsock_bios_decode:

 

28109C.bin  :

 

Tosh_IEC_Potomac_M86_DDR2 M86 GDDR2_16Mx16 128bit 256MB 600e/500m          

 

Subsystem Vendor ID: 1179

       Subsystem ID: ff1c

Object Header Structure Size: 266

Connector Object Table Offset: 46

Router Object Table Offset: 0

Encoder Object Table Offset: ce

Display Path Table Offset: 10

Connector Object Id [5] which is [VGA]

            encoder obj id [0x15] which is [iNTERNAL_KLDSCP_DAC1 (osx txmit 0x00 enc 0x10?)] linkb: false

Connector Object Id [14] which is [LVDS]

            encoder obj id [0x1f] which is [iNTERNAL_KLDSCP_LVTMA] linkb: false

Connector Object Id [12] which is [HDMI_TYPE_A]

            encoder obj id [0x1e] which is [iNTERNAL_UNIPHY (osx txmit 0x10 [duallink 0x0] enc 0x0)] linkb: false

Connector Object Id [15] which is [DIN]

            encoder obj id [0x16] which is [iNTERNAL_KLDSCP_DAC2] linkb: false

 

As you can see, the Bios decode scripts point out this structure:

 

CRTC1>DIG2>UNIPHY_KLDSKP_LVTMA>LVDS (SenseID 05 ???)

CRTC2>DAC>DAC A>VGA (SenseID 01)

CRTC2>DIG1>UNIPHY_A>HDMI (SenseID 02)

CRTC2>DAC>DAC B>S-Video

 

So, I tried to create a custom framebuffer string for each output:

 

VGA

10000000100000000001000000100101

 

LVDS

02000000400000000901000010010205

 

HDMI

00080000000200000001000010000302

 

I injected the LVDS and VGA strings for AMD3800Controller.kext in Clover. This resulted in black screen in LVDS

(nothing was showing up in system info as well), but the VGA was successfully detected and enabled. [FIRST WIN! :D ]

 

I tested it with the single monitor setting in BIOS (in previous tests with Chameleon, the change in this setting gave me

different results, I don’t know why, but they were weird - e.g. LVDS was showing up as a second VGA monitor with the

same characteristics as the VGA monitor connected when both outputs were enabled in BIOS! Apparently, Clover uses

only single monitor setting - even when LVDS+VGA setting is selected in BIOS, when Clover shows up, one monitor is off).

 

What was strange, was the i2cid value of LVDS (0x14). Doing some search for it, I found a thread from 0xdeadbeef

pointing that Radeon_Bios_Decode script may return wrong values about SenseID.

http://www.insanelymac.com/forum/topic/255199-editing-personalities-in-older-ati-framebuffers-iago-friends/

 

Also, an old thread from YannickD about this laptop on Leopard suggests that this laptop has a LVDS which behaves like

a DVI in a way (I didn’t actually understand that to be honest). So, I decided to do some trial and error tests on SenseID

and ATY-ControlFlag values of the framebuffer (I was considering Transmitter and Encoder values as well, but on HD3000

these are hardwired, so it wouldn’t make much difference). Also tried to change some settings in Clover

(Rename my GPU from OVGA to IGPU in DSDT and play with Load VBIOS and Inject EDID options in Clover).

http://www.insanelymac.com/forum/topic/149872-ati-moblity-radeon-hd-3650-on-ideneb-13/

 

After some (not too many to be honest! :D) tests (mainly brute-force attack on framebuffer values and change of options)

I found out a setting which successfully worked on LVDS!!! My internal monitor is successfully detected and I can have

the basic 2D functions for it (resolution & colour change)! [sECOND WIN! :w00t: ]

 

What did work was the default SenseID of the framebuffer (11 instead of 05) AND the ATY-ControlFlag for DVI

(1402 instead of 4000 - others for DVI might work as well!) PLUS the options in Clover (Rename in IGPU for DSDT,

Load VBIOS and Inject EDID, which are usual Clover settings for laptops). When tried both LVDS and VGA, both

monitors were correctly detected and working! The only strange thing was that the GPU was detected as an

ATI Radeon HD4330M instead of Mobility HD3650.

 

Now the bad things…

I can’t enable QE/CI acceleration, even though the DevID 9591 is included by default in ATIRadeonX2000.kext.

Apparently, the kext is loaded, but returns zero values. This is the second computer that I cannot enable because

of the damn ATIRadeonX2000.kext :wallbash: - YannickD says that after Snow Leopard Apple removed some

detection routines with fixed values. Has anyone here found any clues about ATI Acceleration kexts and their values?

 

I tested different controllers as well to see if I can get the acceleration. Results were the same for AMD2400Controller.kext,

AMD2600Controller.kext and AMD3800Controller.kext, but on AMD4600Controller.kext the system crashed and rebooted

before showing anything.

 

Finally, I tested the HDMI string. The result was that the monitor was successfully detected and its properties were correctly

showed up in Mac OS X, but the monitor was showing nothing (black screen). Since HD3650 can drive up to two different

outputs, maybe a special switch must be enabled for the output, or might be a transmitter-encoder issue (don’t know - didn’t

have the time to search more for it, as I was focused on the internal monitor, but it seems something minor).

This was for the Video only - I have not tested the HDMI Audio

 

To sum up

The framebuffer working was:

 

VGA (successfully detected and enabled - no QE/CI)

10000000100000000001000000100101

 

LVDS (successfully detected and enabled - DIFFERENT VALUES FROM SCRIPTS IN BOLD - no QE/CI)

02000000140200000901000010010211

 

HDMI (successfully detected but showed nothing in monitor - no QE/CI)

00080000000200000001000010000302

 

S-Video - not tested

 

Load VBIOS, Inject EDID and Rename from OVGA to IGPU in DSDT were essential for the monitors to work.

 

I tested these settings across different versions of Mac OS X as well, and got the same results.

I can try them on Sierra as well, but I think results will be the same.

 

Bottomline

I think there is a need to do further search about framebuffer string values and their meaning.

 - IF they do not function for you DO NOT BLINDLY TRUST THEM - ESPECIALLY IF YOU GET BLACK SCREEN!!!!!!!!!

 

Also, it would be very good if someone could tear down the acceleration kexts (especially the ATIRadeonX2000.kext)

Anyone??? Please???

 

Thant’s all folks and sorry for my long thread!

×
×
  • Create New...