I am using an ATI Radeon 5870 card, with OS X 10.8.3 and a 20" Apple Cinema Display.
When I connect the display using a 5 meter DVI cable directly, everything works fine.
However, if I connect it using a 10m Mini-HDMI cable (and HDMI+DVI adapters), the system POSTs, and I can even boot to Windows with my native resolution, but in OS X, I don't get any image.
If I connect the 10m cable while OS X is up, I get a black screen, and switching cables doesn't help after that - only a reboot does.
Same thing happens if I boot with the cable connected.
A very similar behavior happened to me with a HDMI to CAT5e adapter.
When this happens, System Profiler shows no display connected at all.
Any ideas what could cause this or how this could be solved?
Let me know if there's any more information I can provide.
2 replies to this topic
Posted 19 May 2013 - 05:06 AM
Posted 19 May 2013 - 05:11 AM
Cheap cable... poor shielding
Posted 19 May 2013 - 11:29 PM
The exact same thing happened to me with a HDMI to CAT5e adapter - it also boots into Windows just fine.
What makes OS X more sensitive to this? Perhaps there is some framebuffer property I can change to improve it?
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users