loc[a]lhost Posted May 19, 2013 Share Posted May 19, 2013 Hi, I am using an ATI Radeon 5870 card, with OS X 10.8.3 and a 20" Apple Cinema Display. When I connect the display using a 5 meter DVI cable directly, everything works fine. However, if I connect it using a 10m Mini-HDMI cable (and HDMI+DVI adapters), the system POSTs, and I can even boot to Windows with my native resolution, but in OS X, I don't get any image. If I connect the 10m cable while OS X is up, I get a black screen, and switching cables doesn't help after that - only a reboot does. Same thing happens if I boot with the cable connected. A very similar behavior happened to me with a HDMI to CAT5e adapter. When this happens, System Profiler shows no display connected at all. Any ideas what could cause this or how this could be solved? Let me know if there's any more information I can provide. Thank you. Link to comment Share on other sites More sharing options...
Rampage Dev Posted May 19, 2013 Share Posted May 19, 2013 Cheap cable... poor shielding Link to comment Share on other sites More sharing options...
loc[a]lhost Posted May 19, 2013 Author Share Posted May 19, 2013 That's what I thought at first, but then I saw it booting perfectly in Windows. The exact same thing happened to me with a HDMI to CAT5e adapter - it also boots into Windows just fine. What makes OS X more sensitive to this? Perhaps there is some framebuffer property I can change to improve it? Link to comment Share on other sites More sharing options...
Recommended Posts