accidental_hackintosher Posted November 25, 2014 Share Posted November 25, 2014 Hi, I have a two monitors and a TV attached to my graphics card. The TV via HDMI and the monitors via DVI and DVI -> VGA Adapter. The monitor which is connected via DVI and the TV are working just fine. But the second monitor is only recognized as 'VGA Display' and I'm not able to choose it's native resolution, which is 1680x1050. When I hook it up via DVI, the Hack does not even recognize it. Is there a possible workaround to get it running? I would prefer to not use the internal graphics for the second monitor, I had problems with it in the past. Thanks in advance! Link to comment Share on other sites More sharing options...
jamiethemorris Posted November 25, 2014 Share Posted November 25, 2014 Try this: http://www.insanelymac.com/forum/topic/290097-guide-108-add-your-custom-retina-hidpi-resolution-for-your-desktop-display/ Obviously it's for a slightly different purpose, but by adding 1680x1050 to scale-resolutions you should be able to achieve what you want. You can also use SwitchResX, which is easier but it's only a 15 day trial. Link to comment Share on other sites More sharing options...
kvonlinee Posted November 25, 2014 Share Posted November 25, 2014 What version osx you run? You card work like 10.8 and up with flag GraphicsEnabler=No in chameleon and in clover you don't inject any graphics items. Link to comment Share on other sites More sharing options...
accidental_hackintosher Posted November 27, 2014 Author Share Posted November 27, 2014 Thanks for the answers. I solved it with SwitchResX. 14,- € for that piece of software is totally worth it. Link to comment Share on other sites More sharing options...
Recommended Posts