Jump to content
Welcome to InsanelyMac Forum

Register now to gain access to all of our features. Once registered and logged in, you will be able to contribute to this site by submitting your own content or replying to existing content. You'll be able to customize your profile, receive reputation points as a reward for submitting content, while also communicating with other members via your own private inbox, plus much more! This message will be removed once you have signed in.

Sign in to follow this  

Fully recognising 6670?

1 post in this topic

Recommended Posts

Hopefully someone answers me this time haha...


Anyway, I've been running Lion for a while and only just realised that DVD player and FaceTime don't work. In System Information the card is only recognised as 6xxx.


The card is a Sapphire 6670 1GB GDRR5 and I'm running 10.7.2


Any ideas on how I can get it properly recognised?

Share this post

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this  

  • Recently Browsing   0 members

    No registered users viewing this page.

  • Similar Content

    • By MetalBreaker
      I noticed some weird behavior with macOS High Sierra. This system was working just fine on Sierra 10.12.5, until I decided to update to High Sierra (10.13.3). The upgrade process went mostly fine I guess, but it got stuck in an infinite spinning wheel which would just keep overlapping. I went into verbose mode, and sure enough, I found "IOConsoleUsers: gIOScreenLockState 3, hs 0, bs 0, now 0, sm 0x0" repeating over and over along with ACM errors... I'm really lost right now. I can boot in recovery mode and safe mode just fine, when the graphics drivers aren't loaded. Web drivers aren't installed. I installed NvidiaGraphicsFixup, but it didn't fix the issue. I tried deleting the native graphics drivers from macOS and installing web drivers, but it didn't help.
      SMBIOS: iMac13,1
      Graphics card: ASUS GT630-2GD3 (It's a Fermi card and it needs injection, so I modified my DSDT. Full graphics acceleration worked in Sierra. No Clover injection. I tried using Clover injection instead, I saw no difference.)
      CPU: Intel Core i3-3210, iGPU disabled in UEFI

      All kexts updated to their latest versions, along with Clover.

      For more info, you can refer to the GitHub issue where I posted it. https://github.com/lvs1974/NvidiaGraphicsFixup/issues/3
      Any help would be appreciated.
      Thank you for your time!
    • By haegar33
      I have browsed thru all forums and all threads for this card but apparently my system is still something special. With some cumbersome effort (none of the simple one-click ##### and other USB-Installers methods ever worked for me) I managed to install Sierra 10.12.5. However when booting is nearly finsihed I get the famous black screen which can be recovered by quickly unplug/plug the DP monitor cable leading finally to a proper initialisation of my 290X. 

      1.) RadeoDeInit is on, without this patch the screens stays dark forever
      2.) Booting with iGFX never works for me, I always get memory allocation errors during boot until I disable the Intel graphics completely.
      3.) I am on Clover 4318, newer version do not make any differences.
      4.) I am using a System Configuration for an Imac 14,2. Most other configs do not make difference either.

      I have not started again the nightmare of Framebuffer patching (as I remember from Yosemite) and I think it will not help as the card is recognised by Sierra but just not initialised.
    • By TheBloke
      Hi all
      I currently use an NVidia 760 with four displays: 1 x 4K; 2 x 1920x1200; 1 x 1920x1080.   This mostly works, but I am considering replacing it with an AMD GPU - likely a Radeon 7970 - for two reasons:
      I would like to add a fifth monitor, maybe even a sixth; I get small stutters and slowdowns with my NVidia, using both Native and NVidiaWeb drivers.  It is a lot better than it was with my NVidia 980Ti, but it's still not perfectly smooth as it should be, especially when I have a full screen video playing while also doing UI movements like Swipe Left A Space or Mission Control.  
      I have just been told that it should be possible to access all display outputs on an AMD 79XX GPU, possibly requiring a custom SSDT and/or radeonDeInit?  I've only ever had one AMD GPU and that was 8 years ago so I am not experienced with them.  My NVidia 980Ti has five outputs, but can only use four at once.  But I believe AMD's EyeFinity does allow it to use them all as separate displays.
      Before I buy the GPU, I'd be most grateful if anyone could confirm that it should be possible to run five or six simultaneous displays on a card such as the ASUS HD 7970 DirectCU II (6 x DP outputs) or XFX AMD Radeon HD 7970 (2 x mini-DP; 1 x HDMI; 2 x DVI).  With one display being 4K @ 60fps and the rest 1920x1200 or 1080P.
      I have Googled on this a while but haven't yet found anyone discussing connecting more than four monitors to an AMD Hack  I did see that the late-2013 Mac Pro (which uses 2 x AMD FirePro GPUs) states it can support "up to 6 Thunderbolt displays".  So I know the OS can do it, it's just a question of whether a particular GPU can, and in a Hack setup.
      Thanks very much