Jump to content
thechickenmoo

2 Different Generation Video Cards, Run 1 Per OS?

4 posts in this topic

Recommended Posts

So essentially, after many years I finally got enough new parts to build a new PC with a Quad Core CPU. I previously had a Core 2 Duo w/ a MSI 975x Platinum Power-Up Edition Board that I had successfully installed 10.4, 10.5,10.6, and 10.8. While using another hard drive to run windows.

 

Now I have:

 

ASUS Sabertooth P67 (rev 3.1)

Core i5 3570k

AMD Radeon HD 6970

 

I've come to learn that the Radeon HD 6790 is essentially not compatible with any Mac OS, which I can learn to live with. I was at least able to install a distro of Mavericks and boot into safe mode for now. 

 

My random thought was this.. I have an older Radeon HD 4670 now laying around without a home. Could I put both in the PC and then just disable the card depending on the OS? IE. Use the newer card in windows 8 or Windows 7 on the one hard drive, and then run Mac OS and have it disable the newer incompatible card?

 

I tried plugging it in for kicks and giggles and was at least still able to boot in safe mode, but I don't see an easy way of disabling the problematic card short of removing it every time i want to use Mac Os which doesn't really sound great long term. 

 

Any thoughts or suggestions? I plan on getting rid of the distro and installing as vanilla as possible again If i can find some sort of solution for the graphics card.

 

The only posts I found for disabling a graphics chip were from mac pro owners with the lemon units.

 

Share this post


Link to post
Share on other sites
Advertisement

Please forgive me if I'm not reading those posts right, but I don't think they answer my question in any way. I've already done vanilla setups in the past on the other board, and that's not really my issue. There is NO support for the Radeo HD 6900 series in MAC OS, this is my attempt at a work around using an older card. 

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

  • Recently Browsing   0 members

    No registered users viewing this page.

  • Similar Content

    • By JohnCenaTheMemeMachine
      I'm not expecting too much help, but if anyone has any suggestions, that would be great.

      I recently got my Hackintosh running on macOS 10.14, and went to install the graphics drivers for my Nvidia GeForce GTX 750 Ti. My setup is a little different, so here it is:

      Screen 1 Screen 2 Screen 3

      Screen 1 and 2 are both connected to my 750 Ti, whereas screen 3 is connected to my iGPU, with is an intel HD 4600. The intel GPU works like a charm, just as it did under High Sierra.

      Now, I got the Nvidia Web drivers to "work" by removing any traces via the "Web Driver Toolkit" that others have recommended. I then patched the installer, installed it, rebooted, patched the installed drivers, and rebooted again, where the drivers are at their current state.

      This is where things get interesting though. All 3 monitors are recognized. My GPU is recognized in my system profiler. However, it is NOT able to be used to run a compute in Geekbench, meaning that GPU acceleration is obviously disabled. The other thing that doesn't work, is, well, the 2 monitors connected to the GPU. They are set to the correct resolution and refresh rates, and are recognized in System Preferences, and I can even drag things between the different desktops.

      But I can only see my cursor. The screens are totally black, and I can't see anything on them except my cursor moving across them. Has this happened to anyone else? Thanks in advance!

      Specs:
      Asrock B85M-Pro4 Motherboard
      Intel Core i5-4690 CPU
      Intel HD 4600 GPU 1
      Nvidia GeForce GTX 750 Ti GPU 2
      iMac 15,1 SMBIOS

      Here is a picture of what happens
      https://imgur.com/gallery/lWTLWlw

      My clover and EFI folders are attached
      EFI.zip
    • By rio2
      How to make AppleHDAController load on Ryzen boards?
      Rename(/add) your audio controller to your DSDT/SSDT as HDEF Add a _DSM method to your HDEF device with layout id 1 Patch the AppleHDAController binary, because it has a static table containing the supported PCI Vendor/DeviceID pairs and it also checks the VendorID against known values. As an example I provide a dif file and patched 10.13.3 binary, but if you want to I can patch the binary for other versions.  
      After AppleHDAController loads to actually get sound working you need to patch AppleHDA.kext for the codec on your board. First I tried to use Lilu+AppleALC for this task, but for some reason it refused to work. (I might look into it later why) But for the time being it was easier to use toleda's cloverALC script for that, but to make it work I had to change(/remove) the specified location of the HDEF device from the script. And for some reason it also required me to mount the EFI partition manually.
       
      Change this:
      if [[ $(cat /tmp/HDEF.txt | grep -c "HDEF@1") != 0 ]]; then Into this:
      if [[ $(cat /tmp/HDEF.txt | grep -c "HDEF") != 0 ]]; then  
      AppleHDAController_Patched10.13.3
      AppleHDAController-10.13.3-AMD.bdiff
    • By Teress
      Hi, I tried during last 4 days almost everything but now I call for help. I have RX560 running on 10.13.4. Previously I was running it on 10.12.6 and had the same problem, but after waking from sleep my thiord display came to life, but it is not working anymore on 10.13.4. Without or with Lilu & Whatewergreen my system correctly recognize my gpu as RX560 but only 2 of 3 displays are displaying content, third display is black but recieving some signal. When I change cabling and connect only two, any combination of 2 displays is working. System profiller and system preferences are sayin I have 3 displays connected.

      Please can somebody help me to figure it out?
      Thanks a lot in advance. Tried to upload my ioreg but it is greater than 10MB allowed for me :(



    • By verymilan
      Hi, i recently set up High Sierra on my AMD computer with the help of the amdosx community (i actually used an installer just to confirm that something would work in the first place before i spend hours for nothing, i can and could have downloaded High Sierra the "legal" way).
      However the graphicscard is recognized properly, the 6gig are recognized and i have no artifacts and no screen tearing with the official nvidia webdriver,
      but unfortunally, dark colors have horizontal dark stripes in them and specific bright colors like on left bar of system windows such as settings do flicker like low fps.

      I only used the clover on the flash drive for now and i'd like if it's worth digging deeper of if i should just forget about it as it is a common nvidia problem on macOS.
      Would be amazing to know.
       
      Closer machine infos:
      * High Sierra (misses a recent minor patch as my network is veeeery slow and i got this nvidia issue anyway so i mainly jumped back to linux)
      * AMD fx8350
      * Asus Sabertooth 990fx Ref. 2.0
      * NVIDIA GeForce GTX 1060 6GB
       
×