Jump to content
Welcome to InsanelyMac Forum

Register now to gain access to all of our features. Once registered and logged in, you will be able to contribute to this site by submitting your own content or replying to existing content. You'll be able to customize your profile, receive reputation points as a reward for submitting content, while also communicating with other members via your own private inbox, plus much more! This message will be removed once you have signed in.


  • Content count

  • Joined

  • Last visited

About tilllt

  • Rank
    InsanelyMac Protégé
  1. Ok, i got my Gigabyte Aero 15 to replace my ancient Macbook(s) Pro. Since i still need to finish up some old FCP7 Projects, i need OSX running with Graphics acceleration. Unfortunately the Aero 15 - although it features a nice GTX1060, is still using Optimus grahics switching. So i am waging the options, as mentioned in my other thread here: http://www.insanelymac.com/forum/topic/329903-osx-in-qemu-kvm-kholia-tutorial-no-network/ So i followed the Kholia Guide and i got Sierra running. I have the virtio-net virtual Network Card running, next thing is graphics. I am waging my options between: 1. GVT-g shared Intel Graphics with the internal Intel 630 GPU. 2. Passthrough of the primary Intel GPU, running on a headless Lubuntu Hypervisor. As you can imagine, both options have Pros and Cons. I'd love to hear a success story for either one, since both options are not really well documented and i am not sure if i am knowledgeable enough to function as a guinea pig for this setup process. The main doubts about Option 1 is that i am not sure if it is usable already or rather a proof-of-concept still. In any case it would mean to use VNC as a protocol for viewing the Guest output, since not even SPiCE is implemented yet... Some relevant Infos: https://github.com/kholia/OSX-KVM https://01.org/igvt-g/blogs/wangbo85/2017/gvt-g-upstream-status-update-were-transition-phase https://github.com/intel/gvt-linux/wiki/GVTg_Setup_Guide https://www.reddit.com/r/archlinux/comments/5wy8x4/has_anyone_passed_through_intel_hd_to_guest_vms/ https://www.kraxel.org/blog/2017/01/virtual-gpu-support-landing-upstream/
  2. Darn. There was a character in the comment line before which made QEMU / KVM ignore the line with the Network card. So thats settled and i could download the virtio-net-osx kext from here: https://github.com/pmj/virtio-net-osx In parallel i am trying to get Intel GVT-g running, anyone tried this with OSX before? https://github.com/intel/gvt-linux/wiki/GVTg_Setup_Guide and https://www.kraxel.org/blog/2017/01/virtual-gpu-support-landing-upstream/ Obviously its a major downside of only being able to use VNC as the remote desktop viewer (for now), advantage would be full (shared) Intel graphics with acceleration in tge Guest...
  3. Well, thats the next step then. Actually i am already resarching how to do that on my Optimus Notebook, so i guess i have to pass through the primary (intel) iGPU - running the Hypervisor headless, which apparently is possible, but i didnt find a comprehensive guide yet. Unless there is a way to use the GTX 1060 dGPU of my optimus Laptop with the VM (which i havent heard of yet)
  4. Hey People, i have installed OSX Sierra in a KVM / QEMU VM on my Ubuntu, following the Kholia OSX Guide: https://github.com/kholia/OSX-KVM Install went smooth and i can boot OSX without problems, even using the boot-macOS-HS.sh - which supposedly is for High Sierra but seems to work fine with Sierra. Reason for using the High Sierra start script is that i preferred Clover over Enoch Bootloader since i am familiar with Clover. The main problem i am stuck with is setting up the Network. None of the outlined ways work for me, neither User Networking with Slirp nor with TAP device, not with Bridged mode. The OSX guest does not have a Network device at all. Anyone has an Idea what i can do to get a network device into OSX that is compatible with either of the Networking methods? Next step would be setting up the accelerated graphics (vmware) adapter ... ultimately i want to be able to use FCP7 in the VM... Cheers, t.
  5. Did all of that, no success. I changed my efforts to mavericks now and it detected both gpu's immediately. The only annoying thing is that it changes from dvi to HDMI out once the GUI is displayed. Haven't resolved that yet.
  6. i abandoned the Mountain Lion process and will continue with a Mavericks attempt. First boot looks much more promising, OOB both video cards detected ... on reboot still the hang ups but like i said. looks more promising.
  7. no success. What is strange is that i even tried to install Lion (which i have running on another partition) and the boot process fails at the same point as Mountain Lion... This is strange because i succesfully installed lion back some years ago and had it running since then...
  8. ok, since nobody seems to have a hint for what else i could try, let me rephrase the question: are there combinations of graphics card that are impossible to get up and running (like my 9500gt and the gtx680)?
  9. Hi People, i still use my old but (usually) reliable original lifehackers build, featuring a Gigabyte EP45-UD3P (rev 1.0), 8GB RAM. Originally equipped with a nVidia 9500 GT then later dual gfx with 9500gt/gtx285 and now 9500gt/GTX680 - if it would work. Right now i am just trying to get this thing rolling, so far without luck. I installed ML using Kakewalk and using the method by another big hackintosh site which doesnt seem to be friends with this. Funny enough, with both methods the installation works fine, but then the system hangs on bootup. Without changing anything, it stalls at PCI CONFIGURATION BEGIN. This can be fixed by either using npci=0x2000 or npci=0x3000 both seem to work. Then the boot process advances to the point where the graphics drivers are loaded. I tried booting the system only with one gfx card (both the 9500gt and the gtx680) no success. In both combinations i tried graphicsenabler=yes / no. I tried deleting the NV* kexts, since that was suggested somewhere, no change. Obviously i also tried booting safe mode (-x) - doesnt start. I can only boot up using single user mode, thats the furthest i get. I am kind of baffled why i am having so much problems, since there are a lot of success stories of people with the same board and ML, so i think the graphics card must be the culprit.... There is also an error with "Sound assertion in ApplaHDADriver..." but i removed the AppleHDA kexts and it was the same. I would really appreciate any suggestions what else i can try, since i ran out of ideas ... i cannot try installing mavericks because i have a software that runs only on mountain lion or below... thanks, t.
  10. Hey People, i have a dual graphic card setup on my EP45-UD3P. I am running a GTX285 and a 9500GT. Both work fine, i hope with the CUDA driver update now system crashes are gone. Out of curiosity i was running the Luxmark 2 OpenCL Benchmark with disappointing results. The Complex Scene gave me a result of 127... If you search the result database for GPU-Only results running the GTX285, the submitted results range from 148 to 2636 (!) ... so what exactly am i doing wrong here? ciao, t.
  11. i have two CUDA / OpenCl capable cards in my machine. How do i specifically run the benchmark on either one of them?
  12. Graphic Card switches after boot - 9500GT & GTX285

    Ok, so i solved this issue myself, like this guy did too. Actually it was the same Problem: http://www.insanelymac.com/forum/index.php?showtopic=272091&view=findpost&p=1770327 So i will PM the Author of the Multiple GFX Card guide here to include the NVDA, Parent and Child Parameters in his Tutorial, this will save a lot of people work. cheers, t.
  13. i just need CUDA for Davinci Resolve, i dont use Adobe Premiere but Avid and FCP7 which both dont support CUDA / OpenCL, i don't do 3D Stuff, and After Effects does not support Rendering with CUDA but only speeds up the preview, which is fine but was ok before. Davinci Resolve on the other hand is blazing fast with the gtx285 gpu, color correction runs in realtime on 1920x1080.. thats amazing, compared to color. and even more so, because i bought the gtx285 on ebay for 80€ - so i cannot complain.
  14. hey, after the enlightening thread one other suggestion, depending on what you want to do. the nvidia is not cheap so maybe you are already getting into the price segment of video-wall output controllers. google it, there are a lot of choices. and the ultra cheap-o solution which i used couple of times to have synchronized video on an variable number of screens: an excellent piece of software called multiscreener by zach poff - syncs video over TCP/IP between multiple machines. its free software! btw. i think usually the people having more than two nVidia cards installed are looking for CUDA power, not multi-monitor setups. byebye, t.
  15. Hey, i tried all day to figure out whats the problem but now i am at the end of my wits. For DaVinci Resolve i need a CUDA GPU in my System, so i bought a cheap Gainward 1gb GTX285. I followed the aquamac guide for installing 2 graphic cards, created my NVCaps via the NVCapmaker etc.pp. Both cards are working in a way... The 9600gt is being used at boot-time (as i have set it in BIOS to be the "main" adapter). But once Lion 10.7.2 is booted to the login screen, the display changes to the GTX285 and the 9600gt goes black. I am running on a GA EP45-UD3P with the GTX285 installed in the 16x PCIe and the 9600GT in the 8x PCIe slot. The system is installed via kakewalk, just added a voodoo hda for sound and a recent sleepenabler. Anyone has an advise how to solve this problem? At the end i basically will just use the GTX285 as a GPU to provide computing power to resolve - without a monitor connected, but of course also it would be nice if it worked as a graphics adapter as well. thanks, t. <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"> <plist version="1.0"> <dict> <key>PciRoot(0x0)/Pci(0x1,0x0)/Pci(0x0,0x0)</key> <dict> <key>@0,compatible</key> <string>NVDA,NVMac</string> <key>@0,device_type</key> <string>display</string> <key>@0,name</key> <string>NVDA,Display-A</string> <key>@1,compatible</key> <string>NVDA,NVMac</string> <key>@1,device_type</key> <string>display</string> <key>@1,name</key> <string>NVDA,Display-B</string> <key>@2,#adress-cells</key> <string>0x01000000</string> <key>@2,#size-cells</key> <string>0x00000000</string> <key>@2,compatible</key> <string>NVDA,sensor-parent</string> <key>@2,device_type</key> <string>NVDA,gpu-diode</string> <key>@2,hwctrl-params-version</key> <string>0x02000000</string> <key>@2,hwsensor-params-version</key> <string>0x02000000</string> <key>@2,name</key> <string>sensor-parent</string> <key>@2,reg</key> <string>0x02000000</string> <key>NVCAP</key> <data>BAAAAAAABwAIAAAAAAAABwAAAAA=</data> <key>NVPM</key> <data>AQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA==</data> <key>VRAM,totalsize</key> <data>AAAAQA==</data> <key>device_type</key> <string>NVDA,GeForce</string> <key>model</key> <string>NVIDIA GeForce GTX 285 DDL</string> <key>name</key> <string>NVDA,Parent</string> <key>rom-revision</key> <string>3172a</string> </dict> <key>PciRoot(0x0)/Pci(0x6,0x0)/Pci(0x0,0x0)</key> <dict> <key>@0,compatible</key> <string>NVDA,NVMac</string> <key>@0,device_type</key> <string>display</string> <key>@0,name</key> <string>NVDA,Display-A</string> <key>@1,compatible</key> <string>NVDA,NVMac</string> <key>@1,device_type</key> <string>display</string> <key>@1,name</key> <string>NVDA,Display-B</string> <key>@2,#adress-cells</key> <string>0x01000000</string> <key>@2,#size-cells</key> <string>0x00000000</string> <key>@2,compatible</key> <string>NVDA,sensor-parent</string> <key>@2,device_type</key> <string>NVDA,gpu-diode</string> <key>@2,hwctrl-params-version</key> <string>0x02000000</string> <key>@2,hwsensor-params-version</key> <string>0x02000000</string> <key>@2,name</key> <string>sensor-parent</string> <key>@2,reg</key> <string>0x02000000</string> <key>NVCAP</key> <data>BAAAAAAAAwAEAAAAAAAABwAAAAA=</data> <key>NVPM</key> <data>AQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA==</data> <key>VRAM,totalsize</key> <data>AAAAIA==</data> <key>device_type</key> <string>NVDA,GeForce</string> <key>model</key> <string>NVIDIA GeForce 9600 GT</string> <key>name</key> <string>NVDA,Parent</string> <key>rom-revision</key> <string>3172a</string> </dict> </dict> </plist>