Jump to content

tech project, two nvidia 590 gtx sli with 6 monitors


raidsix
 Share

21 posts in this topic

Recommended Posts

Hi,

 

Before you call me crazy: it's a tech project.

 

I want to use two "nvidia gtx 590" cards with 6 monitors for example: http://samsung.de/de/Privatkunden/Buero/Displays/OfficeDisplays/md230x6/LS23MURHBEN/detail.aspx

 

Dualboot with win7 and macosx, i do a lot of 3d and grfx stuff and need to use win7 for some task but i do like more osx thats why i want to use dualboot without switching monitors

 

Currently i just know that its not so easy to get the second gpu running, some ppl get it running, some not. I have my background why i want to use the 590gtx so i dont want to start a discussion about the goods and bads about this card. Yes, that card is expensive and maybe to sick buts that not the topic. Does anyone got experience with this topic?

 

Regards

Chris

Link to comment
Share on other sites

  • 2 weeks later...

What I wanted to find out then, would the GTX 590 (with 3 DVI ports) be able to work by itself and still be able to use all 3 DVI ports to run 3 separate monitors off that one graphics card in Mac OS X using 10.7.0 off my EVGA SR-2 Mobo and the rest of my PC parts to make this system work? Currently I'm using GTX 480 and it's water-cooled to keep the temps down...

Link to comment
Share on other sites

  • 2 weeks later...

Nvidia really needs to do something about the arch limitation it's been bothering me beyond reason why my gtx 570 and radeon 6870 has the same amount of connectors yet not equal due to clock limitations within the 570. *sigh*

 

If you could somehow get 3 590s running without compromising your PSU (and electric bill in conjunction) then you'll have yourself a 6 monitor setup.

Link to comment
Share on other sites

  • 2 weeks later...

I DON'T KNOW HOW I CAN MAKE THIS ANY CLEARER... GRINGO VERMEL - HO... NOT ONLY WILL 3 MONITORS WORK BUT YOU CAN DO 6 MONITORS ACROSS !!!!! DON'T BELIEVE ME? GO HERE:

 

http://www.s15515867...edg5hackia.html

 

OH, I ALMOST FORGOT, HE'S KICKIN' IT WITH 3 EVGA 480 GTX GPUs !!!!!

 

It all has to do with seeking knowledge that is outside your own train of thought. That, and knowing how code your strings to make it work. I KNEW there was a way to make it happen and obviously the guy on that link that I provided (above) does... Later...

 

PS - I also hope this bit of info has helped you too TH3L4UGH1NGM4N... B)

Link to comment
Share on other sites

OK now you're just making a fool of yourself. I never said you couldn't use three nvidia cards at the same time.

 

Read your own topic title, maybe you have forgotten what your question was to begin with:

I want to use two "nvidia gtx 590" cards with 6 monitors

My initial answer to that, which I have apparently wasted my time posting twice in this topic, still stands.

It is not possible to use six monitors with two nvidia cards, because no matter how many nvidia video cards you can get working at the same time, you can still only use two displays on each card.

 

If you have two nvidia cards, that means you can't connect more than four displays. Maths. Two displays on each of two cards equals four displays.

 

If you still don't understand, let me know. I'm sure I can come up with more ways to explain it.

Link to comment
Share on other sites

Still doesn't matter what is being described with whatever it is you're saying. I'm sure that Steve at Aquamac can (once he figures out the GPU string - using Windows 7) should be able to have 3 monitors working with that GTX 590 (with the 3 DVI outputs) on that card. Now I know that I can use 3 monitors full wide screen across (weather using the GTX 590 GPU or 3 x 480 GPUs). And NOW you can use even more; up to 6 separate screens. On my system I can put 7 GPUs (which is WAY OVERKILL) cause I have an EVGA SR-2 system. But I appreciate your input and thoughts Gringo Ho... Later... :-)

Link to comment
Share on other sites

*sigh* you don't understand, there is no "GPU string" to figure out.

 

According to Krazubu/Fassl, two displays per nvidia GPU is a hardware limitation. On Windows the driver does some kind of software trick to drive more than two displays.

 

This is not possible on OS X because this "trick" is not implemented in the OS X nvidia drivers. And since the drivers dont' support it, it can't be done through your injection method or "GPU string" if that's what you meant.

 

Besides, Aquamac has been doing what he does for years, I'm guessing even before there was a Hackintosh scene. If it was possible to drive more than two displays per Nvidia GPU then he would have figured it out by now.

 

If you must use more than two displays per GPU, buy an ATI card.

Link to comment
Share on other sites

You BOTH are CORRECT. I confirmed it with Steve (Aqua-Mac) as he did say that ONLY 2 Monitors will work for each GPU. I apologize for not trusting your info as I always "trust" - but verify... everything...

 

Also, since he has a SLAMMIN' system with 3 working GPUs I thought I would confirm that with him (as nobody that I can see here online has a similar system setup like his). Again, he did say that ONLY 2 will work each GPU, but he was able to work out how to program ALL 3 GPUs to work together so he can have 6 monitors working side by side to each other for (not an ULTRA wide screen view = 3 screens), but an ULTRA-ULTRA wide screen view with 6 screens working side by side (by side... by side... by side... by side... - LOL!!!).

 

The bottom line is IT WORKS - EVEN IF YOU HAVE TO USE 3 GPUs. Thanks to guys like Aqua-Mac who make it happen so we too can enjoy the bennies from his labor of knowledge; if that is something that we are seeking to use for ourselves by having that kind of monitor set up... Later... :-)

Link to comment
Share on other sites

hey,

 

after the enlightening thread one other suggestion, depending on what you want to do. the nvidia is not cheap so maybe you are already getting into the price segment of video-wall output controllers. google it, there are a lot of choices. and the ultra cheap-o solution which i used couple of times to have synchronized video on an variable number of screens: an excellent piece of software called multiscreener by zach poff - syncs video over TCP/IP between multiple machines. its free software!

 

btw. i think usually the people having more than two nVidia cards installed are looking for CUDA power, not multi-monitor setups.

 

byebye,

t.

Link to comment
Share on other sites

I was just about to mention the CUDA & Mercury driver thing. That is the WHOLE REASON why I have the EVGA 480 GTX Superclocked GPU. One day I was on nVidia's sight and I saw a comment that mentioned on the Quadro 4000 (driver download section) saying something to the effect of "...for the GTX 285 no drivers are needed..." So that made me think - "why is that comment there?" So something in my brain inspired me to googled "Quadro 4000 vs GTX 285" and found some very interesting info on an Apple forum site (and a couple of other sites). I guess there were a few guys on there actually testing the rendering power using Photoshop, Final Cut Pro, After Effects, Motion, etc. Come to find out that the GTX 285 not only performed the same as the Quadro 4000, but even better in some cases.

 

I put 2 and 2 together an got the GTX 480 Superclocked, and so far to date is FAR better than the GTX 285 and Quadro 4000 in it's rendering processing abilities AND MAINLY IT'S PRICE OVER THE Quardro 4000 !!! I was able to get a GTX 480 (7 months ago) along with a Koolance Water block to cool down that Bad-Boy, because those things run VERY HOT. In fact it's the HOTTEST CARD out there. I only paid $340.00 total for both the GPU and Waterblock. Now the GTX 580 doesn't (for whatever reason) run as hot at the 480, but at the time it was $230.00 more, NOT WORTH the 5% - 10% boost in it's rendering speed. Plus, the Water block dropped my temps down from 70C - 95C+ on Idle (yes - on idle) down to around 38C - 45C UNDER LOAD. Go here to watch the review on what this Koolance Water Block can do for the 480:

 

 

I guess the Quadro 4000 might be able to outperform the GTX 480 when it comes to using 3D rendering apps like Lightwave, Maya, Softimage, etc., but for the other apps that I use constantly it's not worth justifying $700 to $800 for a card that's slower in performance and costs twice the price than a GTX 480 in my opinion...

 

Cool thing is you can now customize how much RAM and CUDA you want to use in Photoshop 5. I still keep them at their default settings as I have enough RAM (48GB) to do the job, but it's nice to know that GPU is offsetting the some of the stress of the CPU and RAM to make all of them (CPU, RAM and GPU) work together...

Link to comment
Share on other sites

i just need CUDA for Davinci Resolve, i dont use Adobe Premiere but Avid and FCP7 which both dont support CUDA / OpenCL, i don't do 3D Stuff, and After Effects does not support Rendering with CUDA but only speeds up the preview, which is fine but was ok before. Davinci Resolve on the other hand is blazing fast with the gtx285 gpu, color correction runs in realtime on 1920x1080.. thats amazing, compared to color. and even more so, because i bought the gtx285 on ebay for 80€ - so i cannot complain.

Link to comment
Share on other sites

Well I can agree with you on that with the 580s price tag I just saw it more fitting to get a 570 to go with my amd 6870 because the 570 was at a sweet price point that gives a good performance:price ratio. In all honesty your 480s will keep you for a while nVidis does make solid gpus that last.

Link to comment
Share on other sites

 Share

×
×
  • Create New...