Jump to content

USB 3.0 HDMI monitor adapter


TheBloke
 Share

5 posts in this topic

Recommended Posts

Has anyone tried adding another monitor via a USB 3.0 adapter?
 
I can't add Thunderbolt to my system as it's X58 and too old.  But I do have working USB 3.0, and I have seen cards like this: StarTech.com USB 3.0 to HDMI External Multi Monitor Video Graphics Adapter for Mac & PC.
 
IF8k0Evm.jpg?1
 
It says it works for Mac, and I have working USB 3.0 so it sounds fine in principle.  But sometimes these things can be more complex with a Hack, so I thought I'd ask first to see if anyone has tried one?
 
In my case I am hoping to use it to add a fifth screen to my system, as NVidia GPUs can't use more than four.  The fifth screen will be 1920x1200 or 1920x1080.
 
Unfortuantely that one above is pretty expensive.  But even more interesting are these:
 
- dodocool USB-C Hub, type C Hub with Power, 1 HDMI port...
- UCOUSO USB Type C HUB, USB-C TO 3 USB 3.0 Ports with HDMI 4@30hz
 
gRiKJF8m.jpg?18im5rfgm.jpg?1

Both of which are a USB 3.1 hub with 4K-capable HDMI (not that I'd have the bandwidth for that), and one has card reader too. And they're much cheaper! However I guess it requires a USB-C port on the host (don't know if a USB 3.0 A to C adapter cable would work), so I'd have to buy a USB-C 3.1 PCie card to use instead of my current USB 3.0 card. But buying that + this would still be cheaper than the first dongle shown. 
 
i'd be grateful if anyone has any experience of using a USB 3 to monitor adapter on a Hack and could let me know your thoughts.

EDIT: Actually I'm now wondering if those hubs can work with a standard USB-C port. One mentions "..works for devices with the USB-C port which supports data transfer and video output functions", possibly implying it requires a special USB C port? Another listing mentions being compatible withThunderbolt-capable MacBooks, again maybe suggesting the required USB C ports are different to a standard USB-C connector?  Or it maybe it just means that only Thunderbolt Macs also have USB 3.1..

Also I realised I can't currently add a USB 3.1 Type-C PCIe card because they all require a x4 slot, and I only have x1 free at the moment.  At some point I might be able to free up a x4 slot if I replace my 2 x X520-DA1 10GB cards with a single X520-DA2, but for now I'd prefer to stick to x1 PCIe USB 3 cards.

So given my USB 3 will come from a PCIe 2.0 x1 card (with a maximum bandwidth of 500MB/s / 4 Gb/s) , maybe I need to stick to something that specifically states it works with USB 3.0, like the dongle I listed at the top. Maybe these cheap USB-C hubs aren't going to work with my setup, even with a 1080p screen?  (I see that 1080p @ 60hz requires 475MB/s, just within the 500MB/s limit of PCIe 2.0 x1)

Link to comment
Share on other sites

OK to answer my own question on the USB C adapters, I am now almost certain that these require special host hardware. 

 

Certain laptops and phones have an alternate graphics path that I guess allows connection to the GPU via USB-C.  It's not a USB monitor as such, USB-C is just an alternate way to interface with the existing GPU.

 

However I am pretty sure the USB 3.0 device I listed is an actual GPU-in-a-box type thing.  Explaining why it's both more expensive and less capable (1080p only).  And pretty chunky, given it's 'only' an HDMI output.  I believe it's actually more than just an HDMI output, it's also got the hardware to create the picture.

Link to comment
Share on other sites

In the end I ordered this one:  ClimaxDigital CUH350D USB 3.0 to DVI,VGA,HDMI Adaptor for Multiple Monitors - DisplayLink DL-3500 ChipsettEXbHYAm.jpg?1
 
USB 3.0 to DVI with HDMI adapter.  Confirmed in the reviews to work in 10.13.  And supports 1920x1200 at 60fps, which is what I'd prefer to use rather than 1080P.
 
I also bought a second USB 3.0 card (an Inateck using FL1100 chipset).  I figured that 1920x1200-60hz is going to use pretty much all of the 500MB/s bandwidth I can get from my PCIe 2.0 x1 USB 3 card, so I'll want to dedicate a whole USB 3.0 card to the external monitor, leaving the other card for all other USB 3 connections.  Although it's possible there's compression involved on the USB 3 side of the connection so it doesn't require all the BW that the connection to monitor uses.  I'm not sure.  None of the descriptions for the various adapters mention bandwidth concerns, but then they're probably assuming onboard USB 3.0.
 
So the total price, including the second PCIe USB 3.0 card, was £57.  Not cheap exactly, but if it all works as planned I think it'll be a nice upgrade.  
 
It will hopefully take me to a total of six screens; four on my NVidia, my iPad as a combined screen/Touch Bar using Duet, and then a sixth monitor via HDMI 3.0.    In practice this means four 'work' screens; the iPad I have mounted above keyboard purely for my email client and for the virtual Touch Bar, and one of the other screens is a 32" TV that I only ever watch videos on.  So right now I have three screens for everything else, which hopefully becomes four.
 
No it's not excessive at all :)

Link to comment
Share on other sites

Well that didn't go well. 

 

The DisplayLink device works perfectly in Windows 10.  In macOS it's.. interesting.

 

1blJ8idm.jpg?1

 

Basically it can display a picture, but then any screen change at all shows up as black or causes corruption.  So when first enabled it does show the background and menu bar - though the background image is usually smeared on the right hand side - but then as soon as you do anything on it, even move the mouse pointer, you get black wherever the screen should have updated.

 

I'm certain these must be driver and not hardware problems.  It worked first time in Windows, and on macOS I tried three different displays with three different cables, as well as connecting via three different USB controllers (both of my USB 3.0 PCIe cards, and the internal motherboard USB 2.0).  I also tried different resolutions right down to 1024x768, reverting back to Native drivers for my NVidia 760 card, and disconnecting a couple of my other screens to reduce the total number in use.  All with identical results.

 

The only time I get anything different is when I switch to Mirrored display mode such that all displays show the same picture.  In that case I don't get black corruption on the DisplayLink monitor. I can move stuff around and see it.  But the picture is still broken.  Only part of the desktop appears, at the wrong resolution and off-centre.  The rest of the screen is filled with bluey-grey and there's a second menu bar at the top.  It's like it's trying to show the screen twice or something.  This image shows three screens in mirrored mode: two NVidia screens working normally, and the DisplayLink monitor on the left doing its own weird version.

 

The DisplayLink macOS troubleshooting page is a huge list of possible problems, though most of the most serious ones (including the sort of graphical corruption I see) were supposedly fixed as of High Sierra 10.13. 

 

More specifically, they're supposed to be fixed by having Metal support.  The Apple support page for Metal says it's present in later iMacs, MacBooks and Mac Pros from 2013.  I'm emulating a 2010 Mac Pro, but I'm pretty sure I do have working Metal  - I ran a MetalTest app that confirmed my NVidia 760 was a 'Metal Device', and GeekBench can use it to produce a Metal benchmark score.

 

I suppose it's possible that the drivers are checking for a specific SMBIOS and not enabling the Metal support without the right SMBIOS.  So I might try changing that.   But even so, it should still work in some way, even if there's a long list of possible problems.  I can't get it to be usable for even a second.

 

So I fear it's a Hack issue.  In fact one of the troubleshooting pages from DisplayLink says "try resetting the SMC", so maybe this is accessing hardware on a lower level than can work with a Hack.

 

I will raise an issue with DisplayLink, though I have 0 hope of success.

 

Oh well, I guess I should have known it would be troublesome :(

Link to comment
Share on other sites

Actually I just realised I could have done something far, far easier than all this messing about.

 

I can buy an NVidia GPU on a PCIe x1 card.  Assuming macOS's Native or NVidia drivers will detect two different GPUs, and use outputs from both, this would work far better.  And is actually cheaper than I spent on the USB stuff.

 

I reckon this DisplayLink thing is going to be returned to Amazon.

Link to comment
Share on other sites

 Share

×
×
  • Create New...