Teardown: Sienoc Active Mini-DisplayPort to DVI Adapter

For those who don’t upgrade their displays very often, slowly, they can find that their displays fall behind in terms of connectivity options. More modern graphic cards offer an array of multi-monitor output options, but mostly using DisplayPort (DP) or Mini-DisplayPort (mDP) connections rather than the VGA or DVI of years past. Unfortunately, adapting between DisplayPort and these older connections isn’t as straightforward as it might seem.

Adapters generally come in two flavours – passive and active. The passive adapters are just “wiring” with no intelligence whatsoever and normally go from DP/mDP to DVI. It relies on the display controller to put out a signal which is DVI compatible – these ports are known as dual mode and are normally marked with two + symbols. If the type of the adapter is not marked and the price is relatively cheap, it’s most likely to be a passive adapter.

The other type is known as an active adapter, which contains some integrated circuitry inside the adapter to drive the screen. It talks to the display controller as a “native” DP device and does not need any dual-mode support from the controller. These adapters are generally more expensive, and used to require additional power in the form of a USB connection. More modern adapters are able to receive enough power over the DP connector that an auxiliary power supply is no longer required. It can be hard to distinguish this type from a passive adapter on appearance alone.

Information about which display controller ports are and aren’t dual mode ports is sometimes hard to come by as ports are sometimes unmarked and manufacturers’ don’t clearly document. Furthermore, passive adapters are limited in their maximum output resolution, being only suitable up to about 1600×1200 or 1920×1200 at 60Hz. Users wishing for higher resolutions (albeit, with limited refresh rates due to the data rate limitations of DVI) need to go for active adapters.

Another good reason for needing active adapters is to make the most of the multi-monitor capabilities of modern graphic cards.

Going Active …

My main workstation has four monitors of disparate makes. Formerly, I had one graphic card driving three of them – one directly, two via a Matrox DualHead2go, with the last one being driven over DisplayLink USB. The quest for more performance eventually led me to ditch the two hacks and instead go with two non-SLi graphic cards – namely a GTX580 and a GTX560ti, each driving two monitors “independently”. Over time, this arrangement came to be quite indispensable, although one of the cards had a cooler failure and was replaced with a spare card. I couldn’t help but think that their time was running out.

Sadly, such an arrangement does consume vast amounts of energy mainly due to the high idle and active power consumption of these older graphic cards. I was able to pick up a Palit Jetstream GTX970 at a good price from a while back with the intention of eventually doing an upgrade. Part of the allure was that the GTX970 can drive four displays on its own. The card itself offers three mDP ports and a DVI-I port. All of my monitors were DVI except one which was VGA only.

The problem was to do with how we could get four displays on the one card. Many users who had bought passive adapters complained that they had additional screens which couldn’t be enabled or just didn’t show up.

A little bit of digging seemed to suggest that Nvidia cards are capable of driving at most two legacy monitors (i.e. DVI or VGA) only. Reaching the magical number of 4 required there be two or less legacy monitors. Similar restrictions exist for AMD Eyefinity set-ups, although the exact numbers may differ.

That was the reason I decided to seek out three active mDP to DVI adapters so my legacy monitors would appear to be native DP monitors, leaving just the one VGA on the DVI-I port through a passive adapter. Active adapters can be found to cost upwards of AU$30 at many places, but a quick search on eBay led me to some “unbranded” Sienoc adapters for just AU$13.90 each. Of course, buying a complement of three makes for about 10% of the cost of a graphic card, which isn’t really that cheap, but it was worthwhile for me as it will allow me to gain more graphic performance, save energy, and free up some PCIe slots for other expansion cards.

The first set of adapters I purchased never arrived, wasting over 50 days of my time and resulting in a refund. I eventually went with a different seller, waiting another 30 days for their arrival …

The Product


The item comes in a resealable white plastic hanger bag. The printing on the adapter is different to that on the eBay listing and it only has a little “tag” to brand the unit. As a result, it’s likely to be a very generic unit, but it does have a fairly respectable design which may lead the unit to be resold under various different brand names. Internally, the shiny plastic surfaces are protected with film and the mDP plug is protected with a cap.


The printing itself is not particularly elegant, and was slightly shaky and worn at one side. The finish is a shiny piano-style plastic, but it really doesn’t matter because most people would have this somewhere out of sight. The incoming cable does have some strain relief.


The front side has a DVI connector with a shroud, resulting in an adapter which is “Apple-esque” but black in colour.


Of course, the other end has a mDP connector. No other inclusions are provided because the unit is pretty much plug-and-play. All power required is drawn from the mDP connector, and thus, no additional USB connector is required.



To tear it down, you must first remove the front shroud by prying on it. Then, it’s a matter of pushing the cable into the shroud which pushes out the internals.


The PCB is marked MDP-D5-V1.2 and features a Parade PS171 chipset as advertised. It is definitely an active adapter. The incoming wire is “glued” down to avoid any loose connections. There seems to be a serial flash IC for configuration data.


The other side hosts a crystal, and a few small unidentified SMD ICs, but basically this is predominantly a single-chip solution.

Benefits and Drawbacks


On the whole, having four monitors on the one card is a dream come true at long last.


Everything is detected as it should be, the IDs and modes come up just fine and all monitors can be activated. The active adapters work just as I expected – plug and play, and definitely money well spent. While I didn’t have anything that truly required or pushed the limits of the adapters, they seemed to be made of the right stuff.

The upgrade itself had a marked impact of the temperature in my room. The older GTX580 + GTX560ti combination of cards could have easily put out about 70W in idle alone, as running with dual monitors does not allow the card to downclock. The four fans from the two cards were also fairly noisy. The new card is much cooler, and with the good airflow in my case, the fans are often at 0rpm even when watching videos, and so my room is quite a bit quieter.

In fact, judging from this review, it seems that the GTX970 would consume 23 watts less at idle than the GTX580 would at idle. As I had also a GTX560ti as well, its idle would have been about 35 watts or so, so we have an idle saving of about 58 watts. Full load power savings would be around 200 watts (as one card is entirely eliminated, and the rest is made by the difference in load power).

Lets assume a saving of 58 watts, and an upgrade price of $500 card and adapters included, and energy price of $0.25/kWh – the energy saved would repay the cost of the upgrade in just under 4 years of continuous operation. It’s probably not worth upgrading cards to save energy alone based on this, but it will also save on the cooling bills and improve the comfort of my room. The break-even point is sooner than I expected, which indicates just how much more efficiency we are getting in newer hardware. Of course, if I do more gaming, the payback time will shorten as the energy saved would be bigger in comparison.

Another advantage is that video playback across all screens is much smoother – moving windows between monitors doesn’t lead to re-initialization of the decoder due to refresh rate differences, or unusual stutters or tears when decoding is done by one card and the result is copied from frame-buffer to frame-buffer over the PCIe link.

I’ve also tested sleeping and waking in the configuration with no problems, although sleeping and waking takes longer as each monitor is initialized in sequence rather than in parallel.

The additional graphics performance and freeing up a few PCIe slots is icing on the cake – now I can have more fun with other devices too.


In short, yes you can run four monitors from the one GTX970 card if you choose active adapters and spend a little more. Active adapters are required because the GPU cannot output more than two “legacy” signals at a time. Active adapters don’t need to break the bank if you’re willing to be patient and buy direct from China over eBay. I had my doubts, but when disassembled, these were the real deal, using the chipset exactly as described and all three worked flawlessly for me with my GTX970.

By displacing two older cards and consolidating all the monitors onto one card, I’ve saved energy, kept my room cooler and quieter, reduced the air-conditioning load, removed the quirks from dragging playing video from one monitor to another and ended up with improved graphics performance. It was an upgrade a long time in the making.

About lui_gough

I'm a bit of a nut for electronics, computing, photography, radio, satellite and other technical hobbies. Click for more about me!
This entry was posted in Computing and tagged , , , , . Bookmark the permalink.

3 Responses to Teardown: Sienoc Active Mini-DisplayPort to DVI Adapter

  1. sparcie says:

    Not all the Nvidia cards are equal when it comes to using legacy monitors, I’ve got some Quadro NVS 510 cards at work that are quite happy to drive 4 VGA cards via passive mDP adapters. Of course those cards are quite different than those you’d normally use in a home desktop.

    Luckily we’ve upgraded our displays now and are using DP for all the displays now.


    • lui_gough says:

      Very interesting to hear that! I wish they made it easier to find out just what sort of configurations are valid for the regular consumer cards because it seems it can also vary because of OEM customizations (i.e. their choice of ports to put on the rear).

      In other cases, some of the limitations are in software too – e.g. http://www.tomshardware.com/news/nvidia-linux-basemosaic-ubuntu-parity,24519.html

      – Gough

      • sparcie says:

        That’s interesting, you wouldn’t think they’d want to artificially limit them that way! Perhaps they wish to encourage people with those needs to get a quadro card, or as the article suggested Microsoft may have had a hand in it.

        The only way I know to find out is to look at detailed specs on the Nvidia website for the particular GPU. That of course only really gives you an indication, but most of the cards manufactured by others are capable of the full spec as long as they have the ports.


Error: Comment is Missing!