Connecting with DVI


DVI (Digital Visual Interface) is the newer digital connector that is quickly replacing VGA connections for hooking up your average flat panel monitor to your PC. Projectors, HDTV's and other digital equipment are also making use of this new connection.

However, the prospect of just going out and picking up a new DVI cable for your connection isn't as easy as it might sound ... something I learned the hard way recently when trying to connect my Media Center PC to my HDTV. So, I thought I'd share a little knowledge I gleaned from that experience.

Single- vs. Dual Channel DVI

The first thing you need to know about DVI is that it can either be single- or dual-channel. In a nutshell, a dual-channel DVI cable can send twice as much information as a single-channel cable. For the most part, a single DVI link consists of four twisted pairs transmitting 2.6 megapixels at 60Hz with a color depth of 24 bits/pixel, giving it approximately the same speed and power of an analog video signal. With a dual-channel cable, there are simply two sets of these four twisted pairs, so twice as much data can be transmitted at the same time.

What does this mean practically? If you desire a resolution of 1024x768 @ 60Hz with 32-bit color (typical resolution for my scenario - hooking my TV up as a glorified monitor), then you would require 1024 * 768 = 786432 pixels = 0.75 megapixels. So, no problem? Well, hold on. A single DVI link transmits at 24 bits/pixel, but this scenario calls for 32 bits/pixel (which is what defines color depth). The specification demands that the leftover 8 bits be sent over the second channel, so a dual-channel connection / cable is required. Something to keep in mind.

What about full HD? If you're pumping out a video signal at 1080i, the calculation is much different. 1080i (standard high-def) is basically 1080 scan lines (hence the name) with 1920 pixels/line, for a total of 1080 * 1920 = 2,071,600 = just shy of 2 megapixels. Again, the second channel is required to get a greater than 24-bit color depth.

So, the net of the discussion is to spring for the dual-channel cable. Even though it's unlikely you'll be hooking up anything that requires a higher resolution than 1920x1440 (which is supports by no LCD monitor you would like own), you probably will want the deeper color depth, you'll want to be prepared for the future, and the cost difference is minimal.

How do I tell the cables apart?

Well, the single-channel vs. dual-channel question isn't the only one to consider here, but it's definitely a factor. Also in play is that newer DVI cables have been built to allow analog signals across them as well. So, if you're going to use a DVI cable to hook up a CRT, that's important to you.

Here's a quick guide on the no-less-than-five different kinds of cables out there, so you can tell them apart. Observe the image above. It pretty much spells out the configuration differences. DVI-I cables support analog and digital connections. DVI-D is for digital connections only ... very commonly sold as just "DVI", which is probably no issue for you because you're most likely wanting to hook up to an LCD monitor, HDTV, or the like -- but be aware of what you're buying. And DVI-A is analog only ... probably not what you're looking for, so be careful of cables priced dramatically less than other cables that sound the same in the limited information a number of sites will give you about the cable prior to purchase. And of course, we've covered the single- vs dual-channel component of the technology.

More Info

In the way of reference, just in case you want to read up on more of this subject, both the image and much of the data in this article come from the DVI entry in the Wikipedia.

Not All My Questions Have Been Answered

So, the last thing I'm left to ponder is as follows. Pretty much everything I've read tells me that DVI and HDMI should be somewhat interchangeable ... that for what I'm doing, either should work. It seems intuitive that it shouldn't make any difference whether or not I hook my HDTV to my PC via a DVI-to-DVI cable or a DVI-to-HDMI cable. But it does. I've tried both, and with the DVI only solution, I can pretty much set my PC to whatever resolution I want at 60Mhz without issue. But with the DVI-to-HDMI connection, I have lots of problems. Above 800x600, the image is totally messed up. It's not blurry, although it's very unreadable. It's like a double image, one a half-inch above the other. It's like there's an interlacing problem, but a DLP TV like mine doesn't work that way, so that's not the problem.

I've also posted to the forums with this question, in case you want to follow up on this particular aspect of the DVI adventure. Hopefully, the rest of this article will be informative nonetheless.

 

Connect With Techlore