HDMI vs DVI

For system help, all hardware / software topics NOTE: use Coders Corner for all coders topics.

Moderators: Krom, Grendel

Post Reply
User avatar
woodchip
DBB Benefactor
DBB Benefactor
Posts: 17673
Joined: Tue Jul 06, 1999 2:01 am

HDMI vs DVI

Post by woodchip »

I'm a little confused here. I bought the Samsung T260 and on the back I can connect either for DVI or HDMI.
My Sapphire 4870 vid card has only a DVI port but does have adapter to use HDMI. So what connection will give me the best visuals? As I understand it the DVI is a 8 bit connection but the HDMI can be 8 bit or higher. Just don't know if I will see any difference if I adapt and use HDMI.
Thanks
User avatar
Krom
DBB Database Master
DBB Database Master
Posts: 16042
Joined: Sun Nov 29, 1998 3:01 am
Location: Camping the energy center. BTW, did you know you can have up to 100 characters in this location box?
Contact:

Post by Krom »

They are essentially the same, but HDMI implies support for all the DRM fancy encryption nonsense that the MPAA insisted on having. Of course this is no guarantee that the encryption actually works over a DVI adapter and you are probably screwed from watching legal content in HD anyway if your monitor doesn't fully support HDCP. (Pirated content should still work fine though, so no worries.)

Basically every new standard added to display interfaces in recent history has nothing to do with actually improving the quality, bandwidth or ease of use of the connection, it is all for the purpose of content protection and encryption.
User avatar
fliptw
DBB DemiGod
DBB DemiGod
Posts: 6458
Joined: Sat Oct 24, 1998 2:01 am
Location: Calgary Alberta Canada

Post by fliptw »

HDCP is supported over DVI. Your monitor just needs to support HDCP to watch contect protected with it.
User avatar
Grendel
3d Pro Master
3d Pro Master
Posts: 4390
Joined: Mon Oct 28, 2002 3:01 am
Location: Corvallis OR, USA

Post by Grendel »

HDMI also has an audio channel.
ImageImage
User avatar
fliptw
DBB DemiGod
DBB DemiGod
Posts: 6458
Joined: Sat Oct 24, 1998 2:01 am
Location: Calgary Alberta Canada

Post by fliptw »

audio channels... Im pretty sure HDMI isn't mono.

tho I wouldn't put past them...
User avatar
Krom
DBB Database Master
DBB Database Master
Posts: 16042
Joined: Sun Nov 29, 1998 3:01 am
Location: Camping the energy center. BTW, did you know you can have up to 100 characters in this location box?
Contact:

Post by Krom »

HDMI has room (bandwidth) for 8 channel lossless PCM audio. However do to the strict content protection that was demanded by the industry and general lazyness on both sides...no current PC implementations of HDMI support it, the best you can do on HDMI from a PC is the lossy audio formats.
User avatar
woodchip
DBB Benefactor
DBB Benefactor
Posts: 17673
Joined: Tue Jul 06, 1999 2:01 am

Post by woodchip »

Looking at the monitor documentation a bit closer it looks as tho the HDMI connection is more for something like a stand alone dvd player.
User avatar
Canuck
DBB Admiral
DBB Admiral
Posts: 1345
Joined: Tue Jun 12, 2001 2:01 am

Post by Canuck »

Basically the DVI signals are the same as the HDMI signals to some extent evidence is because you can use a DVI to HDMI converter to provide video, but not audio to your set. Your TV wants the whole deal on the HDMI input and doesn't allow an analog audio in on that input like a Sony or Sharp would.

If you want to use your computer HDMI out that is a crapshoot as sometimes they don't include the audio out component or cabling between manufacturers devices to achieve this. If you just want video on the TV and audio on the stereo then your set, use the HDMI. *Edit;

Umm maybe use the vga out, I would after reading the specs on the vid card.

From ATI;
What a burn on the HDMI out! 1080i out max on HDMI...

Q: I have a query, your product page says that the Radeon HD3850 series has \"Hardware processed 1080p video playback of Blu-ray?and HD DVDs\" however the spec spreadsheet shows that the maximum HDMI output mode is 1080i. I am confused which is correct, or is it both - as in the card will decode 1080p video but is only able to output 1080i through the HDMI interface
A: the \"Hardware processed 1080p video playback of HD DVDs and Blu-ray\" means that our GPU can support up to 1080p decoder bandwidth. And the TVOUT and HDMI maximun to 1080i means that the \"DISPLAY\" mode can support up to 1080i. These are 2 different things. If you do not have HDMI 1080i support , just use a CRT monitor or traditional TV to see the HD DVD or Blu-ray DVD , our GPU still can decode 1080p video , and you still can see on the CRT or traditional TV. (071228)


I have converter box that takes the VGA and analog audio out and converts it to full HDMI 1.3a if you run into difficulties. My cost was $350.00 or so.
User avatar
woodchip
DBB Benefactor
DBB Benefactor
Posts: 17673
Joined: Tue Jul 06, 1999 2:01 am

Post by woodchip »

Why would one be concerned about audio in the stream when most of us would have a sound card and seperate speakers?
User avatar
Mr. Perfect
DBB Fleet Admiral
DBB Fleet Admiral
Posts: 2817
Joined: Tue Apr 18, 2000 2:01 am
Location: Cape May Court House, New Jersey.
Contact:

Re:

Post by Mr. Perfect »

It seems like it's mostly to reduce cable clutter. Running one cable is easier then separate video and audio lines. Who knows, maybe they can use HDMI audio for additional nifty effects too.
Canuck wrote:I have a query, your product page says that the Radeon HD3850 series has "Hardware processed 1080p video playback of Blu-ray?and HD DVDs" however the spec spreadsheet shows that the maximum HDMI output mode is 1080i.
Does that FAQ apply to the new 4870 though? Woodchip doesn't have a 3000 series part that the question is about.
User avatar
Grendel
3d Pro Master
3d Pro Master
Posts: 4390
Joined: Mon Oct 28, 2002 3:01 am
Location: Corvallis OR, USA

Re:

Post by Grendel »

woodchip wrote:Why would one be concerned about audio in the stream when most of us would have a sound card and seperate speakers?
If you don't have a HD/BR player there's no concern. If you do and want full HD content played back things get a bit more complicated (DRM), else you should be fine using the DVI port.
Mr. Perfect wrote:Does that FAQ apply to the new 4870 though? Woodchip doesn't have a 3000 series part that the question is about.
No, The 4k series does 1080p + up to 7.1 audio over HDMI.
ImageImage
User avatar
Canuck
DBB Admiral
DBB Admiral
Posts: 1345
Joined: Tue Jun 12, 2001 2:01 am

Post by Canuck »

My mistake they had that note in the FAQ for the 4870.
HDMI carries audio and video on one cable, great when it works.
User avatar
Foil
DBB Material Defender
DBB Material Defender
Posts: 4900
Joined: Tue Nov 23, 2004 3:31 pm
Location: Denver, Colorado, USA
Contact:

Post by Foil »

From my personal experience:

Mine is an HD3450 (not as fast as the 3850 or 3870, but with the same featureset), and it does everything perfectly as advertised via the HDMI adapter, including 1080p and the 5.1 audio.

I use it along with a Blu-Ray / HD-DVD combo drive for hi-def movies, as well as streaming video from Netflix and such. I used to have issues with DRM/HDCP with my old video card, but the ATI 3xxx+ cards work just fine.

It plays both Blu-Ray and HD-DVD at 1080p just fine; it is 1080p (1920x1080x60Hz progressive), I've checked. Also, the 5.1 audio is perfectly fine through the RealTek chip on the video card, through the HDMI. I just had to install the RealTek driver.

It doesn't look like you're planning to use the HDMI audio, so there's no real advantage to using HDMI. However, if you do, and you have any issues with it, let me know. I did a lot of research when building my home-theater box.

[Edit: I've also used a DVI->HDMI converter cable to connect to my HDTV with my NVidia 8800 card. It worked just fine for 1080p, no DRM issues. The only difference I noted was that the ATI control center is a bit more flexible than the NVidia system about scaling to handle HDTV overscan.]
Post Reply