The HDCP stuff only applies if the media you are trying to play is protected. If you have it hooked up to play games or whatever then your PC will send un-encrypted data to the TV which it will display. The biggest problem i have seen is most TV's lie about their capabilites. Their EDID data is generally just plain wrong (my screen has the EDID data from a 17" lcd in it). You need to see if the tv will allow you to drive the display pixel for pixel. IE so you can nativley address any pixel on the screen. Most TV's these days seem to only accept a few "standard" resolutions then scale those to match the display. Meaning they are useless for displaying text on and any sort of HD input becomes blury. And you don't want to use VGA for 1080P its just silly lol. John Dammeyer wrote: > Hi all, > > I'm starting to drool over the latest crop of 46" LCD TVs with 1080 > resolution. But I remember reading an article in EDN back in the spring > where apparently the author claimed that none of the PCs with video > cards will actually be able to display in that resolution because of the > copy protection features of HDMI and that the TV itself will degrade the > image to 720p even if presented with 1080p. > > There are so many acronyms and various 'features' it's really hard to > sort of the wheat from the chaff. > > The local Sony store says, it's no big deal, just use the 15 pin VGA > connector. Seems to me if a PC video card has an HDMI output and I'm > drawing graphics or presenting an image that is 1080 resolution, > shouldn't the TV show it? > > The article is here: > > http://www.edn.com/index.asp?layout=article&articleid=CA6413792 > > The question is of course: Is this still the case or since there's such > a lag between articles and publication dates is this information now > really out of date? > > John > > > -- http://www.piclist.com PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist