Firstly - there are four different methods (common) to transfer HD video across a cable - VGA (DB-15 - analog RGB with clock and some data) Component video (YCbCr - three signals - analog) HDMI (Digital with sound) DVI (Has both digital and analog(RGB) signals, though not all cards support both modes) There is one content protection standard - HDCP - which encrypts digital video from 'protected' sources so it can't be copied from the cable. It only works on digital signals, so DVI and HDMI can support HDCP. Not all DVI or HDMI sources or displays support HDCP. Some signal sources (say, Windows Vista with an HDCP video card, or a HD-DVD player or Blue-Ray player) have HDCP signals and will NOT output HD resolution data unless they are connected to an HDCP compliant monitor. They downgrade the signal if: 1) the user selects an anlog output (DB-15, DVI analog, Component) or 2) the user selects a digital output to a monitor or device that does not support HDCP So. Under the following circumstances you _SHOULD_ be able to use the full resolution of whatever monitor you attach to whatever device (as long as the device supports the monitor's resolution): 1) If you never view protected content (so far only HDDVD, Blue-Ray, and some downloadable media) on any video connection, analog or digital 2) If you view protected content, the monitor and signal source both support HDCP and you are using a digital connection (DVI-D or HDMI) I have a 24" monitor, 1920x1200 that does support HDCP over the HDMI connection. I also have Windows Vista. I do NOT have an HDCP video card (at least, I don't think I do...), and further I connect it via the VGA connector (DB15 - analog). So I cannot play protected content. I am able to use the full resolution without problem, and further I have no problem watching full resolution 1080i HD TV broadcast from local stations. IT IS GLORIOUS (and a lot cheaper than the 30" I truly desire). But I digress... If I purchased an HDDVD or BlueRay drive, I would have to get an HDCP compliant video card, and use an HDMI cable (monitor doesn't support DVI). Then I could still have the full use of the screen under any conditions - because I'm complying with the HDCP requirements. Alternately I can buy an HDDVD or BlueRay player and hook it directly to the monitor's HDMI input and it would work. So. If you're spending a lot of money on a new monitor or TV, make sure it is HDCP capable - you will eventually use it with a device that requires content protection, and you will be disappointed to find it displaying 480i if you don't support HDCP. And yes, my productivity has improved moving from two 19" CRTs to one 24" and one 19" CRT. Would love to trade it out as well, but I'll have to wait a bit longer. http://www.newegg.com/Product/Product.aspx?Item=N82E16824255001 is what I got for the curious. $370 after a $50 rebate. It's _very_ nice, but read the reviews to make sure you understand its little quirks. It's way too bright for instance - I turn the brightness to 0 - more of a TV with a variety of inputs than a computer monitor. Lastly, to be cincise, the only reasons you might not get full resolution on your monitor when hooked to a computer are: 1) the monitor doesn't support your video signals (unlikely - check monitor specs) 2) your video card doesn't support your monitor's requirements (unlikely - check video card specs, get latest drivers) (#1 and #2 are generally not problems unless you buy bottom of the barrel cheap monitors, video cards) 3) you are attempting to watch protected content and don't have one of the requirements of HDCP: a) compatible OS b) HDCP video card c) HDCP monitor d) digital video cable (Or replace A and B with any HDCP source such as HDDVD player, etc) I hope you find this useful! -Adam On 10/31/07, John Dammeyer wrote: > Hi all, > > I'm starting to drool over the latest crop of 46" LCD TVs with 1080 > resolution. But I remember reading an article in EDN back in the spring > where apparently the author claimed that none of the PCs with video > cards will actually be able to display in that resolution because of the > copy protection features of HDMI and that the TV itself will degrade the > image to 720p even if presented with 1080p. > > There are so many acronyms and various 'features' it's really hard to > sort of the wheat from the chaff. > > The local Sony store says, it's no big deal, just use the 15 pin VGA > connector. Seems to me if a PC video card has an HDMI output and I'm > drawing graphics or presenting an image that is 1080 resolution, > shouldn't the TV show it? > > The article is here: > > http://www.edn.com/index.asp?layout=article&articleid=CA6413792 > > The question is of course: Is this still the case or since there's such > a lag between articles and publication dates is this information now > really out of date? > > John > > > -- > http://www.piclist.com PIC/SX FAQ & list archive > View/change your membership options at > http://mailman.mit.edu/mailman/listinfo/piclist > -- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - Moving in southeast Michigan? Buy my house: http://ubasics.com/house/ Interested in electronics? Check out the projects at http://ubasics.com Building your own house? Check out http://ubasics.com/home/ -- http://www.piclist.com PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist