Since you need to send an analogue signal describing the color on every single pixel in every single frame which the monitor shows per second, you need to have a throughput of 1 600 000 bytes per second if the screen is a poor 320x200 pixels with 16 colors and 50 frames per second. That's why you NEED the "RAMDAC" (RAM Digital-to-Audio Converter) which is the device the video card is built around. The RAMDAC converts the contents in the video memory (which is dual-port most of the times) into this big data streem which all computer or video monitors require. In most PC's you have a much higher throughput than those 1.6MB/sec - with 1024x768 resolution and 256 colors and 72 Hz, you need a RAMDAC capable of 56 MHz, etc... Matrox Milennium has an extreme RAMDAC of 220MHz or something, thus enabling comfortable refresh rates (stabile video) without stepping down on the resolution or the size of the color palette. I would guess you can buy a cheap obsolete VGA video card with 256KB video memory (thus enabling 640x400 in 256 colors) for 20 dollars today. With a video card, you can have your PIC change the image on the screen in any pace you like, and let the video card deal with the speedy monitor. For most applications I think a simple PC would be a better alternative than to get a PIC to talk to a video card. The obvious question is now - if man can transfer a high-quality image to a monitor about 100 times per second, why does it in 1997 have to take 2 minutes for a top-modern printer to start printing??? The resolution is only about 100 times better (300 dpi), so a couple of seconds would be acceptable, but 2 minutes??? It seems as if people in charge of research and development sometimes can be as incompetent as they want to, providing they have been to college long enough... Regards, Glenn Sweden (Sorry about the attachment WINMAIL.DAT - I am using MS Exchange made by the jerks at Microsoft (also incompetent)) ---------- From: Mailing List Account[SMTP:mlist@ALPHAX.COM] Sent: den 30 August 1997 3:41 AM To: PICLIST@MITVMA.MIT.EDU Subject: Monitors How difficult is it to use a monitor with a microcontroller or microprocessor? As an example i'd like to use a VGA monitor, in 320x200, black and white (and eventually color), I couldn't find any schematics at all, do I need a CRT controller or can I do it all from a micro? Thanks, Philip Lalone