I can think of a number of ways... Example: if you know you are connected to a [inherently] calibrated system - a PC's serial port for example (the baud xtal is orders of magnitude tighter spec'd than the RC osc of the PIC), you can "autobaud" detect the bit timing of an incoming serial stream. Doing this every so often over time can provide data to derive a dt correction factor. Once a +/- dt becomes large enough to be accommodated by a single OSCCAL increment (4ns), you could adjust your bit timing code in your serial routines to reflect the fact. It's not a perfect scheme, but given the resolution of the system, it would tend to work nicely. If you design your code to trigger events from the timer, then the entire system can be adjusted to accommodate any drift. This should take care of all factors: Vdd, T, and process variations. It does, however, rely on some knowledge of an external indirect reference... - Chuck Mauro > -----Original Message----- > From: Robert Powell [SMTP:RobertP@HLYW.COM] > Sent: Thursday, February 26, 1998 9:56 AM > To: PICLIST@MITVMA.MIT.EDU > Subject: Re: Internal oscilator calibration > > Once upon a time I saw a project where the guy said he was able to > calibrate timings based on the internal oscillator in a 12C5xx chip, > and that he was able to compensate for the typical RC drift you get > with temperature change, voltage change, phase of the moon, etc. Is it > actually possible to compensate for the RC drift in your code, or am I > crazy? :-) > > Robert Powell > Programmer/Analyst > Hollywood Entertainment > Wilsonville, OR > (503)570-5307