I think we just touched on the core of the change. 20 years ago, when I was just getting into microcontrollers, I bit-banged lots of different things. I2C, SPI, UART, PWM, RC servo timing, even USB on the AVR parts, I've done them all and more with nothing but GPIO and instruction counts. Today, I can't remember the last time I bit-banged a common protocol. The silicon has become so much more capable while the price has been falling. I recall spending days playing with tricks of integer math to get resolution and scaling without using floating point. "Because floating-point takes too long" A couple weeks ago I was walking a junior (three months out of college, so green I have to water him) through a recent project. I was going down the path of integer math, he declared them as floats.... I was about to go into full lecture mode when we actually benchmarked the code - the processor still spends 99% of it's time waiting for new data. There's just nothing to be saved in that project to justify the time it takes to "improve" the code. -Denny (who is quickly becoming an old curmudgeon) --=20 http://www.piclist.com/techref/piclist PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist .