If you want to create more problems, just lets change the byte from the octetos to decanos (decades). Then a byte will have 10 bits, and will represent 1024 combinations, than, we would be a little bit closer to what te caotic system created with kbits and kBytes. Then you can say that 1kByte is exactly 10^4 cells, that a MByte is exactly 10^7 cells. So, two bytes addressing system would access the infame actual "1MByte" of address space. It would take byte counts to means 2^10 new steps in addressing, 1 byte = actual infame 1kByte 2 bytes = actual infame 1MByte 3 bytes = actual infame 1GByte 4 bytes = actual infame 1TByte It would make more sense. I vote for that.. lets make this a complete caos... Y2K problems would be small peanuts compared to this new computer revolution... Lets wake up Bill Gates and tell him about this: "The world is waiting the Windows2000 with bytes in decanos, not octatos"... lets make him worried a little bit^10. By the way, PIC doesn't use 14 bits? hmmm, it makes me think about something... hey, that's just a joke. Wagner.