ivp wrote: > That's what I do with my clocks. Even using minutes for long-range > (years) calendar functions such as alarms and intervals. Internally > only seconds and minutes, and the display is updated after division > by 1440 for days and a look-up table for day name/date/month. The > maths and manipulations are so much easier using just one basic unit > > The complications of hours, 12/24, days, weeks, and years need only > be attended to for display and are a handicap for working out how > many minutes are between 10:31am 15th April and 11:07pm February > 4th and when to set off regular alarms in that interval Yup. On big machines, my portability layer uses 60.30 signed fixed point seconds since the start of year 2000 for all absolute time. It's a lot easier to manipulate time in a single relatively convenient representation, then convert only when needed elsewhere. Every OS seems to represent time just a little differently, even diffefent flavors of Unix. I have a single routine for each implementation of the portability layer that gets the system time and converts to my standard, then do manipulation from there. This includes converting back and forth to expanded date/time descriptions. I've got a bunch of time manipulation functions that work with my internal standard format that I only had to write once. Of course such level of abstraction comes at a price that is probably too heavy for most PIC projects. Time representation needs to be carefully considered per project. However, keeping 12 hour time with AM/PM flags internally sounds like a really bad idea for just about any case. ******************************************************************** Embed Inc, Littleton Massachusetts, http://www.embedinc.com/products (978) 742-9014. Gold level PIC consultants since 2000. -- http://www.piclist.com PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist