sergio masci wrote: >> Jinx wrote: >> >>> Is the 2038 problem mentioned in C classes or tutorials ? >> >> I don't think this is the most important question. IMO it is whether the >> really needed time span is mentioned in the specs of the software to be >> written. Probably most of the time bugs are not due to misunderstanding on >> the side of the programmer of the implications of using limited date >> ranges, but not sufficiently specified operational date ranges. > > But you useually find that the person writing the specs does not know > about programming limitations, that's one of the tasks of the programmer, > to read the specs and highlight any problems he foresees (limitations). > > The average H/W eng, chemist, microbioligist, vet etc (all pretty > intelligent people) would assume that asking for a date / time stamp (on > data being captured) would work for all dates and not just some small > subset. I don't agree. Date representations are always limited, and writing a spec for any project where dates are important requires a range spec for those dates. I don't have any statistics, but I think that the majority of Y2k fixes were due to someone saying "Dates up until 1999? That's plenty, let's use this" somewhere along the spec path. After the media exposure that Y2k got, no average person specifying a program should assume that a valid date is just "given". > To put this another way, the person writing the spec doesn't need > to tell the programmer that certain variables should be 8, 16, or 32 bits > wide integers or 32 or 64 bit floats. The programmer needs to sort that > out. This is only partly correct. Of course the spec doesn't (or doesn't have to) specify how many bits a variable should have. But it needs to specify whether it should work for values up to $1G or up to $1T for a given entity, for example. For most private finance software, $1G may be good enough to balance checking accounts; if you want to keep track of government debt, you need to go higher (and $1T is not enough). That needs to be specified, and the programmer then chooses the appropriate variable structure. There's no way for a programmer to determine how to represent a value if no required range is given, somehow. > So why should the spec writer assume anything less for dates? Not anything less, but exactly the same with dates... the spec needs to specify the required ranges, and the programmer chooses the appropriate representation. It's not the programmer's job to know whether the client wants to spend the extra money to be able to compute beyond 2038 now, or rather leave that until later. I'm certain that for many MP3 players in the market that contain some time/date handling, the time_t type fits the specs and the client doesn't want anything different (better and probably more expensive). I'm also certain that for some other devices it doesn't. But even if you used a date spec that can represent all dates from year 1 to year 9999 (for example) you can't know whether this is enough for the job. You may have to be able to go further into the past or the future, or handle different calendars, or ... This is spec stuff. It still is the programmer's responsibility (a shared responsibility) to make sure the specs are complete enough. But the decision about the required range is outside of the programming domain; it's in the spec domain. Sometimes a programmer can infer something from the context -- but whenever you infer something, you better make sure that it gets added to the spec, even if informally (like in an email where the client or spec writer confirms that your assumptions are correct). You may infer wrongly. Gerhard -- http://www.piclist.com PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist