On Jul 9, 2009, at 7:28 AM, Olin Lathrop wrote: > he wants things that look like library calls but that are really > intrinsic functions that the compiler understands natively. This of course was very common in languages before (and other than) C. I've opined before that part of C's success may have been due to having the language specification completely separate from the language definition. That wasn't true of Fortran, it's not true of Pascal (one of things that particularly annoyed me about Pascal was that the language's built-in functions got to have more features than user-written functions. Like variable numbers of arguments.) Indeed, I think many languages got "bogged down" trying to define a "complete" set of useful library functions. With C, the standardization of libraries was done separately, much later, and with (perhaps) more experience behind the decisions. Perhaps. ("Posix") > But didn't AT&T (owner of Bell Labs) sell Unix licenses? Not to the > general public, but still sell them? I don't think so. Not tell MUCH later, anyway (depending on who you mean by "not the general public.") Until the mid-80s, unix was almost exclusively found only in universities and research institutes and such. I'm not positive; prior to those times I worked at universities and research institutes :-) Post mid-80s you had the famous "System V" (AT&T) vs BSD (Berkeley) wars. The early 68000 systems (right about that same time) had real operating system difficulties. > Maybe all of this WAS quite true a while ago, but nowadays we can > see that there is a LOT of #IFDEF in C sources to support multiple > platforms. Actually, it's MUCH better than it used to be. Get some code from about 1990, and it'll have conditionals for SYSV vs BSD vs (if you're luck) PC. And it'll have LOTS of them, and they're UGLY (remember "near" and "far" for pre-386 intel CPUs?) Post-standards, post-Posix, post-gcc there are comparatively minor differences for architecture (sometimes), plus what you might expect for difference windowing subsystems and so on. Which is another point. C *has* been getting better. The compilers and error checking within the compiler has gotten better. Compilers find errors that used to require "lint." New functions have been defined that have better error checking capabilities (not "built it", but present) (alas, not always well-thought-out. There was a whole generation of string functions that paid attention to "length" in rather useless ways.) More than that, there's been so much USE of the languages that the "unwritten" standards have improved a great deal. Compilers now post warnings about constructs like "if (a = b)", because programmers demand it. They check the types of printf() arguments against the format descriptors (which is a really weird thing for a compiler that doesn't define printf() to do, but still rather useful.) People are more careful about using explicitly defined data types. That 1990 code wouldn't have had function prototypes. It wouldn't have uint8_t. It probably wouldn't have nice include files defining the APIs. (Of course, your 1990 programs would have been a lot smaller than todays programs...) BillW -- http://www.piclist.com PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist