wouter van ooijen wrote: >> One area which sometimes needs to be included in the equation is the >> ability to do crucially time bound things within crucial time. This may >> relate to code length but may be more complex than sheer size. While >> often enough HLL will do well enough here too, when speed or other >> crucial and complex inter-relationships matter assembler often has the >> edge. > > I first misinterpreted this as 'crutial time' referring to development > time, which would give HLL the edge, not assembler. Of course, if local > speed is of the utmost importance assembler might have the edge, but > still: do the calculation! Alternatives are to use HLL, with a > faster/larger chip, or to use assembler, with a slower/cheaper chip. > [...] > And of course combining HLL and assembler is often better than using one > or the other exlusively. Especially in the case Wouter cited -- execution time constraints --, I think it's in most cases only a very small portion of the code that is critical (interrupt handlers, mostly). With almost all compilers it is possible to optimize these parts in assembler. But all those things are just variations of Wouter's original equation. In reality, however, this equation is very much subjective. For me, for example, it would mean a huge initial cost to start programming in assembler. I'd need to create all the libraries that the ones who do that regularly for many years already have. OTOH, I know my compiler, have my own code and so far have been able to fit everything within (time and money and code space) budget with it. But then, I don't typically work on high production volume items (so far, at least), so the development cost is normally very sensitive. Gerhard -- http://www.piclist.com PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist