Forrest W. Christian wrote: > One way to solve this is to strongly type the language, which leads to > situations where you spend all your time telling the compiler to > convert between this and this type and is a pain to program in. You are way overstating the issue. Well thought out programs rarely need unusual type conversions. The extra syntax to say "yes I know what I'm doing in this special case" is so minimal as to be irrelevant. I don't have any hard numbers, but I suspect it happens less than once in a thousand lines. I routinely write whole programs where it never comes up. This issue is just a smoke screen. > 1) Perform the multiplication on the raw data of the string, that is > the ascii value - this is arguably what would happen in some cases in > C, > since a string is just an array of int8's. > 2) At compile time, realize that you can't multiply two strings and > throw a compile error. This is the "pascal" way. > -or- > 3) realize that strings need to be converted to numbers first before > they can be multiplied, so do the conversion, complete the > multiplication and save the result as a number. > > I agree that #1 is the worst possible outcome. You are arguing for > #2. I would rather have #3. #3 has the drawback of not catching unintentional conversions. It is good to have a language where likely mistakes cause syntax errors instead of unwanted automatic conversions. Remember that maintainence is the big cost of software, not the initial writing. Adding a few inline functions or whatever is trivial when first writing the code. Chasing down runtime bugs later when a change is made with a datatype a bit misunderstood is a lot more expensive. ******************************************************************** Embed Inc, Littleton Massachusetts, http://www.embedinc.com/products (978) 742-9014. Gold level PIC consultants since 2000. -- http://www.piclist.com PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist