William ChopsWestfield wrote: > A number of the examples I found were in media where mu might be > unavailable: usenet, assorted online forums, etc. Especially in those places, mu should be available. On the Internet, all the necessary standards are in place for a long time to represent mu correctly and platform-independently. Pretty much all web server script languages have ready-made functions to convert user input into the appropriate HTML entities. Character sets are well-defined, and so are the ways to define which one is used. Browsers and newsreaders handle that and display that correctly. It's only not available when there's a bug, or an inept programmer :) > Still, I stand by my point: regardless of whether mF is technically > ambiguous, it has become linguisticly ambiguous... If I understand you correctly, you make one point with two supporting arguments. The point is that 'm' as multiplier should be avoided, for both 'milli' and 'micro'. The two arguments are: 1- Up until the 60ies, it was common in the US to use mF to mean microfarad. Therefore mF is now ambiguous. 2- Some documenters and programmers don't know how to handle font and encoding issues properly, so sometimes an 'm' appears where a mu was intended to be. I vote for discarding the first argument. This was just a very unlucky non-standard use, and it's almost half a century behind us now. This practice simply didn't make any sense back then, doesn't make any sense now and people should be, if needed, forced to adapt. (You don't like non-standard web sites that work only in IE on Windows, do you? I'm pretty sure you prefer standard-compliant web sites that can be viewed properly on a MacOS X system with Mozilla :) The second argument doesn't affect only the question uF vs mF; it affects all micro/milli quantities (and it affects the ohm, too: in one of the links you collected, the Greek capital omega got shown as W -- resulting in a 10 kW resistor... :) So (if I discard the first argument) is the suggestion to never use the 'm' multiplier again (as in mA, mH, mm, mV, and yes, mF) because it could be mistaken for micro? In all these cases, both milli and micro are in ranges generally used. In some, even mega is used occasionally, additionally creating confusion due to people who seem to think that everything technical is in capitals (or in lower case :). So, I still ask: if you think it's linguistically not possible to distinguish mF from uF properly, why is it possible to distinguish mA from uA, mV from uV, mH from uH, mm from um? BTW, in Germany, mF seems to be in widespread use (as millifarad, not as microfarad). To find them, of course you need to look for texts in German, not for English texts on .de domain sites :) Gerhard -- http://www.piclist.com PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist