Olin, = I think we are more in agreement than not, regarding SI and attitudes towards it. In the below I snipped lots, since it's a long post and I agree with much of it. I'm not sure though what made you think I didn't. Olin Lathrop wrote: > I've been watching this debate on the sidelines for a while, but you're > constant complaining about how everyone that doesn't follow the > standards is inept is getting me fed up enough to respond. It seems you didn't understand at least some of what I wrote. I never called anyone "inept". You write that again later... not sure why. = > That's what I grew up with, so that's what I think in, and converting > requires more mental work. So what are the hardware stores going to > stock? Whatever their customers demand, right? So why should I buy a > stock of metric hardware when I've already got a nice stock of english > hardware and it's more available anyway? (Actually metric hardware is > available too, but not quite as widespread as english.) = A side note: If you're talking about threaded hardware when you talk about 'English hardware', I don't think you're talking about English threads. AFAIK, the threads common in the USA are UTS threads; a standard that didn't seem to catch on in the UK. They use either metric or BA. = I'm not sure about your work environment, but the shops I've seen recently usually have both sets of tools and stocked hardware: metric and imperial/UTS. = In a neutral world, either would be fine. (I think the metric makes more sense, but that' a different question and we agree on that.) However, having both in your shop if you don't need it is not that smart. You see a threaded hole. By your experience you know from the size what thread it might be. Works if you know whether it's metric or imperial, but if you don't know which one it is, it gets a bit trickier. So you imagine that it's a #4 thread -- but no, it's an M3 thread. If you only had M3, that wouldn't happen. (It wouldn't happen either if you only had #4, but that's not more than wishful thinking, with the rest of the world using M3.) That's just one example why it is obviously more difficult and more costly to work with both systems in parallel. Why stock both #4 and M3 screws when they are about the same size? The thing is that sooner or later you will need M3 screws, so it /may/ be a smarter move to get that done earlier -- before you spend the next $1000 in imperial/UTS tools. You can't control what others do, but you can control what you use. = > So who's doing anything stupid here? = I don't know -- you tell me. I didn't say that anybody does anything stupid. Please don't try to project your harsh judgment into my head and stick to quotes from me if you want to talk about what I wrote. = > If you were Outer Vulgaria and woke up at the end of world war II taking > a leap from ox carts to cars in a span of 5 years, it's no big deal to > decide to standardise "about that long" with a meter. = Now this is a pet peeve of mine :) It seems some people in the USA think that because during the 50ies the USA became the leading nation in science and industry, people in Europe ate bananas from the trees before that (to exaggerate a bit :). That's not quite so. By the 50ies, when most of Europe went metric, Europe had an industrial park and a scientific community just as developed as the USA. By the time European companies went metric, they had just as much pre-metric baggage as USA companies. The only difference is that European governments and companies saw the potential of international cooperation and the need of international standards for that earlier than USA companies, and jumped on the SI bandwagon earlier (the companies) or with more grip (the governments). > It's a lot more trouble for a technologically advanced society like the > US that already has an entrenched system and existing infrastructure. = That's true, and holds just as well for Germany, France, Great Britain, Sweden and the many other countries that were already industrially well developed by the time they adopted the SI. If you'll look through my previous messages, I did /not/ compare the situation in the USA to the situation in "Outer Vulgaria"; if I compared anything, I used countries that were just as developed as the USA by the time they went metric. People like Bohr, Einstein, Heisenberg, Planck and companies like Siemens, Bosch, Mercedes-Benz, Volkswagen, BASF, BMW did not operate in an industrial vacuum. In the 50ies, Europe was arguably as developed as the USA, technologically and industrially. (Sorry if the list above is quite Germany-centered, but that's what comes to my mind. I'm sure there are just as significant examples for many other European countries.) > I make it a point to use milli and nano farads whenever appropriate, > partly because I've deliberately taught myself to think that way and > partly because I want to get the world used to it. = This is exactly my point. I may write it with other words, and I definitely have a different angle on many things, but the base line is this. So what's the problem? > I think of this as one small victory, because I doubt he's going to make > that mistake again. = :) And I think it is not wasted. This is exactly the sort of attitude that I think is missing. He, OTOH, if he's not smart, may think you are a metrics-whino :) = > As for the mu symbol, I've looked upside down, inside out, and sideways > at my keyboard and I don't see one. Yes, there may be a way to make a > mu symbol with a bunch of keystrokes, = Two, to be exact. The same number it takes to write an uppercase 'L'. > but I don't know that off the top of my head and it would be too much > hassle anyway. Using "u" for mu is a well accepted and very common > practise. = I don't know what you read from me that make you think otherwise, but that's exactly what I'm saying. There's no need to avoid the 'm' for 'milli' (as advocated by others because some people like to use 'm' for 'micro') -- everybody who's worth his money would use either the (correct) mu or the (IMO acceptable) 'u' for 'micro' and use the 'm' for 'milli'. > But in reality if it's not ASCII it's either not safe or too much of a > hassle. = See, this is one of the problems that only seem to exist in the USA. The rest of the world does not have a problem with non-ASCII characters and use them on a daily basis. I'm sitting right here in Brazil, writing and reading Portuguese, German and English texts every day without problem. Most of it is not covered by ASCII. No safety problems, neither hassles. Just a little bit of knowledge. (And again, as mentioned several times, I'm pretty sure you are using Unicode, maybe without knowing it.) > Unicode is a big pain in the butt because it takes 16 bits per character. I'm not sure what you know about Unicode, but this is wrong. Maybe you read up a bit on Unicode before you write about it and spread disinformation? Just for the eventual lurker: Unicode comes in a number of different encodings. There was a time, some time ago, where the whole Unicode code space fit into 16 bits, and that's where Olin's misconception may come from. But this is not the case anymore; 32 bits are now necessary. But the number of bits required to represent every code point are not necessarily the number of bits to represent each individual code point. Non-linear encodings have been used for a long time, because some code points are much more frequent than others and therefore it's more efficient to represent them with fewer bits. Check out , especially the section . The default in many situations is UTF-8, which uses only one byte for the printable characters in the ASCII range (and the same encoding). = So the whining that occasionally can be heard about Unicode needing 16 bit per ASCII character is just lack of knowledge -- 16 bits are not enough to /enumerate/ the code space, and in most cases 8 bits are enough to /encode/ a character :) > the really really good reason to switch to unicode is just missing. = Why are you so defensive here? Did I say anything about anybody switching to Unicode? The only thing about Unicode I said was that if somebody wants to use the mu character in newsgroups, email or web, it is easily possible because the standards are there and widely used. The proof is that you can read this mu: '=B5'. (Not exactly Unicode in the message source, but what happens is that the ISO-8859-1 character that gets sent gets then mapped by your newsreader into the appropriate code point in your Unicode font. You may not know it, but you may be using Unicode . Check out the fonts in your system: most of them are probably Unicode.) > When writing HTML and I think about it and I have time, I do use the > "μ" construct for mu. That's OK since it's guaranteed to work in all > browsers (part of the HTML definition), I can remember it, and the > resulting file still uses 8 bit characters. = That's exactly what I said. Use stuff that works. Again, I don't see what your problem is with what I wrote. A side note: General 8-bit characters (especially Windows fonts) are /not/ guaranteed to work platform-independently. If you want to use anything other than the default UTF-8 (which includes 7-bit ASCII) on the web, you need to specify the character set. >> You don't like non-standard web sites that work only in IE on Windows, >> do you? > = > Actually I couldn't care less since I use IE. = That was written as a response to a message to Bill. He apparently doesn't use IE on Windows. I'm sure you don't like sites that only work on Mozilla on MacOSX, right? :) Standards do make sense; they are the reason we can exchange messages and browse the web in the first place. > Again you are forgetting that everyone makes individual decisions that > are the best for them at the time. It's a bit (not much though) besides the point, but I'm sure that this is /not/ the case. Not everybody (not me, not you, not anybody else) only makes decisions that are the best. We all make mistakes. Using the Windows Symbol font in a public web site of an educational institution and therefore causing the abbreviation of 'micro' to show up as 'm' on a significant portion of the browsers is such a mistake, not an "individual decision that [is] the best for them" at any time. Besides, I never forgot that... > Again, my main point is that you are whining about people doing things > different from how collectively you think it should be done, but ignoring > what is in each individual person's best interest. No. I'm not whining, I'm not talking about what I think collectively should be done (that would go far OT :) and I don't ignore what may be in each individual's interest. I'm not whining. I'm arguing against the whiners that always say that it's soooo much more difficult for companies and people in the USA to go metric. It was difficult for every single one of the companies and people in Europe when they did it fifty years ago. There were people with tools in their basements in Europe before 1950, just as in the USA. There were engineers that grew up with different measurement systems, just like in the USA. There were huge amounts of company history and blueprints in other units, just as in the USA. The only difference is that they did it back then, unlike their counterparts in the USA. > Yes, it might be good if we could all suddenly use the same standard, but > who's going to pay for that, and why should I pay if I don't perceive > the value to me outweighs my cost? = Here's the fallacy. It's more expensive to keep the dual system, and the amount wasted on that adds up every year it will take to move over. It's as simple as that. The conversion cost may show up in the budget of a company as "$1.000.000 spent once on metrication efforts", but the cost of keeping the dual system in place won't show up as "$100.000 per year spent on multi-measurement system support". So yes, the short-sighted advantage is with staying with what works and is in place. Judging management by quarterly results reinforces this. But since the move is inevitable (and I'm not sure you agree, but you seem to), it's simply a matter of continuing a waste of money to delay the move, in many situations. (Just as the additional effort you spent -- on your side and on your associate's side who you taught that lesson -- by teaching him that mF means millifarad. This was a cost factor, and you could have avoided it by not using mF and sticking to uF, as Bill suggests. You didn't, you spent the effort, and you did it because you thought that's the right thing to do and that it's cheaper in the long run. I agree completely. I just take this a bit further, maybe...) The other thing is: if the US automotive (and a few other) industries think that it's cheaper to go metric (they definitely had to convert /a lot/ of existing documentation and change stock), it may not be so far off to think that it'd be cheaper for many others, too, and that some of the main reasons they didn't do it are just not thinking about it, and sometimes mental (and other) laziness. = > Are you willing to go thru a few 100000 lines of my source code and > convert every last character reference to unicode? I didn't think so, = Again, I never said anybody should convert all existing programs to Unicode. This seems to be something that tripped your fuse. Please /read/ my stuff. I try to write good, clear English that everybody should be able to understand. I try to make the points clearly. I try not to call people names, and rather stick to facts and mark my opinions clearly as such where I mention them. AFAIK I /never/ in this thread said that anybody should convert any code to Unicode. I said about Unicode that it's well supported in the online media like www, newsgroups and email so that there's no reason not to use Unicode characters. If you think you can find the passage where I said that you (or anybody) should convert existing applications to Unicode, that would be helpful. You seem to be using OE6, probably on a Win2k or WinXP system, and if that is correct, your system supports Unicode -- you /are/ using Unicode, as far as your interface to me is concerned. So you weren't really addressed with the comments to that effect... > so stop whining about it already and stop calling me inept for not doing > it. Again, I never said that you should convert any of your code to Unicode, and I definitely never called you inept. I would definitely like to see the quote you took that from. Gerhard -- = http://www.piclist.com PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist