At 10:29 AM 1/27/2009, Virchanza wrote: >The reason I ask is that I'm in a position where I have to teach some of >this stuff, so I want to be able to explain why they'll hear that a >transmitter has a transmittal power of X amount of "dBm". I am not sure when the use of dBm started, but it is common back to at least the end of WWII in my library. The first piece of equipment I had that directly showed dBm was the HP 430 power meter introduced in 1948. More for laboratory use than for general use and more commonly in the microwave area than elsewhere in the early days. Some quantities are best expressed with a logarithmic relationship, and power is one of those. In laboratory work you often need combinations of components such as attenuators, directional couplers, samplers or amplifiers, all of which have some characteristic expressed in dB, whether it is a gain, coupling, insertion loss or reflection coefficient. If we express power in dBm then we can very simply calculate new or resulting power levels using simple addition and subtraction. Simply put, if we have a signal source at +13dBm, followed by an attenuator of 20dB and a 7dB gain amplifier then we have an output of 13 - 20 + 7 = 0dBm. Engineers are inherently lazy folk, so we took to that simplicity like ducks to water! The Decibel became more commonly use than the Bel because it gave us usable increments without having to insert a decimal point. Decimal points are analogous to a speck of dust and should generally be avoided! I haven't seen a reference to Bels since Bell System Technical Journals of the 30's and 40's. Nowadays we see receiver sensitivities expressed as, say, -138dBm for 10dB SINAD in 50 ohms. But in time gone by it was common to express then in dBu, or dB relative 1 microvolt in whatever the specified system impedance was. In some fields, especially higher power RF, it is not unusual to see dBW, dB relative to 1 Watt, but we should always be careful to specify the system impedance. Using dBm to specify higher power levels outside the lab seems to have only become common since the late 80's. The trend started in the microwave field in my experience, but seems to have spread over the years and seems to be pretty much universally understood in the field now. This seems to parallel the use of laboratory type equipment such as spectrum analysers in such areas as cable TV, cellular infrastructure and the like. So why don't we use dB mA? Well, how often are we dealing with current over wide dynamic ranges? It is more likely that we would deal with power, hence dBm or dBW. Also, in the RF world it is most unlikely we would be measuring or specifying current anyway. In my younger days I recall we had RF ammeters especially for use in the antenna field, and even then only at low frequencies, power is much easier to measure. I hope that helps. John -- http://www.piclist.com PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist