Cable loss is usually defined in lengths of per 100ft. What matters most is that cable loss is frequency dependent. If you plan on measuring it in high frequencies, a network analyzer or signal generator/spectrum analyzer combination is generally in order. If it's less than 100MHz or so, the manufacturer specs will be enough for you to interpolate from as short cables will be only in tenths of dB. Over time, cables do degrade, but the most common cause is connectors wearing from constant connection/deconnection and oxidation of them. Try and use gold plated connectors over silver if you are worrying about this. Regards, Michael On Wed, 12 Jun 2002 12:58:13 -0400 Chris Carbaugh wrote: Hello all, I'm looking for a method to measure signal loss (db) on RG6 and RG11 coax cable. I think I can do this with just an ohm meter, but I'm not sure how to cap the opposite end. What I'm trying to accomplish is to (1) calculate the actual loss on a coax run, and (2) calculate the length of the run by comparing signal loss to the specs from the manufacturer. Most coax manufactures (I believe) specify signal loss per foot. Any help greatly appreciated, Chris -- http://www.piclist.com hint: The list server can filter out subtopics (like ads or off topics) for you. See http://www.piclist.com/#topics -- http://www.piclist.com hint: The list server can filter out subtopics (like ads or off topics) for you. See http://www.piclist.com/#topics