I've got a circuit to which various field-wired "contact closures" are connected to. Today, I'm sourcing about 2.2mA@3.3V on each input, and expecting the closure to be able to yank that down to close to 0V. Today I was looking at a low-signal relay datasheet (and also some GP relay datasheets as well) and got wondering if this is really enough current to "break through" a typical relay's contact resistance.... and if not, what I would need to increase it. From what I can tell, most low-signal relays shouldn't be a problem, as they're rated at 1mA@5V or below (I'm assuming that 3.3V and 5V are "close enough" that 2.2mA would break through @ 3.3V). However, the larger relay seem to perhaps be a problem. The bigger problem is that it is almost impossible to find "min permissible load" ratings for anything above about 8A, and for switches and other similar devices (proximity sensors, etc), it isn't any better (if not worse). The question then being, how much current is really needed to sense "most" switch closures? And I'm probably restricting "most" to the lower end of the amperage scale - 20A at the very most, and probably more like 5 or 10A. This is a power-sensitive application so I'd really like to limit the amount of current, but if I have to increase the current load to reliably sense a switch closure, I sure will do so. Mainly, I'm hoping to get a feel from the list for how big a problem this is likely to be at the 2.2mA@3.3V that I'm currently using, and if there's a magic "number" where this is more reliable. -forrest -- http://www.piclist.com PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist