I need to power some electronics at a trade show for 3 days (total of 26 show hours) and won't have an AC outlet. I need to calculate how many automotive batteries it would take to power these electronics. As a start, the good news is that the electronics are designed to run off automotive 12V -- and have been tested down to 8.5V. Altogether, I estimate 20W consumption (mostly DC-DC converters, so that power should be fairly consistent as battery voltage decreases). Now, I have a battery here that claims 58AH. My google research tells me this means 14.5A for 4hrs, at which time the battery voltage will be 10.5V. But that is apparently 10.5V *loaded*. So what happens if I only consume 2A -- does it also mean that I can get ~2A (being conservative) for 29 hrs and the ending voltage will still be 10.5V? I doubt since the load is different. Generally a car battery is considered 0% charged when the open-circuit voltage drops to 11.8V. And generally considered to need charging (75% charged) when the voltage drops below 12.4V. But that's for a car -- and I'm sure there's a lot of useable capacity below that. So what happens below 11.8V open-circuit? Is the discharge (fairly) linear? I could not find a battery discharge graph via google. What is the minimum safe voltage to charge a battery? Bottomline, I'd like to know if one fully-charged battery will be more than adequate for all 3 days. If absolutely necessary, I'll drag it back to the motel and charge it, but really want to avoid that -- something about a lead-acid battery in an enclosed room (that I'm sleeping in) that does not thrill me. Can any of you lend some intelligence with this? Thanks, -Neil. -- http://www.piclist.com PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist