What they say, but ... 6V gate drive is getting low if you can tolerate 18V. The "whole regulator" is worth tens of cents and you are supplying it not "just for the gate driver but "just for the MOSFETS and whatever they are doing". Is the take really so value-less? Even when required capacitors are added, cost is unlikely to be a good reason for doing this. Size may be. If you use an 18V zener (less whatever safety margin is required) then it needs to supply (just enough current at 16V so you'll be dissipating Imax x (2-16) =3D 8 x I_max Watts. A 10V zener in series will dissipate up to 10 x Imax Watts at any Voltage. I'ts more complex than that but essentially you'll disispate about the same either way and have ~=3D 16V min gate drive using a conventional zener. I'd tend to try to use a voltage regulator. Russell On 24 May 2013 19:53, veegee wrote: > Hi all, > > I'm trying to power some MOSFET drivers (which can be powered from up to > 18V) with a battery pack(s) whose voltage may be anywhere from 16-24V. I > can opt for the "correct"(?) design and just use a linear regulator, but > I don't want to add a whole regulator just for the MOSFET driver. Is it > considered acceptable engineering practice to use a Zener diode in > series with the power supply rail, reverse biased, to act as a 10V (for > example) drop to power the IC? That way, the IC supply voltage will be > from 6-16V, which is acceptable. It seems I'd also need something like a > 10k resistor to GND in parallel with the IC to maintain a high enough > current through the Zener to keep it in the intended reverse voltage rang= e. > > SPICE simulation says it's okay, but I have a feeling there's something > I don't know... > > -- > http://www.piclist.com/techref/piclist PIC/SX FAQ & list archive > View/change your membership options at > http://mailman.mit.edu/mailman/listinfo/piclist --=20 http://www.piclist.com/techref/piclist PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist .