From: kmaryan on
In a step down converter, what is the mechanism that causes efficiency
to be so poor when stepping down to low voltages? The typical case I'm
dealing with is 12V in to 1V out at around 5A. Most canned buck
converter designs quote high 80s to mid 90s efficiency when going from
12V to 3.3V, but change the output to 1.0V and the efficiency drops by
10% or more. It's generally worse at lower output voltages.

What are the properties of the regulator that cause this? How does one
go about designing a power supply to provide good efficiency for 12V
to 1V @ 5A?

I'm relatively new to understanding the details of switching
regulators, but I'm trying to understand the mechanisms behind some
issues. The target is power for processors and FPGAs, which require
around 1V for their core voltages; more and more stuff on my boards
needs 1V or similar low voltages and I only have a 12V input to work
with.

Thanks,

Chris
From: John Larkin on
On Tue, 6 Jul 2010 12:33:43 -0700 (PDT), "kmaryan(a)gmail.com"
<kmaryan(a)gmail.com> wrote:

>In a step down converter, what is the mechanism that causes efficiency
>to be so poor when stepping down to low voltages? The typical case I'm
>dealing with is 12V in to 1V out at around 5A. Most canned buck
>converter designs quote high 80s to mid 90s efficiency when going from
>12V to 3.3V, but change the output to 1.0V and the efficiency drops by
>10% or more. It's generally worse at lower output voltages.
>
>What are the properties of the regulator that cause this? How does one
>go about designing a power supply to provide good efficiency for 12V
>to 1V @ 5A?

Usually it's the catch diode that burns power. Synchronous switchers
are much more efficient at low voltages.

John


From: Joel Koltner on
<kmaryan(a)gmail.com> wrote in message
news:246d906d-c984-41c5-b220-e5ea1203afc8(a)a30g2000yqn.googlegroups.com...
> Most canned buck
> converter designs quote high 80s to mid 90s efficiency when going from
> 12V to 3.3V, but change the output to 1.0V and the efficiency drops by
> 10% or more. It's generally worse at lower output voltages.
>
> What are the properties of the regulator that cause this?

1) If a catch diode is used, even if it's only, say, 0.3-0.4V, that's a huge
chunk relative to 1V whereas not necessarily that bad out of 3.3V (...and of
course nothing at, say, 12V).
2) If instead a FET is used (a "synchronous rectifier"), it acts as a resistor
and the loss is (mostly) I^2*R. If you drop from 3.3V to 1V but want the same
output *power*, the *current* will increase by a factor of 3.3 and hence the
loss will increase by a factor of 3.3^2=10.9. Ouch!

> How does one
> go about designing a power supply to provide good efficiency for 12V
> to 1V @ 5A?

I don't think there's any real panacea. One trade off you might be able to
make could be in "making things bigger" -- if you aren't trying to build cell
phones or similarly tiny devices, choose a slower switcher: The switching
losses will be less, the inductors will be less lossy, and EMI will be less of
a problem.

Linear Tech provides models for LTSpice for all of their switcher ICs, and
they make it very easy to go around and poke at parts and see how much each
one is dissipating. Hence you can spend you time attacking the "worst
offenders" in your quest for higher efficiency...

Note that even 75% efficiency for 1V @ 5A is still a large improvement upon
trying to build a linear power supply with the same specs!

---Joel

From: kmaryan on
On Jul 6, 4:18 pm, "Joel Koltner" <zapwireDASHgro...(a)yahoo.com> wrote:
> <kmar...(a)gmail.com> wrote in message
>
> news:246d906d-c984-41c5-b220-e5ea1203afc8(a)a30g2000yqn.googlegroups.com...
>
> > Most canned buck
> > converter designs quote high 80s to mid 90s efficiency when going from
> > 12V to 3.3V, but change the output to 1.0V and the efficiency drops by
> > 10% or more. It's generally worse at lower output voltages.
>
> > What are the properties of the regulator that cause this?
>
> 1) If a catch diode is used, even if it's only, say, 0.3-0.4V, that's a huge
> chunk relative to 1V whereas not necessarily that bad out of 3.3V (...and of
> course nothing at, say, 12V).
> 2) If instead a FET is used (a "synchronous rectifier"), it acts as a resistor
> and the loss is (mostly) I^2*R.  If you drop from 3.3V to 1V but want the same
> output *power*, the *current* will increase by a factor of 3.3 and hence the
> loss will increase by a factor of 3.3^2=10.9.  Ouch!


In the FET case (synchronous), why does the efficiency still drop so
substantially even at the same current? i.e. all else being equal.

Consider for example the figures in the datasheet for the LTC3850
(http://cds.linear.com/docs/Datasheet/38501fb.pdf - bottom of page 5).
The peak efficiency going from 12 to 3.3 at 2A is around 94% (middle
graph), the same configuration but to 1.8V lists about 90% at 2A (left
graph).

I guess my question really should be: If it's possible to design a 12
to 3V, 5A regulator with a mid-90s efficiency, why does it seem that
it's impossible to design a 12 to 1V 5A regulator with mid-90s
efficiency? I can't find any designs that meet this spec, and I can't
come up with any good reasons why not. Even the CPU power suppy
designs that I've been able to find (i.e. the kind that supply 1V at
100A for the processor core) only seem to have efficiency ratings
around the low-mid 80s. My only guess is something about transistor
operating mode issues.

Chris

From: Jeroen Belleman on
kmaryan(a)gmail.com wrote:
> On Jul 6, 4:18 pm, "Joel Koltner" <zapwireDASHgro...(a)yahoo.com> wrote:
>> <kmar...(a)gmail.com> wrote in message
>>
>> news:246d906d-c984-41c5-b220-e5ea1203afc8(a)a30g2000yqn.googlegroups.com...
>>
>>> Most canned buck
>>> converter designs quote high 80s to mid 90s efficiency when going from
>>> 12V to 3.3V, but change the output to 1.0V and the efficiency drops by
>>> 10% or more. It's generally worse at lower output voltages.
>>> What are the properties of the regulator that cause this?
>> 1) If a catch diode is used, even if it's only, say, 0.3-0.4V, that's a huge
>> chunk relative to 1V whereas not necessarily that bad out of 3.3V (...and of
>> course nothing at, say, 12V).
>> 2) If instead a FET is used (a "synchronous rectifier"), it acts as a resistor
>> and the loss is (mostly) I^2*R. If you drop from 3.3V to 1V but want the same
>> output *power*, the *current* will increase by a factor of 3.3 and hence the
>> loss will increase by a factor of 3.3^2=10.9. Ouch!
>
>
> In the FET case (synchronous), why does the efficiency still drop so
> substantially even at the same current? i.e. all else being equal.
> [...]

Isn't obvious yet? For a given output current, losses are
more or less constant, while the output power scales with
output voltage. So Pout/Ploss goes down with output voltage.

Jeroen Belleman