From: unruh on
On 2010-05-24, Paul Rubin <no.email(a)nospam.invalid> wrote:
> "Joseph Ashwood" <ashwood(a)msn.com> writes:
>> While a perfect number is impossible, if you have a large enough set
>> of users you can check the passwords against each other, this gives a
>> distribution for general purposes.
>
> That doesn't make any sense. Each person picks a password from their
> own distribution. You can't usefully treat them as being drawn from one
> monstrous distribution. There's a bunch of cheesy tests you can use to
> filter out obviously bad passwords, but in the end if you're running a
> high-security application, you simply can't rely on passwords for
> authentication. If you're running a casual web forum or the like, you
> don't have to worry too much about password entropy.
>

The question is not "what is the entropy of the passwords as an abstract
exercise" buti" what is the password entropy given the attacker's paln of
attack." Ie, it is more about the attaker. Thus if a user uses
AvjU7^%hJrtM
as their password, and the attacker has a strategy which chooses that as
as the first password to try, it has extremely low entropy given the
attacker's strategy.

Or course it is pretty unlikely that the attacker's strategy will pick
it as the first try. (unless the user for example published it on their
web page.)
The key is that there is not "entropy of a password". One can only make
reasonable assumptions about the attacker's strategy and hope it is not
too far out. Given those assumptions one can estimate the entropy.
> Also, checking passwords against each other isn't so good since it means
> you're storing them as unsalted hashes or even in the clear.
From: unruh on
On 2010-05-24, Maaartin <grajcar1(a)seznam.cz> wrote:
> On May 24, 12:34?pm, Paul Rubin <no.em...(a)nospam.invalid> wrote:
>> The maximum entropy as worst case for brute force search? ?Sure, you can
>> calculate that the obvious way, H=log2(k**n) where k is the size of the
>> alphabet. ?But that is pretty useless, especially since the searcher
>> won't normally know the length of the passphrase (it could be very long).
>
> I wonder how closely is the expected time of brute force search
> related to the entropy. Imagine me picking a 10 characters random
> password consisting of letters only, where I'm biased 80:20 against
> capitals. The entropy is only 54 bits instead of 57, does it mean the
> search takes 8 times less?

If the attacker adapts his search strategy to take that into account,
yes.

From: Mok-Kong Shen on
unruh wrote:

> The question is not "what is the entropy of the passwords as an abstract
> exercise" buti" what is the password entropy given the attacker's paln of
> attack." Ie, it is more about the attaker. Thus if a user uses
> AvjU7^%hJrtM
> as their password, and the attacker has a strategy which chooses that as
> as the first password to try, it has extremely low entropy given the
> attacker's strategy.
>
> Or course it is pretty unlikely that the attacker's strategy will pick
> it as the first try. (unless the user for example published it on their
> web page.)
> The key is that there is not "entropy of a password". One can only make
> reasonable assumptions about the attacker's strategy and hope it is not
> too far out. Given those assumptions one can estimate the entropy.
>> Also, checking passwords against each other isn't so good since it means
>> you're storing them as unsalted hashes or even in the clear.

If there is not "entropy of a password", could there be "entropy of a
message in general"? I am afraid that the existence non-existence
of both are somehow tightly related.

M. K. Shen


From: Gordon Burditt on
>> The key is that there is not "entropy of a password". One can only make
>> reasonable assumptions about the attacker's strategy and hope it is not
>> too far out. Given those assumptions one can estimate the entropy.
>>> Also, checking passwords against each other isn't so good since it means
>>> you're storing them as unsalted hashes or even in the clear.
>
>If there is not "entropy of a password", could there be "entropy of a
>message in general"? I am afraid that the existence non-existence
>of both are somehow tightly related.

Messages are not chosen randomly. They are also usually chosen to have
meaning, which may mean there's a bit or two per word if the message is
in English or other human language. That's a lot less than for random
letters or random words strung together.

From: Paul Rubin on
unruh <unruh(a)wormhole.physics.ubc.ca> writes:
> The key is that there is not "entropy of a password". One can only make
> reasonable assumptions about the attacker's strategy

I see "entropy of a password" as shorthand for "entropy of the
distribution that the password is drawn from". The attacker's obvious
strategy is to model the distribution as closely as possible, then
search starting from the most probable passwords.