From: Tom St Denis on
On Mar 15, 1:05 pm, Carsten Krueger <cakru...(a)gmail.com> wrote:
> Complete source can be found:http://www.withopf.com/tools/securstick/encrsrc.zip
>
> Is this a secure key derivation function?

Why not just use PKCS #5 Algorithm 2? Then you can say you're doing
PKCS #5 as opposed to having to show code so people can figure out
what you're doing....

Tom
From: Kristian Gj�steen on
Carsten Krueger <cakruege(a)gmail.com> wrote:
>Am Tue, 16 Mar 2010 04:48:11 -0700 (PDT) schrieb Tom St Denis:
>
>> Why not just use PKCS #5 Algorithm 2?
>
>That is exactly my point that I've written to the author of this software.
>
>I'm unsure if the algorithm is secure and I would like to have a second
>opinion.

Try the algorithm with two different passwords, but both should have
the same 1024 character long prefix.

Now try the same with PKCS #5 Algorithm 2.

--
Kristian Gj�steen
From: Joseph Ashwood on
"Carsten Krueger" <cakruege(a)gmail.com> wrote in message
news:8o3p72rzl04e$.dlg(a)cakruege.my-fqdn.de...
> Complete source can be found:
> http://www.withopf.com/tools/securstick/encrsrc.zip
>
> Is this a secure key derivation function?

No it isn't. It has a race condition on Result. It does not check that
InData is properly initialized (it is actually non-deterministic, making it
unusable). It has major endian issues (not a problem with a homogenous
environment, but it isn't portable). The salt value is not always used
leading to insecurities. With that said, it appears what this mess is trying
to do is

Key = Whirlpool(Pwd | Pwd| ... | Pwd | SaltLo | SaltHi)

Which could be secure depending on outside variables. Regardless it is
poorly written.
Joe

From: Joseph Ashwood on
"Carsten Krueger" <cakruege(a)gmail.com> wrote in message
news:ip0m09syez8c$.dlg(a)cakruege.my-fqdn.de...
> Am Tue, 16 Mar 2010 15:14:57 -0700 schrieb Joseph Ashwood:
>
>> No it isn't. It has a race condition on Result. It does not check that
>> InData is properly initialized (it is actually non-deterministic, making
>> it
>> unusable).
>
> I'm unsure but maybe the C standard guarantees that it's initialized with
> zero.

C is very specific that it does NOT do that.

>
>> It has major endian issues
>
> Don't checked. Is the salting the problem?

Yes. In particular the use of shifts to parse the number. Big and little
endian will generate different results for the comptuation

>
>> The salt value is not always used
>> leading to insecurities.
>
> It's used every time.

If the password is the maximum length the salt is avoided completely. If the
password is close to the maximum length, then only part of the salt is used.
This is the same flaw as pointed out by Kristian.

>
>>With that said, it appears what this mess is trying
>> to do is
>>
>> Key = Whirlpool(Pwd | Pwd| ... | Pwd | SaltLo | SaltHi)
>
> I don't think so. It's only hard to read.

Maybe, I might have misread, won't make any difference to the security.

> Seems to be:
> key = hash(password + salt)
> for 1 to 65000 do
> key = hash(key + password + salt)
>
> I don't like the idea that password is added every time.
>
> The normal version looks more secure to me:
> key = hash(password + salt)
> for 1 to 65000 do
> key = hash(key + salt)

Actually the version that introduces password every time is more secure. The
reason comes down to maintainance of entropy, because of the
non-invertibility of a hash it will slowly lose entropy with each iteration,
in theory this will eventually approach, and perhaps even reach 0. By adding
the password in every time all lost entropy is reintroduced, creating a
system with a maximum entropy very close to the limit for Whirlpool. For
only a few thousand iterations the difference should be largely unnoticable,
but those same largely unnoticable mistakes have a nasty habit of being
catastrophic. It still has the other issues listed, race condition,
uninitted memory, endian issues, any one of these is sufficient to introduce
significant weaknesses.

The race condition is most immediately worrisome, I've personally used race
conditions on several ocassions to break security, usually it is used for
privilege escalation, but it is just a variation of a timing attack which is
never a good thing in cryptography.

The salt issue is a more long term problem, it means that large keys are
actually less secure than slightly smaller keys.

The lack of memory initialization has always been complex to use in an
attack, but it is closely related to buffer overflow attacks.

I also just noticed that it has a 32/64 bit bug. Mostly a portability issue,
but still problematic. Another buffer overflow like attack. Useful with the
memory initialization. The line "unsigned int ReqSize = sizeof SaltLo +
sizeof SaltHi;" will return different values on 16, 32 and 64 bit machines,
but the next few lines assume a 32-bit integer. On a 64-bit machine this
gives 64 bits of the buffer under the control of something else. Never a
good thing.

The goal may have been secure, and the original design may even be, but the
implementation is not secure. There are critical programming flaws involved.
Joe

From: Joseph Ashwood on
"Carsten Krueger" <cakruege(a)gmail.com> wrote in message
news:zhclrpwfs33t.dlg(a)cakruege.my-fqdn.de...
> Am Fri, 19 Mar 2010 01:07:06 -0700 schrieb Joseph Ashwood:
>
>> Yes. In particular the use of shifts to parse the number. Big and little
>> endian will generate different results for the comptuation
>
> No. I tested it on both architectures.

Good trick, the only Big-Endian systems available in the last nearly 20
years are 64-bit, but the bit processing is fundamentally 32-bit.

>> If the password is the maximum length the salt is avoided completely. If
>> the
>> password is close to the maximum length, then only part of the salt is
>> used.
>> This is the same flaw as pointed out by Kristian.
>
> Ok
>
>> Actually the version that introduces password every time is more secure.
>
> A flaw in the hash function could lead to the problem that it does not
> need
> to do 10.000 rounds to compare password with hash but only one (the last)
> round.
> The key stretching is weaker.

You're incorrect again. The complexity of hash^10000 is the same as the
complexity of hash^10000, the same attack will likely result in a shortcut
for both. The only viable difference is the maintainance of the entropy.

>
>> The reason comes down to maintainance of entropy, because of the
>> non-invertibility of a hash it will slowly lose entropy with each
>> iteration,
>> in theory this will eventually approach, and perhaps even reach 0.
>
> Do you have a pointer to papers, etc?

Its simple probability, do the math. I don't feel like doing the complete
math, so by the birthday paradox after 2^256 inputs there is a 50%
likelihood of a collision in the hash out, since there are 2^512 input the
total entropy is reduced. In each repetition the entropy necessarily
shrinks.

>> uninitted memory,
>
> The memory is not set to zero, but it's never read before it gets
> overwritten with data.

Probably, but making that assumption is the same mistake in logic that has
led to countless buffer overflow attacks.

You keep trying to justify the security, no amount of justification can
change the truth, it is insecure.
Joe