From: jbriggs444 on
On May 26, 7:16 am, Phoenix <ribeiroa...(a)gmail.com> wrote:
> On 24 Maio, 22:58, Mok-Kong Shen <mok-kong.s...(a)t-online.de> wrote:
>
> > If there is not "entropy of a password", could there be "entropy of a
> > message in general"?
>
> Yes
>
> I am afraid that the existence non-existence
>
> > of both are somehow tightly related.
>
> No
>
> See an example:
>
> key/Password/Passphrase = "aaaaaaaaaaaaaaaaaaaa"
> Plaintext/message  = "aaaaaaaaaaaaaaaaaaaaaaa......
> Ciphertext = Hight quality entropy and outher statiscal values
>
> The entropy value for the cipher text, depends on the algorithm.

Eh? It depends on the set of possible algorithms
and the probability distribution for choosing an algorithm from that
set.

If the algorithm is ROT13 and I know that the algorithm is ROT13
and I know the plaintext then the entropy in the ciphertext is
zero.

If the algorithm is AES-256 and I know the algorithm is AES-256
and I know the plaintext and key then again, the entropy in the
ciphertext is zero.

In some sense, if I look at the ciphertext and see what it is then
the entropy of that ciphertext is zero. It is what it is. With
probability 100%.

On the other hand, I can look at the ciphertext as one possible
ciphertext out of all the possible ciphertexts that could have
been generated if the algorithm was unknown but chosen from
some knowable distribution. I could do this while holding
plaintext and key constant.

In this sense, the "entropy of this particular ciphertext" can be
taken as the negative log of the probability that this particular
ciphertext would result from encoding the fixed plaintext with
the fixed key using a randomly selected algorithm.

The average entropy is given by the classical formula:

sum {p(c) * -log(p(c)} over all possible ciphertexts c

If you have a hundred possible algorithms, known plaintext
and known key and you have a 64 bit ciphertext, the average
entropy in the ciphertext is bounded by 7 bits, not 64.

That's _low_ quality entropy.
From: Mok-Kong Shen on
jbriggs444 wrote:
> Phoenix wrote:
>> On 24 Maio, 22:58, Mok-Kong Shen<mok-kong.s...(a)t-online.de> wrote:
>>
>>> If there is not "entropy of a password", could there be "entropy of a
>>> message in general"?
>>
>> Yes
>>
>> I am afraid that the existence non-existence
>>
>>> of both are somehow tightly related.
>>
>> No
>>
>> See an example:
>>
>> key/Password/Passphrase = "aaaaaaaaaaaaaaaaaaaa"
>> Plaintext/message = "aaaaaaaaaaaaaaaaaaaaaaa......
>> Ciphertext = Hight quality entropy and outher statiscal values
>>
>> The entropy value for the cipher text, depends on the algorithm.
>
> Eh? It depends on the set of possible algorithms
> and the probability distribution for choosing an algorithm from that
> set.
>
> If the algorithm is ROT13 and I know that the algorithm is ROT13
> and I know the plaintext then the entropy in the ciphertext is
> zero.
>
> If the algorithm is AES-256 and I know the algorithm is AES-256
> and I know the plaintext and key then again, the entropy in the
> ciphertext is zero.
>
> In some sense, if I look at the ciphertext and see what it is then
> the entropy of that ciphertext is zero. It is what it is. With
> probability 100%.
>
> On the other hand, I can look at the ciphertext as one possible
> ciphertext out of all the possible ciphertexts that could have
> been generated if the algorithm was unknown but chosen from
> some knowable distribution. I could do this while holding
> plaintext and key constant.
>
> In this sense, the "entropy of this particular ciphertext" can be
> taken as the negative log of the probability that this particular
> ciphertext would result from encoding the fixed plaintext with
> the fixed key using a randomly selected algorithm.
>
> The average entropy is given by the classical formula:
>
> sum {p(c) * -log(p(c)} over all possible ciphertexts c
>
> If you have a hundred possible algorithms, known plaintext
> and known key and you have a 64 bit ciphertext, the average
> entropy in the ciphertext is bounded by 7 bits, not 64.
>
> That's _low_ quality entropy.

The concept of entropy (in information theory, CS) goes back to Shannon.
Could you point out where Shannon mentions the dependence on
"algorithms" in his works. Or is that your novel "insight"?

M. K. Shen

From: Bryan on
Mok-Kong Shen wrote:
> I am, as you already know, anytime ready to acknowledge that I am a
> layman with poor knowlege/memory (clueless etc.),

Yet you've been sci.crypt's most prolific participant over the course
of decades. Were you *trying* to avoid learning anything?

> To make a similar argument in the present case, one would go like this:
> Suppose the password is to be lower-case 4 characters long and a user
> gives 'mary'. Compare the two cases: (1) He uses a perfectly random
> mechanism giving each letter a probability of 1/26, in that case this
> password has a probability of occuring of (1/26)^4. (2) he is a rather
> careless person and lazy etc. and his new girl friend is Mary, in
> which case this password has understandably a fairly large probability.
> So the password 'mary' as such "alone" cannot be ascribed any
> probability value.

A message space assigns a probability to each message, and in those
two cases the messages spaces are different.

> All this seems to argue for what I referred to previously, namely that
> the entropy concept can only be applied to the "source" of randomness
> and not to a finite string that one has at hand. As said, I don't claim
> that it is o.k. (it's not my argument!). I only want to know how to
> clearly "refute" that argument, if it is indeed wrong.

To do that you'd have to put for a serious effort toward studying the
material. I don't see the point in trying to engage in arguments you
do not understand.

--
--Bryan
From: Maaartin on
On May 26, 6:38 pm, Mok-Kong Shen <mok-kong.s...(a)t-online.de> wrote:
> The concept of entropy (in information theory, CS) goes back to Shannon.
> Could you point out where Shannon mentions the dependence on
> "algorithms" in his works. Or is that your novel "insight"?

I don't know his work, but it's just logical. Put very informally, the
entropy *here* is the amount of information you gain when you obtain
the ciphertext, which is here a function of known plaintext and known
key and unknown cipher. The only missing piece of information is the
cipher, with one of 128 ciphers selected uniformly at random it makes
7 bits.
From: Mok-Kong Shen on
Bryan worte:
> Mok-Kong Shen wrote:
>> I am, as you already know, anytime ready to acknowledge that I am a
>> layman with poor knowlege/memory (clueless etc.),
>
> Yet you've been sci.crypt's most prolific participant over the course
> of decades. Were you *trying* to avoid learning anything?
>
>> To make a similar argument in the present case, one would go like this:
>> Suppose the password is to be lower-case 4 characters long and a user
>> gives 'mary'. Compare the two cases: (1) He uses a perfectly random
>> mechanism giving each letter a probability of 1/26, in that case this
>> password has a probability of occuring of (1/26)^4. (2) he is a rather
>> careless person and lazy etc. and his new girl friend is Mary, in
>> which case this password has understandably a fairly large probability.
>> So the password 'mary' as such "alone" cannot be ascribed any
>> probability value.
>
> A message space assigns a probability to each message, and in those
> two cases the messages spaces are different.
>
>> All this seems to argue for what I referred to previously, namely that
>> the entropy concept can only be applied to the "source" of randomness
>> and not to a finite string that one has at hand. As said, I don't claim
>> that it is o.k. (it's not my argument!). I only want to know how to
>> clearly "refute" that argument, if it is indeed wrong.
>
> To do that you'd have to put for a serious effort toward studying the
> material. I don't see the point in trying to engage in arguments you
> do not understand.

But that clearly indicated that the question I asked isn't after that
trivial as you claimed, right?

BTW, Mr. Olson, I have finanally clearly to ask you a question? Why do
you "bother" to question my (poor) knowledge etc. instead of looking
into the substance of what I post. If you consider that's all rubbish,
then the best way is to ignore that (and I recommended more than once
people to put me on their kill-file, if they don't like my posts).
For, if you are intellignet engough, you must have known by now that
your continued endeavours to stop my posting wouldn't work and
therefore you are only wasting the bandwidth of the group and more
importantly your own precious time with your personal attacks.

M. K. Shen