From: WTShaw on
On Dec 10, 9:28 am, Mok-Kong Shen <mok-kong.s...(a)t-online.de> wrote:
> WTShaw wrote:
> > One important fact is that given different ciphers, the amount of
> > ciphertext needed to solve a message varies.  If only a few characters
> > are used, perhaps nothing conclusive can be solved.  A good question
> > regards the level of possible, ambiguity with one threshold for
> > getting anything meaningful and another for being conclusive.  This
> > refers to one concise algorithm systems.
>
> > Since the levels vary with algorithms, there would be a consideration
> > in finding multiple algorithms that did not lessen the strength of any
> > component level.
>
> If I don't err, a cascade of two encryption systems of different
> nature wouldn't weaken any of them, e.g. cascading transposition and
> substitution in classical crypto.
>
> M. K. Shen

Right, a complementary arrangement of two primitives does exactly
that. This means that finding primitive elements that can add
strength to each other is the way to go.

If strength for an algorithm ideally is to be in the keys alone, then
this means using combinations of primitives that utilize stronger keys
is a good idea. That's pretty straight forward.
From: Mok-Kong Shen on
WTShaw wrote:
> Mok-Kong Shen wrote:
>> WTShaw wrote:
>>> One important fact is that given different ciphers, the amount of
>>> ciphertext needed to solve a message varies. If only a few characters
>>> are used, perhaps nothing conclusive can be solved. A good question
>>> regards the level of possible, ambiguity with one threshold for
>>> getting anything meaningful and another for being conclusive. This
>>> refers to one concise algorithm systems.
>>> Since the levels vary with algorithms, there would be a consideration
>>> in finding multiple algorithms that did not lessen the strength of any
>>> component level.

>> If I don't err, a cascade of two encryption systems of different
>> nature wouldn't weaken any of them, e.g. cascading transposition and
>> substitution in classical crypto.
>
> Right, a complementary arrangement of two primitives does exactly
> that. This means that finding primitive elements that can add
> strength to each other is the way to go.
>
> If strength for an algorithm ideally is to be in the keys alone, then
> this means using combinations of primitives that utilize stronger keys
> is a good idea. That's pretty straight forward.

I suppose you brought up in the 2nd paragraph above a theme that seems
to be fairly neglected. Let me therefore ask some questions. Doesn't
the design of any cipher generally presupposes that the user employs
a strong key? If the answer is no, wouldn't a user then need to know
how tolerant is a given cipher to weakness of keys? Further, in order
to know in that case of his safety of use of the cipher, how could he
'measure' the strength/weakness of his key in quantitative terms in
practice to determine the fulfillment of the requirement?

Thanks,

M. K. Shen
From: WTShaw on
On Dec 12, 3:25 am, Mok-Kong Shen <mok-kong.s...(a)t-online.de> wrote:
> WTShaw wrote:
> > Mok-Kong Shen wrote:
> >> WTShaw wrote:
> >>> One important fact is that given different ciphers, the amount of
> >>> ciphertext needed to solve a message varies.  If only a few characters
> >>> are used, perhaps nothing conclusive can be solved.  A good question
> >>> regards the level of possible, ambiguity with one threshold for
> >>> getting anything meaningful and another for being conclusive.  This
> >>> refers to one concise algorithm systems.
> >>> Since the levels vary with algorithms, there would be a consideration
> >>> in finding multiple algorithms that did not lessen the strength of any
> >>> component level.
> >> If I don't err, a cascade of two encryption systems of different
> >> nature wouldn't weaken any of them, e.g. cascading transposition and
> >> substitution in classical crypto.
>
> > Right, a complementary arrangement of two primitives does exactly
> > that.  This means that finding primitive elements that can add
> > strength to each other is the way to go.
>
> > If strength for an algorithm ideally is to be in the keys alone, then
> > this means using combinations of primitives that utilize stronger keys
> > is a good idea.  That's pretty straight forward.
>
> I suppose you brought up in the 2nd paragraph above a theme that seems
> to be fairly neglected. Let me therefore ask some questions. Doesn't
> the design of any cipher generally presupposes that the user employs
> a strong key? If the answer is no, wouldn't a user then need to know
> how tolerant is a given cipher to weakness of keys? Further, in order
> to know in that case of his safety of use of the cipher, how could he
> 'measure' the strength/weakness of his key in quantitative terms in
> practice to determine the fulfillment of the requirement?
>
> Thanks,
>
> M. K. Shen

Most ciphers, the so-called historical ones, waste keys that might be
used in a stronger up. That is the overall design does not live up to
high expectations. Then the best thing to be done is to fly under the
radar by keeping passages short enough not to be conclusively solved
anyway. So, you need frequent key changes, protocols that are awkward-
confusing, and the need to drop the whole scheme for efficiency's
sake.

Many classic ciphers already have rough quantitative appraisals from
an experienced point of view...there are tables. The necessity is for
the operator-clerk to have experience and a feeling for what works
efficiently and well. Now beyond this old established cipher
strategy, there are neoclassical options that can and do work rather
well, and myrids of newer algorithms that can test the mettle of
solvers. But, too difficult does not promote activity of solvers who
may expect to best any ciphertext, even including a few who specialize
in rather sparse clues...Hi Jim. Since these missing ciphers are not
promoted for one or few reasons, those who otherwise work in the field
see a vacuum at that level and think that nothing means everything,
that what might be there must be worthless...untrue.

Emperical testing requires effort by even other than the designer. I
for one would encourage new ciphers and that they be somehow
evaluated. Someone might even learn something. There are useful
examples.

Starting with the OTP and digressing from it is actually a good idea.
There are many ways that it can be changed. The principles of honest
scientific evaluation are not to be dismissed where cipher ideas can
otherwise be rated. New ciphers are to give new dimensions to one or
more ideas, not to solve everything at once. Then, you would have
newer primitives to add back to the mix rather than merely rude cat
calls from the gallery. It's a tell that so many cipher ideas are
dismissed because of basic incompetence and unfairness by would be
evaluators who otherwise what to be classed as learned cryptographers.
Even making critical mistakes is essential if learning is to take
place. Don't bet everything on a just born horse or say nay to just
any unevaluated neigh.
From: WTShaw on
On Dec 12, 3:25 am, Mok-Kong Shen <mok-kong.s...(a)t-online.de> wrote:
> WTShaw wrote:
> > Mok-Kong Shen wrote:
> >> WTShaw wrote:
> >>> One important fact is that given different ciphers, the amount of
> >>> ciphertext needed to solve a message varies.  If only a few characters
> >>> are used, perhaps nothing conclusive can be solved.  A good question
> >>> regards the level of possible, ambiguity with one threshold for
> >>> getting anything meaningful and another for being conclusive.  This
> >>> refers to one concise algorithm systems.
> >>> Since the levels vary with algorithms, there would be a consideration
> >>> in finding multiple algorithms that did not lessen the strength of any
> >>> component level.
> >> If I don't err, a cascade of two encryption systems of different
> >> nature wouldn't weaken any of them, e.g. cascading transposition and
> >> substitution in classical crypto.
>
> > Right, a complementary arrangement of two primitives does exactly
> > that.  This means that finding primitive elements that can add
> > strength to each other is the way to go.
>
> > If strength for an algorithm ideally is to be in the keys alone, then
> > this means using combinations of primitives that utilize stronger keys
> > is a good idea.  That's pretty straight forward.
>
> I suppose you brought up in the 2nd paragraph above a theme that seems
> to be fairly neglected. Let me therefore ask some questions. Doesn't
> the design of any cipher generally presupposes that the user employs
> a strong key? If the answer is no, wouldn't a user then need to know
> how tolerant is a given cipher to weakness of keys? Further, in order
> to know in that case of his safety of use of the cipher, how could he
> 'measure' the strength/weakness of his key in quantitative terms in
> practice to determine the fulfillment of the requirement?
>
> Thanks,
>
> M. K. Shen

As the OTP represents both a black hole of failed method and a
pinnacle of promise; it is overall insufficient for rational use. An
improvement would be strengthening the key stream. Hashing it can
render it most difficult to recover and provide additional
opportunities for different protocols. One good method is as I have
used for other key purposes, using a counted hash technique.

For discussion, consider a 26 character alphabet used in the key
stream. Convert each 26 characters into a 26 character permutation and
chain the permutations together. A plain text stream would be much
easier to solve with patterns derive from common letters and words in
the data and key string. With a hashed key, the ciphertext would tend
to be most random. Simple but effective. There are lots of possible
design variations without returning to the failed OTP..

No, good cipher designs are not that hard, but not knowing what needs
to be done or how to get there gets you nowhere. Consider that the
hashing technique referenced above used combined primitives of
substitution and transposition in a sort of a parallel method. The
result is really a primitive in itself. But of course for as long as
it has been in the tool kit even once classified, few have seen its
real promise in modern designs.
From: WTShaw on
On Dec 11, 2:37 am, WTShaw <lure...(a)gmail.com> wrote:
> On Dec 10, 9:28 am, Mok-Kong Shen <mok-kong.s...(a)t-online.de> wrote:
>
>
>
> > WTShaw wrote:
> > > One important fact is that given different ciphers, the amount of
> > > ciphertext needed to solve a message varies.  If only a few characters
> > > are used, perhaps nothing conclusive can be solved.  A good question
> > > regards the level of possible, ambiguity with one threshold for
> > > getting anything meaningful and another for being conclusive.  This
> > > refers to one concise algorithm systems.
>
> > > Since the levels vary with algorithms, there would be a consideration
> > > in finding multiple algorithms that did not lessen the strength of any
> > > component level.
>
> > If I don't err, a cascade of two encryption systems of different
> > nature wouldn't weaken any of them, e.g. cascading transposition and
> > substitution in classical crypto.
>
> > M. K. Shen
>
> Right, a complementary arrangement of two primitives does exactly
> that.  This means that finding primitive elements that can add
> strength to each other is the way to go.
>
> If strength for an algorithm ideally is to be in the keys alone, then
> this means using combinations of primitives that utilize stronger keys
> is a good idea.  That's pretty straight forward.

As a continuance, consider that a hashed key that produces a list of
permutations and is easily made and remembered in its parts does meet
historic criteria for algorithms, at least as applied to keys. How
such a key is applied is another question.

While some treat the unknown cipher type problem as beneath their
dignity, it remains a valid type puzzle and has been for decades. With
clues to the method, solving the following hashed lines for their well
known source might demonstrate the relative strength of one variation
of counted hash use. The exact procedure is not quite up to par
according to my standards, just cruising slightly above the trees.
Don't falsely rely much on anything you know as this is a new bird for
many. It's designed not to be a quick solve even if you know haw, or
do you? Surprise me and I will explain all and then up the strength
from this:

dgqavbxzmjwfykiolehtpncusr
qcvuhndkposbayjlmgtfwzixre
mbcklfxngjpihtdseozyaqwruv
zovptnjgwhiakcqudmrsbeyflx
sybrncfptaozhekmdxjiglvuqw
klgvdfyntiuoaqprjhmcwszebx
klzypgqhtefadxmvjrnwicubos
ogvtzwdibmplkjefcayrqsunxh

First  |  Prev  |  Next  |  Last
Pages: 1 2 3 4 5 6 7
Prev: Merry Christmas 10
Next: test