From: MrD on
rossum wrote:
> On Mon, 12 Jul 2010 12:58:05 -0500, gordonb.76k3z(a)burditt.org (Gordon
> Burditt) wrote:
>
>> "The Huge Significance of the Bow and Arrow in Modern Cooking"?
> As Mrs Beeton said, "First catch your rabbit."

Alice B. Toklas said "First stone your dates" (in a rather elaborate
recipe for hash brownies).

--
MrD.
From: adacrypt on
On Jul 14, 11:12 pm, rossum <rossu...(a)coldmail.com> wrote:
> On Wed, 14 Jul 2010 06:47:53 -0700 (PDT), adacrypt
>
> <austin.oby...(a)hotmail.com> wrote:
>
> >Your'e bluffing - Adacrypt
>
> Why would I do that?  You asked for the explanation and I gave it.  Go
> and read the RFC - it is available on the web.
>
> rossum

Hi,
> Why would I do that?

You are well known for switching the goal posts from a game that you
are unable to play to to one that you can - like a medic who can't do
brain surgery but instead expounds on broken leg surgery - youv'e been
told this several times in the past by other readers when you
evasively change the subject matter from one you don't know to one
that you think you do and then go on to vindicate an irrelevant
argument as a substitue for the right one under discussion - I
recollect WT Shaw telling you this on at least one occasion and I
quote " There you go again .....etc " - its time for you to get honest
- it wouldn't be so bad if you weren't so dogmatically wrong - In my
case I have delivered the goods - I have provided downloads from my
website so that any reader can go to and test my programs - anybody
can do this and check out the claims I am making about running the
messagetext file as a word document - they will then see that all of
the words that are broken by line endings come together perfectly and
the columns of text are justified in proper alignment to right or
left or centre as the case may be - adacrypt
From: adacrypt on
On Jul 15, 7:01 am, adacrypt <austin.oby...(a)hotmail.com> wrote:
> On Jul 14, 11:12 pm, rossum <rossu...(a)coldmail.com> wrote:
>
> > On Wed, 14 Jul 2010 06:47:53 -0700 (PDT), adacrypt
>
> > <austin.oby...(a)hotmail.com> wrote:
>
> > >Your'e bluffing - Adacrypt
>
> > Why would I do that?  You asked for the explanation and I gave it.  Go
> > and read the RFC - it is available on the web.
>
> > rossum
>
> Hi,
>
> > Why would I do that?
>
> You are well known for switching the goal posts from a game  that you
> are unable to play to to one that you can - like a medic who can't do
> brain surgery but instead expounds on broken leg surgery - youv'e been
> told this several times in the past by other readers when you
> evasively change the subject matter from one you don't know to one
> that you think  you do and then go on to vindicate an irrelevant
> argument as a substitue for the right one under discussion - I
> recollect WT Shaw telling you this on at least one occasion and I
> quote " There you go again .....etc " - its time for you to get honest
> - it wouldn't be so bad if you weren't so dogmatically wrong - In my
> case I have delivered the goods - I have provided downloads from my
> website so that any reader can go to and test my programs - anybody
> can do this and check out the claims I am making about running the
> messagetext file as a word document - they will then see that all of
> the words that are broken by line endings come together perfectly and
> the columns of text are justified in proper alignment  to right or
> left or centre as the case may be - adacrypt

Hi again,
>they will then see that all of
>the words that are broken by line endings come together perfectly and
>the columns of text are justified in proper alignment to right or
>left or centre as the case may be - adacrypt

I am doing myself an injustice - I have just realised my output of
message text of decrypted ciphertext justifies itself perfectly at run-
time in the Ada on-screen editor display - does this automatically - I
don't need to go into a word processor unless I want to re-edit it to
some other form - great - adacrypt
From: adacrypt on
On Jul 15, 10:36 am, Bruce Stephens <bruce
+use...(a)cenderis.demon.co.uk> wrote:
> adacrypt <austin.oby...(a)hotmail.com> writes:
>
> [...]
>
> > The modus operandi that I use in my programs is to read in from a
> > readymade file of plaintext that is prepared according to ASCII
> > character by character - this is read in, character by character,
> > immediately enciphered  and written to a growing external file of
> > ciphertext between each successive character.
>
> What if it's *not* ASCII?  What if it's ISO-8859-9, UTF-8, MP3, etc.?
>
> Why not at least *try* to understand that the problem just goes away if
> instead of reading the input one character at a time as an ASCII file,
> you read it one byte at a time as a binary file?  You can still read
> ASCII, but then you can automatically read all the other kinds of file.
>
> [...]

Hi,
>Why not at least *try* to understand that the problem just goes away if
>instead of reading the input one character at a time as an ASCII file,
>you read it one byte at a time as a binary file? You can still read
>ASCII, but then you can automatically read all the other kinds of file.

This discussion is pushing against an open door in many ways - when I
read in a character It is immediately evaluated into its binary byte
value internally in the computer but it is the denary value as an
enumeration type data that is displayed on screen, that is if I use
ASCII as the enumeration type data (I could use many others - private
customised alphabets) - your'e saying that since the character is
inevitably going to be evaluted as a byte in the ensuing mathematics
why not read in bytes direcly in the first place - am I right in that
assumption.?? - open to correction here.?

Please note that I never use ASCII directly because that would be too
transparent a step and would be a basic 'syntax' error in my crypto
parlance. I use my digital signature operation to change the byte
value first of all (I multiply it by ten to create an empty units
column in the denary representaion of the byte and into that empty
column goes an integer between 0 and 9 => this provides a digital
signature) - I then use this enhanced value to index a private
(secondary) alphabet - this latter value becomes the effective
numerical representation of the character that was read in
originally. A bit convoluted but the computer feels no pain and boy
is it intractable on top of all else that comes later !.

Example,

The character read in instantaneously is say capital 'P' that has
ASCII value 80. I multiply this by 10 to make it 800. I next insert a
digit from the 100 long array of single digits (0 through to 9), say
7, to make the denary representaion of capital P as 807. This value
now becomes the index of another array of 3 digit (usually) integers
that is 1300 elements long. The denary representation of my original
character now finally becomes element no 807 of the latter array,
whatever it may be - this is the value that is operated on in the
ensuing mathematics.

Decryption is asymmetric but uses the same alphabet to decrypt back to
the original charcter - it works very well I can assure you.

It might be said that mutual database technology has automatic digital
signature and the foregoing is unnecessary - in that case just call it
belt and braces back up - you may decide not to use at all on the
other hand - nice to have intrinsic options though.

Working in bytes is a pain in the neck to me - I contend that binary
representation should never have been allowed to escape from the
classroom first day - it is only useful for modelling machine code to
students, it is a grievous red herring outside of that.

Are you saying that the external file of plaintext for encryption is
first prepared as a file of bytes (lot of work - 8 times as long as
charcter files? ) or are you considering keying in plaintext at the
keyboard (only) as your modus operandi - if so then that latter cramps
your style hugely in a busy infrastructure.

My entire modus operandi eschews byte representation like the plague -
I detest it - Uncle Sam sent everybody the wrong way with it back in
1963 and its going to take years to undo the damage - adacrypt
From: adacrypt on
On Jul 15, 4:27 pm, adacrypt <austin.oby...(a)hotmail.com> wrote:
> On Jul 15, 10:36 am, Bruce Stephens <bruce
>
>
>
>
>
> +use...(a)cenderis.demon.co.uk> wrote:
> > adacrypt <austin.oby...(a)hotmail.com> writes:
>
> > [...]
>
> > > The modus operandi that I use in my programs is to read in from a
> > > readymade file of plaintext that is prepared according to ASCII
> > > character by character - this is read in, character by character,
> > > immediately enciphered  and written to a growing external file of
> > > ciphertext between each successive character.
>
> > What if it's *not* ASCII?  What if it's ISO-8859-9, UTF-8, MP3, etc.?
>
> > Why not at least *try* to understand that the problem just goes away if
> > instead of reading the input one character at a time as an ASCII file,
> > you read it one byte at a time as a binary file?  You can still read
> > ASCII, but then you can automatically read all the other kinds of file.
>
> > [...]
>
> Hi,
>
> >Why not at least *try* to understand that the problem just goes away if
> >instead of reading the input one character at a time as an ASCII file,
> >you read it one byte at a time as a binary file?  You can still read
> >ASCII, but then you can automatically read all the other kinds of file.
>
> This discussion is pushing against an open door in many ways - when I
> read in a character It is immediately evaluated into its binary byte
> value internally in the computer but it is the denary value as an
> enumeration type data that is displayed on screen, that is if I use
> ASCII as the enumeration type data (I could use many others - private
> customised alphabets) - your'e saying that since the character is
> inevitably going to be evaluted as a byte in the ensuing mathematics
> why not read in bytes direcly in the first place - am I right in that
> assumption.?? - open to correction here.?
>
> Please note that I never use ASCII directly because that would be too
> transparent a step and would be a basic 'syntax' error in my crypto
> parlance.  I use my digital signature operation to change the byte
> value first of all (I multiply it by ten to create an empty units
> column in the denary representaion of the byte and into that empty
> column goes an integer between 0 and 9 => this provides a digital
> signature) - I then use this enhanced value to index a private
> (secondary) alphabet - this latter value becomes the effective
> numerical representation of the character that was read in
> originally.  A bit convoluted but the computer feels no pain and boy
> is it intractable on top of all else that comes later !.
>
> Example,
>
> The character read in instantaneously is say capital 'P' that has
> ASCII value 80.  I multiply this by 10 to make it 800. I next insert a
> digit from the 100 long array of single digits (0 through to 9), say
> 7, to make the denary representaion of capital P as 807.  This value
> now becomes the index of another array of 3 digit (usually) integers
> that is 1300 elements long.  The denary representation of my original
> character now finally becomes element no 807 of the latter array,
> whatever it may be - this is the value that is operated on in the
> ensuing mathematics.
>
> Decryption is asymmetric but uses the same alphabet to decrypt back to
> the original charcter - it works very well I can assure you.
>
> It might be said that mutual database technology has automatic digital
> signature and the foregoing is unnecessary - in that case just call it
> belt and braces back up - you may decide not to use at all on the
> other hand - nice to have intrinsic options though.
>
>  Working in bytes is a pain in the neck to me - I contend that binary
> representation should never have been allowed to escape from the
> classroom first day - it is only useful for modelling machine code to
> students, it is a grievous red herring outside of that.
>
> Are you saying that the external file of plaintext for encryption is
> first prepared as a file of bytes (lot of work - 8 times as long as
> charcter files? ) or are you considering keying in plaintext at the
> keyboard (only) as your modus operandi - if so then that latter cramps
> your style hugely in a busy infrastructure.
>
> My entire modus operandi eschews byte representation like the plague -
> I detest it - Uncle Sam sent everybody the wrong way with it back in
> 1963 and its going to take years to undo the damage - adacrypt- Hide quoted text -
>
> - Show quoted text -

Hi again,

Regarding Unicode I read in the sets of hexadecimal characters between
the delimiting # characters and evaluate the hex set firstly as a
denary integer number - this is the operand of the ensuing maths
algorithm.

Decryption returns the denary output of the computation as hex - it
requires an intermediary language proficient person to a) convert the
plaintext into unicode hexadecimal for encryption and b) to turn the
hexadecimal ciphertext back into the language of the country in
question .

Note: the string of hexadecimal codepoints may also be treated as a
string of ASCII characters that has alphabet characters 0 to 15. - it
is then encrypted by any of the current ciphers already well known to
users - this is a very efficient efficient ploy - see ASCII_Pad on
http://www.adacrypt.com

It is ditto or similar for all other standards - adacrypt