From: Mark Murray on
On 07/14/10 10:49, adacrypt wrote:
> I am open to correction here if I am wrong but please explain - cheers
> - adacrypt

You ARE wrong. The reasons why have been explained to you /ad nauseam/.
What reason exists that explaining it yet again will make one iota of
difference?

M
--

From: rossum on
On Wed, 14 Jul 2010 02:49:09 -0700 (PDT), adacrypt
<austin.obyrne(a)hotmail.com> wrote:

>I am open to correction here if I am wrong but please explain
You are trying to do two separate things in one process:
encryption/decryption and armouring.

Encryption/decryption takes some bytes and transforms then to some
different bytes. This process deals exclusively with bytes. It knows
nothing about ASCII, EBCDIC, Unicode or whatever. It has no concept
of "text", "line", "file" or whatever. The process takes an array (or
possibly a stream) of bytes and outputs an array/stream of bytes.

byte[] cyphertext = encrypt(byte[] plaintext);

Decrypting is the same:

byte[] plaintext = decrypt(byte[] cyphertext);

The second thing you are trying to do is "armouring", which is a
process that allows raw bytes to be transmitted through a text based
system. This is where your point about line feeds and EoF markers
appearing in the cyphertext is valid. The standard solution is to
transform the raw bytes into a subset of the printable Unicode
characters. The usual method is Base64, though Base 32 and Base16
(aka Hex) are sometimes used. RFC 3548 covers all three in great
detail.

I very strongly suggest that you do not try to mix these two separate
processes. The armouring is not needed if you are not outputting to a
text file or to a text stream. A binary file, or stream, does not
need armouring. Only apply the armouring where it is needed. Your
current proposal enforces a non-standard form of armouring on all
users, whether they need it or not. Use a standard armouring,
preferably Base64, and make it optional.

Here is an ASCII diagram:

original-data
|
V
bytes
|
V
encrypt
|
V
encrypted-bytes
| |
| V
| armouring
| |
V V
binary-file text-file


rossum

From: rossum on
On Wed, 14 Jul 2010 06:47:53 -0700 (PDT), adacrypt
<austin.obyrne(a)hotmail.com> wrote:

>
>Your'e bluffing - Adacrypt
Why would I do that? You asked for the explanation and I gave it. Go
and read the RFC - it is available on the web.

rossum

From: Bruce Stephens on
adacrypt <austin.obyrne(a)hotmail.com> writes:

[...]

> The modus operandi that I use in my programs is to read in from a
> readymade file of plaintext that is prepared according to ASCII
> character by character - this is read in, character by character,
> immediately enciphered and written to a growing external file of
> ciphertext between each successive character.

What if it's *not* ASCII? What if it's ISO-8859-9, UTF-8, MP3, etc.?

Why not at least *try* to understand that the problem just goes away if
instead of reading the input one character at a time as an ASCII file,
you read it one byte at a time as a binary file? You can still read
ASCII, but then you can automatically read all the other kinds of file.

[...]

From: Bruce Stephens on
adacrypt <austin.obyrne(a)hotmail.com> writes:

[...]

> Please note that I never use ASCII directly because that would be too
> transparent a step and would be a basic 'syntax' error in my crypto
> parlance. I use my digital signature operation to change the byte
> value first of all (I multiply it by ten to create an empty units
> column in the denary representaion of the byte and into that empty
> column goes an integer between 0 and 9 => this provides a digital
> signature) - I then use this enhanced value to index a private
> (secondary) alphabet - this latter value becomes the effective
> numerical representation of the character that was read in originally.
> A bit convoluted but the computer feels no pain and boy is it
> intractable on top of all else that comes later !.

I knew I shouldn't have said anything. <http://xkcd.net/763/>

[...]