From: adacrypt on
On Jun 15, 8:53 am, Bruce Stephens <bruce+use...(a)cenderis.demon.co.uk>
wrote:
> adacrypt <austin.oby...(a)hotmail.com> writes:
>
> [...]
>
> > After a moment's thought I realise there is no problem with images or
> > indeed entire websites - these have been created within the bounds of
> > writable ASCII - they are therefore imminently encryptable by my
> > vector cryptography - I remember making a note of these before during
> > a web design session but forgot about since then - even groups of
> > individual pixels (hypothetical extreme - let the boo boys note) may
> > also be enciphered by encrypting their colour codes. - adacrypt
>
> I suggest you take a few more moments to think.  It's possible to
> convert binary to ASCII.  That's done routinely (common examples are
> uuencode, base64, yenc).  There's a cost to that, however.  And in this
> case there's no point: just handle binary.

I just don't think binary at interface level any more - the Ada
language converts ASCII into denary for my programs and vice versa -
any use of binary is always internal as machine code out of soght
within the computer - I contend that we can all do without binary
outside of the classroom and it was a mistake first day to uses bytes
ot represent ASCII - it sent everbody the wrong way and it may take
years for it o fade out now, - adacrypt
From: Mr. B on
adacrypt wrote:
> I just don't think binary at interface level any more - the Ada
> language converts ASCII into denary for my programs and vice versa -
> any use of binary is always internal as machine code out of soght
> within the computer

Except for things like compression. My operating system, for example,
routinely compresses log files. It is also common for web pages to be
compressed before being sent from the server to the client. Schemes for
encoding binary data with ASCII characters wind up increasing the size of
the data -- which is suboptimal when the goal is compression. You cannot
just ignore the reality that raw binary data is widely used and that it has
to be encrypted sometimes.

> - I contend that we can all do without binary
> outside of the classroom and it was a mistake first day to uses bytes
> ot represent ASCII - it sent everbody the wrong way and it may take
> years for it o fade out now, - adacrypt

I am not even going to try to decipher that -- if you have a problem with
the use of binary numbers, then you might as well stop bothering with
computers.

-- B
From: Bruce Stephens on
adacrypt <austin.obyrne(a)hotmail.com> writes:

[...]

> I just don't think binary at interface level any more - the Ada
> language converts ASCII into denary for my programs and vice versa -
> any use of binary is always internal as machine code out of soght
> within the computer - I contend that we can all do without binary
> outside of the classroom

You still fail to get it.

Almost all of the information that people want to encrypt or decrypt is
already on their computer in files. It's in spreadsheets,
presentations, word processing documents, images, video clips, etc.
Almost all of it in non-ASCII form. Even for the unusual plain text
files large parts of the world don't use ASCII. (You (or the system you
used) sent the message I'm replying to in ISO-8859-1, for example, which
extends ASCII.)

> and it was a mistake first day to uses bytes ot represent ASCII

Why pick on ASCII? *Everything* in your computer is represented as
bytes. Is that a mistake?

What's your alternative? If you say it ought to be denary (almost
everybody calls this decimal) then you should know that integers are
(almost certainly) stored and processed as 4 bytes 2s complement binary;
the decimal conversions are purely a convenience for humans (as I
explained before). (For the most part; almost all languages can process
integers larger and smaller than 4 bytes, and most can handle unsigned
integers as well as signed ones. 4 bytes is a common natural size for
modern computers.)
From: WTShaw on
On Jun 14, 1:00 pm, Bruce Stephens <bruce+use...(a)cenderis.demon.co.uk>
wrote:
> gordonb.zx...(a)burditt.org (Gordon Burditt) writes:
>
> [...]
>
> > (a) the newline silently disappears in the encryption/decryption process.
> > (b) the newline disappears in the encryption/decryption process, but a
> >     warning is given.
> > (c) the newline is silently changed to a space.
> > (d) the newline is changed to a space, but a warning is given.
> > (e) the program aborts without encrypting or decrypting.
> > (f) something else (what?)
>
> Most likely his program reads and encrypts/decrypts one line at a time,
> so newlines are preserved in the ciphertext.
>
> (Probably part of the reason he doesn't just handle binary files is that
> he doesn't know how, though it's also quite possible that he doesn't
> understand why we suggest it would be better.)
>
> [...]

What base you use is your arbitrary choice considering that each has
its own merits. Getting beyond the linefeed problem that PC's have is
necessary an the current program at
http://groups.google.com/group/cryptojavascript/web/06162010rosebud-pome-via-b-64
does that. Remember that javascript is based on digits by preference
of its developers.

In short, in this semiautomatic program single line feed are ignored
while double line feeds an more are reduced to two line feeds. In the
results field, the product is first seen as html text should be. If
needed cut and paste into a text application like notepad or wordpad
already adjusted to the desired maximum width.

Another version is simple with a minimal change to allow ragged
product. I admit that this would be better for fixed line prose if
you are heavily in to that sort of thing. Off hand, I don't remember
that I did it but tab would be best converted into a space since tab
is far from universally defined as a format character, width or use.

Consider the source and see how this relates to any problems which you
have. My same code runs in PC or Mac System X and later and message
have been successfully tested between those platforms.

From: Gordon Burditt on
>> I suggest you take a few more moments to think. �It's possible to
>> convert binary to ASCII. �That's done routinely (common examples are
>> uuencode, base64, yenc). �There's a cost to that, however. �And in this
>> case there's no point: just handle binary.
>
>I just don't think binary at interface level any more - the Ada
>language converts ASCII into denary for my programs and vice versa -

I challenge you to prove that a conversion from the binary ASCII
value to a decimal representation actually takes place at a hardware
level. A reasonable compiler would optimize this conversion out
if all you do is do some math on the character values and output
them. What base it's represented in is usually unimportant as long
as the representation can hold the full range of values required.

The ADA interface does let you know when you reach the end of the
line, and how to end a line. You just need to learn to use it
properly to detect and output newlines properly.

>any use of binary is always internal as machine code out of soght
>within the computer - I contend that we can all do without binary
>outside of the classroom and it was a mistake first day to uses bytes
>ot represent ASCII - it sent everbody the wrong way and it may take
>years for it o fade out now, - adacrypt

I'm assuming you DON'T mean that we should replace ASCII with something
more complicated, like UTF-8.

What would you use to represent characters? Dog turds? How would you
use a computer program written in ADA to encrypt those?

If you can get a version of ADA that doesn't run on a computer, you
might have a point. You can get the denary representation of, er,
what exactly? A handwritten character? An Egyptian hieroglyph?