From: Wojtek on
Lew wrote :
> The point of my example wasn't that Y2K should have been handled earlier, but
> that the presence of the bug was not due to developer fault but management
> decision, a point you ignored.

At the time (70's etc) hard drive space was VERY expensive. All sorts
of tricks were being used to save that one bit of storage. Remember
COBOL's packed decimal?

So the decision to drop the century from the date was not only based on
management but on hard economics.

Which, I will grant, is not a technical decision, though the solution
was...

And at the time Y2K was created it was not a bug. It was a money saving
feature. Probably worth many millions.

--
Wojtek :-)


From: Lew Pitcher on
On February 11, 2010 19:15, in comp.lang.c, nowhere(a)a.com wrote:

> Lew wrote :
>> The point of my example wasn't that Y2K should have been handled earlier,
>> but that the presence of the bug was not due to developer fault but
>> management decision, a point you ignored.
>
> At the time (70's etc) hard drive space was VERY expensive. All sorts
> of tricks were being used to save that one bit of storage. Remember
> COBOL's packed decimal?

Packed decimal (the COBOL COMP-3 datatype) wasn't a "COBOL" thing; it was an
IBM S370 "mainframe" thing. IBM's 370 instructionset included a large
number of operations on "packed decimal" values, including data conversions
to and from fixedpoint binary, and math operations. IBM's COBOL took
advantage of these facilities with the (non-ANSI) COMP-3 datatype.

As for Y2K, there was no "space advantage" in using COMP-3, nor was there an
overriding datatype-reason to store dates in COMP-3. While "space
requirements" are often given as the reason for the Y2K truncated dates,
the truncation usually boiled down to three different reasons:
1) "That's what the last guy did" (maintaining existing code and design
patterns),
2) "We'll stop using this before it becomes an issue" (code longevity), and
3) "We will probably rewrite this before it becomes an issue"
(designer/programmer "laziness").

Space requirements /may/ have been the initial motivation for truncated
dates, but that motivation ceased being an issue in the 1970's, with
cheap(er) high(er) density data storage.

FWIW: I spent 30+ years designing, writing, and maintaining S370 Assembler
and COBOL programs for a financial institution. I have some experience in
both causing and fixing the "Y2K bug".

> So the decision to drop the century from the date was not only based on
> management but on hard economics.
>
> Which, I will grant, is not a technical decision, though the solution
> was...
>
> And at the time Y2K was created it was not a bug.

I agree.

> It was a money saving feature. Probably worth many millions.

I disagree. It was a money-neutral feature (as far as it was a feature) that
would have (and ultimately did) cost millions to change.

Alone, it didn't save much (there's enough wasted space at the end of each
of those billions of mainframe records (alignment issues, don't you know)
to easily have accommodated two more digits (one 8-bit byte) in each
critical date recorded).

The cost would have been in time and manpower (identifying, coding, testing,
& conversion) to expand those date fields after the fact. And, that's
exactly where the Y2K costs wound up. /That's/ the expense that management
didn't want in the 70's and 80's (and got with interest in the 90's).

--
Lew Pitcher
Master Codewright & JOAT-in-training | Registered Linux User #112576
Me: http://pitcher.digitalfreehold.ca/ | Just Linux: http://justlinux.ca/
---------- Slackware - Because I know what I'm doing. ------


From: Seebs on
On 2010-02-12, Lew Pitcher <lpitcher(a)teksavvy.com> wrote:
> Space requirements /may/ have been the initial motivation for truncated
> dates, but that motivation ceased being an issue in the 1970's, with
> cheap(er) high(er) density data storage.

Furthermore, what with the popularity of 30-year mortgages, people were
dealing with Y2K in or before 1970...

-s
--
Copyright 2010, all wrongs reversed. Peter Seebach / usenet-nospam(a)seebs.net
http://www.seebs.net/log/ <-- lawsuits, religion, and funny pictures
http://en.wikipedia.org/wiki/Fair_Game_(Scientology) <-- get educated!
From: Leif Roar Moldskred on
In comp.lang.java.programmer Wojtek <nowhere(a)a.com> wrote:
>
> And at the time Y2K was created it was not a bug. It was a money saving
> feature. Probably worth many millions.

Not really. Remember, you can pack 256 years into a single 8 bit byte if
you want to, but in most cases of the Y2K problem people had stored a
resolution of 100 years into two bytes -- quite wasteful of space.

In some cases it came from a too tight adherence to the manual business
process that was modeled -- remember the paper forms with "19" pre-printed
and then two digits worth of space to fill out? Those got computerised
and the two-digit year tagged along.

In other cases it boiled down to "this is how we've always done it."

--
Leif Roar Moldskred
From: Leif Roar Moldskred on
In comp.lang.java.programmer Brian <coal(a)mailvault.com> wrote:
>
> Imagine driving by a house and seeing a car in front with
> this sign -- "Free car."

Imagine the cook at a soup kitchen storing raw and fried
chicken in the same container.

Or imagine a company giving out away a free game as a marketing
stunt and the game turns out to have been infected with a virus
that formats the users' hard-drives.

Or imagine the author of an open-source product not paying
sufficent attention and accepting a patch from a third party
which turns out to have included a backdoor, providing full
access to any system where the program is running.

--
Leif Roar Moldskred