From: Andy Champ on
Oh boy, that reply of mine stirred things up a bit. Nevertheless I'm
going to stick my head over the parapet again! Excuse me not replying
to someone in particular.

The context was where I suggested that the guys who wrote code with the
Y2k bug may have been justified.

I've been surprised by how many of you seem to be involved in ancient
software (Banks; Nextstep AKA OSX; the steam locos on the Darjeeling
Railway even though I'm not convinced they _have_ much software). And
come to think of it some of the avionics stuff I've brushed past seems
to have quite a long lifetime.

The point I wished to make is that software does not have to be perfect.
It has to be fit for purpose, and one of the ways in which it has to
be fit is to be in a state where it can be shipped. Go and read about
the "osborne effect"!

If you were writing an application for stock control of lettuces in 1970
it was probably a pretty good bet that the system wasn't going to need
to worry about the y2k problem. So the standard Cobol date was fine.

On the other hand, if it was a pensions application it most certainly
would - many current pensioners would have been born in the 19th
century, and many current contributors will not be taking their pensions
until the 21st. So for pensions, the standard date wasn't good enough.

When I knock up an application to read the log files from my overnight
tests it doesn't have to handle all possible errors. It doesn't have to
worry about getting the wrong data format. I _know_ what the data
format will be; being paranoid I'll check it quickly, but I won't
attempt to handle all possible formats, nor to recover from a bad one.
That app. isn't nearly as "good" as the stuff I ship - and it doesn't
have to be.

When Y2K came along there were almost no incidents. The problem was
understood, it was fixed. The software was (at least by then) "good
enough".

Andy