From: Jerry Coffin on
In article <hku5go$af0$1(a)news.eternal-september.org>,
John.koy(a)example.com says...

[ ... ]

> Exactly. Engineering is about measurable outcomes, quantification.
> What's the equivalent of "this building can withstand a quake of
> magnitude 7.5 for 30 seconds" in software? Can any of us state "this
> software will stand all virus attacks for 12 months" or "this software
> will not crash for 2 years, and if it does your loss won't exceed 20% of
> all digital assets managed by it" ?

Your analogy is fatally flawed, in quite a number of ways.

First of all, a particular piece of software is only one component in
a much larger system of both hardware and software -- where the final
system is generally designed and assembled by a somebody who's not an
engineer at all. What you're asking for isn't like a warranty on a
building. It's more like asking a vendor of steel beams to warrant
that any possible building of any design will withstand earthquake X
as long as it includes this particular component.

Second, an earthquake of magnitude X is a known and measurable
quantity. "all virus attacks for 12 months" is a completely unknown
and unmeasurable quantity. Worse, it's an attack with malice
aforethought -- so in terms of buildings, what you're asking for is
more like a bunker guaranteed to withstand any weapon with which
anybody might attack it. Just for one example, that was the intent of
NORAD headquarters in Cheyenne Mountain -- but by the time it was
finished, there were already weapons capable of destroying it.

I can warrant software to withstand every currently known threat.
Physical buildings can't even withstand threats from decades ago --
and even withstanding threats from a century ago or more is generally
prohibitive. Software is well ahead of buildings in this respect.

As to not crashing for 2 years and/or limiting losses, it's a bit
like asking an auto manufacturer to warrant against a crash for 2
years, and if there is one that nobody will be more than mildly
injured.

--
Later,
Jerry.
From: Michael Foukarakis on
On Feb 12, 4:02 pm, Nick Keighley <nick_keighley_nos...(a)hotmail.com>
wrote:
> On 12 Feb, 11:30, Michael Foukarakis <electricde...(a)gmail.com> wrote:
>
>
>
> > On Feb 12, 10:08 am, Nick Keighley <nick_keighley_nos...(a)hotmail.com>
> > > On 11 Feb, 09:58, Michael Foukarakis <electricde...(a)gmail.com> wrote:
> > > > I am not an expert at law, so I cannot reason about justification or
> > > > necessity. However, I do recall quite a few "mishaps" and software
> > > > bugs that cost both money and lives.
> > > > Let's see: a) Mariner I, b) 1982, an F-117 crashed, can't recall if
> > > > the pilot made it, c) the NIST has estimated that software bugs cost
> > > > the US economy $59 billion annually, d) 1997, radar software
> > > > malfunction led to a Korean jet crash and 225 deaths, e) 1995, a
> > > > flight-management system presents conflicting information to the
> > > > pilots of an American Airlines jet, who got lost, crashed into a
> > > > mountain, leading to the deaths of 159 people, f) the crash of Mars
> > > > Polar Lander, etc. Common sense tells me that certain people bear
> > > > responsibility over those accidents.
>
> > >http://catless.ncl.ac.uk/risks
>
> > I'm terribly sorry, but I didn't get your point, if there was one.
> > Seriously, no irony at all. Care to elaborate?
>
> oh, sorry. You were listing "software bugs that cost both money and
> lives", I thought your list was a bit light (Ariane and Therac spring
> to mind immediatly). I thought you might not have come across the
> RISKs forum that discusses many computer related (and often software
> related) bugs.

Oh, well, that seems rather obvious now. Lack of sleep ftl. I'm sure
Seebs got the point, anyways.
From: James Kanze on
On 12 Feb, 22:21, Brian <c...(a)mailvault.com> wrote:
> On Feb 12, 3:36 pm, James Kanze <james.ka...(a)gmail.com> wrote:
> > On Feb 12, 11:42 am, Leif Roar Moldskred

> > <le...(a)huldreheim.homelinux.org> wrote:
> > > In comp.lang.java.programmer Arved Sandstrom <dces...(a)hotmail.com> wrote:
> > > > This is what I am getting at, although we need to have
> > > > Brian's example as a baseline. In this day and age,
> > > > however, I'm not convinced that a person could even give
> > > > away a free car (it wouldn't be free in any case, it
> > > > would still get taxed, and you'd have to transfer title)
> > > > and be completely off the hook, although 99 times out of
> > > > 100 I'd agree with Brian that it's not a likely scenario
> > > > for lawsuits.
> > > Where Brian's example falls down is that the previous
> > > owner of the car is, in effect, just a reseller: he isn't
> > > likely to have manufactured the car or modified it to any
> > > degree. However, let us assume that he _has_ done
> > > modifications to the car such as, say, replacing the fuel
> > > tank. If he messed up the repair and, without realising
> > > it, turned the fuel car into a potential firebomb, he
> > > would be liable for this defect even if he gave the car
> > > away free of charge.

> > He doesn't even have to have done that much. If he knows
> > that the brakes doen't work, and he lets you drive it, he's
> > legally responsible.

> I have no problem with that. Some though believe that if you
> give away a car and aren't aware of a problem with the car,
> that you are still liable. I don't think I'm obligated to
> have a car looked at by a mechanic before giving it away. If
> it is safe to the best of my knowledge, then I should just
> tell whoever wants the car about it's history and encourage
> them to have the car checked out.

Yes. There are two things that could make you liable: deceit or
negligence. If you make claims you know aren't true, it's
deceit. As a private individual giving something away,
negligence (to the point of making you liable) is practically
impossible. As Jerry pointed out, even in the case of a large
company, with expertise in the field, it's very, very difficult.

IMHO, if you don't take well known and provenly effective
preventive measures, you're negligent. But Jerry is a lot
better versed in the legal issues than I am, so I'll take his
word for it that if everyone's doing it, you're off the hook.
And there are certainly enough software firms delivering junk to
count as almost everyone. (Although it probably depends on the
domain. Almost every firm delivering software to NASA is taking
adequate steps; for that matter, most of the telecoms software
I've seen was "correctly" developed as well.)

--
James Kanze
From: Arved Sandstrom on
Jerry Coffin wrote:
> In article <eab51075-377a-4714-ab9d-853df4fcae95
> @b2g2000yqi.googlegroups.com>, electricdelta(a)gmail.com says...
>
> [ ... ]
>
>> Nobody knows how to build earthquake-immune buildings, yet
>> engineers give certain guarantees. When those are failed to be met,
>> (s)he is held liable. Maybe it's about time some "software
>> engineers" were held liable for their unreliable code in the same
>> way.
>
> Unfortunately, I'm afraid you're mostly wrong. If a building falls
> down, grounds for a lawsuit would be that the engineer(s) involved in
> the design were "negligent". In this case, "negligent" is generally
> defined to mean that the care with which they did this particular job
> was substantially less than would be expected of most others in the
> same profession.
>
> To put it somewhat differently, to win such a case, you need to show
> that (in essence) if virtually and of their direct competitors had
> done the job instead, you'd have a reasonable assurance that you
> would have received a result of substantially better quality.
>
> In the case of software, showing such a thing would be next to
> impossible. Software disasters of truly epic proportions are
> commonplace, well known and easy to cite. Offhand, I'd be hard put to
> think of even one "good practice" that's sufficiently widespread that
> I could testify that it was at all surprising when it wasn't
> followed!

All of which is true, and all of which defines the problem. Acquisition
of quality software is hit and miss, and employment or contracting of
quality software developers is hit and miss. Mostly miss in both cases.

Peter Seebach has made a point that why worry about certifications and
warranties and so forth. He states that there exist individuals who
follow best practices - which is true - and that they don't need a
label. Furthermore, he's made a related statement that the widespread
availability of free software is very beneficial (it may or may not be,
IMHO), and that warranties and liability would be detrimental to this
process.

My take on it is, and I've said it before, certifications and warranties
and so forth are about lifting the bar. Without them purchasers and
employers need to spend a great deal more time and money to establish
what and who is quality, and what and who is not (because most of the
products and prospective employees are _not_ quality). Having
certifications/education/professional development/discipline etc which
are associated with a true profession helps provide an assurance that
*any* certified software developer you intend to hire is going to be
somewhat competent. And having guarantees on a program helps provide an
assurance that the program is suitable for a stated purpose.

The issue is that we are all craftsmen, and our "profession" is in a
pre-Industrial Era stage of craftsmanship. Except that we're missing
even a formal classification of craftsmen (apprentices, journeymen,
masters), so employers have to sort through the inflated self-labelling
so common in our field. Arguments are constantly made that software
development is somehow not amenable to being qualified and quantified,
that it's an art (and a craft), that it's impossible to create reliable
software, and so forth. These are precisely the kinds of arguments that
craftsmen made back in the day, and they don't hold any water.

We will eventually lift ourselves, or be lifted, into the status of a
profession. I'd rather do the lifting than be lifted.

AHS
From: Arved Sandstrom on
LR wrote:
> Arved Sandstrom wrote:
>
>> To my way of thinking there are some
>> implied obligations that come into effect as soon as a software program
>> is published, regardless of price. Despite all the "legal" disclaimers
>> to the effect that all the risk is assumed by the user of the free
>> software, the fact is that the author would not make the program
>> available unless he believed that it worked, and unless he believed that
>> it would not cause harm.
>
> Aren't some programs released with known defects?

Absolutely. To me if a program comes with a clear list of known defects,
with description, that's a sign of maturity. Essentially the set of
known defects simply reduces the working functionality of the program -
if you can form an educated opinion about what that working
functionality is then you're OK to make a decision as to whether or not
it's suitable for the desired purpose.

>> This is common sense.
>
> Applied to what is most likely a branch of mathematics or applied to the
> law?

Applied to the law. Don't get me wrong, I'm not saying the law is like
this now. It's not.

>> I don't know if there is a legal principle attached to this concept, but
>> if not I figure one will get identified. Simply put, the act of
>> publishing _is_ a statement of fitness for use by the author, and to
>> attach completely contradictory legal disclaimers to the product is
>> somewhat absurd.
>
> I think this may be part of an ongoing controversy. Here's a taste of
> what's coming.
> http://www.tampaflduilawyer.com/Defenses/DUIBreathTest.aspx (Look for
> "Throughout the State of Florida, DUI defense attorneys are demanding
> that the State of Florida provide the source code") and there's this:
>
> "Reasons Why Production of the Source Code is Necessary"
> "7. # The extent that known and unknown flaws in the program affect the
> accuracy of the test results."
>
> LR

And to my way of thinking, what's wrong with that? Although in this case
it seems like the fault, if any, would lie more with the purchaser.
Since we currently don't have any widespread mechanisms for certifying
and guaranteeing software, it's up to purchasers to exercise due
diligence *when they can*. I think it's unreasonable to expect private
individuals to be able to test software before purchase, but a
government certainly ought to be able to arrange for thorough testing
and inspection before they decide to buy an application.

AHS