From: Tony Harding on
On 01/21/10 17:45, Pete Dashwood wrote:

<snip>

> I DO feel sorry when I look around and see cases like the friend I
> mentioned, where the solution is "just around the corner" but it seems to be
> taking for ever to come into view. I may be oversensitive to this because my
> father (who my Mother and I both adored) died from heart problems in 1957
> (at the age of 46...) and a few months after he died a technique was
> perfected that could have helped him. By 1960, bypass was available and
> could have saved his life.

*My* father's 2nd heart attack (preceded by a stroke and the first heart
attack) killed him in 1957 when he was 46 years old. Just interesting
.... did you think your life was half over when you turned 23?

Bill
From: Tony Harding on
On 01/24/10 13:59, Alistair wrote:
> On Jan 23, 4:42 am, "Charles Hottel"<chot...(a)earthlink.net> wrote:
>> "Pete Dashwood"<dashw...(a)removethis.enternet.co.nz> wrote in message
>>
>> news:7rv6pmF6k5U1(a)mid.individual.net...
>>
>>
>>
>>
>>
>>> Charles Hottel wrote:
>>>> "Pete Dashwood"<dashw...(a)removethis.enternet.co.nz> wrote in message
>>>> news:7rtoeoFd3qU1(a)mid.individual.net...
>>
>>>> <snip>
>>
>>>>> I think we part company here, Charlie. I don't believe people are
>>>>> evil (I do believe there are aberrated people who behave very badly,
>>>>> but I don't believe they were born that way...). I don't believe in
>>>>> "original sin", and I can't believe a newborn baby is anything but a
>>>>> clean slate that life experience and the intellect of the child will
>>>>> write on.
>>>> <snip>
>>
>>>> I worded this poorly. I believe people are capable of both 'good'
>>>> actions and 'badl' actions depending upon their circumstances and the
>>>> pressures that they are under and on their ablity to control their
>>>> emotional and impulsive responses as well as a host of other factors.
>>
>>>> Suppose that the nanotechnology existed that could be fed a good but
>>>> imperfect terrorist profile as input, and could seek out and identify
>>>> and count all those people that fit the profile. Further suppose
>>>> that some innocent people would be counted as terrorists and some
>>>> potential terrorists would escape detection but that a limit to the
>>>> margin of error could be determined. Suppose it could be determined
>>>> what the number of auxillary casualties such as children of
>>>> terroists who might not survive if their parents were eliminated.
>>>> Lastly suppose that the capability to kill all those identified as a
>>>> potential threat exists.
>>>> You say the 'good' guys would not deploy this technology. If it were
>>>> the day after 9-11 would you still say that? What if there was
>>>> intelligence that a dirty bomb attack that on New York City was in
>>>> the process of being set up and that one million people would die? What
>>>> if it were 2 million or three million people etc.? What if it
>>>> was a coordinated attack on 3 cities or 5 cities or 10 cities or
>>>> more?
>>>> At some point the 'good' guys would weight the amount of their
>>>> potential causalties against the number of potential 'enemy'
>>>> casulaties. What if killing all the "enemy" would result in the death
>>>> of fewer people than if nothing were done? What would be the most
>>>> moral choice?
>>>> I think the answer to what would be done is clear just from the
>>>> history of previous wars. WWII did not involve nanotechnology, but
>>>> decisions were made that resulted in the deaths of 50 to 70 million
>>>> people and they are still just as dead even though it was more
>>>> conventional technology that caused their deaths. The only
>>>> difference is that nanotechnology is potentially more powerful and
>>>> possibly more discriminating in its selection of who to kill. Today
>>>> we use smart bombs and say they cause fewer collateral casualties.
>>
>>>> As our power to create ever more powerful technologies increases our
>>>> wisdom to decide whether or not to use it is not increasing at the
>>>> same pace. I prefer beings that have God like power to also have God
>>>> like wisdom.
>>>> I have played God, not out of choice, and I hated it. I had to decide
>>>> whether and when to have my dog put to sleep. I made the best
>>>> decision that I could. Was it right and timely? I do not know. Did
>>>> I wait too long and did my dog suffer more than she should? Did I
>>>> act too soon and take away good days of remaining life? I don't know
>>>> and I don't know how I could have known the optimum and best time to
>>>> act. I just have to live with my choice.
>>>> As you know I went through an experience with losing a baby. I will
>>>> skip the details of all the decisions that had to be made, and the
>>>> details of the variable quality of the information that we had to
>>>> base those decisions on, and of how that information changed over
>>>> time, and of the conflicting opnions of the various doctors. These
>>>> decisions not only affected our baby but also my wife and the
>>>> potential for her to be able to have a baby in the future. In
>>>> retrospect we made some bad decisions and we were lucky that
>>>> circumstances played out so that it at least would not be dangerous
>>>> for her to have a baby in the future. In fact that actual
>>>> consequenceses we experenced were the same as if all our decisions
>>>> had been 100% correct. We were very, very lucky in that part at
>>>> least.
>>>> Will politicians and military stategists agonize over the potential
>>>> deaths of millions and much as I did ove my wife, baby and a dog? I
>>>> doubt it. Perhaps I am too tender hearted or possibly even
>>>> emotionally immature, but perhaps the people deciding on the proper
>>>> uses of nanotechnology are too cynical or too concerned with
>>>> maintaining their positions. Only time will tell and the future of
>>>> the human race may rest upon the decisions that are made. I only
>>>> wish I thought and felt that we are ready to make these momentous
>>>> decisions. The time to get ready is growing very short and the
>>>> majority of people have no clue that this is approaching.
>>
>>> A really well written post, Charlie. I thought about your points long and
>>> hard, and have nothing to add. I also think your conclusion that we may be
>>> about to obtain weapons we are not emotionally ready for is an excellent
>>> one.
>>
>>> The best analogy I could think of is that it's like giving an Uzi sub
>>> machine gun to a 9 year old who is "normally well behaved".
>>
>>> So you'd just say: "Don't give him the gun..." But this is not a gun. It
>>> is a technology that could help millions of people. We cannot suppress it.
>>
>>> The best we can do is educate the child to the enormity of the power in
>>> his hands.
>>
>>> I agree that it's a frightening thought.
>>
>>> Pete.
>>
>> Thanks.
>>
>> Just as we now hope the anti-virus software keeps ahead of or at least
>> limits the effects of virus software, we can hope that the countermeasures
>> of the good guys can prevent catostrophic effects from the bad guys. I
>> would not be afraid of giving an Uzi to a 9year old, if the Uzi was
>> intelligent enough to recognize its rightful owner and work only for that
>> person.
>
> That would really slow down the gun fight at the OK Corral as the DNA
> analyzers in the gun grips did a spot-check on the handlers' DNA
> sequence. Of course (they are developing guns that can recognize their
> owner) there are always ways around that - using severed hands....

Makes it difficult to aim and to fire. Maybe a preplanned hit, but not
the OK Corral scenario.
From: Tony Harding on
On 01/25/10 10:27, Howard Brazee wrote:
> On Fri, 22 Jan 2010 21:05:04 -0500, "Charles Hottel"
> <chottel(a)earthlink.net> wrote:
>
>> At some point the 'good' guys would weight the amount of their potential
>> causalties against the number of potential 'enemy' casulaties. What if
>> killing all the "enemy" would result in the death of fewer people than if
>> nothing were done? What would be the most moral choice?
>
> Right now many predominant politicians are saying that we should
> disregard the Constitution for people who are suspected of terrorism.
> To me, that's like saying we should destroy America to save it.

As The Fugs said > 40 years ago, "killing for peace is like f*cking for
chastity" (or words to that effect).

> There are SF stories where they have computers predicting future
> crimes, and law enforcement is designed to stop crime before it
> exists. Therefore, everybody arrested and punished is innocent of
> creating any crimes, because the crimes haven't yet been committed.

Sic transit gloria US Constitution. :)
From: Howard Brazee on
On Wed, 27 Jan 2010 12:26:39 -0500, Tony Harding
<tharding(a)newsguy.com> wrote:

>As The Fugs said > 40 years ago, "killing for peace is like f*cking for
>chastity" (or words to that effect).

That's how virgins are created...

--
"In no part of the constitution is more wisdom to be found,
than in the clause which confides the question of war or peace
to the legislature, and not to the executive department."

- James Madison
From: Howard Brazee on
On Wed, 27 Jan 2010 12:02:28 -0500, Tony Harding
<tharding(a)newsguy.com> wrote:

>*My* father's 2nd heart attack (preceded by a stroke and the first heart
>attack) killed him in 1957 when he was 46 years old. Just interesting
>... did you think your life was half over when you turned 23?

I had my first heart attack at the age my father had his first. We've
both had stints, and I'm quickly approaching the age when he had his
final heart attack.

Genetics counts a lot. Arthur Ashe was a world class athlete, but
he had a massive heart attack at the same age as the one that killed
his father. Medicine (or maybe his active lifestyle) had increased
enough so that he survived - but not enough to screen his transfusions
for AIDS, and the transfusions ended up killing him.

--
"In no part of the constitution is more wisdom to be found,
than in the clause which confides the question of war or peace
to the legislature, and not to the executive department."

- James Madison