From: Charles Hottel on

"Pete Dashwood" <dashwood(a)removethis.enternet.co.nz> wrote in message
news:7rv6pmF6k5U1(a)mid.individual.net...
> Charles Hottel wrote:
>> "Pete Dashwood" <dashwood(a)removethis.enternet.co.nz> wrote in message
>> news:7rtoeoFd3qU1(a)mid.individual.net...
>>>
>>
>> <snip>
>>
>>> I think we part company here, Charlie. I don't believe people are
>>> evil (I do believe there are aberrated people who behave very badly,
>>> but I don't believe they were born that way...). I don't believe in
>>> "original sin", and I can't believe a newborn baby is anything but a
>>> clean slate that life experience and the intellect of the child will
>>> write on.
>> <snip>
>>
>> I worded this poorly. I believe people are capable of both 'good'
>> actions and 'badl' actions depending upon their circumstances and the
>> pressures that they are under and on their ablity to control their
>> emotional and impulsive responses as well as a host of other factors.
>>
>> Suppose that the nanotechnology existed that could be fed a good but
>> imperfect terrorist profile as input, and could seek out and identify
>> and count all those people that fit the profile. Further suppose
>> that some innocent people would be counted as terrorists and some
>> potential terrorists would escape detection but that a limit to the
>> margin of error could be determined. Suppose it could be determined
>> what the number of auxillary casualties such as children of
>> terroists who might not survive if their parents were eliminated.
>> Lastly suppose that the capability to kill all those identified as a
>> potential threat exists.
>> You say the 'good' guys would not deploy this technology. If it were
>> the day after 9-11 would you still say that? What if there was
>> intelligence that a dirty bomb attack that on New York City was in
>> the process of being set up and that one million people would die? What
>> if it were 2 million or three million people etc.? What if it
>> was a coordinated attack on 3 cities or 5 cities or 10 cities or
>> more?
>> At some point the 'good' guys would weight the amount of their
>> potential causalties against the number of potential 'enemy'
>> casulaties. What if killing all the "enemy" would result in the death
>> of fewer people than if nothing were done? What would be the most
>> moral choice?
>> I think the answer to what would be done is clear just from the
>> history of previous wars. WWII did not involve nanotechnology, but
>> decisions were made that resulted in the deaths of 50 to 70 million
>> people and they are still just as dead even though it was more
>> conventional technology that caused their deaths. The only
>> difference is that nanotechnology is potentially more powerful and
>> possibly more discriminating in its selection of who to kill. Today
>> we use smart bombs and say they cause fewer collateral casualties.
>>
>> As our power to create ever more powerful technologies increases our
>> wisdom to decide whether or not to use it is not increasing at the
>> same pace. I prefer beings that have God like power to also have God
>> like wisdom.
>> I have played God, not out of choice, and I hated it. I had to decide
>> whether and when to have my dog put to sleep. I made the best
>> decision that I could. Was it right and timely? I do not know. Did
>> I wait too long and did my dog suffer more than she should? Did I
>> act too soon and take away good days of remaining life? I don't know
>> and I don't know how I could have known the optimum and best time to
>> act. I just have to live with my choice.
>> As you know I went through an experience with losing a baby. I will
>> skip the details of all the decisions that had to be made, and the
>> details of the variable quality of the information that we had to
>> base those decisions on, and of how that information changed over
>> time, and of the conflicting opnions of the various doctors. These
>> decisions not only affected our baby but also my wife and the
>> potential for her to be able to have a baby in the future. In
>> retrospect we made some bad decisions and we were lucky that
>> circumstances played out so that it at least would not be dangerous
>> for her to have a baby in the future. In fact that actual
>> consequenceses we experenced were the same as if all our decisions
>> had been 100% correct. We were very, very lucky in that part at
>> least.
>> Will politicians and military stategists agonize over the potential
>> deaths of millions and much as I did ove my wife, baby and a dog? I
>> doubt it. Perhaps I am too tender hearted or possibly even
>> emotionally immature, but perhaps the people deciding on the proper
>> uses of nanotechnology are too cynical or too concerned with
>> maintaining their positions. Only time will tell and the future of
>> the human race may rest upon the decisions that are made. I only
>> wish I thought and felt that we are ready to make these momentous
>> decisions. The time to get ready is growing very short and the
>> majority of people have no clue that this is approaching.
>
> A really well written post, Charlie. I thought about your points long and
> hard, and have nothing to add. I also think your conclusion that we may be
> about to obtain weapons we are not emotionally ready for is an excellent
> one.
>
> The best analogy I could think of is that it's like giving an Uzi sub
> machine gun to a 9 year old who is "normally well behaved".
>
> So you'd just say: "Don't give him the gun..." But this is not a gun. It
> is a technology that could help millions of people. We cannot suppress it.
>
> The best we can do is educate the child to the enormity of the power in
> his hands.
>
> I agree that it's a frightening thought.
>
> Pete.

Thanks.

Just as we now hope the anti-virus software keeps ahead of or at least
limits the effects of virus software, we can hope that the countermeasures
of the good guys can prevent catostrophic effects from the bad guys. I
would not be afraid of giving an Uzi to a 9year old, if the Uzi was
intelligent enough to recognize its rightful owner and work only for that
person. Other variations on this theme are of course possible and possibly
desirable. Here's hoping the good nanobots can out replicate the bad ones
;-)

Perhaps a a retirement project I should start work on my resurrection ship.


From: Alistair on
On Jan 22, 5:33 pm, "HeyBub" <hey...(a)NOSPAMgmail.com> wrote:
>
> I plan to re-visit that parking lot looking for more spoor.

Isn't that called "doggin'"?
From: Alistair on
On Jan 23, 2:42 am, "Pete Dashwood"
<dashw...(a)removethis.enternet.co.nz> wrote:
> Alistair wrote:
> > On Jan 22, 1:44 pm, "Pete Dashwood"
> > <dashw...(a)removethis.enternet.co.nz> wrote:
>
> >> Fortunately, as time goes by, people on both sides of this conflict
> >> are getting wiser. Eventually (and it will be a very long time) they
> >> will realise that continued warfare is in nobody's interest and
> >> they'll start trading and living together. As people become better
> >> educated, they are less likely to be strictly religious, and the the
> >> religious grounds for war recede.
>
> > Witness the CIA bomber (a doctor) in Afghanistan.
>
> A fair point.
>
> Maybe exposure to the Military affects the reason. :-) (I too was a
> soldier... :-))
>
> Maybe some people have religion so deeply ingrained in them that no amont of
> education will get them thinking for themselves.
>

My fundamentalist friend whenever we have arguments (about the age of
the earth, etc.) asks for citations and says "that depends upon what
you mean by...". E.G. the earth was calculated to be 6000 years old
(give or atke a day or two and allowing for the calender changes) by
some learned scholar. My friend has the view that if God made the
earth in 6 days then he did and that each day could last as long as
god wanted it to. Oddly, he is a geek and heavily in to science but he
just doesn't accept fundamental scientific principles such as uranium
half-lives.

The evil aspect to me hopes that the fundamental principles under
which his internal combustion engine works will suffer a localized
breakdown and he won't be covered for acts of god.

From: Alistair on
On Jan 23, 4:42 am, "Charles Hottel" <chot...(a)earthlink.net> wrote:
> "Pete Dashwood" <dashw...(a)removethis.enternet.co.nz> wrote in message
>
> news:7rv6pmF6k5U1(a)mid.individual.net...
>
>
>
>
>
> > Charles Hottel wrote:
> >> "Pete Dashwood" <dashw...(a)removethis.enternet.co.nz> wrote in message
> >>news:7rtoeoFd3qU1(a)mid.individual.net...
>
> >> <snip>
>
> >>> I think we part company here, Charlie. I don't believe people are
> >>> evil (I do believe there are aberrated people who behave very badly,
> >>> but  I don't believe they were born that way...). I don't believe in
> >>> "original sin", and I can't believe a newborn baby is anything but a
> >>> clean slate that life experience and the intellect of the child will
> >>> write on.
> >> <snip>
>
> >> I worded this poorly. I believe people are capable of both 'good'
> >> actions and 'badl' actions depending upon their circumstances and the
> >> pressures that they are under and on their ablity to control their
> >> emotional and impulsive responses as well as a host of other factors.
>
> >> Suppose that the nanotechnology existed that could be fed a good but
> >> imperfect terrorist profile as input, and could seek out and identify
> >> and count all those people that fit the profile. Further  suppose
> >> that some innocent people would be counted as terrorists and some
> >> potential terrorists would escape detection but that a limit to the
> >> margin of error could be determined.  Suppose it could be determined
> >> what the number of  auxillary casualties such as children of
> >> terroists who might not survive if their parents were eliminated.
> >> Lastly suppose that the capability to kill all those identified as a
> >> potential threat exists.
> >> You say the 'good' guys would not deploy this technology.  If it were
> >> the day after 9-11 would you still say that?  What if there was
> >> intelligence that a dirty bomb attack that on New York City was in
> >> the process of being set up and that one million people would die? What
> >> if it were 2 million or three million people etc.?  What if it
> >> was a coordinated attack on  3 cities or 5 cities or 10 cities or
> >> more?
> >> At some point the 'good' guys would weight the amount of their
> >> potential causalties against the number of potential 'enemy'
> >> casulaties. What if killing all the "enemy" would result in the death
> >> of fewer people than if nothing were done?  What would be the most
> >> moral choice?
> >> I think the answer to what would be done is clear just from the
> >> history of previous wars.  WWII did not involve nanotechnology, but
> >> decisions were made that resulted in the deaths of 50 to 70 million
> >> people and they are still just as dead even though it was more
> >> conventional technology that caused their deaths.  The only
> >> difference is that nanotechnology is potentially more powerful and
> >> possibly more discriminating in its selection of who to kill.  Today
> >> we use smart bombs and say they cause fewer collateral casualties.
>
> >> As our power to create ever more powerful technologies increases our
> >> wisdom to decide whether or not to use it is not increasing at the
> >> same pace.  I prefer beings that have God like power to also have God
> >> like wisdom.
> >> I have played God, not out of choice, and I hated it.  I had to decide
> >> whether and when to have my dog put to sleep.  I made the best
> >> decision that I could.  Was it right and timely?  I do not know. Did
> >> I wait too long and did my dog suffer more than she should?  Did I
> >> act too soon and take away good days of remaining life?  I don't know
> >> and I don't know how I could have known the optimum and best time to
> >> act. I just have to live with my choice.
> >> As you know I went through an experience with losing a baby.  I will
> >> skip the details of all the decisions that had to be made, and the
> >> details of the variable quality of the information that we had to
> >> base those decisions on, and of how that information changed over
> >> time, and of the conflicting opnions of the various doctors.  These
> >> decisions not only affected our baby but also my wife and the
> >> potential for her to be able to have a baby in the future.  In
> >> retrospect we made some bad decisions and we were lucky that
> >> circumstances played out so that it at least would not be dangerous
> >> for her to have a baby in the future.  In fact that actual
> >> consequenceses we experenced were the same as if  all our decisions
> >> had been 100% correct. We were very, very lucky in that part at
> >> least.
> >> Will politicians and military stategists agonize over the potential
> >> deaths of millions and much as I did ove my wife, baby and a dog?  I
> >> doubt it. Perhaps I am too tender hearted or possibly even
> >> emotionally immature, but perhaps the people deciding on the proper
> >> uses of nanotechnology  are too cynical or too concerned with
> >> maintaining their positions.  Only time will tell and the future of
> >> the human race may rest upon the decisions that are made.  I only
> >> wish I thought and felt that we are ready to make these momentous
> >> decisions.  The time to get ready is growing very short and the
> >> majority of people have no clue that this is approaching.
>
> > A really well written post, Charlie. I thought about your points long and
> > hard, and have nothing to add. I also think your conclusion that we may be
> > about to obtain weapons we are not emotionally ready for is an excellent
> > one.
>
> > The best analogy I could think of is that it's like giving an Uzi sub
> > machine gun to a 9 year old who is "normally well behaved".
>
> > So you'd just say: "Don't give him the gun..." But this is not a gun. It
> > is a technology that could help millions of people. We cannot suppress it.
>
> > The best we can do is educate the child to the enormity of the power in
> > his hands.
>
> > I agree that it's a frightening thought.
>
> > Pete.
>
> Thanks.
>
> Just as we now hope the anti-virus software keeps ahead of or at least
> limits the effects of virus software, we can hope that the countermeasures
> of the good guys can prevent catostrophic effects from the bad  guys.  I
> would not be afraid of giving an Uzi to a 9year old, if the Uzi was
> intelligent enough to recognize its rightful owner and work only for that
> person.  

That would really slow down the gun fight at the OK Corral as the DNA
analyzers in the gun grips did a spot-check on the handlers' DNA
sequence. Of course (they are developing guns that can recognize their
owner) there are always ways around that - using severed hands....

From: Anonymous on
In article <f8a8a0a3-d02a-4ff8-b4be-a80dbb68ed3f(a)a22g2000yqc.googlegroups.com>,
Alistair <alistair(a)ld50macca.demon.co.uk> wrote:
>On Jan 23, 2:42?am, "Pete Dashwood"
><dashw...(a)removethis.enternet.co.nz> wrote:
>> Alistair wrote:
>> > On Jan 22, 1:44 pm, "Pete Dashwood"
>> > <dashw...(a)removethis.enternet.co.nz> wrote:

[snip]

>> Maybe some people have religion so deeply ingrained in them that no amont of
>> education will get them thinking for themselves.
>>
>
>My fundamentalist friend whenever we have arguments (about the age of
>the earth, etc.) asks for citations and says "that depends upon what
>you mean by...".

This is one of the Very Good Reasons for beginning with agreed-upon
Definitions, Postulates and Common Notions before one begins to construct
Propositions for proof/disproof.

DD