From: topmind on
>
> >I have a question of you. If for the sake of argument
> >the changes *are*
> >random, do you still think polymorphism would simplify changes?
>
> Yes, because polymorphism help me to decouple. Decoupled modules are
> easier to change.
>

First of all, I thought you agreed that not all coupling is bad
(remember the suitcase analogy?). Thus less coupling is not inherantly
good or bad. Your line of reasoning appears inconsistent on this.

It seems coupling is "good" if it helps simplify changes, and "bad" if
it does not. But this gets back to probabilities of possible change
patterns.

Copy-and-paste actually *reduces* coupling because it lets things be
independent, for example. Thus, if reducing coupling is always good,
then copy-and-paste is always good.

So, let's focus on change impact. Can you answer in terms of change
impact instead of "coupling"? That would be helpful. If you would
rather stick to "coupling", then first we need to clarify what kind of
coupling is good and what kind is bad.

How about we explore a code change scenario. I have five case statement
items (occuring once) and you use polymorphism instead. Now suppose the
change is that the items become non-mutually exclusive (yep, that word
again). Do you claim the total change effort for poly will be less than
what it would be for the case statement?

Somewhere around here I think I gave an example of a device driver-like
thingy that now wants to output in multiple formats (HTML, PDF, RTF,
etc.) in one shot so that users can browse in a file directory and pick
the format they want. Remember, this was not a requirement on the first
round of coding.

-T-

From: Juancarlo AƱez on
Robert, (Bob?)

>Of course they describe the whole things as approaching programming
>from a different perspective -- and that's a fair comment.

It is *much more* than a "fair comment". It is what it's all about!

There's a quote from Dijsktra (that I can't find right now) in which he
said to the likes of "I build languages over languages over languages
until I arrive at one in which it is easy to write a solution to the
problem at hand" (I read the quote in "Anatomy of Lisp" by John Allen,
which I can't find right now).

If you're defending that OO is a small step from function tables
implementation-wise, I could not argue that. But what matters is that
OO (from Simula onwards), whichever the implementation (a dynamic
language may choose maps instead of vectors) was a huge conceptual step
respect how we approach problem solving with programming. As big as
Dijsktra's proving structured programing was complete. As big as
Hoare's concept of "monitor", which provided a new insight into
parallel programming (who cares if you need one, two, or three of
Dijstra's "semaphores" to implementa a monitor?)

It is all machine code, bits, and transistors in the end, but that is
not what it's all about.

For example, portable, compiled-to-byte-code languages is a very old
idea that has became mainstream only recently, after people found ways
to implement them efficiently. Byte-code is now much more than a
convenience: it is a resource that enables designs that one would not
had risked imagine just a while ago.

The way we approach software analysis and design today is fundamentally
different because of the availability of OO. The difference is not
limited to "programming convenience". It is an epistemological kind of
thing (as in Hoare and Dijsktra), absolutely.

Sorry for the long note, but....

Juanco

From: Isaac Gouy on
Robert C. Martin wrote:
> On 20 Jun 2005 02:16:36 -0700, ggroups(a)bigfoot.com wrote:
>
> >Of course, the words of the creators of Simula will be of no comfort
> >to prevaricators who cannot deal with the fact that OO did not appear
> >from some programmers hacking around with Algol-60 and discovering
> >things by mistake (a s/w Penicillin in effect) ... ***
>
> Steve, there is a difference between accusing someone of being
> incorrect, and accusing someone of being a liar. The first is
> civilized, and the second is libelous.

Robert, I was puzzled by your reaction (in English prevaricate is not a
synonym for lie) - so after checking the OED, I looked in Websters and
found the definition I expected "to turn aside from, or evade, the
truth; equivocate" and then a secondary meaning "to tell an untruth".

Maybe something was lost in translation between English and American.

From: coeval on
Years ago I've already come across this question; how to apply
objects polymorphism at the business level?
At that time due to the buzzword running by all OO methodologies it was
hard to find other people asking the same question.

So I decided to a look at some formal methods (Z, VDM) to find a
formal, mathematical definition about objects polymorphism..... and
methodologists were stuck to demonstrate a logical proof.

As far I remember just one book, (K. Lano from the Imperial College
of London ), quoted , just at the beginning : " We are unable to give
you any formal description about the polymorphism because it has not
mathematical proof".

Since that time I only use polymorphism " for animal, shape and
devices".

When one says the main concept in OO is polymorphism, let me I have
some doubts.

I read lot of books about OO to improve my skills but most of the time
the real world is completely different.
I suggest OO authors and OO evangelists should dive in 100,000 lines of
Java/C++ code, written by numerous OO-ish programmers, to fix bugs,
defects, performance issues and misconceptions.

And I've learned by myself how is hard to add/change huge number of
line code based on just class-inheritance technique "baked" elsewhere.

In the same time I started using OMT/UML to design my project but
years after years as I used and I am still using relational databases I
changed my way because I want to explain to my customer what I really
understand about its business.
So I am now using NIAM and ORM methods ( Objects relational modelling
from Terry Halpin North Face University) that are well-suited for
relational databases.


Harvey

From: Phlip on
coeval wrote:

> When one says the main concept in OO is polymorphism, let me I have
> some doubts.

Have you read the book /Design Patterns/? The authors held themselves to the
laudable goal of citing 3 real-life projects per pattern.

Each had more, of course.

> I read lot of books about OO to improve my skills but most of the time
> the real world is completely different.

The "real world" is the problem.

Plenty of OO tutorials accidentally start with "OO works best because you
can model the real world", or similar. This is heinous verbiage for one
simple reason: If we discuss which of two OO models is better, saying "mine
is better because it models the real world" adds nothing to the
conversation.

The OO model with the simplest implementation and least redundancy is
better. The Real World _never_ obeys that guideline. But sometimes the best
way to replace redundancies with flexibilities is with virtual methods.

You use OO every time you open a file handle and write to it. The target
could be a disk file, a console, a printer, etc. That's polymorphic behavior
encapsulated behind an interface: OO.

> I suggest OO authors and OO evangelists should dive in 100,000 lines of
> Java/C++ code, written by numerous OO-ish programmers, to fix bugs,
> defects, performance issues and misconceptions.

Show me a technology that can't be abused and I'l show you one that's
useless.

--
Phlip
http://www.c2.com/cgi/wiki?ZeekLand


First  |  Prev  |  Next  |  Last
Pages: 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63
Next: Use Case Point Estimation