From: Christopher Barber on
spinoza1111(a)yahoo.com wrote:
>
> Christopher Barber wrote:
>
>>spinoza1111(a)yahoo.com wrote:
>>
>>>Christopher Barber wrote:
>>>
>>>
>>>>spinoza1111(a)yahoo.com wrote:
>>>>
>>>>
>>>>
>>>>>>>For, I used the phrase "close to" as regards O(n).
>>>>>>>
>>>>>>>O(n*n) == O(n*n/2), of course. But it's also the case for practical
>>>>>>>applications that n*n/2 < n*n.
>>>>>>
>>>>>>Sure, given the choice between an algorithm that costs n*n steps and one that
>>>>>>costs n*n/2 steps of the same size, the latter is a better choice in terms of
>>>>>>cost, but I don't known anyone other than you that would call such an
>>>>>>algorithm "close to" O(n). When there are linear and O(n log n) algorithms
>>>>>>that are also simpler to implement, what is the practical use case for your
>>>>>>algorithm?
>>>>>
>>>>>Read the literature on O(x), for Chrissake. See for example Skiena, The
>>>>>Algorithm Design Manual
>>>>>(http://www.amazon.com/exec/obidos/tg/detail/-/0387948600/qid=1118630635/sr=8-3/ref=pd_csp_3/103-4992382-6347852?v=glance&s=books&n=507846).
>>>>>
>>>>>You don't UNDERSTAND O(x) UNTIL you know when to use O(n^2) in the
>>>>>pragmatic sense where n is small and known to be small for practical
>>>>>applications.
>>>>
>>>>You are right that it can be ok to use a suboptimal algorithm for when you
>>>>know for a fact that the data size will be small, but generally you would only
>>>>do so because the faster algorithm is more complex or more work to implement.
>>>>This is not the case with your algorithm.
>>>
>>>How do you know that for a specific instance? In a specific instance,
>>>the preconditions may indeed be that n is bounded and delete is fast.
>>>In a specific instance, you have to know how to work with (write about:
>>>speak of) algorithms without snap moral judgements based on
>>>ill-understood complexity theory.
>>
>>First, if delete has constant cost then your algorithm is O(n^2). If delete
>>is O(n), then your algorithm is O(n^3), so the cost of delete doesn't really
>>matter.
>>
>>Second, the "trick" solution is simple to implement (provided you are using a
>>non-crappy programming languge) and is O(n), but even without XOR you can code
>>an O(n) solution using hash tables or a large bit array.
>>
>>Third, assuming you have a system-supplied sort function, the sort and search
>>solution is O(n log n) and still easier to implement.
>>
>>Finally, your algorithm isn't even the simplest O(n^2) algorithm!
>>
>>for (i=0; i<max; ++i) {
>> for (j=0; j<max; ++j) {
>> if (i !=j && a[i] == a[j]) break;
>> }
>> if (j == max) return a[i];
>>}
>
>
> Might be simple is not very elegant, since it wastes time,

Of course it wastes time. It is O(n^2)!

> in the outer
> loop, on elements that have already been paired. Needs to delete. And,
> only has one operation inside the inner loop.
>
> Are you even AWARE that one way to improve a loop within the overbroad
> categories of algorithm complexity is to move paired operations in the
> loop as I did? Sounds like you need to supplement your education with
> some knowledge of compiler optimization, in which code improvements are
> made in the small but make in aggregate a genuine contribution.

Why bother with optimizing a bad algorithm? The only time you would use
such an algorithm is when n is very small so there is no need to perform
any other optimizing tricks. When n is large you had better be using
an O(n) algorithm.

> I agree that an O(log n) algorithm is in some nonmoral sense better
> than O(n), which is better than O(n^y)! But in real programming there's
> much more to know.

Like what?

Your algorithm doesn't appear to have any practical use and you have yet
to suggest one.

>>>>You appear to have "misread" me. I just meant that it was not obvious
>>>
>>>>from what you posted *previously* that you understood the O() notation.
>>>
>>>Sorry you got that impression. But in a collegial setting, you'd assume
>>>that discussants understood the material and you wouldn't have to waste
>>>time establishing their bonafides.
>>
>>I would not assume that someone understood if they said something that
>>made it seem otherwise. That was the case here. When you suggested your
>>O(n^2) was "close to" O(n), you used language that I and others have never
>>heard used when speaking of the O() notation, so the assumption of prior
>>knowledge was dropped.
>
> The problem is that you seem more concerned with biography than
> anything else. OK, the terminology is unfamiliar.

It is not the terminology that is unfamiliar, it is your non-standard use of
it. I have never heard anyone speak of an O(n^2) algorithm as being "close
to" an O(n) one in performance. I have no idea what you mean by "biography".

> Hey, how did you do
> in calculus and the notion of limit of a function?

Of course, but that does not apply to this topic. As n increases the
performance of O(n) and O(n^2) algorithms will diverge.

> Perhaps there is a whole field of algorithm dynamics in which the
> properties of algorithms are studied as they change between different
> versions of the algorithm.

I am not sure what you mean, but those who study algorithms do examine
their properties under different conditions.

> Probably not. But one reason for the 80% failure rate of enterprise
> systems is the unwillingness to speak of code without snap judgements
> and one-upmanship in a field that has been RUINED by laissez faire
> "competition".

I don't think anyone here made a "snap judgement". I also don't see
how suggesting improvements to algorithms or program designs is going
to lead to failures in enterprise systems.

> And even if it makes no sense to speak of the execution time of a
> program approaching a limit, becoming "close to O(n)", approaching an
> infinitesimal but nonzero value for larger N, note that there's a whole
> field of code improvement within compiler optimization in which the
> improvements are beneath the radar screen of complexity theory.

There is no such thing as an infintesimal integer value. It is also not
interesting to claim that algorithms tend to converge in performance as the
sample size goes down because all algorithms are going to perform just about
the same for n == 0. By this argument, an O(n^n) algorithm is "close to"
O(n)!

I don't really know what you are talking about w/ regard to compiler
optimization, but based on this discussion I wouldn't want you optimizing our
compiler.

> I'm a happy camper who likes to solve problems.

You don't give the appearance of being a "happy camper", and if you are
so into solving problems then why are you so resistant to admitting that
there might be better algorithms for the problem at hand?

> But I also like to fight with people who deal with their own
> feelings of incompetence by transferring them onto innocent bystanders.

I don't know who you are taking about, but it isn't me.

- Christopher
From: spinoza1111 on


Gerry Quinn wrote:
> In article <1118670149.685715.160970(a)o13g2000cwo.googlegroups.com>,
> spinoza1111(a)yahoo.com says...
> > Gerry Quinn wrote:
>
> > > Puzzles are fun and educational. Good teaching IMO. The fun element
> > > means the student understands the problem domain concretely before he
> > > tries to plug in textbook solutions. (It worked on you, surely - I
> > > should think/hope you now know more about algorithmic complexity than
> > > you did before this thread. And you've learned a new XOR trick as
> > > well!)
> >
> > You talking to me? I didn't learn jack from this exchange, and that's
> > the problem. I already knew about algorithm complexity and had my time
> > wasted by people who don't know it, because I had to again produce an
> > explanation of their bizarre and offensive behavior.
>
> Nobody who can claim:
>
> "As long as DELETE is an atomic and fast operation this algorithm is
> close to O(n) because the array gets smaller by two entries in the
> worst case through its major loop, and in the worst case for the inner
> loop (where the inner loop must search the entire array in both
> directions) the algorithm is finished by definition."
>
> ..can also claim to be well versed in asymptotic algorithm complexity
> order, since this algorithm is trivially O(n^2).

No, the fact that you read into the post any equation between O(n) and
O(n^y) means three things.

It means that your knowledge of complexity theory is based probably on
the anxious rote memorization of the categories, where the rote
memorization replaces the ability to ideate with overbroad categories
and simplified equations.

It means you are with deficient understanding to put the worst possible
construction on any extended text in the manner of the censor.

It also means you're probably unfamiliar with code improvement as seen
in compiler optimization to which the broad categories of complexity
theory are simply not applicable.

You may post to show off a pre-existing erudition, the appearance of
which you defend by never being quite explicit in such a way as would
benefit actual seekers-of-knowledge in the original intention of this
ng. I remain interested in content.
>
> - Gerry Quinn

From: Mike on
In article <1118739693.953533.129230
@z14g2000cwz.googlegroups.com>, spinoza1111(a)yahoo.com
says...

> The problem is that you seem more concerned with biography than
> anything else. OK, the terminology is unfamiliar. Hey, how did you do
> in calculus and the notion of limit of a function?
>
> Perhaps there is a whole field of algorithm dynamics in which the
> properties of algorithms are studied as they change between different
> versions of the algorithm.
>
> Probably not. But one reason for the 80% failure rate of enterprise
> systems is the unwillingness to speak of code without snap judgements
> and one-upmanship in a field that has been RUINED by laissez faire
> "competition".
>
> And even if it makes no sense to speak of the execution time of a
> program approaching a limit, becoming "close to O(n)", approaching an
> infinitesimal but nonzero value for larger N, note that there's a whole
> field of code improvement within compiler optimization in which the
> improvements are beneath the radar screen of complexity theory.
>
Hmmm..."limit of a function"? "close to O(n)"?

Perhaps you could justify your initial analysis on the
grounds that O(n^2) -> O(n) as n -> 0.

Mike
From: Peter Ammon on
spinoza1111(a)yahoo.com wrote:
>
> Peter Ammon wrote:
>
>>spinoza1111(a)yahoo.com wrote:
>>
>>>They also produce bad feeling in a classroom because of necessity they make only one student look "smart".
>>>
>>
>>irony...too big...crushing....
>
>
> Why is this ironic?

[...]

> A puzzle generates one answer, along with resentment.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

What you wrote above is exactly what happened in this thread - and
that's the irony.

-Peter
From: CBFalconer on
Mike wrote:
> @z14g2000cwz.googlegroups.com>, spinoza1111(a)yahoo.com says...
>
.... snip ...
>>
>> And even if it makes no sense to speak of the execution time of a
>> program approaching a limit, becoming "close to O(n)", approaching
>> an infinitesimal but nonzero value for larger N, note that there's
>> a whole field of code improvement within compiler optimization in
>> which the improvements are beneath the radar screen of complexity
>> theory.
>>
> Hmmm..."limit of a function"? "close to O(n)"?
>
> Perhaps you could justify your initial analysis on the
> grounds that O(n^2) -> O(n) as n -> 0.

Hmm - you may have something there. All the derivatives also
approach zero at the same time. This proves something about
trivialities. Maybe "as the algorithm becomes sufficiently trivial
the incoherencies disappear"?

--
"If you want to post a followup via groups.google.com, don't use
the broken "Reply" link at the bottom of the article. Click on
"show options" at the top of the article, then click on the
"Reply" at the bottom of the article headers." - Keith Thompson


First  |  Prev  |  Next  |  Last
Pages: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16
Prev: Algorithm book recommendation?
Next: TaskScheduler.dll