From: Scott Sauyet on
On Jan 16, 11:38 am, Thomas 'PointedEars' Lahn <PointedE...(a)web.de>
wrote:
> Scott Sauyet wrote:
> > Thomas 'PointedEars' Lahn wrote:
> >> Scott Sauyet wrote:
> >>> You're getting as pedantic as Thomas is, here.
>
> >> By contrast, I am being *precise*, stupid.
>
> > | 2. overly concerned with minute details or formalisms, esp. in
> > teaching
>
> > -http://dictionary.reference.com/browse/pedantic
>
> > Or another way of putting it, being overly precise.
>
> And I do not think I have been *overly* precise. I just would not accept my
> words being twisted by wannabes. If you think that to be pedantic, so be
> it.

Almost always when I think you're being pedantic, I do also think
you're right. I don't think you're being pedantic when you object to
someone twisting your own words. What I find overly precise is when
you correct something that people widely recognize as accurate
*enough*, like when (perhaps from earlier in this thread) you talked
about how it's ES engines not browsers at issue. Sure it's right.
But most knowledgeable people recognize that is true and still prefer
the common usage.

>> If it's an insult, it's a much more mild one than "stupid". :-)
>
> Fair enough :) It is too often *used as* an insult, so I guess I have grown
> a bit allergic to it. Sorry.

I do mean it in a slightly insulting way. I really think the dialog
here doesn't merit that sort of correction unless it touches on the
current discussion. And my skin is plenty thick enough to handle
being called stupid. You clearly have plenty of value to offer in the
discussions here; I like it when you focus on those rather than picky
corrections.

Cheers,

-- Scott
From: Thomas 'PointedEars' Lahn on
Scott Sauyet wrote:

> [...] What I find overly precise is when you correct something that
> people widely recognize as accurate *enough*, like when (perhaps from
> earlier in this thread) you talked about how it's ES engines not
> browsers at issue. Sure it's right. But most knowledgeable people
> recognize that is true and still prefer the common usage.

I do hope you are mistaken here. Or, IOW: I cannot accept people as
knowledgable who say so, because it is obviously false. You should
better double-check your assumptions.


PointedEars
--
Prototype.js was written by people who don't know javascript for people
who don't know javascript. People who don't know javascript are not
the best source of advice on designing systems that use javascript.
-- Richard Cornford, cljs, <f806at$ail$1$8300dec7(a)news.demon.co.uk>
From: Jorge on
On Jan 18, 9:30 pm, Scott Sauyet <scott.sau...(a)gmail.com> wrote:
> (...)
> Jorge pointed out [2] that this changes drastically if we up the ante
> to 20,000 array elements.  In fact in Safari with 20000 array
> elements, the setLookupIncrement and setNewVarIncrement functions are
> over a number of tests, between 5 and 25 times as fast as their
> *Decrement counterparts.  I followed this up in the other browsers
> [3], although I haven't put it on the results page yet, and there is a
> clear turn-around for all tested browsers -- except Opera -- somewhere
> between 10000 and 20000 elements, although in no other browser is it
> as drastic as it is in Safari.
>
> The upshot is that decrement is generally better at smaller array
> sizes, but you're probably better off with increment as the array gets
> larger.  Where the cut-off is will depend on the ES implementation
> most used for your script.  In IE, it's fairly low, between 10 and
> 100; in Chrome, it's between 1000 and 10000; in Firefox, between 100
> and 1000; and in Safari, between 10000 and 20000.  And in Opera, I
> haven't found any such cut-off.
> (...)

This thread might interest you:
http://groups.google.com/group/comp.lang.javascript/browse_thread/thread/ca8297b2de1edad1/f140443fc7dbc8a8#d45575d71a6c117a
--
Jorge.
From: Scott Sauyet on
On Jan 18, 4:54 pm, Jorge <jo...(a)jorgechamorro.com> wrote:
> On Jan 18, 9:30 pm, Scott Sauyet <scott.sau...(a)gmail.com> wrote:
>> Jorge pointed out [2] that this changes drastically if we up the ante
>> to 20,000 array elements.  [ ... ]
>
> This thread might interest you:http://groups.google.com/group/comp.lang.javascript/browse_thread/thread/ca8297b2de1edad1/f140443fc7dbc8a8#d45575d71a6c117a

Thanks. I did see that one first time through. Although it's
interesting in its own right, I don't think it's relevant to this
discussion. We are not pre-allocating the arrays, unless in one of
the decrement algorithms, an ES implementation does a pre-allocation
when faced with

var photos = [];
photos[10000] = //...

But it's interesting re-reading!

Cheers,

-- Scott

From: Jorge on
On Jan 18, 11:21 pm, Scott Sauyet <scott.sau...(a)gmail.com> wrote:
> On Jan 18, 4:54 pm, Jorge <jo...(a)jorgechamorro.com> wrote:
>
> > On Jan 18, 9:30 pm, Scott Sauyet <scott.sau...(a)gmail.com> wrote:
> >> Jorge pointed out [2] that this changes drastically if we up the ante
> >> to 20,000 array elements.  [ ... ]
>
> > This thread might interest you:http://groups.google.com/group/comp.lang..javascript/browse_thread/thr...
>
> Thanks.  I did see that one first time through.  Although it's
> interesting in its own right, I don't think it's relevant to this
> discussion.(...)

Safari stores array elements in fast storage "slots" up to a limit,
but no more: "Our policy for when to use a vector and when to use a
sparse map. For all array indices under MIN_SPARSE_ARRAY_INDEX, we
always use a vector. When indices greater than MIN_SPARSE_ARRAY_INDEX
are involved, we use a vector as long as it is 1/8 full. If more
sparse than that, we use a map."

MIN_SPARSE_ARRAY_INDEX happens to be 1e4.
--
Jorge.