From: Colin Bartlett on
On Sun, Apr 4, 2010 at 1:12 AM, Caleb Clausen <vikkous(a)gmail.com> wrote:
> On 4/3/10, Intransition <transfire(a)gmail.com> wrote:
>> The more OOP approach would be:
>>
>>   class Object
>>     def visits
>>       yield(self)
>>     end
>>   end
>>
>>   module Enumerable
>>     def visits(&block)
>>       each{ |item| item.visits(&block) }
>>     end
>>   end
>
> This won't behave well if passed a true graph (not just tree or dag)
> of ruby objects. For instance:
>  a=[]
>  a<<a
>  a.visits{|x| p x }
>
> At one time, I got interested enough in this problem to write a fairly
> complete solution to this problem (at least I think so). If
> interested, please have a look at:
>  http://github.com/coatl/ron/blob/master/lib/ron/graphedge.rb

So, temporarily ignoring the problem of possible infinite loops
(which I'd just begun to worry about - I was using a = [2] ; a << a,
which in IRB gives [2, [...]], and a = a[1] #=> [2, [...]],
but your example is even simpler)
you opted for recursive_each (and recursive_reverse_each)?

Going back to the original problem, my understanding is that
Roger Pack was looking for a simple way of putting some
data into an array and then getting at it, in which case
the likely maximum depth may well be known, and then
Robert Klemme's solution with a mandatory level/depth argument
seems to work.

So I withdraw my suggestion of level/depth = nil for infinite depth!
(Interestingly, Numeric#step raises an error if the step is 0,
but Date#step seems to allow infinite loops with a step of 0.)

From: Caleb Clausen on
On 4/3/10, Colin Bartlett <colinb2r(a)googlemail.com> wrote:
> So, temporarily ignoring the problem of possible infinite loops
> (which I'd just begun to worry about - I was using a = [2] ; a << a,
> which in IRB gives [2, [...]], and a = a[1] #=> [2, [...]],
> but your example is even simpler)
> you opted for recursive_each (and recursive_reverse_each)?

Actually, while interesting, recursive_each and friends are not really
relevant to what that file is doing. I really should have put in a
little more info about what my code is doing, sorry. The main
interface is Ron::GraphEdge.graphwalk(), which walks all the edges
(not just nodes) of an object graph. A node might be visited more than
once if there are multiple edges leading to it.

I'm doing some other things there which are beyond the scope of what
others in this thread have attempted (or needed), such as iterating
over the subfields of an Object. There's also code in there for
copying, not just walking an object graph, potentially changing it
while it is copied.

> Going back to the original problem, my understanding is that
> Roger Pack was looking for a simple way of putting some
> data into an array and then getting at it, in which case
> the likely maximum depth may well be known, and then
> Robert Klemme's solution with a mandatory level/depth argument
> seems to work.

Yeah, we (especially me) have rather gotten off topic of what Roger
wanted. I'm sure my code is not helpful to Roger, but I couldn't help
sharing it.

From: Robert Klemme on
On 04/03/2010 05:47 PM, Colin Bartlett wrote:

> I also like Robert Klemme's idea of having a level parameter,
> so if Robert is agreeable - and from his Ruby-talk posts he seems to be
> a very agreeable person! - maybe include levels, maybe renamed depth?
> (I've thought of using level in a more general Find module, and
> eventually decided depth was a better word for the concept. I can't
> remember why, but I'm sure there must have been very good reasons!)

Thanks! I think you're right: "depth" is a better name than "level".
It's probably because "depth" hints at "descending into the collection"
while "level" rather sounds like "ascending".

> A question: should the depth/levels test be levels > 1 or > 0?
> Using [ 2..2 ] and "levels > 1":
> levels param: 0 #=> 2..2; 1 #=> 2..2; 2 #=> 2;
> Using [ 2..2 ] and "levels > 0":
> levels param: 0 #=> 2..2; 1 #=> 2; 2 #=> 2;
> It seems (maybe?) more logical to have depth/levels arguments:
> 0 means don't do any recursive expansion of each;
> 1 means do some recursive expansion of each;

It's probably a matter of taste. I used "levels = 1" to indicate that
we step down one level (i.e. into the collection) - that would be the
same as what #each does out of the box. "levels = 2" means, go one
level further. Consequently "level = 0" in my model would mean, do not
step down at all.

Kind regards

robert

--
remember.guy do |as, often| as.you_can - without end
http://blog.rubybestpractices.com/
From: Robert Klemme on
On 04/03/2010 03:07 AM, Caleb Clausen wrote:
> On 4/2/10, Roger Pack <rogerpack2005(a)gmail.com> wrote:
>> Anybody know if there's an easy way to accomplish the equivalent of
>> this:
>>
>> [1..20, 30..46].each{|n|
>> # n = 1,2,3,4,5..30, 31, 32
>> }
>>
>> ?
>
> Oh, I wish for a real integer set class!

Well, there's Fixnum / Bignum:

irb(main):004:0> set = 0
=> 0
irb(main):005:0> set[1]
=> 0
irb(main):006:0> set |= (1 << 1)
=> 2
irb(main):007:0> set[1]
=> 1
irb(main):008:0> set |= (1 << 100)
=> 1267650600228229401496703205378
irb(main):009:0> set[1]
=> 1
irb(main):010:0> set[100]
=> 1

It's fairly easy to build something around that, e.g.

class IntSet
include Enumerable

def initialize
@s = 0
end

def add? x
@s[x] == 0 and @s |= (1 << x)
end

def add x
add? x
self
end

alias << add

def delete? x
@s[x] == 1 and @s ^= (1 << x)
end

def include? x
@s[x] == 1
end

def each
x = 0
it = 1 << x

while it <= @s
yield x if @s[x] == 1
x += 1
it <<= 1
end

self
end
end

is = IntSet.new

is << 10 << 5 << 14

is.each {|x| p x}


Kind regards

robert

--
remember.guy do |as, often| as.you_can - without end
http://blog.rubybestpractices.com/
From: Caleb Clausen on
On 4/4/10, Robert Klemme <shortcutter(a)googlemail.com> wrote:
> On 04/03/2010 03:07 AM, Caleb Clausen wrote:
>> Oh, I wish for a real integer set class!
>
> Well, there's Fixnum / Bignum:

Well, yeah, and that's fine for storing sets of small integers. Or I
could put my Integers into a Set, and that's fine for small sets of
integers. I have used both techniques in the past. What I have in mind
is more scalable: a way to store large sets of large integers.
Efficiently. So, what I would like to see is a data structure with
O(1) (or at most O(log(n))) insertion, deletion, and lookup times, as
well as (as close as possible to) O(1) memory cost per stored item.
Contiguous ranges of integers should be storable with not much more
memory cost than that of a single integer.

Bitmaps come close, but also cost O(1) bits per integer (< the largest
integer stored) which _isn't_ in the set. A hash-based set costs
something like O(2*wordsize) bits per stored integer. Perhaps this can
be reduced to O(2+wordsize) if using google's sparse array/sparse
hash. (Which Roger has been kind enough to add a ruby interfaces
for.... but not an integer set.)

Of course there's a contradiction here; if:
1) the integer set is fairly dense, and
2) members are randomly distributed throughout it,
then Kolmogorov complexity theory says you can't get any better
storage efficiency than a bitmap. So, this miraculous data structure
I'm imagining has to be allowed to discard one or the other of those
assumptions. Discarding 1) leads to something like google's sparse
array (sparse bitmap really; I'm not sure if google's sparse array
code could be used as is). Discarding 2) leads you in the direction of
a tree or trie. That's what I'm more interested in. Maybe something
based on Judy would suit my purposes....

I've wanted this several times, but never really needed it badly
enough to go and implement what I want to see. (As it almost surely
involves writing a fair bit of c code.)

First  |  Prev  |  Next  |  Last
Pages: 1 2 3 4 5 6
Prev: A little help needed
Next: ANN: Sequel 3.10.0 Released