From: Chip Eastham on
On Jul 11, 10:55 am, Tom St Denis <t...(a)iahu.ca> wrote:
> On Jul 9, 5:00 pm, Pubkeybreaker <pubkeybrea...(a)aol.com> wrote:
>
> > On Jul 9, 2:42 pm, Tom St Denis <t...(a)iahu.ca> wrote:
>
> > > On Jul 3, 10:37 am, Pubkeybreaker <pubkeybrea...(a)aol.com> wrote:
>
> > > > The NFS(a)Home project is moving towards factoring larger numbers.
>
> > > > This is a call for participants. Visit:
>
> > > >http://escatter11.fullerton.edu/nfs/
>
> > > No offense, but stop. This is really pointless busywork at this
> > > point.
>
> > The Cunningham project is the longest on-going computation project in
> > history. It would be nice to finish it.
>
> Are they looking to develop new algorithms or merely burn more
> cycles? My complaint is mostly along the lines of it's a poor use of
> electricity [which in turn creates more demand for generation and
> pollution].
>
> If I run a for loop from 1 to 2^50 I will have counted to 2^50, but
> will have I learned?
>
> Tom

I believe more than one algorithm is being
applied to the Cunningham project, but that
NFS(a)Home is using a particular number field
sieve approach. In addition to the actual
factorizations of b^n +/- 1 for bases
b = 2, 3, 5, 6, 7, 10, 11, 12 and exponents
n whose ranges depend on the bases, which
give empirical evidence to compare with a
theoretical estimate of number of prime
factors, etc., the project can easily serve
as a test bed for new algorithms.

I'm pretty sure the incremental energy
consumption (of using my computer's spare
cycles for this project) is nominal. If it
should happen that my computer melts down
during the life of the project, I'll be
sure to let you know so you can say you
told me so.

regards, chip
From: Tom St Denis on
On Jul 11, 12:39 pm, Kristian Gjøsteen <kristiag+n...(a)math.ntnu.no>
wrote:
> Tom St Denis  <t...(a)iahu.ca> wrote:
>
> >On Jul 9, 5:00 pm, Pubkeybreaker <pubkeybrea...(a)aol.com> wrote:
> >> The Cunningham project is the longest on-going computation project in
> >> history.  It would be nice to finish it.
>
> >Are they looking to develop new algorithms or merely burn more
> >cycles?  My complaint is mostly along the lines of it's a poor use of
> >electricity [which in turn creates more demand for generation and
> >pollution].
>
> >If I run a for loop from 1 to 2^50 I will have counted to 2^50, but
> >will have I learned?
>
> For this particular loop, the factorizations of a bunch of interesting
> numbers.  Type "the cunningham project" into a search engine.

I did. And? What can we do with the factorization of ridiculously
large numbers of the form b^n-1 for small b? Same thing with the
mersenne prime search. Do I care if
2^8472384723847238472384723847328472384723847238472384732847238427348237-1
is prime? Honestly? Why? I'm not going to do anything userful over
a field of that size, etc. The project wasn't a waste, they developed
some cool algorithms, but once the research stopped the usefulness of
the project ceased.

It's like the OGR project of a few years ago. Any real purpose of
finding huge OGRs? Other than to say you did?

Tom
From: Tom St Denis on
On Jul 11, 2:59 pm, Chip Eastham <hardm...(a)gmail.com> wrote:
> On Jul 11, 10:55 am, Tom St Denis <t...(a)iahu.ca> wrote:
>
>
>
>
>
> > On Jul 9, 5:00 pm, Pubkeybreaker <pubkeybrea...(a)aol.com> wrote:
>
> > > On Jul 9, 2:42 pm, Tom St Denis <t...(a)iahu.ca> wrote:
>
> > > > On Jul 3, 10:37 am, Pubkeybreaker <pubkeybrea...(a)aol.com> wrote:
>
> > > > > The NFS(a)Home project is moving towards factoring larger numbers.
>
> > > > > This is a call for participants.  Visit:
>
> > > > >http://escatter11.fullerton.edu/nfs/
>
> > > > No offense, but stop.  This is really pointless busywork at this
> > > > point.
>
> > > The Cunningham project is the longest on-going computation project in
> > > history.  It would be nice to finish it.
>
> > Are they looking to develop new algorithms or merely burn more
> > cycles?  My complaint is mostly along the lines of it's a poor use of
> > electricity [which in turn creates more demand for generation and
> > pollution].
>
> > If I run a for loop from 1 to 2^50 I will have counted to 2^50, but
> > will have I learned?
>
> > Tom
>
> I believe more than one algorithm is being
> applied to the Cunningham project, but that
> NFS(a)Home is using a particular number field
> sieve approach.  In addition to the actual
> factorizations of b^n +/- 1 for bases
> b = 2, 3, 5, 6, 7, 10, 11, 12 and exponents
> n whose ranges depend on the bases, which
> give empirical evidence to compare with a
> theoretical estimate of number of prime
> factors, etc., the project can easily serve
> as a test bed for new algorithms.

If they're actually using new algorithms then it's cool. Otherwise,
it's just a waste.

Should point out that this sort of empirical observation doesn't mean
much for GNFS [used on RSA].

> I'm pretty sure the incremental energy
> consumption (of using my computer's spare
> cycles for this project) is nominal.  If it
> should happen that my computer melts down
> during the life of the project, I'll be
> sure to let you know so you can say you
> told me so.

I'm more worried about the pollution caused by the increased demand on
power generation.

Tom
From: jgchilders on
On Jul 11, 7:55 am, Tom St Denis <t...(a)iahu.ca> wrote:

> Are they looking to develop new algorithms or merely burn more
> cycles?  My complaint is mostly along the lines of it's a poor use of
> electricity [which in turn creates more demand for generation and
> pollution].

I apologize for reviving an old thread, but... :-)

Develop new algorithms? No. But improve existing algorithms, yes.
The most recent improvement is the implementation of a fast and
efficient parallel implementation of the block Lanzcos algorithm using
the MPI library. Matrices that before took over a month to solve with
the fastest implementation, msieve, can now be completed in a few
days. This improvement has significantly increased the size of
numbers that we can realistically handle. Although the kilobit SNFS
and 768-bit GNFS milestones have been crossed, the implementations
used are not in the public domain. NFS(a)Home is now working toward the
kilobit SNFS milestone using software that is in the public domain and
thus is available for anyone with sufficient computing power to use.

If you do decide to join NFS(a)Home and have a computer with at least
1.25 GB of memory per core, I also request that you try disabling the
lasievee application in your NFS(a)Home preferences. The lasievef
application is being used for the larger, more interesting
factorizations.

Greg
NFS(a)Home Administrator