From: SqlBeginner on
Thx for the response Jeroen and Uri.

Uri, is it advisable to run your sample query on a table which has
250million data in it and at the end of it I need approx 100 million only
from it?

How does this sound and pls let me know if you find any demerits in using
this:

1. Move the data into a temp table
2. truncate the original table
3. Insert back the data from temp table into the original table
4. May be we can try changing the recovery model to 'Simple' to reduce the
log activity [I have no clue what impact it would be in Prod box to change
recover model ... apart from Backup/Recovery hit]
5. While inserting back drop the index on the original table
6. Load the data and create the indexes back one by one
7. To reclaim the space .. i am thinking DBCC DBReindex or DBCC cleantable
might help!

Regards
Pradeep

"Uri Dimant" wrote:

> insert into Archive..dbArchive
> select getdate(),d.*
> from (delete top (10000)
> from Dbname..tblname
> output deleted.*) d
> go
>
>
>
> "SqlBeginner" <SqlBeginner(a)discussions.microsoft.com> wrote in message
> news:37A9F833-893B-4783-A436-10F284CE92A1(a)microsoft.com...
> > Have a table which sales data from the year 2000. The Database doesn't
> > seem
> > to have any archival mechanism as of now. Now the mgmt feels they don't
> > need
> > data beyond last 'n' years.
> >
> > So delete all those data which is beyond 'n' years is a huge process as i
> > feel it would fill up the transaction log. What would be the best way to
> > approach this problem.
> >
> > Regards
> > Pradeep
>
>
> .
>