From: Dan on
Erland,

Thanks for the response. I have the db in Simple recovery mode for this
operation, so logging shouldn't be an issue right?

On a single statment insert into, if I have 50 GB of data, does that mean
that I need more than 50Gb of RAM in order to avoid heavy useage of disk in
the TempDB?

Thanks,

Dan


"Erland Sommarskog" wrote:

> Dan (dantheriver(a)newsgroup.nospam) writes:
> > I need to move 100 Million rows of data from a table in one SQL 08 DB to
> > another database on the same server. What is the fastest way to do this?
> >
> > - Insert Into in one big query
> > - Multiple Insert Into statements looping through xxxx number of rows at a
> > time to "batch" it up
> > -bcp
> > -BULK INSERT
> > -SSIS
> >
> > According to most documentation bcp/Bulk Insert is the fastest way to
> > load lots of data into SQL Server, but is that true even if the data
> > source is another sql server db on the same server? Seems like that
> > would be slower since you have to go to an intermediate text file. It
> > would be nice if there was a bcp pipe from one SQL db to another.
> >
> > What is the best practice for this scenario?
>
> Since you are on SQL 2008, the first option you list, may be the best.
> But a few stars need to be aligned. I'll advice you read the topic
> about INSERT in Books Online, and particularly the section "Using INSERT
> INTO…SELECT to Bulk Load Data with Minimal Logging". If memory serves,
> there is a trace flag that you can set, that widens the oppurtunities
> a bit.
>
> If you must perform the load under full transaction logging, I
> think the second method is the best. Depending on the row size,
> loading about 500.000-1.000.000 rows at a time is likely to be
> a good choice. Just make sure that you define the batches from the
> clustered index of the source table, else localising each batch
> may require a full scan of the source.
>
>
> --
> Erland Sommarskog, SQL Server MVP, esquel(a)sommarskog.se
>
> Links for SQL Server Books Online:
> SQL 2008: http://msdn.microsoft.com/en-us/sqlserver/cc514207.aspx
> SQL 2005: http://msdn.microsoft.com/en-us/sqlserver/bb895970.aspx
> SQL 2000: http://www.microsoft.com/sql/prodinfo/previousversions/books.mspx
>
> .
>
From: Erland Sommarskog on
Dan (dantheriver(a)newsgroup.nospam) writes:
> Thanks for the response. I have the db in Simple recovery mode for this
> operation, so logging shouldn't be an issue right?
>
> On a single statment insert into, if I have 50 GB of data, does that
> mean that I need more than 50Gb of RAM in order to avoid heavy useage of
> disk in the TempDB?

I don't think so, but you will need awful lot of log file. Even if you
are in simple recovery, keep in mind that the log will never be
truncated past the oldest active transaction.

That's one of the major reasons to insert things in batches. You
keep the log file in check.



--
Erland Sommarskog, SQL Server MVP, esquel(a)sommarskog.se

Links for SQL Server Books Online:
SQL 2008: http://msdn.microsoft.com/en-us/sqlserver/cc514207.aspx
SQL 2005: http://msdn.microsoft.com/en-us/sqlserver/bb895970.aspx
SQL 2000: http://www.microsoft.com/sql/prodinfo/previousversions/books.mspx