From: The Frog on
Hi Bob,

I use a disconnected ADO recordset when doing bulk inserting. I create
a recordset based on the target table, then close the connection -
there is no 'actual' writing taking place at this time.

Next I start a transaction with the recordset, add the records I want,
then re-open the connection, and commit the transaction / update.
Doing this I am able to insert over 8 million records across a network
(10 base T would you believe) in just under ten minutes.

I firmly believe that a recordset is your way to go. Having it
disconnected also allows you to run further checks and manipulations
on the inserted data if you want to before actually writing it to
disk.

If you would like to post a little more of the code and sql that you
use I am sure that there will be a way to speed up the processing you
have and hence reduce the time required. This is a pretty normal thing
for my work and I am happy to help. Please remember that the above is
just an example and we would need to know some more about your
scenario to help further.

Cheers

The Frog
From: Bob Barker on
Thank you all who responded thus far!

In my test environment I set up, I was able to insert 1,000,000
records in 3 minutes and 39 seconds using the RecordSet method. I
have yet to transport this into a production or even a development
version of a production database we use, but so far this is
promising! 10,000 records took me 1 second approximately! I am very
excited.

Some issues I ran into: First: I was sending a CommandType Enum
option into the RecordSet open method "adCmdTable"
rst.Open "tblTest", CurrentProject.Connection,
adOpenForwardOnly, adLockOptimistic, adCmdTable
but this caused a 3001 runtime error when attempting to invoke the
AddNew method. After googling some more, MSFT support solution said
to change the CommandType Enum to adCmdTableDirect so my Open method
became
rst.Open "tblTest", CurrentProject.Connection,
adOpenForwardOnly, adLockOptimistic, adCmdTableDirect
After some testing, it took 8 minutes+ to insert 14,651 records which
was an improvement but still not a lot of time savings. I began
focusing on the commandtype enum and why I even needed to send this
since it was optional. I found out that the Open method queries the
connection to the DB and determines which CommandType method is most
optimal to use. After omitting the option at the end, my results are
as folows:

10,000 ROWS - 1 Average
1,000,000 ROWS - 3 minutes 39 seconds Average.

Each row contained a String of variable length between 2 and 50, 1
Integer, a Short Date, and a Double (AKA Float).

THANK YOU ALL! ONCE AGAIN!