From: Rene J. Pajaron on
Hi Goeff,

I do not use BULK INSERT because it does not insert all rows from the
table. I left it uncheck, and it works but very slow, and I have 10
tables here that averages 1.5M rows.

What does Bulk Insert do?

Background info:
MS SQL Server 2008
Tables are already created, using SQLMaster. I did not bother to drop
the table, because I can right clicked at table, and select "Import
Data from DBF/XML->Import Data from DBF".

Using Bulk Inserts:

Option I checked here are the following:
CSV file path: ...valid path....
Server CSV Path: same as above, as I am working on the same PC.

Options:
1. Create Table is disabled (because table is already existing).
2. Create Index is checked
3. Upload Data is checked
4. Use Bulk Insert is checked
5. Stop on Error is not checked
6. Fix Illegal Names is not checked
7. Add RECID is check by default
8. Add Del column is not checked
9. Add RLOCK col is not checked

Memo Field as : TEXT

Upload block Size (rows) 100

DBFCDX

Then, GET CSV DELIMITERS
String Delimiter is not available
Row Delimiter: ||CRLF
Field Delimiter: >|<

I experimented on changing above two values; but Bulk Inserts do not
happen; or do I have to use other tools to import data?

Thanks for your help Geoff,

Rene


On Aug 4, 6:13 am, "Geoff Schaller"
<geo...(a)softwarexobjectives.com.au> wrote:

> Rene.
>
> 2 million rows using bulk insert will take about 40 seconds on my PC.
> Most of that is creating the CSV from the DBF.
>
> I presume you used bulk insert?
>
> Geoff
>
> "Rene J. Pajaron" <rjpaja...(a)gmail.com> wrote in messagenews:ca00d138-c7c1-4b8a-bf3e-e4887710102a(a)q21g2000prm.googlegroups.com:
>
> > Hello,
>
> > I have DBF with more than 2 million rows in it, it seems SQLMaster
> > took ages to convert to SQL Server 2008.
>
> > Anyone, have faster idea?
>
> > Actually, we are planning to write our own conversion, but I want to
> > see how fast would SQLMaster.
> > Maybe my PC is slow enough: Win7, 8GB, Quad core, seems fast to me.
>
> > Rene
>
>

From: Stephen Quinn on
Rene

> 2. Create Index is checked
I'd turn this off and create the indices after the data has been loaded.

> What does Bulk Insert do?
Exactly what it says
- it'll insert 'x' number of records each time (where 'x' can be 1000 or
more)

Geoff can give a more/better info than I can though.

CYA
Steve


From: Rene J. Pajaron on
Hi Steve,

Bulk insert - my understanding is that it do insert after CSV is
created.

Anyway, when this option is check, nothing happens - except it create
an new table when this is not yet created.

Rene

On Aug 4, 11:27 am, "Stephen Quinn" <stevej...(a)bigpondSPAM.net.au>
wrote:
> Rene
>
> > 2. Create Index is checked
>
> I'd turn this off and create the indices after the data has been loaded.
>
> > What does Bulk Insert do?
>
> Exactly what it says
>     - it'll insert 'x' number of records each time (where 'x' can be 1000 or
> more)
>
> Geoff can give a more/better info than I can though.
>
> CYA
> Steve

From: Rene J. Pajaron on
Hi Steve,

Bulk insert - my understanding is that it do insert after CSV is
created.

Anyway, when this option is check, nothing happens - except it create
an new table when this is not yet created.

Rene

On Aug 4, 11:27 am, "Stephen Quinn" <stevej...(a)bigpondSPAM.net.au>
wrote:
> Rene
>
> > 2. Create Index is checked
>
> I'd turn this off and create the indices after the data has been loaded.
>
> > What does Bulk Insert do?
>
> Exactly what it says
>     - it'll insert 'x' number of records each time (where 'x' can be 1000 or
> more)
>
> Geoff can give a more/better info than I can though.
>
> CYA
> Steve

From: Geoff Schaller on
Bulk insert works fine for almost all people using it that I know (and
yes, there are quite a few). Send me your DBF and let me see if I can
upload it.

Make sure you select to load data before keys or key violations may
prevent the upload.

Make sure you DO NOT PRE-CREATE the table. Let SQL Master create the
table or I cannot guarantee you got the columns in the right order.

Make sure you read the log file I create. It usually tells you the
error.

Make sure you DO NOT change the default separators. They are designed
with VO'ers in mind that did crazy things with data in memos.

Geoff



"Rene J. Pajaron" <rjpajaron(a)gmail.com> wrote in message
news:33d83868-9f2c-43ba-9c76-5e1c54d95c50(a)h17g2000pri.googlegroups.com:

> Hi Goeff,
>
> I do not use BULK INSERT because it does not insert all rows from the
> table. I left it uncheck, and it works but very slow, and I have 10
> tables here that averages 1.5M rows.
>
> What does Bulk Insert do?
>
> Background info:
> MS SQL Server 2008
> Tables are already created, using SQLMaster. I did not bother to drop
> the table, because I can right clicked at table, and select "Import
> Data from DBF/XML->Import Data from DBF".
>
> Using Bulk Inserts:
>
> Option I checked here are the following:
> CSV file path: ...valid path....
> Server CSV Path: same as above, as I am working on the same PC.
>
> Options:
> 1. Create Table is disabled (because table is already existing).
> 2. Create Index is checked
> 3. Upload Data is checked
> 4. Use Bulk Insert is checked
> 5. Stop on Error is not checked
> 6. Fix Illegal Names is not checked
> 7. Add RECID is check by default
> 8. Add Del column is not checked
> 9. Add RLOCK col is not checked
>
> Memo Field as : TEXT
>
> Upload block Size (rows) 100
>
> DBFCDX
>
> Then, GET CSV DELIMITERS
> String Delimiter is not available
> Row Delimiter: ||CRLF
> Field Delimiter: >|<
>
> I experimented on changing above two values; but Bulk Inserts do not
> happen; or do I have to use other tools to import data?
>
> Thanks for your help Geoff,
>
> Rene
>
>
> On Aug 4, 6:13 am, "Geoff Schaller"
> <geo...(a)softwarexobjectives.com.au> wrote:
>
>
> > Rene.
> >
> > 2 million rows using bulk insert will take about 40 seconds on my PC.
> > Most of that is creating the CSV from the DBF.
> >
> > I presume you used bulk insert?
> >
> > Geoff
> >
> > "Rene J. Pajaron" <rjpaja...(a)gmail.com> wrote in messagenews:ca00d138-c7c1-4b8a-bf3e-e4887710102a(a)q21g2000prm.googlegroups.com:
> >
>
> > > Hello,
> >
> > > I have DBF with more than 2 million rows in it, it seems SQLMaster
> > > took ages to convert to SQL Server 2008.
> >
> > > Anyone, have faster idea?
> >
> > > Actually, we are planning to write our own conversion, but I want to
> > > see how fast would SQLMaster.
> > > Maybe my PC is slow enough: Win7, 8GB, Quad core, seems fast to me.
> >
> > > Rene
> >
>
> >

First  |  Prev  |  Next  |  Last
Pages: 1 2 3 4
Prev: Repository, AEF or MEF Explorer
Next: questions ole Excel