From: Willie Moore on
Geoff,

My guess would be that the csv location is not valid. I believe that was the
issue I had a few years back.

Regards,
Willie

"Geoff Schaller" <geoffx(a)softwarexobjectives.com.au> wrote in message
news:4c595c32$0$34568$c30e37c6(a)exi-reader.telstra.net...
> Bulk insert works fine for almost all people using it that I know (and
> yes, there are quite a few). Send me your DBF and let me see if I can
> upload it.
>
> Make sure you select to load data before keys or key violations may
> prevent the upload.
>
> Make sure you DO NOT PRE-CREATE the table. Let SQL Master create the table
> or I cannot guarantee you got the columns in the right order.
>
> Make sure you read the log file I create. It usually tells you the error.
>
> Make sure you DO NOT change the default separators. They are designed
> with VO'ers in mind that did crazy things with data in memos.
>
> Geoff
>
>
>
> "Rene J. Pajaron" <rjpajaron(a)gmail.com> wrote in message
> news:33d83868-9f2c-43ba-9c76-5e1c54d95c50(a)h17g2000pri.googlegroups.com:
>
>> Hi Goeff,
>>
>> I do not use BULK INSERT because it does not insert all rows from the
>> table. I left it uncheck, and it works but very slow, and I have 10
>> tables here that averages 1.5M rows.
>>
>> What does Bulk Insert do?
>>
>> Background info:
>> MS SQL Server 2008
>> Tables are already created, using SQLMaster. I did not bother to drop
>> the table, because I can right clicked at table, and select "Import
>> Data from DBF/XML->Import Data from DBF".
>>
>> Using Bulk Inserts:
>>
>> Option I checked here are the following:
>> CSV file path: ...valid path....
>> Server CSV Path: same as above, as I am working on the same PC.
>>
>> Options:
>> 1. Create Table is disabled (because table is already existing).
>> 2. Create Index is checked
>> 3. Upload Data is checked
>> 4. Use Bulk Insert is checked
>> 5. Stop on Error is not checked
>> 6. Fix Illegal Names is not checked
>> 7. Add RECID is check by default
>> 8. Add Del column is not checked
>> 9. Add RLOCK col is not checked
>>
>> Memo Field as : TEXT
>>
>> Upload block Size (rows) 100
>>
>> DBFCDX
>>
>> Then, GET CSV DELIMITERS
>> String Delimiter is not available
>> Row Delimiter: ||CRLF
>> Field Delimiter: >|<
>>
>> I experimented on changing above two values; but Bulk Inserts do not
>> happen; or do I have to use other tools to import data?
>>
>> Thanks for your help Geoff,
>>
>> Rene
>>
>>
>> On Aug 4, 6:13 am, "Geoff Schaller"
>> <geo...(a)softwarexobjectives.com.au> wrote:
>>
>>
>> > Rene.
>> >
>> > 2 million rows using bulk insert will take about 40 seconds on my PC.
>> > Most of that is creating the CSV from the DBF.
>> >
>> > I presume you used bulk insert?
>> >
>> > Geoff
>> >
>> > "Rene J. Pajaron" <rjpaja...(a)gmail.com> wrote in
>> > messagenews:ca00d138-c7c1-4b8a-bf3e-e4887710102a(a)q21g2000prm.googlegroups..com:
>> >
>>
>> > > Hello,
>> >
>> > > I have DBF with more than 2 million rows in it, it seems SQLMaster
>> > > took ages to convert to SQL Server 2008.
>> >
>> > > Anyone, have faster idea?
>> >
>> > > Actually, we are planning to write our own conversion, but I want to
>> > > see how fast would SQLMaster.
>> > > Maybe my PC is slow enough: Win7, 8GB, Quad core, seems fast to me.
>> >
>> > > Rene
>> >
>>
>> >
>
>
> __________ Information from ESET NOD32 Antivirus, version of virus
> signature database 5341 (20100804) __________
>
> The message was checked by ESET NOD32 Antivirus.
>
> http://www.eset.com
>
>
>

__________ Information from ESET NOD32 Antivirus, version of virus signature database 5341 (20100804) __________

The message was checked by ESET NOD32 Antivirus.

http://www.eset.com



From: Geoff Schaller on
Possibly too, although if he is using his local PC (which I believe he
is) then this shouldn't be a problem. However he hasn't contacted me
directly, which everybody else does <g>, so I don't have any other
information.



"Willie Moore" <williem(a)wmconsulting.com> wrote in message
news:i3c8of$qbl$1(a)speranza.aioe.org:

> Geoff,
>
> My guess would be that the csv location is not valid. I believe that was the
> issue I had a few years back.
>
> Regards,
> Willie
>
> "Geoff Schaller" <geoffx(a)softwarexobjectives.com.au> wrote in message
> news:4c595c32$0$34568$c30e37c6(a)exi-reader.telstra.net...
>
> > Bulk insert works fine for almost all people using it that I know (and
> > yes, there are quite a few). Send me your DBF and let me see if I can
> > upload it.
> >

From: Rene J. Pajaron on
Hi Geoff,

Following everything you said.

Bulk Insert is still dead on tracks. But when I unchecked this it
will do the insert but very slow, hence this is my problem.

You said, Bulk Insert is faster so I keep trying it, but still do not
see data being inserted after that.

Memo is not a problem, tables I use do not have MEMO fields.

No index use. Just to isolate the problem.

Anyway, I will continue my research this weekend.

Thanks,

Rene

My
On Aug 4, 8:25 pm, "Geoff Schaller"
<geo...(a)softwarexobjectives.com.au> wrote:
> Bulk insert works fine for almost all people using it that I know (and
> yes, there are quite a few). Send me your DBF and let me see if I can
> upload it.
>
> Make sure you select to load data before keys or key violations may
> prevent the upload.
>
> Make sure you DO NOT PRE-CREATE the table. Let SQL Master create the
> table or I cannot guarantee you got the columns in the right order.
>
> Make sure you read the log file I create. It usually tells you the
> error.
>
> Make sure you  DO NOT change the default separators. They are designed
> with VO'ers in mind that did crazy things with data in memos.
>
> Geoff
>
> "Rene J. Pajaron" <rjpaja...(a)gmail.com> wrote in messagenews:33d83868-9f2c-43ba-9c76-5e1c54d95c50(a)h17g2000pri.googlegroups.com:
>
> > Hi Goeff,
>
> > I do not use BULK INSERT because it does not insert all rows from the
> > table.  I left it uncheck, and it works but very slow, and I have 10
> > tables here that averages 1.5M rows.
>
> > What does Bulk Insert do?
>
> > Background info:
> > MS SQL Server 2008
> > Tables are already created, using SQLMaster.  I did not bother to drop
> > the table, because I can right clicked at table, and select "Import
> > Data from DBF/XML->Import Data from DBF".
>
> > Using Bulk Inserts:
>
> > Option I checked here are the following:
> > CSV file path: ...valid path....
> > Server CSV Path: same as above, as I am working on the same PC.
>
> > Options:
> > 1. Create Table is disabled (because table is already existing).
> > 2. Create Index is checked
> > 3. Upload Data is checked
> > 4. Use Bulk Insert is checked
> > 5. Stop on Error is not checked
> > 6. Fix Illegal Names is not checked
> > 7. Add RECID is check by default
> > 8. Add Del column is not checked
> > 9. Add RLOCK col is not checked
>
> > Memo Field as : TEXT
>
> > Upload block Size (rows) 100
>
> > DBFCDX
>
> > Then, GET CSV DELIMITERS
> > String Delimiter is not available
> > Row Delimiter: ||CRLF
> > Field Delimiter: >|<
>
> > I experimented on changing above two values; but Bulk Inserts do not
> > happen; or do I have to use other tools to import data?
>
> > Thanks for your help Geoff,
>
> > Rene
>
> > On Aug 4, 6:13 am, "Geoff Schaller"
> > <geo...(a)softwarexobjectives.com.au> wrote:
>
> > > Rene.
>
> > > 2 million rows using bulk insert will take about 40 seconds on my PC.
> > > Most of that is creating the CSV from the DBF.
>
> > > I presume you used bulk insert?
>
> > > Geoff
>
> > > "Rene J. Pajaron" <rjpaja...(a)gmail.com> wrote in messagenews:ca00d138-c7c1-4b8a-bf3e-e4887710102a(a)q21g2000prm.googlegroups.com:
>
> > > > Hello,
>
> > > > I have DBF with more than 2 million rows in it, it seems SQLMaster
> > > > took ages to convert to SQL Server 2008.
>
> > > > Anyone, have faster idea?
>
> > > > Actually, we are planning to write our own conversion, but I want to
> > > > see how fast would SQLMaster.
> > > > Maybe my PC is slow enough: Win7, 8GB, Quad core, seems fast to me.
>
> > > > Rene
>
>

From: Rene J. Pajaron on
Hi Willie,

Not a problem because I do check this by creating CSV only and I saw
CSV files are created on the temp path I provided.

Regards,

Rene


On Aug 5, 1:41 am, "Willie Moore" <will...(a)wmconsulting.com> wrote:
> Geoff,
>
> My guess would be that the csv location is not valid. I believe that was the
> issue I had a few years back.
>
> Regards,
> Willie
>
> "Geoff Schaller" <geo...(a)softwarexobjectives.com.au> wrote in message
>
> news:4c595c32$0$34568$c30e37c6(a)exi-reader.telstra.net...
>
>
>
> > Bulk insert works fine for almost all people using it that I know (and
> > yes, there are quite a few). Send me your DBF and let me see if I can
> > upload it.
>
> > Make sure you select to load data before keys or key violations may
> > prevent the upload.
>
> > Make sure you DO NOT PRE-CREATE the table. Let SQL Master create the table
> > or I cannot guarantee you got the columns in the right order.
>
> > Make sure you read the log file I create. It usually tells you the error.
>
> > Make sure you  DO NOT change the default separators. They are designed
> > with VO'ers in mind that did crazy things with data in memos.
>
> > Geoff
>
> > "Rene J. Pajaron" <rjpaja...(a)gmail.com> wrote in message
> >news:33d83868-9f2c-43ba-9c76-5e1c54d95c50(a)h17g2000pri.googlegroups.com:
>
> >> Hi Goeff,
>
> >> I do not use BULK INSERT because it does not insert all rows from the
> >> table.  I left it uncheck, and it works but very slow, and I have 10
> >> tables here that averages 1.5M rows.
>
> >> What does Bulk Insert do?
>
> >> Background info:
> >> MS SQL Server 2008
> >> Tables are already created, using SQLMaster.  I did not bother to drop
> >> the table, because I can right clicked at table, and select "Import
> >> Data from DBF/XML->Import Data from DBF".
>
> >> Using Bulk Inserts:
>
> >> Option I checked here are the following:
> >> CSV file path: ...valid path....
> >> Server CSV Path: same as above, as I am working on the same PC.
>
> >> Options:
> >> 1. Create Table is disabled (because table is already existing).
> >> 2. Create Index is checked
> >> 3. Upload Data is checked
> >> 4. Use Bulk Insert is checked
> >> 5. Stop on Error is not checked
> >> 6. Fix Illegal Names is not checked
> >> 7. Add RECID is check by default
> >> 8. Add Del column is not checked
> >> 9. Add RLOCK col is not checked
>
> >> Memo Field as : TEXT
>
> >> Upload block Size (rows) 100
>
> >> DBFCDX
>
> >> Then, GET CSV DELIMITERS
> >> String Delimiter is not available
> >> Row Delimiter: ||CRLF
> >> Field Delimiter: >|<
>
> >> I experimented on changing above two values; but Bulk Inserts do not
> >> happen; or do I have to use other tools to import data?
>
> >> Thanks for your help Geoff,
>
> >> Rene
>
> >> On Aug 4, 6:13 am, "Geoff Schaller"
> >> <geo...(a)softwarexobjectives.com.au> wrote:
>
> >> > Rene.
>
> >> > 2 million rows using bulk insert will take about 40 seconds on my PC..
> >> > Most of that is creating the CSV from the DBF.
>
> >> > I presume you used bulk insert?
>
> >> > Geoff
>
> >> > "Rene J. Pajaron" <rjpaja...(a)gmail.com> wrote in
> >> > messagenews:ca00d138-c7c1-4b8a-bf3e-e4887710102a(a)q21g2000prm.googlegroups.com:
>
> >> > > Hello,
>
> >> > > I have DBF with more than 2 million rows in it, it seems SQLMaster
> >> > > took ages to convert to SQL Server 2008.
>
> >> > > Anyone, have faster idea?
>
> >> > > Actually, we are planning to write our own conversion, but I want to
> >> > > see how fast would SQLMaster.
> >> > > Maybe my PC is slow enough: Win7, 8GB, Quad core, seems fast to me..
>
> >> > > Rene
>
> > __________ Information from ESET NOD32 Antivirus, version of virus
> > signature database 5341 (20100804) __________
>
> > The message was checked by ESET NOD32 Antivirus.
>
> >http://www.eset.com
>
> __________ Information from ESET NOD32 Antivirus, version of virus signature database 5341 (20100804) __________
>
> The message was checked by ESET NOD32 Antivirus.
>
> http://www.eset.com

From: Rene J. Pajaron on
Hi Geoff,

You do not mind me directly contacting you?

If you don't mind, I will email you this weekend after I resume my
research.

Lots in the pipeline, lots to learn.....

Thanks,

Rene

On Aug 5, 5:36 am, "Geoff Schaller"
<geo...(a)softwarexobjectives.com.au> wrote:
> Possibly too, although if he is using his local PC (which I believe he
> is) then this shouldn't be a problem. However he hasn't contacted me
> directly, which everybody else does <g>, so I don't have any other
> information.
>
> "Willie Moore" <will...(a)wmconsulting.com> wrote in message
>
> news:i3c8of$qbl$1(a)speranza.aioe.org:
>
> > Geoff,
>
> > My guess would be that the csv location is not valid. I believe that was the
> > issue I had a few years back.
>
> > Regards,
> > Willie
>
> > "Geoff Schaller" <geo...(a)softwarexobjectives.com.au> wrote in message
> >news:4c595c32$0$34568$c30e37c6(a)exi-reader.telstra.net...
>
> > > Bulk insert works fine for almost all people using it that I know (and
> > > yes, there are quite a few). Send me your DBF and let me see if I can
> > > upload it.
>
>

First  |  Prev  |  Next  |  Last
Pages: 1 2 3 4
Prev: Repository, AEF or MEF Explorer
Next: questions ole Excel