Prev: Fields not showing up in columns
Next: Do I have to set the primary key to include primary key fields
From: ChicagoPete on 14 Apr 2010 10:11 So my company just bought another company and they send me a flat text file with 1.5 million records of customer name/orders/transactions etc - all in one huge txt file. Just for fun... I imported this into Access 2007 and you can view the single table, can't do anything with it, a simple parameter search on one customer name locks up the server, Access not responding, white screen blah, blah blah... I tired to split the table during import with the wizard to create the 6 or so tables I would like to have - but it kept bombing out. When I tried a small sample I see I get prompts for nulls fields and am asked to "leave as is?" a lot, so I guess when I tried to import the complete file that is where it is hanging up. Don't feel like cleaning 1.5M records today. Do I set this up on a SQL backend? If so, do I create the SQL db first then import the text file into SQL OR do I import the text file into Access and use the db splitter to create the SQL side? Is there a way in SQL to split into seperate tables? here are the fields - (tab delimited in file, i used comma here for ease): Acct Num, Cust Name, Address1, City, State, Zip, Service Call Date, Item, Qty, Amt, Pmt Date, Service Tech, Ins Name , Ins Paid, Balance Basically I see about 20 rows per customer name, each with a different Pmt Date (they pay on the accounts), looking for a ledger type report to run based on Customer name that will show all payment dates and current balance due. example 48714 Smith, Mary 123 Main Boston MA 02134 02012010 937821a 1 2,500 02012010 Bob-11 2,000 48714 Smith, Mary 123 Main Boston MA 02134 02012010 937821a 1 500 03012010 Bob-11 1,500 The intent is to use this data as read only, no new transactions will be posted, will just need to view previous balances for customers. Open to any and all ideas on this one...
From: Stefan Hoffmann on 14 Apr 2010 10:25 hi Pete, On 14.04.2010 16:11, ChicagoPete wrote: > So my company just bought another company and they send me a flat text file > with 1.5 million records of customer name/orders/transactions etc - all in > one huge txt file. Just for fun... I imported this into Access 2007 and you > can view the single table, can't do anything with it, a simple parameter > search on one customer name locks up the server, Access not responding, white > screen blah, blah blah... Create indices for the columns in your search condition. > I tired to split the table during import with the wizard to create the 6 or > so tables I would like to have - but it kept bombing out. How big in GB is the database if you do a simple import into one table? Maybe your hitting the 2GB size limit. > Do I set this up on a SQL backend? If so, do I create the SQL db first then > import the text file into SQL OR do I import the text file into Access and > use the db splitter to create the SQL side? Is there a way in SQL to split > into seperate tables? While Access can handle huge amounts of data like this, I would use a SQL Server back-end and use the BULK INSERT command to load the data at once. Then you may split the table into a correct relational schema or also simply add the necessary indices... mfG --> stefan <--
From: Stefan Hoffmann on 14 Apr 2010 10:29 hi, On 14.04.2010 16:25, Stefan Hoffmann wrote: > While Access can handle huge amounts of data like this, I would use a > SQL Server back-end and use the BULK INSERT command to load the data at > once. http://msdn.microsoft.com/en-us/library/ms188365.aspx mfG --> stefan <--
From: ChicagoPete on 14 Apr 2010 11:08 Hi Stefan; The raw file size is 500,000kb. I will try to re-import and add indicies. How do I split one table into several (either in Access or SQL) after import? I would like to stay totally in Access 2007, more comfortable... thanks for the quick reply -Pete "Stefan Hoffmann" wrote: > hi Pete, > > On 14.04.2010 16:11, ChicagoPete wrote: > > So my company just bought another company and they send me a flat text file > > with 1.5 million records of customer name/orders/transactions etc - all in > > one huge txt file. Just for fun... I imported this into Access 2007 and you > > can view the single table, can't do anything with it, a simple parameter > > search on one customer name locks up the server, Access not responding, white > > screen blah, blah blah... > Create indices for the columns in your search condition. > > > I tired to split the table during import with the wizard to create the 6 or > > so tables I would like to have - but it kept bombing out. > How big in GB is the database if you do a simple import into one table? > Maybe your hitting the 2GB size limit. > > > Do I set this up on a SQL backend? If so, do I create the SQL db first then > > import the text file into SQL OR do I import the text file into Access and > > use the db splitter to create the SQL side? Is there a way in SQL to split > > into seperate tables? > While Access can handle huge amounts of data like this, I would use a > SQL Server back-end and use the BULK INSERT command to load the data at > once. > > Then you may split the table into a correct relational schema or also > simply add the necessary indices... > > > mfG > --> stefan <-- > . >
From: Stefan Hoffmann on 14 Apr 2010 11:30 hi Pete, On 14.04.2010 17:08, ChicagoPete wrote: > The raw file size is 500,000kb. This may result in size problems. > How do I split one table into several (either in Access or SQL) after import? You have to do it manually. Create the necessary tables as you can entities in your file. Then create appropriate append queries. > I would like to stay totally in Access 2007, more comfortable... You can link the SQL Server tables into Access and do the data processing (splitting the data) in Access. Sorry for being so vague, but as I don't know the data the explanation has to be so abstract. mfG --> stefan <--
|
Next
|
Last
Pages: 1 2 Prev: Fields not showing up in columns Next: Do I have to set the primary key to include primary key fields |