From: Salad on 15 Mar 2010 11:08 The Frog wrote: > Hi Roger, > > I do have a button that starts the process, and I can place a 'hamster > in a wheel' so to speak based on the forms timer event to show > activity is taking place. I know how many records need to be brought > back (separate count query), but using an SELECT INTO statement for > the table, which is by far the quickest method gives me no feedback on > how far through it is during the process (there is only one table / > view to capture) - hence using the recordset. > > I was just hoping that someone might know a trick with access to show > that kind of data without using a recordset. It looks like I might > just have to stay with the existing method so far. It does work, just > takes a long time...... > > Thanks for the help, I appreciate your time and interest. > > Cheers > > The Frog I'm not fully understanding the thread but I'll toss some mud at the wall and maybe something sticks. If you had a count of 50K records, you could divide that by 10 to get percentages complete. Then select the TOP 5000 records. Get the last record value, and select the next TOP 5000 greater than the last record. When the counter is 10, select all greater than the last record as you might have 5001 records. Then use Syscmd to show a progress meter.
From: Roger on 15 Mar 2010 15:24 On Mar 15, 2:09 am, The Frog <mr.frog.to....(a)googlemail.com> wrote: > Hi Roger, > > I do have a button that starts the process, and I can place a 'hamster > in a wheel' so to speak based on the forms timer event to show > activity is taking place. I know how many records need to be brought > back (separate count query), but using an SELECT INTO statement for > the table, which is by far the quickest method gives me no feedback on > how far through it is during the process (there is only one table / > view to capture) - hence using the recordset. > > I was just hoping that someone might know a trick with access to show > that kind of data without using a recordset. It looks like I might > just have to stay with the existing method so far. It does work, just > takes a long time...... > > Thanks for the help, I appreciate your time and interest. > > Cheers > > The Frog so you have a button that does one 'select into' statement to do all the work what you can do is open another form to be the hamster, with a timer event, to display the current status and then close the hamster form when the select into is done
From: The Frog on 16 Mar 2010 04:34
Thanks for the ideas guys, I appreciate it. What I am currently doing to try and solve the issue is to open and ADO connection to the Oracle server, then follow that up with an ADO recordset, using a client side cursor, and forward only, lock read only. I then create a collection object, and use the getrows function to return me an array of 1000 records, and cycle through the entire recordset till eof. Surprisingly this 'capture' takes only just over a minute to actually run - I have no idea why it is so much faster than running the query direct or even using it in a maketable type operation, but the time difference is huge. I am guessing it is probably a quirk of our network, but I will ue it for now. So now I have the known number of records, as well as a 'capture' that collects the data in an orderly fashion I can show progress with. The next step is to pump the collection of arrays into (user selected) a new database (mdb) or append the data to an existing one. I do this with another ADO recordset and batch updates. Again I can now show progress to the user. I am also using a chunk of code that takes the ado recordset and feeds it to a routine to build via ADOX a new table and if needed database. I am hoping that if I need to do a different capture process in the future from this data warehouse I can re-use the code. Still not 100% happy with the ADOX table building routine but it is stable for now. I will need to spend some more time on it to make it robust enough for the general case rather than just my specific case for now. If anyone wants the code I am happy to post it. Bare in mind that it is still not production level yet, but does give you the functionality. It is curious the speed with which the different operations actually run. I would have expected that an Access Passthrough query being used as the basis of a Select Into would have been far quicker than an ADO approach, epsecially since the ADO 'Source' is using exactly the same query as the Passthrough. One of the infrastructure guys here suggested that it could be an issue between Access (2003) and the AntiVirus/Firewall/Packet Filter that the machine has installed (corporate standard issue). I am unable to prove this, but watching the network throughput of the two approaches I am getting a lot more bandwidth utilisation with ADO (and hence throughput) than with Access and a Passthrough. Odd, but there it is. At least this way I have a reliable progress display, and can also squeeze in some 'cleanup' code if I need to change data types or similar. Does anyone have any suggestions on optimizing this further? It is running pretty well so far but I am open to suggestions :-) I would also love to hear suggestions as to why the speed differences in the approaches. And also if anyone has a robust 'production level' code for taking an ado recordset and building a table from it. As always a big thankyou for your help Cheers The Frog |