Prev: "This file might not be safe if it contains code..." message with runtime version.
Next: Hyperion Financial management (HFM), Jersey City, NJ
From: John Spencer on 9 Aug 2010 09:08 Difference is user perception. If they see the hourglass they expect a delay, but a rather short one (1-2 minutes at most). With a form popping up, you can give the user a bit better idea of the length of the delay. Without that I have had users decide the application was locked up and decide to try Control+alt+Delete to kill the application. Usually, I try to find a way to shorten the delay or failing that use a progress bar form (if possible) to let the user know something is happening. The amazing thing with a progress bar form is that the user's perception is that the process is now faster. It is actually a tiny bit slower due to the overhead of updating the form. John Spencer Access MVP 2002-2005, 2007-2010 The Hilltop Institute University of Maryland Baltimore County David W. Fenton wrote: > "Phil" <phil(a)stantonfamily.co.uk> wrote in > news:QaSdnflS967b9MPRnZ2dnUVZ7sGdnZ2d(a)brightview.co.uk: > S N I P > I don't see how popping up a static form is any different than > relying on the hourglass. It's one of those things that isn't going > to change during the process, so it really doesn't indicate to the > user if the app has locked up or not. > > I'd be reluctant to start shoving code to update the UI into the > underlying code that drives the functions, as you're then slowing > the report down even more, and there's no real way to choose > anything sensible for those functions to be displaying. > > Another option would be to animate the popup form with a timer, but > I'm not sure that's going to be any more useful, since it's possible > for parts of Access to lock up while leaving the other parts running > (though I'm not certain on that -- I'm just thinking in terms of how > things run asynchronously). >
From: David W. Fenton on 9 Aug 2010 16:25 Salad <salad(a)oilandvinegar.com> wrote in news:ab2dndKnW7U_2MLRnZ2dnUVZ_hKdnZ2d(a)earthlink.com: > David W. Fenton wrote: >> Salad <salad(a)oilandvinegar.com> wrote in >> news:7d2dna-Jwqk1bsPRnZ2dnUVZ_gidnZ2d(a)earthlink.com: >> >> >>>Granted, I wasn't printing graphics, just text, but wow!...if I >>>had a report that took a couple of minutes to display a page I'd >>>have to take a look at that code and refine it, see what's >>>causing the bottleneck. >> >> In the case of the report I mentioned, I know exactly what causes >> the bottleneck -- it's that I'm denormalizing a list of >> categories for display, i.e., taking the N:N categories for each >> item and using a function to concatenate them into a >> comma-separated list for each item displayed. If I remove that >> from the printout, it pops up almost immediately. >> >> Also, it's important to note that the first page pops up after a >> couple of minutes, and you can start sending to the printer then >> (it doesn't reformat from scratch). The long wait was if you >> click the last page control, so I just trained the user (there's >> only one for this particular report) to not preview the last >> page. >> > > David, I know you've been around the block more than once. I > still might want to see "what can I do" to improve. Let's say you > have a form. You select a few categores out of many, maybe filter > on customer or employee or whatever. Maybe run a process prior to > calling the report to "create filter" or whatever and update the > table then. Then have a separate button to run the report. > > Or add a memo field to store the categories in when updating data > entry input. In my mind, the more instaneous the data to the user > the better, normalization be damned. . Well, I would never denormalize the storing of the actual data in order to make a report run smoothly, but I have very frequently used a temp table to drive a report. But in this case, I'd be running the exact same function to populate the temp table, so it wouldn't gain anything. > If it's like a monthly report, eh, who cares. But if it is run by > one or more people on a daily basis...big difference, IMO. In this particular case, it's a report that turned out to not even be one that needs to exist. I'm keeping it, just in case, but it doesn't matter that it's slow. > I don't know your situation. In my case where this report that I > fixed to take seconds instead of 4 hours in FoxPro...that was 20 > hours of wasted time per week, 1000 hours per year that I > corrected. If I had to break a rule to save user time I'd do it > in a heartbeat. In my case I didn't have to, I simply used the > tools I had at hand. That sounds like the kind of situation where I'd definitely do whatever it took to make it faster. In my present situation, it's more a matter of impatience on the part of the users. Also, there's some kind of network issue that causes some kinds of data retrieval to be extremely slow, and I've never quite been able to track it down. So, fixing the report without addressing the network issues would, I think, be putting resources in the wrong place (particularly since I'm ultimately responsible for both). > The other one in Access that took an hour+, the prior developer > was slick with the subselects and didn't have enough, IMO, > indexes. I modified so the subselects became their own queries > and left joined on them and added some relevent indexes and the > report flew. I'm not meaning to suggest that one ought not look for performance enhancing improvements, only that sometimes, they won't be possible without actually changing the way the report behaves. In my case, writing the data to a temp table would not speed up the display of the first page, but it definitely *would* speed up previewing the last page. But I don't consider that an improvement worth adding the fussiness of the temp table (though I could definitely put only the problem field in the temp table, now that I think about it -- I might actually do that!). -- David W. Fenton http://www.dfenton.com/ contact via website only http://www.dfenton.com/DFA/
From: David W. Fenton on 9 Aug 2010 16:29 John Spencer <JSPENCER(a)Hilltop.umbc> wrote in news:i3oujq$kq0$1(a)news.eternal-september.org: > Difference is user perception. Certainly, but the question is: is there anything reasonable that can be done to manipulate their perceptions? I'm arguing that 9 times out of 10, it's more trouble than it's worth to try given the ultimate small benefit. > If they see the hourglass they expect a delay, > but a rather short one (1-2 minutes at most). > > With a form popping up, you can give the user a bit better idea of > the length of the delay. Without that I have had users decide the > application was locked up and decide to try Control+alt+Delete to > kill the application. Users are incredibly impatient, and I despair of doing anything to fix that! > Usually, I try to find a way to shorten the delay or failing that > use a progress bar form (if possible) to let the user know > something is happening. The amazing thing with a progress bar form > is that the user's perception is that the process is now faster. > It is actually a tiny bit slower due to the overhead of updating > the form. Progress bars are trickly. I use a bouncing one (i.e., indicator bounces back and forth from start to finish), where I don't have to any calculations. There's nothing worse than a progress bar that tells lies, as so many of them do, so one that just shows something is happening is as far as I ever go. -- David W. Fenton http://www.dfenton.com/ contact via website only http://www.dfenton.com/DFA/
From: John Spencer on 10 Aug 2010 08:11
Your choice. My choice. Different points of view and both are correct as far as I am concerned. John Spencer Access MVP 2002-2005, 2007-2010 The Hilltop Institute University of Maryland Baltimore County |