From: David Kaye on 21 Jul 2010 15:15
BeeJ <nospam(a)live.com> wrote:
>I am only looking for the best that can be done on a non-real-time OS.
When you get down to milliseconds you're always going to have problems due to
the nature of the beast, depending on the number and kind of interrupts.
From: BeeJ on 22 Jul 2010 11:45
BeeJ wrote :
> I have two timers running.
> In each timer routine I need to have a variable delay (10 mSec to 1000 mSec).
> The delay only needs to be accurate to a few mSecs but should be consistent
> each time it is encountered.
> What is the best way to do these "simultaneous" delays so they do not
> interfere with each other?
> Is for example Sleep 10& acceptable or do I have to write a class with a
> CreateWaitableTimer and instantiate one for each timer subs?
> This is a hardware interface.
I have it working now.
I have a pseudo interrupt, using a timer class, and doing delays in the
timer routine using a wait class (CreateWaitableTimer).
Due to the nature of the application, I believe that if I set the
minimum calculated timer interval to between a movie frame rate(33mSec)
and the power line frequency (17mSec) the observer will not see flicker
and will get a full motion experience and I will not have to stress the
PC's CPU much at all. These timers are driving mechanical devices as
well as image effects.
What do you think?