I am considering using the javax.swing.Timer class to time some animation, but in testing the class I am consistently getting inaccurate results. On my PC, they are just that, consistently inaccurate, but predictably so. However, that's just the behavior on this particular PC; I don't think I can anticipate this behavior.
My goal is to get 33ms accuracy, so that I can basically update my objects about 30 times per second (to basically get 30fps graphics). If I can even come close, I will be happy.
Here's my test class:
import java.awt.event.ActionEvent;
import java.awt.event.ActionListener;
import javax.swing.Timer;
public class SandBox implements ActionListener, Runnable{
private Timer timer;
public SandBox(){
timer = new Timer(33, this);
timer.setRepeats(true);
timer.start();
}
public static void main(String[] args) {
new Thread(new SandBox()).start();
}
public void actionPerformed(ActionEvent arg0) {
if (arg0.getSource()==timer ){
System.out.println(System.currentTimeMillis());
}
}
public void run() {
while (true){
}
}
}
And here's my problem, setting the timer to 20ms gives me an average interval of about 31ms. 50ms gives me an average of about 62ms. 100ms gives me an average of about 109ms. All of the values are within 1ms of their respective averages.
But that's just the behavior on my machine. How can I count on this?
Obviously, as just a pre-flame deterrent, I do realize that once I actually add a real workload in it may throw these results way off. But, if I can't count on a stable result without a load, then how can I count on a stable result WITH a load? And, BTW, it gives me the same results whether I execute it in my IDE or on the command line.
Certainly other people have had this problem. How did you solve it?
I should add that I am posting this, but won't be back for replies until morning.