If a database reports the following statistics:
No. of Queries Avg Std Deviation
134744 0.14 0.19
...would you consider that good or bad?
What I'm getting at is that the average time to run the 130,000+ queries is very good for us: 0.14 seconds is more than enough to keep users happy. But does a standard deviation of 0.19 on that sample tell you whether the '0.14' is a pretty reliable mean, or one you're only likely to get in a month of Sundays?
Another database reports:
29247 0.266 0.9
That's a worse average (though everyone's still happy with 0.26 seconds), but does the standard deviation of 0.9 mean it's a much less reliable mean than the other result? Is it possible to quantify how much?
Sorry: it's not a database question, but a statistics one, but if someone could explain how to spot a 'good' or 'bad' standard deviation and when one transitions to another, I'd be fascinated! I have read the Wikipedia stuff on both variance and its square root (standard deviation), but I don't get how the theory translates into simple practicalities. I should have paid attention in math class, I guess...