Skip to Main Content

Oracle Database Discussions

Announcement

For appeals, questions and feedback about Oracle Forums, please email oracle-forums-moderators_us@oracle.com. Technical questions should be asked in the appropriate category. Thank you!

Is this good news?

Catfive LanderOct 8 2008 — edited Oct 9 2008
If a database reports the following statistics:
No. of Queries         Avg      Std Deviation
134744	             0.14	0.19
...would you consider that good or bad?

What I'm getting at is that the average time to run the 130,000+ queries is very good for us: 0.14 seconds is more than enough to keep users happy. But does a standard deviation of 0.19 on that sample tell you whether the '0.14' is a pretty reliable mean, or one you're only likely to get in a month of Sundays?

Another database reports:
29247	0.266	0.9
That's a worse average (though everyone's still happy with 0.26 seconds), but does the standard deviation of 0.9 mean it's a much less reliable mean than the other result? Is it possible to quantify how much?

Sorry: it's not a database question, but a statistics one, but if someone could explain how to spot a 'good' or 'bad' standard deviation and when one transitions to another, I'd be fascinated! I have read the Wikipedia stuff on both variance and its square root (standard deviation), but I don't get how the theory translates into simple practicalities. I should have paid attention in math class, I guess...
This post has been answered by HJR on Oct 8 2008
Jump to Answer
Comments
Locked Post
New comments cannot be posted to this locked post.
Post Details
Locked on Nov 6 2008
Added on Oct 8 2008
6 comments
503 views