What I want to do:
I'm evaluating a free tier, and want to setup an alarm on a daily network egress of 330GB, to be notified when I'm expected to exceed 10TB free egress over that month.
What have I tried:
So I've attempted to create a metric in oci_internet_gateway
namespace to show mean
value of BytesToIgw
over one day: BytesToIgw[1d].mean()
.
The resulting stats however make no sense, they seem to be 8-10 times higher than expected.
My barrier to understand is: what is the meaning of BytesToIgw
? It shows in Bytes, but bytes over what time period? Does the value show egress since last sample? Do I need to divide BytesToIgw[1d].mean()
by sample rate and multiple by 60 (this did not produce meaninfgul result either)?
What is that sample rate? Looking at “Count” aggregation I get inconsistent 8-12 samples.
Some other metrics, such as oci_compute
, state explicitly that data is sampled 6 times every minute and data is shown over the minute -- but oci_internet_gateway
page does not mention anything about this: https://docs.oracle.com/en-us/iaas/Content/Network/Reference/IGWmetrics.htm#IGWmetrics
TLDR: please help me understand, what is the meaning of BytesToIgw[1d].mean()
, and if I have steady egress of 1MB/sec, how shall I manipulate the metric to show me the expected 86.4GB/day egress in that scenario?