Double accuracy: Java vs C#
807591May 28 2008 — edited May 28 2008Hi
One question about the accuracy of double primitive. When using the following code:
class Test
{
public static void main(String args[])
{
double n = 0.0;
for(int i=0;i<10;i++) n += 0.1;
System.out.println(Double.toString(n));
}
}
It will output 0.999999999999999999
Doing the same in C# will give the result: 1.0000000000000
Is Java less accurate compared with C#, or how can C# have this accuracy while Java can not? Please no flaming, I am really wondering about this.