## 09 Aug Javascript 0.1+0.2 != 0.3 ???!?!?!

If you ever open the javascript console of your browser and type in: 0.7+0.1. You might get the result : 0.7999999999999999. And then, 0.1+0.7 == 0.8 returns false …what the heck?!…

Dont’ worry, you might change the expression into an equivalent expression by multiplying 10 to each operant and then divide  whole by 10 : (0.7*10 + 0.1*10)/10. This time, it’s 0.8. And the comparison now is correct (0.7 * 10 + 0.1 * 10)/10 == 0.8 return true. So what’s the problem?

JavaScript doesnt have an integer type for variables. All numbers are treated as real numbers and stored in 64 bits. The number part is stored in bits 0 to 51, the exponent part is in bits 52 to 62, and the sign in bit 63. Therefore, there’re some constraints::

• The max number can be represented as in value is 2^52 – 1 ~ 4,503,599,627,370,496 (16 digits).
• And thus, gives a precision contraints of 17 digits. However, the floating point arithmetic is not always 100% accurate. (https://www.w3schools.com)

In the examples above, the values 0.1 was approximate into 0.099999..99. Let’s try another example:
1/3 = 0.3333333333333333 and 2/3 = 0.6666666666666666 and (1/3 + 2/3) = 1. Then, let’s add: 0.3333333333333333 + 0.6666666666666666. This also gives 1.0 as the output not 0.9999..9999.
This is because of the precision of the javascript is just limited to 17 digits. To overcome, just cheat the precision òf javascript by time 10 and divide by 10.
Hope this helps.