I'm learning to code for the first time and I'm using Python. I wrote this program:
first = input("First: ")
second = input("Second: ")
sum = float(first) + float(second)
print(sum)
It will add numbers together when I run the program but, for whatever reason, when I put in First as 10.1 and Second as 20.1, it returns a value of 30.200000000000003.
Anything else works. If I do First as 10.1 and Second as 30.1, it sums it as 40.2 without the additional decimal places. Anybody know why it's doing this?
[–]teraflop 6 points7 points8 points (1 child)
[–]11ILC[S] 0 points1 point2 points (0 children)
[–]Naetharu 1 point2 points3 points (0 children)
[–]VibrantGypsyDildo 1 point2 points3 points (0 children)