Currency calculationsComputer Science Level 1
As part of your new program, you need to 'count' money from £0.0 to £1.00 in steps of 10p, but you notice that your computer generates some odd results. So you compile a table of the actual results versus the expected results, and the output is as follows:
|Expected Result||Actual Result||Pass/Fail|
Why is the program's answer different from what you might expect--surely adding 10p is simple?