These days we take it for granted that if we need to know the square root of a number, or the sine of an angle, we can easily find it using a pocket calculator or a line of code. And furthermore, the result will be accurate to many decimal places.

In the 1800s, there were no computers or electronic calculators. All they had were printed tables to look up the sine, logarithms and square roots.
That wasn't so bad, the problem was that the values were only given to a few decimal places, and worst still, since they were all calculated by
had, some of the values were *wrong*!

These tables were important for architecture, ship navigation, and various branches of science including astronomy.

Mathematicians knew how to calculate the tables by using addition only - with no need for multiplication or other calculation.

Even with this simplification, a sum like this is time consuming to do by hand:

785117343651642 + 230105386827704

Several of these additions were required for each table entry, and there were *millions* of entries in each table.

Charles Babbage's invention, the difference engine, could carry out a complete set of calculations with the turn of a handle. This meant that a new table entry could be calculated every few seconds, and with no risk of human error.

*Image by Geni*

Each column in the machine has a set of cogs which store the digits of one of the numbers to be added. The mechanism adds the numbers and makes the result available for the next stage of the calculation.

The idea of using a number of *registers* to store numbers, and using hardware to add the content to existing values, maps directly onto the model
of registers, an ALU (Arithmetic and Logic Unit) and an accumulator, which most modern computers use.

The difference engine used "10s complement" to represent negative values, exactly as we use the binary equivalent (2s complement) to this day. It also used fixed point notation to represent fractions as whole numbers, a precursor to the floating point method we use now (although fixed point arithmetic is still used sometimes since it is faster).

That is not to say Babbage invented these ideas. Some go back to the abacus a thousand or more years earlier, and simple mechanical calculating machines had been around for a hundred years before the difference engine. But it shows that many of the concepts of modern computing have deep roots in history.

Babbage and Ada Lovelace conceived of a more complex machine, the Analytical Engine, which would have been able to execute more complex programs stored on punched cards. They never managed to get funding for the machine (it would have been extremely expensive if it was even possible), but many of Ada Lovelace's ideas for how the machine could have been used were years ahead of her time (for example, she envisaged the machine being able to compose music). For that reason, she is often known as the first computer programmer.

Copyright (c) Axlesoft Ltd 2021