Napier invented logarithms to make calculations like multiplication, division and exponentiation easier, using identities like log(ab)=log(a)+log(b). Logarithmic tables and slide rules came soon thereafter. Slide rules became the standard way to do mathematical computations for your typical scientist/engineer until reasonably priced calculators appeared in the late 70's. My chemistry teacher in 1980 taught us how to use a slide rule, even though we all had calculators, and I even (foolishly) tried using my father's slide rule on a chemistry test.
The exhibit struggled to find current uses for logarithms, mentioning only the Richter scale. In theoretical computer science we use logarithms all the time. Here's an incomplete list off the top of my head.
- As part of a running time usually as a result of divide and conquer. Sorting, for example, takes Θ(n log n) comparisons.
- The size of a number, pointer or counter. To count up to n requires log n bits of storage.
- The representation of a small amount of space as in the complexity classes L and NL.
- To capture entropy, coupon collector and other topics in probability and information.
- Roughly captures the sum of 1/i or exactly capturing the integral of 1/x.
- The inverse of exponential growth.
Thanks John Napier for the logarithm and making our lives just a little less linear.