The one reason we love logarithms in machine learning

Tivadar Danka small portrait Tivadar Danka
Base two logarithm

Understanding math will make you a better engineer.

So, I am writing the best and most comprehensive book about it.

There is one big reason we love the logarithm function in machine learning.

Logarithms help us reduce complexity by turning multiplication into addition. You might not know it, but they are behind a lot of things in machine learning.

First, let's start with the definition of the logarithm. The base a a logarithm of b b is simply the solution of the equation ax=b a^x = b .

Definition of the logarithm

Despite its simplicity, it has many useful properties that we take advantage of all the time.

You can think of the logarithm as the inverse of exponentiation. Because of this, it turns multiplication into addition:

log(xy)=log(x)+log(y). \log(xy) = \log(x) + \log(y).

(The base of a logarithm is often assumed to be a fixed constant. Thus, it can be omitted.) Exponentiation does the opposite: it turns addition into multiplication.

Why is the property log(xy)=log(x)+log(y) \log(xy) = \log(x) + \log(y) useful? Because we can use it to calculate gradients and derivatives!

Training a neural network requires finding its gradient. However, lots of commonly used functions are written in terms of products.

As you can see, this complicates things:

(fg)=fg+fg(fgh)=fgh+fgh+fgh \begin{align*} (f g)^\prime &= f^\prime g + f g^\prime \\ (fgh)^\prime &= f^\prime gh + fg^\prime h + fgh^\prime \\ &\vdots \end{align*}

By taking the logarithm, we can compute the derivative as it turns products into sums:

(logf1fn)=(i=1nlogfi)=i=1n(logfi). \begin{align*} \Big( \log f_1 \dots f_n \Big)^\prime &= \bigg( \sum_{i=1}^{n} \log f_i \bigg)^\prime \\ &= \sum_{i=1}^{n} \Big( \log f_i \Big)^\prime. \end{align*}

This method is called logarithmic differentiation. One example where this is useful is the maximum likelihood estimation.

Given a set of observations and a predictive model, we can write this in the following form.

Likelihood function

Believe it or not, this is behind the mean squared error!

Mean square error as a log-likelihood function

Every time you use this, logarithms are working in the background.

Having a deep understanding of math will make you a better engineer.

I want to help you with this, so I am writing a comprehensive book that takes you from high school math to the advanced stuff.
Join me on this journey and let's do this together!