# Why don’t logarithms have horizontal asymptotes?

We’re studying logarithms in my Honors Algebra II class, and when looking at a logarithmic function for the first time, a very astute student asked “does it have a horizontal asymptote.” And my quick response was a slightly dismissive “no.” But the student was insistent, and pointed out that the slope seems to keep getting smaller and smaller, so shouldn’t it eventually reach zero, and won’t that mean that there is an asymptote?

I like this question because, if I let it, it pulls me away from simply repeating the same information I was handed back during my math education, and really gets me thinking about the nature of the logarithm and the meaning of horizontal asymptotes.

## What I “know”…

So here’s what I “know”—the logarithm is just the inverse of the exponential function, and the exponential function doesn’t have any vertical asymptotes—you can always exponentiate a larger number. Thus, it should be that when you invert this function to form the logarithm, there shouldn’t be any horizontal asymptotes. As you take logarithms of larger and larger numbers, the output of the function should continue to increase.

## A comparison:

Let’s compare the logarithm to a function that we know does have an asymptote, namely, the inverse function. Just to make it a bit easier to visualize, I want to work with 1/x reflected over the horizontal axis and vertically shifted by 3,

In many ways this function looks similar to the logarithm—it is increasing, and as it grows, its slope decreases. But this particular function has a horizontal asymptote at .

What about the rate of change?

Things get more interesting when I look at the rates of change for these functions.

For the logarithm:

for the inverse function:

As we can see from the graphs, both slopes decrease as x increases, and the inverse decreases at a bigger rate. Does this difference explain this behavior?

It does seem to tell me that just because the rate of change of a function goes to zero at infinity can’t be proof of a horizontal asymptote. But could it be that any function whose rate of change goes to zero faster than a particular rate (say will have a horizontal asymptote?

I’m flummoxed. How would you explain to a student why a function whose rate of change is always decreasing never reaches a maximum value, particularly when every similar function he’s previously encountered like this does have a horizontal asymptote?

A horizontal asymptote in the positive x direction implies that there exists a maximum (or a minimum, if we approach the asymptote from above) value of f(x) as x -> infinity. So all you have to do prove that there is no horizontal asymptote is show that f(x) is an unbounded function.

Given that y = log(x) is defined as the value of the required power such that 10^y = x, since larger values of x will always require larger powers, this shows that f(x) cannot have an upper limit.

The easiest way I can see is to think about this by way of contradiction (which also introduces a really cool proof technique!):

Assume has some horizontal asymptote, call it . Then it should be that in some limit, but never quite reaches it. However, we can clearly see that , thereby contradicting our assumption that is a horizontal asymptote for . Since the exponential is a nicely behaved function, it doesn’t matter what number we choose for our hypothetical horizontal asymptote; we can always find a value of such that .

Contrasting that to the case of , one claims that is a horizontal asymptote of — no matter what we choose for , and so the function never reaches the asymptotic value (although it can come arbitrarily close).

I like this proof by contradiction a lot. That seems very accessible to an Algebra II student.

Actually, here’s another interesting tie-in, to the harmonic series . This has a (discretized) derivative of , just like the natural log… and yet, the series diverges as even though the derivative goes to zero.

Compare this to the series , which has a discretized derivative and *does* converge as $n \rightarrow \infty$. (If i’m remembering, any harmonic series converges if …)

In the case of the harmonic series, it’d be interesting to look at the asymptotic behavior — ie, plot the behavior for large , and compare to the natural log function. look, for example, at: http://bit.ly/W2RAW9 — I don’t have time before work, but looks like the partial sum and have the same functional behavior (up to a constant offset?) for large . I would bet the same is true for and .

This is excellent. I had thought about summing the infinite series a bit when I was looking at the derivative, but I didn’t think to take it this far, and it makes great sense. Thanks for sharing this. It also helps tie back to the idea that logarithmic growth is the slowest growth one can have.

I think your answer lies in calculus 1 & 2. A logarithmic function doesn’t converge to any number, and thus, cannot have a horizontal asymptote. Remember that the derivative of ln(x) is equal to 1/x, and in convergence testing, 1/x does not converge. In general form of 1/x^p, if p is equal to or less than 1, it will not converge. However, if 1/x^p with p>1, it will converge. I’m not sure how you would explain the concept of convergence and functions to infinity to an algebra student.

npisenti beat me to it. 1/x is the harmonic series so even though you’re adding smaller and smaller numbers (that will eventually “go” to zero); the harmonic series is divergent and so any range value will eventually be passed.

Unlike x^-2, whose terms all go to zero eventually as well, but this infinite sum does converge.

Never knew this fantastic connection to the converging and diverging series. Fantastic!

Late to the conversation here but I’ll echo Dan’s excitement at seeing the connection with convergence conversations. We are just now testing in Calc BC and the idea that a sequence converges but its series does not is a tough one to get across at that level – nevermind an Algebra II discussion. I think that you can discuss how quickly it is slowing down without needing the Calculus behind it all.

The cleaner explanation – of course, is simply to use the nice proof by contradiction suggested above

Fun stuff. Just a couple of things that come to mind. The p-series test, which I think relates to your conjecture about how derivatives determine whether functions have horizontal asymptotes. If we’re not considering powers of x, though, as derivatives, there are functions that grow even slower than ln(x) but still do not have horizontal asymptotes: ln(x)/2, ln(ln(x)), ln(ln(ln(x))), and so on, with derivatives 1/2x, 1/(xln(x)), and 1/(xln(x)ln(ln(x)))) respectively.

This questions came up every now and then in my Analysis classes at Bay. My favorite prompt to pose in response was usually something along the lines of “Look at the inverse of the logarithm function. What question about the inverse could you ask that is mathematically equivalent?”

Jumping ahead to the answer, y = log(x) has a horizontal asymptote iff y = 10^x has a vertical asymptote, but students know that y = 10^x has a domain of all reals and thus cannot have a vertical asymptote.

This particular approach is accessible to students without calculus, but I readily acknowledge that it doesn’t fully address the bigger question of “how can you show whether or not a function has an upper bound”, at least not using the approaches that mathematicians typically use.