How do you solve log(x+2)+log(x1)=1?

1 Answer
Aug 2, 2015

Take exponent of both sides to get: (x+2)(x1)=10

Solve the quadratic and discard the spurious solution to get x=3

Explanation:

Note that we require x+2>0 and x1>0 in order that log(x+2) and log(x1) be defined.

This boils down to requiring x>1.

Take exponent of both sides to find:

10=101=10log(x+2)+log(x1)=10log(x+2)10log(x1)

=(x+2)(x1)=x2+x2

Subtract 10 from both ends to get:

0=x2+x12=(x+4)(x3)

Which gives us x=4 or x=3

Discard the spurious solution x=4 since log(x+2) and log(x1) are undefined for x=4.

So x=3