How do you use the factor theorem to determine whether x+1 is a factor of x^3 + x^2 + x + 1x3+x2+x+1?

1 Answer
Jan 3, 2016

The factor theorem states that a polynomial f(x)f(x) has a factor (x+k)(x+k) if and only if f(-k)=0f(k)=0.
Here x^3+x^2+x+1x3+x2+x+1 is a polynomial.
Let f(x)=x^3+x^2+x+1f(x)=x3+x2+x+1

Now we want to know that is x+1x+1 a factor of f(x)f(x) or not.

For this purpose we have to put x=-1x=1 in f(x)f(x), if the result comes to be 00 then x+1x+1 is a factor of f(x)f(x) and if the result comes not to be 00 then x+1x+1 is not a factor of f(x)f(x).

Put x=-1x=1 in f(x)f(x)
implies f(-1)=(-1)^3+(-1)^2+(-1)+1=-1+1-1+1=0f(1)=(1)3+(1)2+(1)+1=1+11+1=0

Since the result is 00, therefore x+1x+1 is a factor of the given polynomial.