Waste less time on Facebook — follow Brilliant.
×

Polynomial problem

Let p(x)= \(a\)\(x^{2}\) + \(b\)\(x\) + \(c\) be such that p(x) takes real values for real values of \(x\) and non-real values for non-real values of \(x\). Prove that \(a\)=0.

I tried it out taking \(a\),\(b\) and \(c\) to be complex values and then implemented the conditions given in the problem which only got me to a weird expression. Any correct approach ?

Note by Nishant Sharma
4 years, 1 month ago

No vote yet
4 votes

Comments

Sort by:

Top Newest

Hint: first prove that a, b, c are real. For this it suffices to use the fact that p(-1), p(0), p(1) are real. Specifically, express a, b & c in terms of these three values.

Next, suppose a>0. Find a sufficiently small M<0 such that p(x) = M has no real roots, or equivalently, the discriminant for \(ax^2 + bx + (c-M)=0\) is negative. Do something similar for a<0. C Lim · 4 years, 1 month ago

Log in to reply

@C Lim Could not follow your second argument. Please explain a bit further. Nishant Sharma · 4 years, 1 month ago

Log in to reply

What CL did is right. 1st prove that a,b,c are real. now assume a is not equal to zero. Divide the whole polynomial by a. Give the new constants a new name. We get f(x)=x^2+bx+c. As per on your condition if x is non real then f(x) should be non-real. Since c is real we can say that x^2+bx should be non real for non real x. But there exist some non-real x which makes x^2+bx real. hence not satisfying your condition. therefore we can say that there exist no nonzero a which satisfies your condition. Now consider a =0. We have p(x)=bx+c which is nonreal for every non real x. Hence Proved Abhishek Sethi · 4 years, 1 month ago

Log in to reply

×

Problem Loading...

Note Loading...

Set Loading...