Iterative Nightmare

Calculus Level 1

Let f,g:RRf,g : \mathbb{R} \longrightarrow \mathbb{R} such that f(2x)=2f(x)f(2x)=2f(x) and g(x)=x+f(g(x)).g(x) = x+ f\big(g(x)\big).

If 2g(x)2g(x) is in the range of gg for all real x,x, then is it always true that g(2x)=2g(x)?g(2x)=2g(x)?

Note: ff and gg are not necessarily continuous over R.\mathbb{R}.

×

Problem Loading...

Note Loading...

Set Loading...