# Program a mathematical function

Suppose you have a programming language $\mathcal L$ designed to efficiently perform calculations with real numbers. In particular, it allows for the definition of (computational, deterministic) functions which take one number as an argument and return a number:

 1 2 3 define F(x): y = ...; return y; 

Let $f:\ \mathbb R \to \mathbb R$ be an arbitrary mathematical function, mapping real numbers to real numbers. What is the probability that $f$ can be modeled using a function programmed in $\mathcal L$?

Assume that $\mathcal L$ can handle real numbers in an arbitrarily large range, with infinite precision. The programmed function $F$ is of finite length.

×