Program a mathematical function

Suppose you have a programming language L\mathcal L designed to efficiently perform calculations with real numbers. In particular, it allows for the definition of (computational, deterministic) functions which take one number as an argument and return a number:

1
2
3
define F(x):
   y = ...;
   return y;

Let f: RRf:\ \mathbb R \to \mathbb R be an arbitrary mathematical function, mapping real numbers to real numbers. What is the probability that ff can be modeled using a function programmed in L\mathcal L?

Assume that L\mathcal L can handle real numbers in an arbitrarily large range, with infinite precision. The programmed function FF is of finite length.

×

Problem Loading...

Note Loading...

Set Loading...