In addition to Solution 1.

In general case, such prediction is

**theoretically impossible**. I mean, it is impossible even for the systems where the timing of every logically deterministic sample is totally deterministic, such as in the case of single-thread single-process system with known timing of each CPU instruction (in CPU cycles) and CPU cycle frequency.

So, in such systems, you can perfectly predict timing in special (simple) cases. But why not in general case? Because of

*conditional operators* and, generally,

*instruction-flow control* statements, such as "if". Well, with such statements, timing prediction can also be possible in special (simple) cases. But, again, why not in general case?

Because even the logical behavior cannot be predicted in general case. I can proof it mathematically, based on another mathematically proven statement, that the

*halting problem** is unsolvable*. The generalized statement is known as

*Rice's theorem*. Please see:

https://en.wikipedia.org/wiki/Halting_problem#Sketch_of_proof[

^],

https://en.wikipedia.org/wiki/Rice%27s_theorem[

^].

I could prove my point on the

`Rice's theorem`

, but for simplicity, it's enough to use its special case, the fact that halting problem is not solvable. In general case, you cannot even predict if some not so trivial algorithm will ever halt the execution. That means, you can expect that the program will execute for some, usually long, time, and eventually halt, or will never stop.

Apparently, if you don't even know from the algorithm that the time is infinite of finite, it means you cannot predict the time. :-)

[EDIT]

Can we face or build an algorithm which always halts, but timing is finite but still theoretically unpredictable? Yes, with Rice's theorem. If some your property value is not deterministic, you can write something like

`if (myProperty) LongMethod() else QuickMethod()`

.

One simple practical case is the use of

*pseudo-random number generator*. Yes, its algorithm itself is deterministic, but the values won't be deterministic if you

*seed* the algorithm, for example, with the value taken from the real-time clock device.

Another trivial cases is an interactive program. :-)

Now, even with

*multi-threaded* systems, in special cases you can still predict timing, but it will be "hockey time", not "football time". Perhaps you know that in hockey, the clock is stopped on all breaks, to count only the "pure" time of the game. You can precisely evaluate the timing for a single thread, not counting the time for all interrupts and thread switching. For real time ("football time") it will give you the

__infinum__ of the possible real-time durations.

But this estimate will be valid only on the fixed CPU model and fixed CPU clock frequency. The tick times are available from the CPU specs, so you can sum up all the tick counts for all instructions. But remember that your code can be compatible with the similar CPU, which may have different timing. Generally, it all boils down to what you understand by "code" and "prediction". If this is the code in high-level programming language, different compilers may generate different CPU instructions.

"Prediction" is also not a well-defined category. Let's say, you are calculating generalized

*Fibonacci sequence* using the

*recurrent* algorithm. The algorithm is perfectly deterministic, but what would be the prediction, especially if the initial data set and required length of sequence is taken from input data? Each time, you can predict the behavior of the algorithm perfectly, but the method of such prediction is only one: to perform all the calculations from the beginning to the end with the exact same data set. Is it prediction or not? :-)

Just some food for thought…

—SA
—SA