## Best-case analysis is bogus

One deals with worst-case analysis of algorithms in a day-to-day basis, especially if one is working with systems that have real-time constraints, such as CEP. As I’ve written in the past, real-time systems do not necessarily need to run fast, that is, they can take days to finish, but if the real-time system states that it will finish a task in 1 hour, then it must. To be able to make such time assertions with certainty, the real-time system developer must know the worst-case time for all algorithms that make up such a task.

As an example, what is the worst-case cost of inserting 1 element in a sorted linked list of size n? A naive approach is to start at the first position and compare the elements in the linked list until you find one that is larger than the element being inserted, in which case then you insert the new element in that position and move the remaining elements back. The worst-case scenario is when the element to be inserted is larger than any other element in the list, thus the cost being the cost of n node traversals (i.e. n*c1) and the cost of the actual insertion at the end (i.e. c2). Or, eliminating the constants, we would arrive at the asymptotic worst-case cost of n.

What about the best-case analysis of this algorithm?

Well, I believe that best-case analysis are rather bogus. It has very limited analytical and statistical value, as it not only adds no insight to the analysis of the algorithm, but it also rather hardly presents a real-world case.

However, the biggest problem of best-case algorithm analysis is that the result can be manufactured. To test for best-case scenarios means that you can choose your best input, which allows the algorithm developer to hard-code the algorithm for this well known best input. For example, the algorithm could do the following:

if (bestInput) then return x;

// otherwise do proper work

In other words, you could have a very complex algorithm whose worst-case cost is n square, but whose best-case cost for this one selected input is 1!

Is this valid? Well, it is a valid algorithm, but, in my opinion, a bogus analysis.