Sometimes, an elegant algorithm in computer science is processed the

same way a person would perform a task. For example, if you were looking

up a name in a phone book, you wouldn’t start at the very beginning

of the book. You would perhaps start somewhere in the middle, and divide by half

each time until you zeroed in on the target.

Maybe you know that the phone book is alphabetized, and if the name started with a “B,”

then you might start towards the beginning of the book. However, imagine that you didn’t know

how many names there were for each letter, and it’s possible that letters that

start with “A” could take up a majority of the book. Here, you might guess to

open the book up to about the center, and go from there. For the rest of

this post, let us assume we don’t know the distribution of the numbers in an array,

but only that it is sorted.

When you looked up the numbers in the phone book, you are roughly doing a “binary search.”

Here is a table that shows how a computer might process the steps:

If we were to do a “brute-force” method, which is checking each number starting at

the beginning, the process will take on the order of N/2 times on average, where N

is the number elements in the array. If we guess at random until we find the target,

it will also take on the order of N/2 times on average.

However, binary search limits the set that we must search by half each time, so it

will only take on the order of lg N on average. For illustration, if there were 10,000,000

numbers, the brute-force method will take 5,000,000 steps on average, whereas binary search

will take on average 23 steps!

It is worth thinking about efficient algorithms, because even the best hardware will lose

to a great algorithm.

As always, let’s let the code do the talking!