Sometimes we think that a certain thing is simple, but in the end it appears it’s not as simple as we have thought. I mean, BinarySearch is pretty simple, right? Every meaningful programmer can implement a binary search algorithm. Everyone knows that InsertionSort sucks and QuickSort shines, yep?
Sometimes we tend to implement by ourselves what has already been implemented before us. In most cases, this is a very bad idea. But I know many people who like to reinvent the wheel, hoping that their’s wheel would be the best one. Yes, programmers often are very self-indulgent and they feel like they are the smartest guys in the room.
But let’s look right in the eyes of the reality. Are you sure you can implement binary search, flawlessly, without bugs?
Before answering, consider the following fact, “first binary search paper was published in 1946; first bug-free one in 1962. Bug in Java’s Array.binarySearch() discovered in 2006 (an integer overflow bug when calculating the midpoint of the range that you’re dividing the search over)”. All those people were not stupid, they were just… people. And people are very error-prone. You can say that this is actually a proof that it’s meaningful to re-implement such algorithms, but I’ll say NO. You should understand what’s going on under the hood of specific implementations and try not to re-implement them if you can. Nowadays you can find bug-free classical algorithms implementations for most popular platforms, such as Java or .NET. Use them. Check the System (platform, native) implementation if you can, try to google about integrated implementations, maybe you will find some valuable information.
You are encouraged to know whether specific QuickSort implementation you are going to use is based on 3-partitioning, or is it stable (be default QuickSort is not), or is it shuffled before the sorting. If it doesn’t use 3-partitioning or shuffling, you could face the case when your QuickSort will take N^2 time to execute. Is it speeded up by transitioning to InsertionSort on 7-items (in length) arrays? This transition can speed up the algorithm by 20%. Great minds struggle with all kind of input data for specific sorting algorithms. This topic is not as simple as it seems to be. QuickSort took N^2 in certain cases in the C-implentation (1990). And by the way in 1990 it has already been passed about 31 years since the invention of QSort. Just think of it.
Reimplementing MergeSort you can make it unstable, simply by using “<=" (">=”) instead of “<" (">“) when comparing items. Don’t forget to create the auxiliary array out of the recursive procedure, unless you want your program’s memory traffic skyrocketing.
The bottom line of the post is, “don’t boast, be humble, think of what you’re actually using, test it, try to use the best implementations (but bear in mind what certain characteristics of a certain implementation mean), don’t reinvent the wheel.
Take my course “Algorithms and Data Structures in C#: Complete Tutorial”
I have a course on Udemy “Algorithms and Data Structures in C#: Complete Tutorial”. You can take it just for 9.99!