Performance impact of sorted arrays.
Computer performance is filled with unexpected results. This morning on stack overflow I found such a result, and decided to replicate it in C#. Take an array of random integers and sum only the big integers. Next, sort the array and re-sum the big integers. Does sorting the array affect the time taken to compute the aggregation? Let's code it up: The results: Wow, it's 5 times faster to perform this sum over a sorted array then an unsorted array. What's going on? Branch prediction! If you've never heard of branch prediction check out the stack overflow answer that inspired this post. You can quickly replicate this experiment using LinqPad and MeasureIt.Net , by loading this file .