Posts

Showing posts with the label performance

Performance impact of sorted arrays.

Image
Computer performance is filled with unexpected results. This morning on   stack overflow  I found  such a result, and decided to replicate it in C#. Take an array of random integers and sum only the big integers.  Next, sort the array and re-sum the big integers. Does sorting the array affect the time taken to compute the aggregation? Let's code it up: The results: Wow, it's 5 times faster to perform this sum over a sorted array then an unsorted array. What's going on? Branch prediction!  If you've never heard of branch prediction check out the stack overflow answer that inspired  this post. You can quickly replicate this experiment using LinqPad and MeasureIt.Net , by loading this file .

Run time costs of small operations in C#.

Image
Thank you smart people that helped me understand this:  Vance Morrison ; Kevin Frei; Dave Detlefs; Jan Kotas; Sean Selitrennikoff; Nathan Iarovich; Dave Driver. One of my colleagues read this post and pointed out for really small operations e.g  {i++ } the cost of the measurement system is exceeding the cost of the operations. I agree completely,  and recommend using  measure it  and reading the help if you need to measure small things. My colleague also noticed something interesting; in this code    (full  code  here , proof of concept in linqpad using measureit .) : class Program { static int interlocked = 0; static int staticInt = 0; static void DoNop() { } static void IncrementStaticInt() { staticInt++; } // This is an array to make it easier to debug in cdb. static NamedAction[] namedActions; static void Main(string[] args) { int loc...

Run time costs of common operations in C#

Disclaimers: 1) The benchmarking methodology below should not be used for very small things like i++. For measuring small things see  this . 2) Since writing the below I've modified  measure it  for integration with linqpad - use that instead. 3) If you still want to read this, be my guest :) Today we debated if Enum.ToString() is too slow to be used. To my surprise it takes 1-5 micro seconds on my core-i7 laptop. If 5 micro seconds is too slow for you, the i nternet has you covered . Naturally, I was curious about  cost of other operations so I measured those as well. You can reproduce and extend  my experiment by putting the below code into linqpad . The output from the current code on my laptop is: Milliseconds to perform 100000 iterations Enum.ToString() took 411 ms NewGuid() took 31 ms InterlockedIncrement() took 3 ms Lock() took 3 ms i++ took 2 ms static TimeSpan TimeIt(int iterations, string title, Action action) { var watch = Stopwa...

The cloud lets you evaluate the cost of performance optimizations

One of the things I love about cloud computing is you can put an honest price on computing time.  You can than balance the human engineering time required to optimize code (and often have more complex code) vs just paying for the cloud to do it.  The Zillow rent estimate post speaks to this brilliantly: We implemented the Rent Zestimation process as a software application taking input from Zillow databases and producing an output table with about 100 million rows. We deploy this software application into a production environment using Amazon Web Services (AWS) cloud .  The total time to complete a run is four hours using four of Amazon EC2 instances of Extra-Large-High-CPU type .  This type of machine costs $1.16/hr.  Thus, it costs us about $19 to produce 100 million Rent Zestimates which is the same as a 3-D movie ticket or about 5 gallons of gasoline in New York City today. A few things to note about this quote: If your data processing can be done...

The Performance of Everyday Things

I've spent much time fixing code optimizations that added no business value (with often matching performance value). Please do not try to make your code faster unless you need to. The way I handle performance issues on my projects: Define acceptable performance. Write my code as simply as possible. Measure performance: against definition, if performance > acceptable - goto DONE. /*Performance not acceptable*/ Profile; Fix as simply as possible; goto Measure. DONE To be explicit: I'm comfortable using slower patterns if they are clear and simple. As soon as I've hit my acceptable performance bar - I'm done. With that out of the way, let me discuss a performance riddle I hit this week. I was wandering through some powershell code that processed slews of objects (over 200K of 'em): $interestingObjects = @() foreach ($object in $inputObjects) { if ($object.IsInteresting) { $interestingObjects += $objects ...