The
Two Types of Compiler Optimization
There are
two fundamental types of optimization: elimination of work or restructuring of
work. Although there is a huge list of optimizations that can be performed by
compilers, all of them resolve to either not doing something or doing
something in a more effi-cient way. Consider the snippet of code shown in
Listing 2.32.
Listing 2.32 Empty Loop
for (int i=0; i<1000; i++) { }
It quite
clearly does not perform any useful work; a programmer might have included it
as a naïve delay. An optimizing compiler will eliminate the entire loop.
Consider the variant of the code shown in Listing 2.33.
Listing 2.33 Loop Containing Function Call
for (int i=0; i<1000; i++) {
do_nothing(); }
Unless
the compiler can inspect the body of the function do_nothing() or the
pro-grammer has used some other mechanism to indicate to the compiler that the
code does nothing, then the compiler will have to leave this loop in the code.
Listing
2.34 shows another code snippet.
Listing 2.34 Loop Containing Floating-Point Arithmetic
double total=0.0;
for (int i=0; i<1000; i++) { total
= total + 1.0;}
Although
a human would determine that the code is equivalent to adding 1,000 to the
floating-point variable total, a compiler may perform all the individual additions. This is in case
there is some side effect from the floating-point computation that a different
part of the code is watching for. For example, there might be a floating-point
exception handler that gets triggered by the computation of this loop. By
default, some compilers cannot eliminate this code, but when given suitable
compiler flags, the com-piler will remove the loop. This is a demonstration
that the compiler needs to be given appropriate instructions by the user in
order for it to perform all the optimizations that it is capable of.
The other
fundamental type of optimization is an improvement in the efficiency of the
operations. In the section “Array Access Patterns,” we saw a potential example
where the compiler could interchange two loops in order to improve the pattern
of memory accesses. This improved the performance of the code by reducing the
memory access costs. Another example of this improvement in the efficiency of
the code is where one operation can be replaced by a less expensive one. This
is called strength reduction. A good
demonstration of strength reduction is replacing integer division by a power of
two with a shift operation. The code in Listing 2.35 contains an integer
division by two.
Listing 2.35 Code with Opportunity for Optimization
unsigned int b;
...
unsigned int a = b/2;
An
integer division by two can be replaced by a shift right operation, as shown in
Listing 2.36.
Listing 2.36 Code After Optimization
unsigned int b;
...
unsigned int a = b>>1;
Another
common optimization is to replace conditional code with logical operations. The
performance gains come from the ease with which the processor is able to
execute the resulting code. The improved sequence may eliminate branch
instructions, therefore eliminating branch misprediction stalls, or maybe the
new sequence needs fewer instruc-tions. Listing 2.37 shows code with an
opportunity for the replacement of conditional code with logical operations.
Listing 2.37 Conditional
Code
if
( (value & 1) == 1 )
{
odd = 1;
}
Else
{
odd = 0;
}
The code shown in Listing 2.37 can be replaced by the equivalent code
shown in Listing 2.38.
Listing 2.38 Logical
Operations
odd
= value & 1;
Related Topics
Privacy Policy, Terms and Conditions, DMCA Policy and Compliant
Copyright © 2018-2023 BrainKart.com; All Rights Reserved. Developed by Therithal info, Chennai.