Our website uses cookies to enhance your browsing experience.
Accept
to the top
close form

Fill out the form in 2 simple steps below:

Your contact information:

Step 1
Congratulations! This is your promo code!

Desired license type:

Step 2
Team license
Enterprise license
** By clicking this button you agree to our Privacy Policy statement
close form
Request our prices
New License
License Renewal
--Select currency--
USD
EUR
* By clicking this button you agree to our Privacy Policy statement

close form
Free PVS‑Studio license for Microsoft MVP specialists
* By clicking this button you agree to our Privacy Policy statement

close form
To get the licence for your open-source project, please fill out this form
* By clicking this button you agree to our Privacy Policy statement

close form
I am interested to try it on the platforms:
* By clicking this button you agree to our Privacy Policy statement

close form
check circle
Message submitted.

Your message has been sent. We will email you at


If you haven't received our response, please do the following:
check your Spam/Junk folder and click the "Not Spam" button for our message.
This way, you won't miss messages from our team in the future.

>
>
>
Code Optimization

Code Optimization

Aug 17 2017

Definition and Properties

Code optimization is any method of code modification to improve code quality and efficiency. A program may be optimized so that it becomes a smaller size, consumes less memory, executes more rapidly, or performs fewer input/output operations.

The basic requirements optimization methods should comply with, is that an optimized program must have the same output and side effects as its non-optimized version. This requirement, however, may be ignored in the case that the benefit from optimization, is estimated to be more important than probable consequences of a change in the program behavior.

Types and Levels of Optimization

Optimization can be performed by automatic optimizers, or programmers. An optimizer is either a specialized software tool or a built-in unit of a compiler (the so-called optimizing compiler). Modern processors can also optimize the execution order of code instructions.

Optimizations are classified into high-level and low-level optimizations. High-level optimizations are usually performed by the programmer, who handles abstract entities (functions, procedures, classes, etc.) and keeps in mind the general framework of the task to optimize the design of a system. Optimizations performed at the level of elementary structural blocks of source code - loops, branches, etc. - are usually referred to as high-level optimizations too, while some authors classify them into a separate ("middle") level (N. Wirth?). Low-level optimizations are performed at the stage when source code is compiled into a set of machine instructions, and it is at this stage that automated optimization is usually employed. Assembler programmers believe however, that no machine, however perfect, can do this better than a skilled programmer (yet everybody agrees that a poor programmer will do much worse than a computer).

What to Optimize

With manual code optimization, one faces another problem: one doesn't just need to know how exactly optimization should be done, but also what particular part of the program should be optimized. Due to various reasons (slow input operations, the difference in the working speed of a human operator and a computer, and so on), 90% of the execution time of a program is spent executing only 10% of the code (this statement is rather speculative, with the Pareto principle as a quite doubtful ground, but A. Tanenbaum makes it sound convincing). Since optimization takes additional time aside from the time you've spent on developing the program, you'd better focus on optimizing this time-critical 10% of code rather than try to optimize the whole program. These code fragments are known as bottlenecks, and can be detected by special utilities - profilers - which can measure the time taken by various parts of the program to execute.

In practice, however, optimization is usually done after the stage of "chaotic" programming (including such methods as "Copy-Paste", "we'll see later", "it's OK this way"), and therefore is a mixture of optimization as such, refactoring and bugfixes: simplification of "queer" constructs like strlen(path.c_str()), logical conditions like (a.x != 0 && a.x != 0), and so on. Profilers are of little help with this kind of optimization. Nevertheless, you can detect these issues with static analysis tools, i.e. tools designed to search for semantic errors, relying on deep analysis of source code. As you can see from the above mentioned example with the strange condition, inefficient code may appear as a result of errors (like a misprint in our example, where a.x != 0 && a.y != 0 should be instead). A powerful static analyzer will detect such code fragments and draw your attention to them by producing warning messages.

Good and Bad Outcomes of Optimization

In programming, almost everything should be treated from the viewpoint of rationality - optimization is no exception. There is a belief that code written by an inexperienced Assembler programmer is 3-5 times slower than code generated by the compiler (Zubkov). Widely known is a phrase by Knuth regarding early low-level optimizations (such as attempts to save on operators or variables): "Premature optimization is the root of all evil".

Most programmers don't complain about optimizations performed by the optimizer, some of which are conventional and obligatory. Such as, for instance, tail call optimization in functional languages (tail call is a special case of recursion, which can be represented as a loop).

However, one should understand that multiple complex optimizations at the level of machine code may cause a great slow-down of compilation. The benefit they allow you to gain may be much too insignificant, when compared to general system design optimizations (Wirth). One should also keep in mind that modern languages, with all their syntactic and semantic "frills", have many nuances and subtleties, so that a programmer who isn't familiar with them may be surprised by an outcome of optimization.

For example, take C++ and the so-called Return-Value Optimization, when the compiler avoids copying a temporary object returned by a function. Because the compiler omits copying, this method is also called "Copy elision". So, the following code:

#include <iostream>
 
struct C {
  C() {}
  C(const C&) { std::cout << "A copy was made.\n"; }
};
 
C f() {
  return C();
}
 
int main() {
  std::cout << "Hello World!\n";
  C obj = f();
}

may have several outputs:

Hello World!
A copy was made.
A copy was made.

Hello World!
A copy was made.

Hello World!

Strange as it may seem, all the three versions are valid because the language standard allows omitting calls of a copying constructor in such cases, even if the constructor has side effects (§12.8 Copying Class Objects, Paragraph 15).

Conclusion

Thus, we should always consider optimizing the program code using specialized utilities wherever possible, yet do this with much care, and be ready for the probability of unexpected tricks from the compiler at times.

PVS-Studio

A set of diagnostics is implemented in the static analyzer PVS-Studio that enable you to find some situations where code can be optimized. However, PVS-Studio as any other static analyzer cannot serve as a replacement of the profiling tools. Only dynamic program analyzers are able to identify the bottlenecks. Static analyzers do not know what input data programs get and how often a particular piece of code is executed. That is why we are saying that the analyzer suggests implementing some "micro-optimizations" of code, which do not guarantee the performance gains.

Despite the considered disadvantage, PVS-Studio analyzer acts as a good complement to profiling tools. Moreover, when dealing with PVS-Studio warnings, related to optimization, code often becomes simpler and shorter. This effect is considered in more detail in the article "Exploring Microoptimizations Using Tizen Code as an Example".

References

Popular related articles


Comments (0)

Next comments next comments
close comment form