简体   繁体   中英

Is it a good idea to include .cpp files instead of .h files to make vanilla gcc able to optimize my code more?

Is it a good idea to use #include "randombytes.cpp" instead of randombytes.h in my project (where randombytes.cpp is a file in my projects source code directory) for runtime speed reasons ? randombytes.cpp would look like this:

#ifndef RANDOMBYTES_INCLUDED
#define RANDOMBYTES_INCLUDED

/* include native headers here */

unsigned char *fetch_random_bytes(int amount);

/* include other parts of my project here if necessary */

unsigned char *fetch_random_bytes(int amount) {
  // do stuff
}

#endif

This should also work for files requiring each other and so on, right? Can you think of any cases in which this won't work or I won't get the optimization benefit?

This practice is called "Unity Build" (google it) and is generally not a good idea for anything but trivial projects since you will need to recompile the entire project every time you make a single change, which can mean minutes of waiting every time you fix a tiny error.

As for runtime performance, the difference in speed is not very different from compiling with Link Time Optimizations on.

yes, this is in generally a technique called 'unity builds' and it helps in the inlining process (if the compiler is smart enough). However there are disadvantages of this if you have duplicate functions with internal linkage (ie: functions that only exist in the .cpp and are not declared in a .h), since those might give hard to debug compile errors, althought that can be avoided with careful and consistent naming conventions

some reading regarding this:

The benefits / disadvantages of unity builds?

http://www.gmixer.com/archives/46

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM