简体   繁体   中英

"#define" vs "#define 1"

The 1 seems unnecessary (and possibly misleading) in the following example, but I have seen this multiple times when used for checking #ifdef s:

#ifndef __NEWLIB_H__

#define __NEWLIB_H__ 1

Is there a difference or reason for using the above versus a plain #define __NEWLIB_H__ ?

1 is true, so you can use the macro in an #if test. That's not usually very useful for header guards, but it certainly doesn't hurt. For other macros which might be tested in boolean expressions, the true value is definitely useful.

Some people just like the consistency. And that's the definition that gcc chooses by default if you put -D TESTME on the command line.

However,

#define __NEWLIB_H__ 1

is not cool unless it's in an implementation of the standard library, because names starting with two underscores (or an underscore and a capital letter) are reserved for use by the implementation, and should never be used in portable applications.

When used purely as an #include guard, there is no difference between

#ifndef __NEWLIB_H__
#define __NEWLIB_H__

and

#ifndef __NEWLIB_H__
#define __NEWLIB_H__ 1

However, in general, there is a distinction.

Compiler error

#ifndef ABCD
#define ABCD
#endif

int main()
{
#if defined(ABCD) && (ABCD == 1)
   std::cout << "ABCD is 1\n";
#else
   std::cout << "ABCD is not 1\n";
#endif
}

Outputs the string "ABCD is 1"

#ifndef ABCD
#define ABCD 1
#endif

int main()
{
#if defined(ABCD) && (ABCD == 1)
   std::cout << "ABCD is 1\n";
#else
   std::cout << "ABCD is not 1\n";
#endif
}

Outputs the string "ABCD is not 1"

#ifndef ABCD
#define ABCD 10
#endif

int main()
{
#if defined(ABCD) && (ABCD == 1)
   std::cout << "ABCD is 1\n";
#else
   std::cout << "ABCD is not 1\n";
#endif
}

#define by itself will replace the symbol with nothing .

On the other hand, #define 1 , as you call it, will replace the symbol with 1 everywhere it is found in the file. So, for example, the following code:

#include <iostream>

#define EXAMPLE "hello"

int main()
{
    std::cout << EXAMPLE;

    return 0;
}

prints

hello

This is because EXAMPLE here is replaced with "hello" , making the print statement equivalent to:

std::cout << "hello";

If we change the #define statement to this instead:

#define EXAMPLE

This will give a compile error :

main.cpp: In function ‘int main()’:
main.cpp:15:25: error: expected primary-expression before ‘;’ token
     std::cout << EXAMPLE;

As to why you would ever use this second form of #define , it's because there is another processor directive that you can use called #ifdef :

#include <iostream>

#define EXAMPLE

int main()
{
#ifdef EXAMPLE
    std::cout << "EXAMPLE defined.";
#endif

    return 0;
}

This will print:

EXAMPLE defined.

Because #ifdef (and its relative #ifndef ) only require that the symbol be defined, we don't really need to give it a value. It just needs to be there to work.

A common place you see this kind of stuff is with header guards (which is probably what you're seeing). You can also see it with platform identification , or even to determine whether the compiler is using C++ or not .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM