简体   繁体   中英

C++11: the type of decltype((x)) and decltype((x+1)) are different?

I was thinking that decltype((x)) give the & reference type, but some experiment showed sth else:

#include<stdio.h>
int main(){
    int x = 0;
    decltype((x)) r = x;
    r = 1;
    printf("%d\n",x);

    decltype((x+1)) i = x;
    i = 2;
    printf("%d\n",x);

    decltype((1)) k = x;
    k = 3;
    printf("%d\n",x);
    return 0;
}

I was expecting that (x) and (x+1) and (1) woul all give an "int&". However the running result was

1
1
1

I expected that the running result should be 1,2,3 as each decltype retrieves reference, but seems only the 1st works, while both (x+1) an (1) gives only int, not 'int&'. Why, are (x) and (x+1) of different id-expression type?

x + 1 is a prvalue, so its decltype is just int . By contrast, x is an id-expression, and so (x) is an lvalue and its decltype is int& . (There is a special rule by which the decltype of an id-expression itself (eg x ) is the actual type as which the variable was declared, and you have to parenthesize the expression to get at the value category of the expression.)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM