The sole purpose of references is aliasing. Assigning a reference (referring to a constant int) to a integer seems absurd since it is not an alias (and it doesn't give an error!). I suppose it is similar to defining a constant int itself. Is there any difference?
Within a function body or file scope, the only difference is decltype(x)
. In one case, it is int const
and the other int const&
.
The const int & x=7;
creates a temporary anonymous int
with value 7
. It then binds a reference x
to it. The lifetime of the temporary is the extended to that of the reference. This is basically indistinguishable from x
being the name of a const int
with value 7
.
An exception to it being nigh identical is when the binding occurs within an object's constructor as part of member initialization. In that case, the lifetime is not extended.
I suspect you can induce this with:
struct Foo{
int const& x=7;
Foo(){};
};
Either the above syntax is illegal or it dangles (I do not recall if there is a corner case in the standard for references), while:
struct Foo{
int const x=7;
Foo(){};
};
is both legal and does not dangle. So there is a difference.
There would also be a difference as a parameter to a function, wbhere =7
simply provides a default.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.