Code:
int a;
int b;
char test[011];
a = 0x41414141;
b = 0x42424242;
gdb output
(gdb) x/s &a
0x7fffffffde1c: "AAAA@\336\377\377\377\177"
(gdb) x/s &b
0x7fffffffde18: "BBBBAAAA@\336\377\377\377\177"
In the code a is initialised with AAAA and b with BBBB . I need to know the following.
Why the location of
b
has BBBBAAAA instead of BBBB it is supposed to have?
It doesn't. The location of b has 0x42424242 (when interpreted as an int
). But by running x/s &b
(as opposed to print b
) you are telling gdb to print a string starting at the location of b
, rather than to print the int
stored there.
It so happens that the bytes stored at the location of b
look like "BBBB" when interpreted as ASCII, and the bytes after that look like "AAAA@" when interpreted as ASCII, and then there are some more bytes that aren't printable characters so gdb prints them as escape codes instead, and then there's a 0 byte (which indicates the end of a string).
What does the @\\336\\377\\377\\377\\177 signify?
@ is the character @. \\336 and \\337 and \\177 are escape codes - the bytes following the @ aren't displayable characters, so gdb prints them as octal escape codes instead (using C syntax).
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.