简体   繁体   中英

A weird problem with command printf

I am using Simple Scalar toolset to run some simulations on cache associativity. And I am seeing some strangest behavior with printf function. Here is the code snippet:

printf(" Name: %s %d %d %d \n", name, nsets, cp->set_shift, cp->set_mask);
printf(" Name: %s %d %d %d \n", name, cp->set_mask, nsets, cp->set_shift);

The printf lines are one after another, no other code in between. And here is the output:

 Name: dl1 128 5 127
 Name: dl1 127 0 128

The output of second printf is wrong. The output of the second printf should be :

Name: dl1 127 128 5

Changing the relative order of printf statements does not change the output. What part of printf I am missing to understand??

With regards Newbie

Go look at the variable declarations. My guess is somebody is a short or long , not an int . Since printf can't check what you pass for validity, it decides how many bites to grab from the stack based on the % signs. If your args don't agree with the format, not a compile error, but garbage can come out.

I'd guess you have a mis-match between the type you're passing and the type you're telling print that you're passing. Specifically, it looks like cp->set_mask is some type larger than int. Since you've told printf that it's an int, it's taking the first sizeof(int) bytes as the first int, and then the next sizeof(int) bytes as if it were nsets (and since you're apparently on a little-endian machine, those bytes all contain 0). Note, in particular, that after the 0 comes the 128 that the first prints says is the value of nsets .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM