简体   繁体   中英

When we take input in a string why size does not matter?

#include<stdio.h>`

int main() {
  char b[1];
  scanf("%s",b);
  puts(b);
  // gets(b);
  // puts(b);
  // gets(b);
  // puts(b);
  return 0;
}

output:

hello_world
hello_world

I expect that it should give max limit error but it didn't gave that. Why?

"give max limit error" --> C does not specify a "max limit error" required.

C is like riding a bicycle without training wheels. If code is not well directed, it fails.

Use a larger buffer and a width limit like:

char b[100];
scanf("%99s",b);

char b[1]; says to reserve one byte of memory for b .

scanf("%s",b); says to read any number of characters (until a white-space character is seen) and put them in memory starting at b .

So this code asks to write to memory that is not reserved for b . Nothing in the C language stops you from doing that. You are allowed by the C language to ask to do “improper” things; it is deliberately a language for having low-level control of various parts of the computer. The C standard does not specify what happens when you do that; it is up to you, and other tools, to take care with what you are doing when you go beyond what the standard defines.

If writing to that memory happens not to break anything else your program needs to continue “working,” then the program might “work.” If it does break something else, the program can fail in various ways. Also note that the C standard permits the compiler to optimize your program in ways that will not affect programs that are fully defined by the C standard but that may cause surprising effects in programs that are not fully defined by the standard.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM