简体   繁体   中英

Memory allocation doubt with type pointer to char

This program is supposed to prompt for the number of letters in a word(to be entered later) so it knows how much space to allocate. It seems to work OK, however it doesn't seem to matter if you allocate less memory than needed for the word to be stored. Is it a bug that I must correct or is it because that's how pointer to char (char *) works?

#include <stdio.h>
#include <stdlib.h>

int main() 
{
unsigned int a = 0;
printf("Enter the size of the word(0=exit) :");
scanf("%d",&a);
if(a==0){return 0;}
else
     {
      char *word = (char *)malloc(a*sizeof(char) + 1);
      if(word == NULL)
          {
           fprintf(stderr,"no memory allocated");
           return 1;
          }
      printf("Reserved %d bytes of space (accounting for the end-character).\nEnter your word: ", a*sizeof(char) + 1);
      scanf("%s", word);
      printf("The word is: %s\n", word);
     }

return 0;
}

All right i think i might have fixed it, this way, running with valgrind shows none of the errors that it showed earlier.

char aux[]="";
  scanf("%s", aux);

  if(strlen(aux)>(a*sizeof(char) + 1))
     {
  fprintf(stderr,"Word bigger than memory allocated\nExiting program\n");
  return 1;
     }
  else
     {
      strcpy(word,aux);
      printf("The word is: %s\nAnd is %d characters long\n", word, strlen(word));
     }

Now my doubt is: why can I declare an empty char array(char aux[] = ""), and then use "extra" memory with no errors (in valgrind output) yet char *aux = ""; gives me a segmentation fault? I'm very new to C programming so I'm sorry if it's obvious/ dumb question. Thanks.

It doesn't seem to matter but it does , if you use more space than allocated you will eventually end with a buffer overrun. It's possible that your current implementation allocates a bit more than what you actually request, its also possible that it doesn't. You cannot relay on that behavior, never access/use memory that wasn't allocated.

Also sizeof( char ) == 1 by definition.

Yes, you must correct that bug in your program.

When you allocate less memory than you need, and later access that "extra" memory, the program goes into undefined behavior mode. It may seem to work, or it may crash, or it may do anything unexpected. Basically, nothing is guaranteed after you write to the extra memory that you didn't allocate.

[Update:]

My proposal to read a string of arbitrary length from a file is the following code. I cannot help that it is somewhat long, but since standard C doesn't provide a nice string data type, I had to do the whole memory management thing on my own. So here it is:

#include <assert.h>
#include <stdio.h>
#include <stdlib.h>
#include <string.h>

/** Reads a string from a file and dynamically allocates memory for it. */
int fagetln(FILE *f, /*@out*/ char **s, /*@out*/ size_t *ssize)
{
  char *buf;
  size_t bufsize, index;
  int c;

  bufsize = 128;
  if ((buf = malloc(bufsize)) == NULL) {
    return -1;
  }

  index = 0;
  while ((c = fgetc(f)) != EOF && c != '\n') {
    if (!(index + 1 < bufsize)) {
      bufsize *= 2;
      char *newbuf = realloc(buf, bufsize);
      if (newbuf == NULL) {
        free(buf);
        return -1;
      }
      buf = newbuf;
    }
    assert(index < bufsize);
    buf[index++] = c;
  }

  *s = buf;
  *ssize = index;
  assert(index < bufsize);
  buf[index++] = '\0';
  return ferror(f) ? -1 : 0;
}

int main(void)
{
  char *s;
  size_t slen;

  if (fagetln(stdin, &s, &slen) != -1) {
    printf("%zu bytes: %s\n", slen, s);
  }
  return 0;
}

Usually (but not always) the overflows of allocated buffers causing a crash when you free the buffer. If you would add free(word) at the end, you will probably see the crash.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM