简体   繁体   中英

Why allocate an array of size 1 more than the requested size?

This is really interesting because our instructor was teaching this to us yesterday and he couldn't figure it out himself. So, we're kinda left hanging on this without knowing the actual reason why.

Here is the array implementation of a Queue in a famous book (which I don't have, but that's what my instructor said. The author is very reputed):

class QUEUE {
private:
    int* q;
    int N;
    int head;
    int tail;
public:
    QUEUE(int maxN) {
        q = new int[maxN + 1];
        N = maxN + 1; head = N; tail = 0;
    }
    int empty() const {
        return head % N == tail;
    }
    void put(int item) {
        q[tail++] = item; tail = tail % N;
    }
    int get() {
        head = head % N; return q[head++];
    }
};

Inside the constructor, you see q = new int[maxN + 1]; . But why the '+ 1' ? Why is he allocating one extra int block of memory?

The problem that adding one to maxN solves is that if you allocate exactly maxN items, you would not be able to distinguish these two situations:

  • The queue is empty, and
  • The queue has exactly maxN items.

In both these situations head and tail would be equal to each other modulo N .

Note: the implementation is not ideal, because inserting maxN+1 -th element wraps the queue around, so it becomes empty again. This shortcoming can be addressed in three ways:

  • Throw an exception when queue overflows,
  • Change return type to bool , ignore insertions that overflow the queue, and return false if an insertion is ignored, or
  • Silently ignore insertions that overflow the queue (not recommended).

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM