简体   繁体   English

std::vector back() 的奇怪行为

[英]Odd behavior of std::vector back()

The following code asserts in the indicated place with "iterator+offset is out of range."下面的代码在指示的地方用“迭代器+偏移量超出范围”断言

void Network::PushInput(int c, int h, int w) {
    Input* input = new Input(batch, c, h, w, data);
    layers.push_back(input);    // this happens to be the first push_back()
//  layers.push_back(input);    // doing another doesn't change the assert!
    Layer *foo = layers.back();  // asserts here
    Layer *baz = layers[layers.size()-1];  // does not assert
}

Input is a public subclass of Layer. Input 是 Layer 的公共子类。 layers is declared as层被声明为

std::vector<Layer *>layers;

If I attempt to duplicate the above with more vanilla template types, eg, int*, back() works as expected with no asserts.如果我尝试使用更多的 vanilla 模板类型复制上述内容,例如 int*,back() 将按预期工作,没有断言。 Somehow, the template type matters here.不知何故,模板类型在这里很重要。 (Note: _ITERATOR_DEBUG_LEVEL is 2, which triggers that assert check in the vector class.) (注意:_ITERATOR_DEBUG_LEVEL 是 2,这会触发向量类中的断言检查。)

I'd rather not bluntly change all of the back()'s in the code to size()-1, but would rather understand what is going on here.我宁愿不直截了当地将代码中的所有 back() 更改为 size()-1,而是希望了解这里发生了什么。

Any ideas?有任何想法吗? (I'll continue to perturb the code until I can find the apparent cause of this, but hopefully this will be obvious to someone else.) (我将继续扰乱代码,直到找到造成这种情况的明显原因,但希望这对其他人来说是显而易见的。)

(I'm using Visual Studio 2013 Community Edition, if that matters.) (如果重要的话,我使用的是 Visual Studio 2013 社区版。)

..... .....

Here's a stand-alone file that compiles that shows the problem:这是一个编译后显示问题的独立文件:

#include <vector>

using namespace std;

namespace layer {
    class Layer {
    public:
        Layer(float alpha = 0, float momentum = 0.9f, float weight_decay = 0);
        virtual ~Layer();

        // three virtual method that all layers should have
        virtual void forward(bool train = true) = 0;
        virtual void backward() = 0;
        virtual void update() = 0;

        void adjust_learning(float scale); // change the learning rate

        Layer* prev;                    // previous layer
        Layer* next;                    // next layer
        float* data;                    // X': output (cuDNN y)
        int batch;                      // n: batch size
        float alpha;                    // learning rate
        float momentum;                 // beta: momentum of gradient
        float weight_decay;             // gamma: weight decay rate
    };
} /* namespace layer */

namespace layer {
    Layer::Layer(float alpha_, float momentum_, float weight_decay_)
    {
        std::memset(this, 0, sizeof(*this));
        alpha = alpha_;
        momentum = momentum_;
        weight_decay = weight_decay_;
    }

    Layer::~Layer() {}

    void Layer::adjust_learning(float scale) {
        alpha *= scale;
    }
}

namespace layer {

    class Input : public Layer {
    public:
        Input(int n, int c, int h, int w, float* _data);
        virtual ~Input();
        void forward(bool train = true);
        void backward();
        void update();
    };

}

namespace layer {

    Input::Input(int n, int c, int h, int w, float* _data) : Layer() {
        prev = NULL;

        batch = n;
        data = _data;
    }

    Input::~Input() {
        data = NULL;
    }

    void Input::forward(bool train) {
        // nothing
    }

    void Input::backward() {
        // nothing
    }

    void Input::update() {
        // nothing
    }

}

using namespace layer;

namespace model {

    class Network {
    private:
        std::vector<Layer*> layers; // list of layers
        bool has_input, has_output; // sanity check
        float* data; // input on device
        int batch; // whole size of data, batch size
    public:
        Network(int batch_size);
        virtual ~Network();
        void PushInput(int c, int h, int w);
    };
}

namespace model {
    void Network::PushInput(int c, int h, int w) {

        Input* input = new Input(batch, c, h, w, data);
        layers.push_back(input);
        Layer *foo = layers.back();  // **WHY DOES THIS ASSERT??**
    }
    Network::Network(int _batch) {
        std::memset(this, 0, sizeof(*this));
        batch = _batch;
    }

    Network::~Network() {
        for (Layer* l : layers)
            delete l;
    }
}

void main()
{
    model::Network foo(10);

    foo.PushInput(2, 3, 4);
}

You have undefined behavior in your code.您的代码中有未定义的行为

In the Layer constructor you doLayer构造函数中,您执行

std::memset(this, 0, sizeof(*this));

The problem with this is that the above call will clear the virtual function table (which is a part of the object) as well.这样做的问题是上面的调用也会清除虚函数表(它是对象的一部分)。 Any virtual function called after that will not work as expected, if at all.之后调用的任何虚函数都不会按预期工作,如果有的话。 That includes the destruction of the objects as the destructors are virtual.这包括对象的销毁,因为析构函数是虚拟的。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM