简体   繁体   中英

C++ Memory Error

When I compile my code, I repeatedly get the error

free(): invalid next size (fast)

Yet the code only goes so far as to create references. Specifically, commenting out a specific line seems to fix the error; however, it's a very important line.

void neuron::updateWeights(layer &prevLayer) {
    for(unsigned i = 0; i < prevLayer.size(); i++) {
        double oldDeltaWeight = prevLayer[i].m_connections[m_index].m_deltaWeight;
        double newDeltaWeight = eta * prevLayer[i].m_output * m_gradient + alpha * oldDeltaWeight;
        prevLayer[i].m_connections[m_index].m_deltaWeight = newDeltaWeight; // THIS LINE
        prevLayer[i].m_connections[m_index].m_weight += newDeltaWeight; 
    }
}

Any help would be very appreciated!

EDIT: Additional code // Headers #include "../../Include/neuralNet.h"

// Libraries
#include <vector>
#include <iostream>
#include <cmath>

// Namespace
using namespace std;

// Class constructor
neuron::neuron(unsigned index, unsigned outputs) {
    m_index = index;
    for(unsigned i = 0; i < outputs; i++) {
        m_connections.push_back(connection());
    }
    // Set default neuron output
    setOutput(1.0);
}

double neuron::eta = 0.15;    // overall net learning rate, [0.0..1.0]
double neuron::alpha = 0.5;   // momentum, multiplier of last deltaWeight, [0.0..1.0]

// Definition of transfer function method
double neuron::transferFunction(double x) const {
    return tanh(x); // -1 -> 1
}

// Transfer function derivation method
double neuron::transferFunctionDerivative(double x) const {
    return 1 - x*x; // Derivative of tanh
}

// Set output value
void neuron::setOutput(double value) {
    m_output = value;
}

// Forward propagate
void neuron::recalculate(layer &previousLayer) {

    double sum = 0.0;
    for(unsigned i = 0; i < previousLayer.size(); i++) {
        sum += previousLayer[i].m_output * previousLayer[i].m_connections[m_index].m_weight;
    }
    setOutput(transferFunction(sum));
}

// Change weights based on target
void neuron::updateWeights(layer &prevLayer) {
    for(unsigned i = 0; i < prevLayer.size(); i++) {
        double oldDeltaWeight = prevLayer[i].m_connections[m_index].m_deltaWeight;
        double newDeltaWeight = eta * prevLayer[i].m_output * m_gradient + alpha * oldDeltaWeight;
        prevLayer[i].m_connections[m_index].m_deltaWeight = newDeltaWeight;
        prevLayer[i].m_connections[m_index].m_weight += newDeltaWeight; 
    }
}

// Complex math stuff
void neuron::calculateOutputGradients(double target) {
    double delta = target - m_output;
    m_gradient = delta * transferFunctionDerivative(m_output);
}

double neuron::sumDOW(const layer &nextLayer) {
    double sum = 0.0;

    for(unsigned i = 1; i < nextLayer.size(); i++) {
        sum += m_connections[i].m_weight * nextLayer[i].m_gradient;
    }

    return sum;
}

void neuron::calculateHiddenGradients(const layer &nextLayer) {
    double dow = sumDOW(nextLayer);
    m_gradient = dow * neuron::transferFunctionDerivative(m_output);
}

Also the line is called here

 // Update weights
    for(unsigned layerIndex = m_layers.size() - 1; layerIndex > 0; layerIndex--) {
        layer &currentLayer = m_layers[layerIndex];
        layer &previousLayer = m_layers[layerIndex - 1];

        for(unsigned i = 1; i < currentLayer.size(); i++) {
            currentLayer[i].updateWeights(previousLayer);
        }
    }    

Your constructor initialize N 'outputs' m_connections in the class.

But you have a lot of places calling:

m_connections[m_index]

What happens if m_index > outputs? Is this possible in your problem? Try including an assert ( http://www.cplusplus.com/reference/cassert/assert/ ) in the first line of the constructor:

assert(index < outputs)

You are probably having a bad pointer access somewhere.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM