简体   繁体   中英

Neural Network classification

I am trying to train a Multi-Layer Feed Forward Neural Network for (Statlog) Shuttle data set -

This is a multi-class classification task. The target attribute is 'Class'.

The code that I have is as follows-

# Column names to be used for training and testing sets-
col_names = ['A1', 'A2', 'A3', 'A4', 'A5', 'A6', 'A7', 'A8', 'A9', 'Class']

# Read in training and testing datasets-
training_data = pd.read_csv("shuttle_training.csv", delimiter = ' ', names = col_names)
testing_data = pd.read_csv("shuttle_test.csv", delimiter = ' ', names = col_names)

print("\nTraining data dimension = {0} and testing data dimension = {1}\n".format(training_data.shape, testing_data.shape))
# Training data dimension = (43500, 10) and testing data dimension = (14500, 10)

# Data Preprocessing-

# Check for missing value(s) in training data-
training_data.isnull().values.any()
# False

# Get target attribute class distribution-
training_data["Class"].value_counts()
'''
1    34108
4     6748
5     2458
3      132
2       37
7       11
6        6
Name: Class, dtype: int64
'''
# NOTE: Majority of instances belong to class 1

# Visualizing the distribution of each attribute in dataset using boxplots-
fig=plt.figure(figsize=(18, 16), dpi= 80, facecolor='w', edgecolor='k')

sns.boxplot(data = training_data)
plt.xticks(rotation = 20)
plt.show()

# # To divide the data into attributes and labels, execute the following code:

# 'X' contains attributes
X = training_data.drop('Class', axis = 1)

# Convert 'X' to float-
X = X.values.astype("float")

# 'y' contains labels
y = training_data['Class']

# Normalize features (X)-
rb_scaler = RobustScaler()

X_std = rb_scaler.fit_transform(X)

# Divide attributes & labels into training & testing sets-
X_train, X_test, y_train, y_test = train_test_split(X_std, y, test_size = 0.30, stratify = y)

print("\nDimensions of training and testing sets are:")
print("X_train = {0}, y_train = {1}, X_test = {2} and y_test = {3}\n\n".format(X_train.shape, y_train.shape, X_test.shape, y_test.shape))
# Dimensions of training and testing sets are:
# X_train = (30450, 9), y_train = (30450,), X_test = (13050, 9) and y_test = (13050,)

from keras.models import Sequential
from keras.layers import Dense
from sklearn.model_selection import StratifiedKFold                     
from sklearn.pipeline import Pipeline
from keras.wrappers.scikit_learn import KerasClassifier
from sklearn.model_selection import cross_val_score

# Create Neural Network model-
model = Sequential()

# Input layer-
model.add(Dense(9, input_dim = 9, kernel_initializer = 'normal', activation = 'relu'))

# Hidden layer(s)-
model.add(Dense(9, kernel_initializer = 'normal', activation='relu'))

# Output layer-
model.add(Dense(7, activation = 'softmax'))  # 7 output neurons for 7 classes in target attribute

# Compile NN model-
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])

'''
Layer (type)                 Output Shape              Param #   
=================================================================
dense_1 (Dense)              (None, 9)                 90        
_________________________________________________________________
dense_2 (Dense)              (None, 9)                 90        
_________________________________________________________________
dense_3 (Dense)              (None, 7)                 70        
=================================================================
Total params: 250
Trainable params: 250
Non-trainable params: 0
_________________________________________________________________

'''

# Train model on training data-
history = model.fit(X_train, y_train, epochs = 200, batch_size = 50, validation_data = (X_test, y_test), verbose = 1, shuffle = False)

It gives me the error-

ValueError: Error when checking target: expected dense_3 to have shape (7,) but got array with shape (1,)

Well, according to the 'Class' attribute (which is our target), there seems to be in total 7 classes (though with a severe class imbalance). So why am I getting this error? Any clues?

Thanks!

Error Trace-

--------------------------------------------------------------------------- ValueError Traceback (most recent call last) in ----> 1 history = model.fit(X_train, y_train, epochs = 200, batch_size = 50, validation_data = (X_test, y_test), verbose = 1, shuffle = False)

~/.local/lib/python3.6/site-packages/keras/engine/training.py in fit(self, x, y, batch_size, epochs, verbose, callbacks, validation_split, validation_data, shuffle, class_weight, sample_weight, initial_epoch, steps_per_epoch, validation_steps, **kwargs) 950 sample_weight=sample_weight, 951 class_weight=class_weight, --> 952 batch_size=batch_size) 953 # Prepare validation data. 954 do_validation = False

~/.local/lib/python3.6/site-packages/keras/engine/training.py in _standardize_user_data(self, x, y, sample_weight, class_weight, check_array_lengths, batch_size) 787 feed_output_shapes, 788 check_batch_axis=False, # Don't enforce the batch size. --> 789 exception_prefix='target') 790 791 # Generate sample-wise weight values given the sample_weight and

~/.local/lib/python3.6/site-packages/keras/engine/training_utils.py in standardize_input_data(data, names, shapes, check_batch_axis, exception_prefix) 136 ': expected ' + names[i] + ' to have shape ' + 137 str(shape) + ' but got array with shape ' + --> 138 str(data_shape)) 139 return data 140

ValueError: Error when checking target: expected dense_3 to have shape (7,) but got array with shape (1,)

You need convert your y_train/y_test to categorical one-hot vectors. Add this code just after the train/test split.

y_test = to_categorical(y_test)
y_train = to_categorical(y_train)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM