简体   繁体   中英

AttributeError: 'TensorSliceDataset' object has no attribute 'dtype'

Here's what I did:

def prepare_data(self, features, labels):
  assert features.shape[0] == labels.shape[0]
  print("DEBUG: features: shape = " + str(features.shape) \
    + " , dtype(0,0) = " + str(type(features[0,0])))
  print("DEBUG: labels: shape = " + str(labels.shape) \
    + ", dtype(0) = " + str(type(labels[0])))
  dataset = tf.data.Dataset.from_tensor_slices( (features, labels) )
  iterator = dataset.make_one_shot_iterator()
  return dataset, iterator

...

self.train_features = np.asarray(train_features_list)
self.train_labels = np.asarray(train_labels_list)
self.train_data, self.train_it = \
    self.prepare_data(self.train_features, self.train_labels)

hidden1 = tf.layers.dense(self.train_data,
    self.input_layer_size * 40,
    activation=tf.nn.relu,
    name='hidden1')

And this is what I've got:

DEBUG: features: shape = (4000, 3072) , dtype(0,0) = <class 'numpy.uint8'>
DEBUG: labels: shape = (4000,), dtype(0) = <class 'numpy.int64'>
...
AttributeError: 'TensorSliceDataset' object has no attribute 'dtype'

With the error location pointing to this code in tensorflow/python/layers/core.py:

layer = Dense(units,
            activation=activation,
            use_bias=use_bias,
            kernel_initializer=kernel_initializer,
            bias_initializer=bias_initializer,
            kernel_regularizer=kernel_regularizer,
            bias_regularizer=bias_regularizer,
            activity_regularizer=activity_regularizer,
            kernel_constraint=kernel_constraint,
            bias_constraint=bias_constraint,
            trainable=trainable,
            name=name,
            dtype=inputs.dtype.base_dtype,
            _scope=name,
            _reuse=reuse)

Could you tell me what am I doing wrong here?

Your tf.layers.dense accepts tensor as input, but you are feeding it a tf data object. That is why its probably throwing this error.

I have modified your code with an example and that does not throw an error. Also, the dense layer will expect 2 dimensions as input so I included the batch in your function so that it is 2 dim.

def prepare_data(features, labels):
  assert features.shape[0] == labels.shape[0]
  print("DEBUG: features: shape = " + str(features.shape) \
    + " , dtype(0,0) = " + str(type(features[0,0])))
  print("DEBUG: labels: shape = " + str(labels.shape) \
    + ", dtype(0) = " + str(type(labels[0])))
  dataset = tf.data.Dataset.from_tensor_slices( (features, labels) )
  iterator = dataset.batch(1).make_one_shot_iterator() # Modified here
  return iterator # Returned only the iterator

train_features = np.random.randn(4000, 3072) 
train_labels = np.random.randn(4000)
train_it = prepare_data(train_features, train_labels)

input_data, input_label = train_it.get_next() # Getting the input feature from the iterator
hidden1 = tf.layers.dense(input_data, 40, activation=tf.nn.relu, name='hidden1') # Used 40 as an example

Result:

DEBUG: features: shape = (4000, 3072) , dtype(0,0) = <class 'numpy.float64'>
DEBUG: labels: shape = (4000,), dtype(0) = <class 'numpy.float64'> 

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM