简体   繁体   中英

Input 0 of layer dense_18 is incompatible with the layer: expected axis -1 of input shape to have value 3500 but received input with shape [None, 7]

I'm making a ML project using Tensorflow that predicts chances of getting admitted to a school using student data. I've ran into a issue that I can't understand (I believe it has something to do with the input shape). Here is my code and the dataset I'm using.

Dataset: https://www.kaggle.com/mohansacharya/graduate-admissions

Code:

import numpy as np
import tensorflow as tf
import keras
import pandas as pd


data = pd.read_csv("Admission_Predict.csv")
df = pd.DataFrame(data)

X = df.drop(df.columns[[0, -1]], axis=1).values
y = df[df.columns[-1]].values

model = keras.Sequential([
    keras.layers.Flatten(input_shape=(500, 7)),
    keras.layers.Dense(32),
    keras.layers.Dense(1)
])

model.compile(optimizer="adam",
             loss="mse",
             metrics=["accuracy"])

model.fit(X, y, epochs=10)

The error:

Epoch 1/10
WARNING:tensorflow:Model was constructed with shape (None, 500, 7) for input Tensor("flatten_14_input:0", shape=(None, 500, 7), dtype=float32), but it was called on an input with incompatible shape (None, 7).
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-160-e60f87ea5845> in <module>
----> 1 model.fit(X, y, epochs=10)

~\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\keras\engine\training.py in _method_wrapper(self, *args, **kwargs)
     64   def _method_wrapper(self, *args, **kwargs):
     65     if not self._in_multi_worker_mode():  # pylint: disable=protected-access
---> 66       return method(self, *args, **kwargs)
     67 
     68     # Running inside `run_distribute_coordinator` already.

~\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\keras\engine\training.py in fit(self, x, y, batch_size, epochs, verbose, callbacks, validation_split, validation_data, shuffle, class_weight, sample_weight, initial_epoch, steps_per_epoch, validation_steps, validation_batch_size, validation_freq, max_queue_size, workers, use_multiprocessing)
    846                 batch_size=batch_size):
    847               callbacks.on_train_batch_begin(step)
--> 848               tmp_logs = train_function(iterator)
    849               # Catch OutOfRangeError for Datasets of unknown size.
    850               # This blocks until the batch has finished executing.

~\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\eager\def_function.py in __call__(self, *args, **kwds)
    578         xla_context.Exit()
    579     else:
--> 580       result = self._call(*args, **kwds)
    581 
    582     if tracing_count == self._get_tracing_count():

~\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\eager\def_function.py in _call(self, *args, **kwds)
    609       # In this case we have created variables on the first call, so we run the
    610       # defunned version which is guaranteed to never create variables.
--> 611       return self._stateless_fn(*args, **kwds)  # pylint: disable=not-callable
    612     elif self._stateful_fn is not None:
    613       # Release the lock early so that multiple threads can perform the call

~\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\eager\function.py in __call__(self, *args, **kwargs)
   2417     """Calls a graph function specialized to the inputs."""
   2418     with self._lock:
-> 2419       graph_function, args, kwargs = self._maybe_define_function(args, kwargs)
   2420     return graph_function._filtered_call(args, kwargs)  # pylint: disable=protected-access
   2421 

~\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\eager\function.py in _maybe_define_function(self, args, kwargs)
   2772           and self.input_signature is None
   2773           and call_context_key in self._function_cache.missed):
-> 2774         return self._define_function_with_shape_relaxation(args, kwargs)
   2775 
   2776       self._function_cache.missed.add(call_context_key)

~\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\eager\function.py in _define_function_with_shape_relaxation(self, args, kwargs)
   2703     self._function_cache.arg_relaxed_shapes[rank_only_cache_key] = (
   2704         relaxed_arg_shapes)
-> 2705     graph_function = self._create_graph_function(
   2706         args, kwargs, override_flat_arg_shapes=relaxed_arg_shapes)
   2707     self._function_cache.arg_relaxed[rank_only_cache_key] = graph_function

~\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\eager\function.py in _create_graph_function(self, args, kwargs, override_flat_arg_shapes)
   2655     arg_names = base_arg_names + missing_arg_names
   2656     graph_function = ConcreteFunction(
-> 2657         func_graph_module.func_graph_from_py_func(
   2658             self._name,
   2659             self._python_function,

~\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\framework\func_graph.py in func_graph_from_py_func(name, python_func, args, kwargs, signature, func_graph, autograph, autograph_options, add_control_dependencies, arg_names, op_return_value, collections, capture_by_value, override_flat_arg_shapes)
    979         _, original_func = tf_decorator.unwrap(python_func)
    980 
--> 981       func_outputs = python_func(*func_args, **func_kwargs)
    982 
    983       # invariant: `func_outputs` contains only Tensors, CompositeTensors,

~\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\eager\def_function.py in wrapped_fn(*args, **kwds)
    439         # __wrapped__ allows AutoGraph to swap in a converted function. We give
    440         # the function a weak reference to itself to avoid a reference cycle.
--> 441         return weak_wrapped_fn().__wrapped__(*args, **kwds)
    442     weak_wrapped_fn = weakref.ref(wrapped_fn)
    443 

~\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\framework\func_graph.py in wrapper(*args, **kwargs)
    966           except Exception as e:  # pylint:disable=broad-except
    967             if hasattr(e, "ag_error_metadata"):
--> 968               raise e.ag_error_metadata.to_exception(e)
    969             else:
    970               raise

ValueError: in user code:

    C:\Users\User\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\keras\engine\training.py:571 train_function  *
        outputs = self.distribute_strategy.run(
    C:\Users\User\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\distribute\distribute_lib.py:951 run  **
        return self._extended.call_for_each_replica(fn, args=args, kwargs=kwargs)
    C:\Users\User\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\distribute\distribute_lib.py:2290 call_for_each_replica
        return self._call_for_each_replica(fn, args, kwargs)
    C:\Users\User\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\distribute\distribute_lib.py:2649 _call_for_each_replica
        return fn(*args, **kwargs)
    C:\Users\User\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\keras\engine\training.py:531 train_step  **
        y_pred = self(x, training=True)
    C:\Users\User\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\keras\engine\base_layer.py:927 __call__
        outputs = call_fn(cast_inputs, *args, **kwargs)
    C:\Users\User\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\keras\engine\sequential.py:277 call
        return super(Sequential, self).call(inputs, training=training, mask=mask)
    C:\Users\User\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\keras\engine\network.py:717 call
        return self._run_internal_graph(
    C:\Users\User\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\keras\engine\network.py:888 _run_internal_graph
        output_tensors = layer(computed_tensors, **kwargs)
    C:\Users\User\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\keras\engine\base_layer.py:885 __call__
        input_spec.assert_input_compatibility(self.input_spec, inputs,
    C:\Users\User\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\keras\engine\input_spec.py:212 assert_input_compatibility
        raise ValueError(

    ValueError: Input 0 of layer dense_26 is incompatible with the layer: expected axis -1 of input shape to have value 3500 but received input with shape [None, 7]

your data are still in 2D, you don't need to use a flatten layer. your model must be:

model = keras.Sequential([
    keras.layers.Dense(32, input_shape=(X.shape[-1],)),
    keras.layers.Dense(1)
])

pay attention also to the input dimensionality you specify... pass only the feature dimension and not the sample dimension

a final note: if your problem is a regression task accuracy is not a suitable metric

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

Related Question Input 0 of layer dense is incompatible with the layer: expected axis -1 of input shape to have value 3 but received input with shape (None, 1) ValueError: Input 0 of layer dense is incompatible with the layer: expected axis -1 to have value 8 but received input with shape [None, 1] Input 0 of layer dense is incompatible with the layer: expected axis -1 of input shape to have value 8192 but received input with shape (None, 61608) Input 0 of layer dense is incompatible with the layer: expected axis -1 of input shape to have value 2700 but received input with shape [None, 900] Input 0 of layer sequential is incompatible with the layer: expected axis -1 of input shape to have value 8 but received input with shape (None, 71) “ValueError: …incompatible with the layer: expected axis -1 of input shape to have value 8 but received input with shape (None, 7, 169)” ValueError: ...incompatible with the layer: expected axis -1 of input shape to have value 20 but received input with shape (None, 20, 637) Input 0 of layer conv2d is incompatible with layer: expected axis -1 of input shape to have value 1 but received input with shape [None, 64, 64, 3] ValueError: Input 0 of layer dense_24 is incompatible: expected axis -1 of input shape to have value 1024 but received input with shape [16, 512] ValueError: Input 0 of layer sequential is incompatible with the layer: expected axis -1 of input shape to have value 3 but received input with shape
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM