简体   繁体   中英

Tensorflow custom estimator for logical 'AND'

I'm trying to create a simple one-layer/one-unit nn with tensorflow custom estimators that will be able to compute logical AND operation, but I've got a trouble with sigmoid activation -- I want to set threshold

here is my code

x = np.array([
    [0, 0],
    [1, 0],
    [0, 1],
    [1, 1]
], dtype=np.float32)

y = np.array([
    [0],
    [0],
    [0],
    [1]
])

def sigmoid(val):
    res = tf.nn.sigmoid(val)
    isGreater = tf.greater(res, tf.constant(0.5))
    return tf.cast(isGreater, dtype=tf.float32)

def model_fn(features, labels, mode, params):
    predictions = tf.layers.dense(inputs=features, units=1, activation=sigmoid)

    if mode == tf.estimator.ModeKeys.PREDICT:
        return tf.estimator.EstimatorSpec(mode=mode, predictions=predictions)

    loss = tf.losses.sigmoid_cross_entropy(labels, predictions)
    optimizer = tf.train.GradientDescentOptimizer(0.5)
    train_op = optimizer.minimize(loss, global_step=tf.train.get_global_step())

    return tf.estimator.EstimatorSpec(mode=mode, loss=loss, train_op=train_op)

nn = tf.estimator.Estimator(model_fn=model_fn)
input_fn = tf.estimator.inputs.numpy_input_fn(x=x, y=y, shuffle=False, num_epochs=None)

nn.train(input_fn=input_fn, steps=500)

BUT this throws an error

ValueError: No gradients provided for any variable, check your graph for ops that do not support gradients, between variables ["<tf.Variable 'dense/kernel:0' shape=(2, 1) dtype=float32_ref>", "<tf.Variable 'dense/bias:0' shape=(1,) dtype=float32_ref>"] and loss Tensor("sigmoid_cross_entropy_loss/value:0", shape=(), dtype=float32).

How can I fix this? Please help..

Another question I've got - why Tensorflow does not have inbuilt threshold for sigmoid activation? Isn't it one of the most needed things for binary classification (with sigmoid/tanh)?

There is a built-in sigmoid activation, which is tf.nn.sigmoid .

However when you create a network you should never use an activation on the last layer. You need to provide unscaled logits to the layer, like this:

predictions = tf.layers.dense(inputs=features, units=1, activation=None)

loss = tf.losses.sigmoid_cross_entropy(labels, predictions)

Otherwise, with your custom sigmoid your predictions will be either 0 or 1 and there is no gradient available for this.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM