简体   繁体   中英

Keras access individual values in custom loss function

I want to achieve the following loss function:

  • Receive a batch of dimension (batch_size, number_out_nodes)
  • Apply a scipy function to each row/sample which maps it to a real number
  • Combine the results of all samples from the batch

I already wrote the function using numpy arrays as input.

It looks something like this:

 def loss_func(y_true, y_pred):
      y_pred = np.array(y_pred)
      list_of_factors = [4]*len(y_pred)
      val = 0
      for idx, factor in list_of_factors:
          val += factor*scipy_func(y_pred[idx])
      return val

Is there any way to implement this function as Keras loss function? I do not know how to access the individual samples of the batch.

Thanks

You can't apply any scipy function because the variables you get in the loss function are tensors and not nD NumPy arrays. So no NumPy or SciPy inside the loss function that take tensors as arguments.

Depending on what your scipy function is, you might be able to implement it using operations available in the Keras backend . Most of the functions are similar to NumPy operations but act on tensors.

Have a look at existing loss functions and how they operate on tensors using those backend functions.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM