There are several solutions to this problem. You can lower the version of keras, for example:
pip install keras==2.1
However, there is a more convenient way. From the error, we can see that softmax does not contain the axis parameter, so we can replace the axis parameter with dim. The source code is as follows:
def softmax(x, axis=-1):
"""Softmax of a tensor.
# Arguments
x: A tensor or variable.
axis: The dimension softmax would be performed on.
The default is -1 which indicates the last dimension.
# Returns
A tensor.
"""
return tf.nn.softmax(x, axis=axis)
Change to this:
def softmax(x, axis=-1):
"""Softmax of a tensor.
# Arguments
x: A tensor or variable.
axis: The dimension softmax would be performed on.
The default is -1 which indicates the last dimension.
# Returns
A tensor.
"""
return tf.nn.softmax(x, dim=axis)
That’s to change the last line
Similar Posts:
- Keras.utils.to in keras_ Categorical method
- How to Solve Tensorflow SincNet Error
- TypeError: ‘module’ object is not callable
- [Solved] Tensorflow TypeError: Fetch argument array has invalid type ‘numpy.ndarry’
- tf.nn.top_k(input, k, name=None) & tf.nn.in_top_k(predictions, targets, k, name=None)
- [Solved] TensorFlow Error: InternalError: Failed copying input tensor
- [Solved] TypeError: can’t convert CUDA tensor to numpy. Use Tensor.cpu() to copy the tensor to host memory first.
- Pytorch: How to Use pack_padded_sequence & pad_packed_sequence
- Tensorflow gradients TypeError: Fetch argument None has invalid type
- [Solved] RuntimeError: one_hot is only applicable to index tensor