-
Notifications
You must be signed in to change notification settings - Fork 451
Closed
Labels
bugAn unexpected problem or unintended behaviorAn unexpected problem or unintended behavior
Description
Describe the bug
Hello, thank you for developing this tool!
When Converting a Keras model containing a Bidirectional LSTM layer, the activation function attribute (activations) is not set in the corresponding lstm onnx operator.
The bug does not seem to appear if we are not in the "Bidirectional" context.
System information
- OS Platform and Distribution (e.g., Linux Ubuntu 20.04*):
- TensorFlow Version: 2.11
- tf2onnx: 1.14.0
- Python version: 3.10
To Reproduce
1: Creating the Keras network with the activation functions set to "relu":
#! /usr/bin/env python3
from tensorflow.keras.models import Model
from tensorflow.keras.layers import Input, LSTM, Bidirectional
input = Input(shape=(1,1))
model = Bidirectional(LSTM(units=1, return_sequences=True, activation="relu", recurrent_activation="relu"))(input)
model = Model(input, model)
model.save(f"lstm_issue_activation_function.h5")
2: converting the model
#! /usr/bin/env python3
import tensorflow as tf
import onnx
import sys
import tf2onnx
target_model = sys.argv[1]
model = tf.keras.models.load_model(target_model)
spec = (tf.TensorSpec((None,1, 1), tf.float32, name="input"),)
output_path = model.name + ".onnx"
model_proto, _ = tf2onnx.convert.from_keras(model, input_signature=spec, opset=15, output_path=output_path)
onnx.save(model_proto, output_path)
3: checking the converted model
While inspecting the resulting onnx model, we can see that the attribute "activations" of the lstm node is not set and will default to "sigmoid/tanh" rather than the expected "relu" functions
Metadata
Metadata
Assignees
Labels
bugAn unexpected problem or unintended behaviorAn unexpected problem or unintended behavior