-
Notifications
You must be signed in to change notification settings - Fork 217
Can not export the transform graph from tf.Transform together with model definition using keras in TensorFlow 2.0. #150
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Hi, @zoyahav @rmothukuru , could you please share more information about this bug? |
As a hack, I implemented a custom Keras layer that applies the tf-transform function ( |
Any progress on this issue? |
Yes, tf.transform currently supports Keras, though we're actively working on improving this support. Example usage: https://github.com/tensorflow/transform/blob/master/examples/census_example_v2.py#L344-L356 |
Thanks, super helpful! |
@brightcoder01, |
Automatically closing due to lack of recent activity. Please update the issue when new information becomes available, and we will reopen the issue. Thanks! |
We cannot export the transform graph from tf.Transform and the graph from Keras Model together into a single SavedModel.
Background
Let's focus on these three stages in the end-to-end machine learning lifecycle of TFX:
Transform stage
: tf.Transform component transforms the raw data into the data used to train a machine learning model. It also exports the transform logic in the SavedModel format. We name itsaved_model_trans
.Training stage
: TensorFlow uses the preprocessed data to train a model. After completing the training job, it exports the model graph together with the transform graph (fromsaved_model_trans
) as one SavedModel. We name itsaved_model_final
.Serving stage
: TensorFlow Serving can loadsaved_model_final
to provide inference service. The schemas of the inference request and the raw data of the transform stage are exactly the same.Current state: how to export transform graph and model graph together to a SavedModel?
√ Estimator (Work Well):
From tf.transform official tutorial, at the stage of exporting model, it will call estimator.export_saved_model(exported_model_dir, serving_input_fn) to complete the model exporting work. Inside its implementation, it calls serving_input_fn to load the transform graph from SavedModel at first, and then calls estimator's model_fn to generate the model graph, combines these two graph into one graph and finally exports it into one SavedModel. Please check the code snippet.
× Keras (Can not):
For TF2.0, we define a model using keras and exports it by calling tf.saved.saved_model. This SavedModel only contains the model definition including feature columns and NN structure.
tf.saved.saved_model
api doesn't have the parameterserving_input_fn
just like estimator and it lacks the flexibility to combine transform graph and model graph together for inference. It will break the integration between tf.Transform and TensorFlow 2.0.We want to improve
tf.saved.saved_model
api in Keras to support this function. So that TensorFlow 2.0 can work well with tf.Transform.The text was updated successfully, but these errors were encountered: