recommenders icon indicating copy to clipboard operation
recommenders copied to clipboard

[Question] How to properly visualize the model structure as well as save model?

Open xiaoyaoyang opened this issue 4 years ago • 7 comments

I tried tf.keras.utils.plot_model

But it seems not to work.. only print the model name, is it because we did not use Functional API here?

Also for saving the model, model.save() works but later the weights and summary can not be called even if I do loaded_model(inputs) first.

I am also a little bit confused of Model vs. Layer class. I see all tutorials here use Model class for both query and candidate tower, but -> https://www.tensorflow.org/guide/keras/custom_layers_and_models#the_model_class . It says if you do not need to call fit (which is the case for query and candidate tower), consider use layer class

xiaoyaoyang avatar Feb 03 '22 03:02 xiaoyaoyang

I have the same question as well. I couldn't find tfrs examples that showcase how to save and load a seralized model. I keep running into problems where the input pipeline graph cannot be seralized. Not sure how practical this is if we cannot port the tfrs model from one env to another.

chengyineng38 avatar Mar 29 '22 21:03 chengyineng38

+1 here I think this seems quite a frequent issue.

@maciejkula could you please help out here? I think a guide would save lots of work :)

ydennisy avatar Apr 21 '22 16:04 ydennisy

I found that if I implement the get_config() method then I was able to serialize the model: https://www.tensorflow.org/guide/keras/custom_layers_and_models#you_can_optionally_enable_serialization_on_your_layers . +1 to having a more explicit / dedicated guide though.

chengyineng38 avatar Apr 21 '22 23:04 chengyineng38

@chengyineng38 thanks! I will look it up. I end up saving the checkpoints (save_weights) and load_weights later. it seems the TFRS model is a "custom"/"sub-class" model and I am referring here how to save sub-class model, seems aligned with what you posted.

+1 for explicit guide

xiaoyaoyang avatar Apr 22 '22 01:04 xiaoyaoyang

There are two ways of saving model state in TensorFlow:

  1. Using checkpoints for saving model variables during training.
  2. Using tf.saved_model.save/load to store and load models for serving.

Here are the Keras docs for checkpointing (1). For (2), you can call tf.saved_model.save on your trained Model instance.

maciejkula avatar Jun 03 '22 21:06 maciejkula

There are two ways of saving model state in TensorFlow:

  1. Using checkpoints for saving model variables during training.
  2. Using tf.saved_model.save/load to store and load models for serving.

Here are the Keras docs for checkpointing (1). For (2), you can call tf.saved_model.save on your trained Model instance.

Thanks! I did not explore this for a while and kept using save_weight, but I remember last time I had a struggle with saved_model... (might be missing signature or sth). feels there are some extra steps to save it for the Custom model..

xiaoyaoyang avatar Jun 28 '22 21:06 xiaoyaoyang

Below is how I save & load the model/index

scann = tfrs.layers.factorized_top_k.ScaNN(
    model.query_model, 
    num_reordering_candidates=100
)

scann.index_from_dataset(
    products.batch(128).map(
        lambda x: (x['product_token'] + ': ' + x['product_name'], # idx identifier
                   model.product_model({                          # transform embeddings 
                       "product_name": x['product_name'],
                       "product_description": x["product_description"]
                   }))
    )
)

path = './your_model_name'
tf.saved_model.save(
    scann,
    path,
    options=tf.saved_model.SaveOptions(namespace_whitelist=["Scann"])
)

loaded = tf.saved_model.load(path)

_, titles = loaded({
    "query_text": np.array(["hello world"])
})
for title in titles[0]:
    print(title)

jasonzyx avatar Aug 17 '22 03:08 jasonzyx