JasonXu
JasonXu
Edit: I figured it out myself. For those who don't know how to call signatures like myself, here is the way ``` loaded.signatures["k_50"](tf.constant(["red dress"])) ``` **Original question:** I was able...
Below is how I save & load the model/index ``` scann = tfrs.layers.factorized_top_k.ScaNN( model.query_model, num_reordering_candidates=100 ) scann.index_from_dataset( products.batch(128).map( lambda x: (x['product_token'] + ': ' + x['product_name'], # idx identifier model.product_model({...
Have you tried the WARP loss for TFRS? WARP loss is implemented in lightfm, and I think that's probably the main reason driving the difference here.
@ydennisy I found this: https://gist.github.com/vihari/c3c59bf2e4f18722a872499b0394986c First define the warp loss as above. Then specify them like below ``` self.task: tf.keras.layers.Layer = tfrs.tasks.Retrieval(loss=warp_loss, ...) ```
With the category cross-entropy loss, have you tried to tune the parameter `num_hard_negatives` -- it shares somewhat similar idea with WARP loss. This may help boost the performance to the...
@drtinumohan Curious how did you setup the sample_weight?