Hurizma1003

Results 17 comments of Hurizma1003

lets assume i have 5000 documents and their 5000 integer labels and in this corpus we got 14000 unique words. according to paper total num of nodes will be ==>...

i tried installing using following commands: during installation i didnt received any error. `!pip install torch-scatter -f https://data.pyg.org/whl/torch-1.10.0+cu113.html !pip install torch-sparse -f https://data.pyg.org/whl/torch-1.10.0+cu113.html !pip install torch-geometric` but i am getting...

I am facing exact same issue , how to solve it any update

replace model section in notebook with following: `model = implicit.als.AlternatingLeastSquares(factors=20, regularization=0.1, iterations=50) alpha = 15 data = (sparse_customer_item * alpha).astype('double') model.fit(data)` implicit model expects customer_item data

hi , i am trying to save index using following ways. 1. `items = user_item_interactions.map(lambda x: x["item-text"] ) index = tfrs.layers.factorized_top_k.BruteForce(model.query_model) index.index_from_dataset( tf.data.Dataset.zip((items.batch(4), user_item_interactions.batch(4).map(model.candidate_model))) ) tf.saved_model.save(index, path) ` there is...

> Will you elaborate on your problem setting? If (a) you have a pre-existing graph which you trained on, and a new node is added to that graph, then no...

i also want to to know ,could u reply?

Thanks, in step 3. it should be scores[num_items:] ri8?

`{'factorized_top_k/top_1_categorical_accuracy': [0.04586755111813545, 0.012231347151100636, 0.014677616767585278, 0.01895858906209469, 0.02192905731499195, 0.04359601438045502, 0.038528744131326675, 0.03363620489835739, 0.03302463889122009, 0.035558272153139114], 'factorized_top_k/top_5_categorical_accuracy': [0.04752752184867859, 0.02341429330408573, 0.04298444837331772, 0.027782632037997246, 0.03791717812418938, 0.058011531829833984, 0.07233968377113342, 0.05626419559121132, 0.06526297330856323, 0.06141883507370949], 'factorized_top_k/top_10_categorical_accuracy': [0.04883802309632301, 0.03616984188556671, 0.0710291787981987, 0.03529617190361023, 0.056351564824581146, 0.06613664329051971, 0.1050148531794548,...