ninesky110

Results 9 comments of ninesky110

> > concat之后再输出,在计算结果上,和论文中 sigmoid(y_FM+y_DNN) 单独计算再加和是一样的。 > > 我觉得first order乘以feature_bias是多余的。因为embedding的结果与deep、second order拼接最后接一个projection layer,只看feat_value-projection这一块就已经是等价LR 的形式(论文中的公式2),在前面乘以一个feature_bias又不加非线性激活函数完全没必要。 > > PS:在gayhub上讨论,是不是还是用英语更合适? > > 我也有此疑问, > 按论文中的意思,y_FM= reduce_sum(first_order,1) + reduce_sum(second_order,1) > y_DNN = reduce_sum(y_deep,1),这个和 > concat([first_order, second_order,...

同问 def call(self, inputs, mask=None, training=None, **kwargs): if self.supports_masking: if mask is None: raise ValueError( "When supports_masking=True,input must support masking") queries, keys = inputs key_masks = tf.expand_dims(mask[-1], axis=1) key_masks怎么得出来的?程序竟然还没报错

I have the same problem, have you solved it?

![image](https://user-images.githubusercontent.com/38121533/125430018-8653c451-aa72-487e-9b0f-f51d069e00c6.png)

when I changed the "data_batch_size" or "max_data_per_step",I got the error. ![image](https://user-images.githubusercontent.com/38121533/125432414-d9a4cfb0-a97a-46b4-b4c5-4dab9ada8889.png)

me too !!! I used unsupervised method and it not work

Have you ever tried to adjust parameters or replace unsupervised tasks with supervised tasks?

I replace unsupervised tasks with supervised tasks and find that it worked

> The negative sampling is not used in the supervised case. @ninesky110 will you elaborate on how it did not work? The embeddings learned during unsupervised should be different as...