Cot2g

Results 2 issues of Cot2g

In base_model.py def _cross_l_loss(self, hparams): cross_l_loss = tf.zeros([1], dtype=tf.float32) for param in self.cross_params: cross_l_loss = tf.add(cross_l_loss, tf.multiply(hparams.cross_l1, tf.norm(param, ord=1))) cross_l_loss = tf.add(cross_l_loss, tf.multiply(hparams.cross_l2, tf.norm(param, ord=1))) return cross_l_loss I wonder on...

preprocess_word,的作用是把每个vector feature里面的特征前面加上'W' 这个操作的动机是什么呢,这么做对模型的改善有什么帮助吗?