echoht

Results 7 issues of echoht

I run on a cpu machine with default args, I got the following errors: num_params: 167301123 INFO:root:Variables info has been saved. 5%|#####5 | 1533/30681 [3:21:15

python version 3.6.0 the error line code is : _prepro = lambda x: [line.strip() for line in open(x, 'r').read().split("\n") \ if not line.startswith("

https://github.com/lajanugen/zeshel/blob/584f15d5b0ae285df7dcd1a3e99ffae2203984dc/run_classifier.py#L203 BertModel has no such param.

![企业微信截图_16832967678761](https://user-images.githubusercontent.com/48375360/236485946-52f3cee5-2428-4d7c-889d-3e49f7b916a2.png) 这里batch size设置为1时,逻辑没有问题。当batch size!=1时,会出现tensor size不match的情况。

fix bug when batchsize is != 1.

我看了webRL和AutoGLM的Paper,autoGLM里提到使用webRL的训练框架,我猜它的数据集应该不是VAB开源的这部分数据集,极可能是涉及更多网站、特别是中文网站的任务吧?这部分数据有开源吗?

### Reminder - [x] I have read the above rules and searched the existing issues. ### System Info 引用代码处: ` # Run PPO step self.model.train() stats = self.step(queries, responses, rewards)...

bug
pending