wxam

Results 5 issues of wxam

I have read all of your codes but met some error, I then worked about one day to fix them, besides, I notice the bottleneck of the speed is data...

I keep ep and tp to be 'zero' and sp to be 'random' in 3dmm ``` sp = bfm.get_shape_para('random') ep = bfm.get_exp_para('zero') vertices = bfm.generate_vertices(sp, ep) tp = bfm.get_tex_para('zero') colors...

in forward function, x = torch.inverse(...) ===== tensor = torch.FloatTensor([[1,2],[3,4]]) t_out = torch.inverse(tensor) t_o = 1/tensor print(tensor) print(t_out) print(t_o) ==========> 1 2 3 4 [torch.FloatTensor of size 2x2] -2.0000 1.0000...

在ChatGLM的Official实现中,token采用了import sentencepiece as spm,这样的一个库,这个库在 self.sp.EncodeAsPieces(text),这一句会把英文单词比如“hello”处理成"▁hello",注意前面的两个杠不是下划线。这应该是最标准的方式,而本项目好像没有做类似的处理。