champson
champson
The root cause of this issue is difference pybind11 abi between pytorch and th-glm. Add the pybind11 macro define like https://github.com/pytorch/pytorch/blob/1fae179ee1b59c42c41f9dc7b55a2cba64737adb/torch/utils/cpp_extension.py#L1975 when build can fix it.
Get error: top_k: 1 top_p: 0.0 int8_mode: 0 random_seed: 5 temperature: 1 max_seq_len: 1024 max_batch_size: 1 repetition_penalty: 1 vocab_size: 50304 tensor_para_size: 1 pipeline_para_size: 1 lib_path: ./lib/libth_transformer.so ckpt_path: ./models/345m/c-model/1-gpu hf_config: {'activation_function':...