obito

Results 19 comments of obito

> 我用的lora ,训练集较小 600条测试,很快loss就到0.01, 是不是越小越好? 是不是过拟合了?我的数据平均长度1000token 拟合不了

> 嘿@wccccp 👋 > > 该异常不是由于`transformers`,而是由于`.json`文件(或类似文件)。您的分词器检查点可能有问题。 > > 请参阅[此](https://stackoverflow.com/questions/16573332/jsondecodeerror-expecting-value-line-1-column-1-char-0)堆栈溢出问题。 you are right,the question is solute

> > > > Hi @nature1949 @lxd941213 @matriim @Abandon339 @yeloveyou @MrPeterJin @Abandon339 @Xin-GY @fatboy1994 please make sure you are on a secure connection (Check the URL of the app https://...

> What command are you running to get this error? I loaded hugging face model inference locally and this happened.

> 你是不是拉的是baichuan的模型,因为baichuan模型虽然和llama属于是双胞胎,但是他们在模型文件中也是有区别的,在做映射那一步的时候底层写的只是针对llama的,所以你会报这个错误,解决办法就是换llama 现在只能支持llama的吗😄

> 2023-12-04 16:49:56,456] [INFO] [launch.py:145:main] WORLD INFO DICT: {'localhost': [0]} [2023-12-04 16:49:56,456] [INFO] [launch.py:151:main] nnodes=1, num_local_procs=1, node_rank=0 [2023-12-04 16:49:56,456] [INFO] [launch.py:162:main] global_rank_mapping=defaultdict(, {'localhost': [0]}) [2023-12-04 16:49:56,456] [INFO] [launch.py:163:main] dist_world_size=1 [2023-12-04...

> 已解决,cuda 环境问题,导致走了cpu 怎么解决的啊?

Serving Flask app 'run' * Debug mode: on WARNING: This is a development server. Do not use it in a production deployment. Use a production WSGI server instead. * Running...

> 程序报错信息贴一下 * Serving Flask app "run" (lazy loading) * Environment: production WARNING: This is a development server. Do not use it in a production deployment. Use a production WSGI...