Results 4 issues of RDouglas

```python import json import os import sys import time import warnings from pathlib import Path from typing import Optional import lightning as L import torch from generate import generate from...

enhancement
generation

I attempted a workaround, but the output from finetuning doesn't look quite right. Has anyone made a working fix for this issue?

I use FastChat as the framework for both training and dialog-based inference, and FastChat supports Meta/Llama. I was excited to try the 3B state Open-Llama model, and the FastChat finetuning...

In run/.py, changing the line from transformers import Trainer to from src.transformers import Trainer solved the problem.