Anh (Luka) Pham

Results 15 comments of Anh (Luka) Pham

@wen020 yes, I use 4090 to train, may the training requires at least more than 24Gb I think, otherwise it causes oom.

The config of mine works on 4090, just switch to small model like 1B. The only problem I have is the lora result, it is bad.

@wen020 yes, especially on faces :(((

@dome272 Actually the 3B model trained on LoRAs is still bad, especially on faces. Have you ever tried to train faces with LoRA?

@wen020 I can only train 1B model on 4090, otherwise it will cause oom.

@DaveyBoy1970 You have any idea how we can call it?

@evesloan have you run it? I always got the conflict errors for cuda and not able to install due to cuda, ... It would be nice if you can share...

@CenekSanzak Hi, have you tried to implement your idea of multicontrolnet?