Todd Mostak
Todd Mostak
@ienkovich how is this branch looking in terms of moving out of draft? Also saw the asan failure in GeospatialTest, any ideas on that?
@fexolm have you tried on latest master? I believe this commit should add support for varlen columnar conversion. https://github.com/omnisci/omniscidb/commit/5fe4ee1bea2cc2f10c2e67ae0f59a6d52a9b401a
I'd bump this task in priority since it will take a way for people to sledgehammer our backend. Should be quite simple.
I believe this is necessary for getting the best results when fine-tuning Llama 3, although there seems to be some confusion (https://huggingface.co/meta-llama/Meta-Llama-3-8B/discussions/9).
Just as a follow-up, I implemented a hacky version of this to help with training Llama 3, and indeed adding BOS tokens to prompts and answers when fine-tuning the Llama...
Yes good point... still seems desirable to have native support in the UX though.
I'd like to bump this... seems DoRA yields accuracy basically on par with full fine tuning.
I wanted to check and see if there was any update on this. We've hit this as well and would love to be able to construct models directly for model...
@napetrov yes ideally all models would support serialization so we can store on disk (we're adding ML training/inference support to our database). However if it helps with prioritization we're currently...
@napetrov Wanted to see if you had any update on this?