xTuring icon indicating copy to clipboard operation
xTuring copied to clipboard

Build, customize and control you own LLMs. From data pre-processing to fine-tuning, xTuring provides an easy way to personalize open-source LLMs. Join our discord community: https://discord.gg/TgHXuSJ...

Results 21 xTuring issues
Sort by recently updated
recently updated
newest added

`finetuning_config = model.finetuning_config()` `finetuning_config.output_dir = "model_cp/"` Changing the model's output directory as above (mentioned in https://xturing.stochastic.ai/configuration/finetune_configure/ ). This is having no effect in Databricks and the model checkpoints are still...

Issue with installing xTuring on Windows with "pip install xturing" ``` [WARNING] Unable to import torch, pre-compiling ops will be disabled. Please visit https://pytorch.org/ to see how to properly install...

## Proposal to Add Mistral Model Family ### Overview I'd love to see the **Mistral model family** added to the list of supported models. These models show interesting performance improvements...

Bumps [@babel/traverse](https://github.com/babel/babel/tree/HEAD/packages/babel-traverse) from 7.21.3 to 7.23.2. Release notes Sourced from @​babel/traverse's releases. v7.23.2 (2023-10-11) NOTE: This release also re-publishes @babel/core, even if it does not appear in the linked release...

dependencies
javascript

Bumps [postcss](https://github.com/postcss/postcss) from 8.4.21 to 8.4.31. Release notes Sourced from postcss's releases. 8.4.31 Fixed \r parsing to fix CVE-2023-44270. 8.4.30 Improved source map performance (by @​romainmenke). 8.4.29 Fixed Node#source.offset (by...

dependencies
javascript

This PR adds support for the above mentioned LLMs using LiteLLM https://github.com/BerriAI/litellm/ LiteLLM allows you to use any LLM as a drop in replacement for gpt-3.5 Example ```python from litellm...

Hey folks, Trying to get 13/30B model with 4 bit fine tuning - any chance you folk could release the script used to convert the 7B version of model to...

enhancement
help wanted

Could xTuring support Open Assistant: https://github.com/LAION-AI/Open-Assistant/issues/2716

enhancement
help wanted

- Add more fine-tuning/generation parameters (gradient_accumulation_steps, save_total_limit, eval_steps, max_grad_norm,...) in config files.

enhancement
help wanted