VT.ai
VT.ai copied to clipboard
VT.ai - Multimodal AI Chatbot
VT.ai
Multi-modal AI Assistant
Introduction
VT.ai is a multi-modal AI Chatbot Assistant, offering a chat interface to interact with Large Language Models (LLMs) from various providers. Both via remote API or running locally with Ollama.
The application supports multi-modal conversations, seamlessly integrating text, images, and vision processing with LLMs.
[Beta] Multi-modal AI Assistant support via OpenAI's Assistant API function calling.
Key Features
- [Beta] Assistant support: Enjoy the assistance of Multi-modal AI Assistant through OpenAI's Assistant API. It can write and run code to answer math questions.
- Multi-Provider Support: Choose from a variety of LLM providers including OpenAI, Anthropic, and Google, with more to come.
- Multi-Modal Conversations: Experience rich, multi-modal interactions by uploading text and image files. You can even drag and drop images for the model to analyze.
- Real-time Responses: Stream responses from the LLM as they are generated.
- Dynamic Settings: Customize model parameters such as temperature and top-p during your chat session.
- Clean and Fast Interface: Built using Chainlit, ensuring a smooth and intuitive user experience.
- Advanced Conversation Routing: Utilizes SemanticRouter for accurate and efficient modality selection.


Getting Started
Prerequisites
- Python 3.7 or higher
- (Optional -- Recommended)
ryeas the Python dependencies manager (installation guide below) - For using local models with Ollama:
- Download the Ollama client from https://ollama.com/download
- Download the desired Ollama models from https://ollama.com/library (e.g.,
ollama pull llama3) - Follow the Ollama installation and setup instructions
A note on Python package/environment management
- You can use native Python
pipto install packages dependencies without installingrye. If so, you can skip these steps and proceed to the Usage section below. - [Recommended] If want to use
rye, and had it installed from the Prerequisites step, you can skip these steps and proceed to the Usage section below. Otherwise you can installryeby following these steps:
a. Install
rye(Python packages manager):
curl -sSf https://rye-up.com/get | bashb. Source the Rye env file to update PATH (add this to your shell configuration file, e.g.,
.zprofileor.zshrc):
source "$HOME/.rye/env"
Usage
- Rename the
.env.examplefile to.envand configure your desired LLM provider API keys. If using Ollama, you can leave the API keys blank. - Create Python virtual environment:
python3 -m venv .venv - Activate the Python virtual environment:
source .venv/bin/activate - Packages management:
- Using pip, start dependencies sync, by running this command:
pip install -r requirements.txt - [Recommended] If you use
rye, start dependencies sync, by running this command:rye sync
- (Optional) Run semantic trainer once.
python src/router/trainer.py - Run the app with optional hot reload:
chainlit run src/app.py -w - Open the provided URL in your web browser (e.g.,
localhost:8000). - Select an LLM model and start chatting or uploading files for multi-modal processing. If using Ollama, select the
Ollamaoption from the model dropdown. - To run Ollama server for serving local LLM models, you can use the following commands:
- Example to use Meta's Llama 3 model locally from Ollama:
ollama pull llama3to download thellama3model (replace with the desired model name) ollama serveto start the Ollama serverollama --helpfor more options and details
Technical Overview
Dependencies
- Chainlit: A powerful library for building chat applications with LLMs, providing a clean and fast front-end.
- LiteLLM: A versatile library for interacting with LLMs, abstracting away the complexities of different providers.
- SemanticRouter: A high-performance library for accurate conversation routing, enabling dynamic modality selection.
Contributing
Contributions are welcome! Here's how you can contribute:
- Fork the repository
- Create a new branch:
git checkout -b my-new-feature - Make your changes and commit them:
git commit -m 'Add some feature' - Push to the branch:
git push origin my-new-feature - Submit a pull request
Releases
See releases tags
License
This project is licensed under the MIT License.
Contact
For questions, suggestions, or feedback, feel free to reach out:
- Twitter: @vinhnx