ollama-client
ollama-client copied to clipboard
Ollama Client – Chat with Local LLMs Inside Your Browser A lightweight, privacy-first Ollama Chrome extension to chat with locally hosted Ollama lllm models like LLaMA 2, Mistral, and CodeLLaMA. Suppo...
🧠 Ollama Client — Chat with Local LLMs in Your Browser
Ollama Client is a powerful, privacy-first Chrome extension that lets you chat with locally hosted LLMs using Ollama — no cloud, no tracking. It’s lightweight, open source, and designed for fast, offline-friendly AI conversations.
🚀 Get Started — Install Now
❤️ Upvote Us on Product Hunt!
🌐 Explore More
✨ Features
🤖 Model Management
- 🔌 Local Ollama Integration – Connect to a local Ollama server (no API keys required)
- 🌐 LAN/Local Network Support – Connect to Ollama servers on your local network using IP addresses (e.g.,
http://192.168.x.x:11434) - 🔄 Model Switcher – Switch between models in real time with a beautiful UI
- 🔍 Model Search & Pull – Search and pull models directly from Ollama.com in the UI (with progress indicator)
- 🗑️ Model Deletion – Clean up unused models with confirmation dialogs
- 🧳 Load/Unload Models – Manage Ollama memory footprint efficiently
- 📦 Model Version Display – View and compare model versions easily
- 🎛️ Advanced Parameter Tuning – Per-model configuration: temperature, top_k, top_p, repeat penalty, stop sequences, system prompts
💬 Chat & Conversations
- 💬 Beautiful Chat UI – Modern, polished interface built with Shadcn UI
- 🗂️ Multi-Chat Sessions – Create, manage, and switch between multiple chat sessions
- 📤 Export Chat Sessions – Export single or all chat sessions as PDF, JSON, Markdown, or Text
- 📥 Import Chat Sessions – Import single or multiple chat sessions from JSON files
- 📋 Copy & Regenerate – Quickly rerun or copy AI responses
- ⚡ Streaming Responses – Real-time streaming with typing indicators
- 🗑️ Message Deletion – Delete individual messages with cascading cleanup (v0.5.10)
- 📎 Export Individual Messages – Export single messages in PDF, Markdown, JSON, or Text formats (v0.5.10)
- 🌿 Conversation Branching – Fork conversations by editing messages and explore alternate paths
🧠 Embeddings & Semantic Search (Beta v0.3.0)
- 🔍 Semantic Chat Search – Search chat history by meaning, not just keywords
- 📊 Vector Database – IndexedDB-based vector storage with optimized cosine similarity
- 🎯 Smart Chunking – 3 strategies: fixed, semantic, hybrid (configurable)
- 🚀 Optimized Search – Pre-normalized vectors, caching, early termination
- 🔧 Configurable – Chunk size, overlap, similarity threshold, search limits
- 📁 Context-Aware – Search across all chats or within current session
📎 File Upload & Processing (Beta v0.3.0+)
- 📄 Text Files – Support for .txt, .md and text based files
- 📁 PDF Support – Extract and process text from PDF documents
- 📘 DOCX Support – Extract text from Word documents
- 📊 CSV Support – Parse CSV, TSV, PSV with custom delimiters and column extraction (Beta v0.5.0)
- 🌐 HTML Support – Convert HTML to Markdown for clean text extraction (Beta v0.5.0) with 50+ language support (Beta v0.5.0)
- ⚙️ Auto-Embedding – Automatic embedding generation for uploaded files
- 📊 Progress Tracking – Real-time progress indicators during processing
- 🎛️ Configurable Limits – User-defined max file size in settings
🌐 Webpage Integration
- 🧠 Enhanced Content Extraction – Advanced extraction with multiple scroll strategies (none, instant, gradual, smart)
- 🔄 Lazy Loading Support – Automatically waits for dynamic content to load
- 📄 Site-Specific Overrides – Configure extraction settings per domain (scroll strategies, delays, timeouts)
- 🎯 Defuddle Integration – Smart content extraction with Defuddle fallback
- 📖 Mozilla Readability – Fallback extraction using Mozilla Readability
- 🎬 YouTube Transcripts – Automated YouTube transcript extraction
- 📊 Extraction Metrics – View scroll steps, mutations detected, and content length
⚙️ Customization & Settings
- 🎨 Professional UI – Modern design system with glassmorphism effects, gradients, and smooth animations
- 🌓 Dark Mode – Beautiful dark theme with smooth transitions
- 📝 Prompt Templates – Create, manage, and use custom prompt templates (Ctrl+/)
- 🔊 Advanced Text-to-Speech – Searchable voice selector with adjustable speech rate & pitch
- 🌍 Internationalization (i18n) – Full multi-language support with 9 languages: English, Hindi, Spanish, French, German, Italian, Chinese (Simplified), Japanese, Russian
- 🎚️ Cross-Browser Compatibility – Works with Chrome, Brave, Edge, Opera, Firefox
- 🧪 Voice Testing – Test voices before using them
🔒 Privacy & Performance
- 🛡️ 100% Local and Private – All storage and inference happen on your device
- 🧯 Declarative Net Request (DNR) – Automatic CORS handling
- 💾 IndexedDB Storage – Efficient local storage for chat sessions
- ⚡ Performance Optimized – Lazy loading, debounced operations, optimized re-renders
- 🔄 State Management – Clean Zustand-based state management
🧩 Tech Stack
Built with React 18, TypeScript, Plasmo, Shadcn UI, Zustand, Dexie.js and Tailwind CSS.
For a deep dive into the code structure and design patterns, see ARCHITECTURE.md.
🛠️ Quick Setup
✅ 1. Install the Extension
✅ 2. Install Ollama on Your Machine
brew install ollama # macOS
# or visit https://ollama.com for Windows/Linux installers
ollama serve # starts at http://localhost:11434
💡 Quick Setup Script (Cross-platform):
For easier setup with LAN access and Firefox CORS support:
# Cross-platform bash script (macOS/Linux/Windows with Git Bash)
./tools/ollama-env.sh firefox # Firefox with CORS + LAN access
./tools/ollama-env.sh chrome # Chrome with LAN access
📄 Script file: tools/ollama-env.sh
This script automatically:
- Configures Ollama for LAN access (
0.0.0.0) - Sets up CORS for Firefox extensions (if needed)
- Shows your local IP address for network access
- Detects your OS (macOS, Linux, Windows) automatically
- Stops any running Ollama instances before starting
If you don't have the script file, you can download it directly or see the full setup guide: Ollama Setup Guide
More info: https://ollama.com
✅ 3. Pull a Model
ollama pull gemma3:1b
Other options: mistral, llama3:8b, codellama, etc.
⚙️ 4. Configure the Extension
-
Click the Ollama Client icon
-
Open ⚙️ Settings
-
Set your:
-
Base URL:
http://localhost:11434(default) or your local network IP (e.g.,http://192.168.1.100:11434) - Default model (e.g.
gemma:2b) - Theme & appearance
- Model parameters
- Prompt templates
-
Base URL:
💡 Tip: You can use Ollama on a local network server by entering its IP address (e.g.,
http://192.168.x.x:11434) in the Base URL field. Make sure Ollama is configured withOLLAMA_HOST=0.0.0.0for LAN access.
Advanced parameters like system prompts and stop sequences are available per model.
🛠️ Local Development Setup
Want to contribute or customize? You can run and modify the Ollama Client extension locally using Plasmo.
⚙️ Prerequisites
📦 1. Clone the Repo
git clone https://github.com/Shishir435/ollama-client.git
cd ollama-client
📥 2. Install Dependencies
Using pnpm (recommended):
pnpm install
Or with npm:
npm install
🧪 3. Run the Extension (Dev Mode)
Start development mode with hot reload:
pnpm dev
Or with npm:
npm run dev
This launches the Plasmo dev server and gives instructions for loading the unpacked extension in Chrome:
- Open
chrome://extensions - Enable Developer mode
- Click Load unpacked
- Select the
dist/folder generated by Plasmo
🛠 4. Build for Production
pnpm build
⛓️ 5. Package for Production
pnpm package
🧪 6. Run, build and package in Firefox (Experimental)
Setup Ollama for Firefox:
Firefox requires manual CORS configuration. Use the helper script:
# Cross-platform bash script (macOS/Linux/Windows with Git Bash)
./tools/ollama-env.sh firefox
This configures OLLAMA_ORIGINS for Firefox extension support.
Build and run:
pnpm dev --target=firefox
pnpm build --target=firefox
pnpm package --target=firefox
Or with npm:
npm run dev -- --target=firefox
Load as a temporary extension.
📁 Code Structure
src/
├── background/ # Background service worker & API handlers
├── sidepanel/ # Main chat UI
├── options/ # Settings page
├── features/ # Feature modules
│ ├── chat/ # Chat components, hooks, semantic search
│ ├── model/ # Model management & settings
│ ├── sessions/ # Chat session management
│ ├── prompt/ # Prompt templates
│ └── tabs/ # Browser tab integration
├── lib/ # Shared utilities
│ └── embeddings/ # Vector embeddings & semantic search
├── components/ # Shared UI components (Shadcn)
└── hooks/ # Shared React hooks
Architecture: Feature-based organization with separation of concerns (components, hooks, stores). Zustand for global state, React hooks for local state.
✅ Tips
- Change manifest settings in
package.json - PRs welcome! Check issues for open tasks
💡 Recommended Models by Device
| System Specs | Suggested Models |
|---|---|
| 💻 8GB RAM (no GPU) | gemma:2b, mistral:7b-q4 |
| 💻 16GB RAM (no GPU) | gemma:3b-q4, mistral |
| 🎮 16GB+ with GPU (6GB VRAM) | llama3:8b-q4, gemma:3b |
| 🔥 RTX 3090+ or Apple M3 Max | llama3:70b, mixtral |
📦 Prefer quantized models (q4_0, q5_1, etc.) for better performance.
Explore: Ollama Model Library
🧪 Firefox Support
Ollama Client is a Chrome Manifest V3 extension. To use in Firefox:
- Go to
about:debugging - Click "Load Temporary Add-on"
- Select the
manifest.jsonfrom the extension folder - Manually allow CORS access (see setup guide)
🐛 Known Issues
- [ ] "Stop Pull" during model downloads may glitch
- [ ] Large chat histories in IndexedDB can impact performance
🗺️ Roadmap
See ROADMAP.md for planned features and upcoming releases.
🔗 Useful Links
- 🌐 Install Extension: Chrome Web Store
- 📘 Docs & Landing Page: ollama-client
- 🐙 GitHub Repo: Github Repo
- 📖 Setup Guide: Ollama Setup Guide
- 🔒 Privacy Policy Privacy Policy
- 🐞 Issue Tracker: Report a Bug
- 🙋♂️ Portfolio: shishirchaurasiya.in
- 💡 Feature Requests: Email Me
📚 For Developers & Contributors
- 🏗️ ARCHITECTURE.md - Technical architecture, design patterns, and implementation details
- 🗺️ ROADMAP.md - Planned features and development timeline
- 📊 PROJECT_ANALYSIS.md - Comprehensive code quality analysis
- 🤝 CONTRIBUTING.md - Guidelines for contributing to the project
📢 Spread the Word!
If you find Ollama Client helpful, please consider:
- ⭐ Starring the repo
- 📝 Leaving a review on the Chrome Web Store
- 💬 Sharing on socials (tag
#OllamaClient)
Built with ❤️ by @Shishir435