ollma
Here are 8 public repositories matching this topic...
CDoc lets you chat with your documents using local LLMs, combining Ollama, ChromaDB, and LangChain for offline, secure, and efficient information extraction. Perfect for researchers, developers, and professionals seeking quick insights from their documents.
-
Updated
Aug 26, 2024 - Python
Windows Copilot is suck, that's why Optimus is created, for being your true Windows assistant
-
Updated
Aug 9, 2025 - C#
AI Model (llava from ollama) used to generate description of a image. A mere 7b Model was used. Documentation PDF : https://drive.google.com/file/d/1dHPw1Z0hPHerkl3kMPB_Inmaaovv0M1r/view?usp=drivesdk
-
Updated
Nov 16, 2025 - Python
A smart GPU-aware proxy designed for Proxmox VE that dynamically manages GPU resources between Ollama and GPU-intensive background processes.
-
Updated
Nov 14, 2025 - Python
This is a security-first NL-to-SQL solution designed for enterprises with small to medium-sized databases. It enables natural language querying of data while ensuring that all processing remains fully on-premises. No data is sent to external cloud environments, and no sensitive information is exposed to external LLM services.
-
Updated
Jan 29, 2026 - Python
Improve this page
Add a description, image, and links to the ollma topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the ollma topic, visit your repo's landing page and select "manage topics."