llm-chat
Here are 23 public repositories matching this topic...
Start a chat room between all, or some, of your models running on Ollama. All in a single Bash shell script.
-
Updated
Aug 30, 2025 - Shell
Chat locally using leading open models built by the community, optimized and accelerated by NVIDIA's enterprise-ready inference runtime
-
Updated
Jul 25, 2024 - Python
Mem0Chat is an application showcasing an LLM-powered chat with a model-agnostic persistent memory layer powered by mem0. It enables context-aware conversations, using mem0's memory management to retain and utilize key memories across sessions/chats for a personalized, efficient user experience.
-
Updated
Jul 3, 2025 - JavaScript
Scalable implementation of Semantic search and LLM powered chat bot for online store
-
Updated
Jul 23, 2024 - Python
Single HTML Page offering split screen LLM Chat and Web App Prototyping
-
Updated
May 22, 2025 - JavaScript
Chat App with Ollama
-
Updated
Nov 21, 2025 - TypeScript
llm-is-fun-for-js introduces front-end developers to Large Language Models (LLMs) with simple projects like a chatbot, a text summarizer, and a To-Do List app. It encourages developers to explore new technologies, enhance their resumes, and build innovative applications using LLMs.
-
Updated
Mar 30, 2025 - JavaScript
implement llm streaming with page reload support using vercel ai sdk
-
Updated
Oct 22, 2025 - TypeScript
-
Updated
Jul 2, 2025 - Go
A modern real-time streaming chat application with FastAPI backend and React TypeScript frontend
-
Updated
Aug 31, 2025 - Python
-
Updated
May 14, 2024 - JavaScript
JavaScript templating engine based on Jinja2 (forked to experiment with LLM Chat templates in Jinja2 syntax)
-
Updated
Aug 26, 2024 - JavaScript
An ai-powered journal that determines your mood.
-
Updated
Nov 12, 2024 - TypeScript
Asses-2 is a modern, responsive frontend prototype for an AI chat application, built with React, TypeScript, and Vite. It features a clean interface for managing conversations, a collapsible sidebar with chat history, debounced search, and a rich input area supporting file attachments with drag-and-drop. The backend is currently simulated.
-
Updated
Oct 27, 2025 - TypeScript
Chat LLM local: Interfaz CLI para modelos GGUF y Transformers con compatibilidad CUDA. Permite ejecutar Llama, Mistral, Gemma, Phi y Qwen localmente con detección automática de modelos, adaptación de mensajes del sistema, soporte RAG y más.
-
Updated
Nov 5, 2025 - Python
A branching LLM chat application that allows users to revisit past turns and create new conversation branches. Supports multiple LLM providers (OpenAI, Gemini, Claude, Ollama) with a React/Next.js frontend and a Python/FastAPI backend.
-
Updated
Jun 7, 2025 - TypeScript
Improve this page
Add a description, image, and links to the llm-chat topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the llm-chat topic, visit your repo's landing page and select "manage topics."