Personal AI Notebooks. Organize files & webpages and generate notes from them. Open source, local & open data, open model choice (incl. local).
-
Updated
Apr 8, 2026 - TypeScript
Personal AI Notebooks. Organize files & webpages and generate notes from them. Open source, local & open data, open model choice (incl. local).
A simple, locally hosted Web Search MCP server for use with Local LLMs
Self-hosted personalized AI in a mirror.
A local, privacy-first résumé builder using LLMs and Markdown to generate ATS-ready DOCX files with Pandoc — no cloud, no tracking.
Versatile Almost Local, Eventually Reasonable Assistant 🔫
Chrome Extension to Summarize or Chat with Web Pages/Local Documents Using locally running LLMs. Keep all of your data and conversations private. 🔐
💻一款简洁实用轻量级的本地AI对话客户端,采用Tauri2.0和Next.js编写 A simple, practical, and lightweight local AI chat client, written in Tauri 2.0 & Next.js.
Code with AI in VSCode, bring your own ai.
TypeScript library for Local LLM in Chromium browsers
Open Source Local Data Analysis Assistant.
🤖 Visual AI agent workflow automation platform with local LLM integration - build intelligent workflows using drag-and-drop interface, no cloud dependencies required.
Vibecode Editor is a fullstack, web-based IDE built with Next.js and Monaco Editor. It features real-time code execution using WebContainers, AI-powered code suggestions via locally running Ollama models, multi-stack templates, an integrated terminal, and a developer-focused UI for seamless coding in the browser.
Run local LLM from Huggingface in React-Native or Expo using onnxruntime.
Find the best models and how to run them locally.
An Autonomous AI Software Engineer
M-Courtyard: Local AI Model Fine-tuning Assistant for Apple Silicon. Zero-code, zero-cloud, privacy-first desktop app powered by Tauri + React + mlx-lm.
Unlock GitHub Copilot as a local API Gateway. Use Copilot with Cursor, LangChain, and any OpenAI-compatible tool.
On-device AI for iOS & Android
Local AI desktop app — chat, agent mode, image gen, video gen. Supports Ollama, Gemma 4, Llama, Qwen, OpenAI, Anthropic. Single .exe, no Docker.
Add a description, image, and links to the local-llm topic page so that developers can more easily learn about it.
To associate your repository with the local-llm topic, visit your repo's landing page and select "manage topics."