The pretty much "official" DSPy framework for Typescript
- Updated
- TypeScript
![]() |
VOOZH | about |
The pretty much "official" DSPy framework for Typescript
NativeMind: Your fully private, open-source, on-device AI assistant
TypeScript SDK for different in-browser AI model providers, built to make client-side AI integration simpler and more consistent across vendors.
Your users download a 4GB AI model, the connection drops at 3.8GB — verifyfetch resumes from 3.8GB and verifies every byte. Drop-in integrity verification for Transformers.js, WebLLM, and any large file in the browser.
A lightweight recreation of OS1/Samantha from the movie Her, running locally in the browser
A Webapp that uses Retrieval Augmented Generation (RAG) and Large Language Models to interact with a PDF directly in the browser.
Local-first Safari Web Extension + native app for structure-first reading, with bundled WebLLM popup inference and token-aware long-document processing on Apple platforms.
Not-Only LLM Chat. An AI application that enhances creativity and user experience beyond just LLM chat. Noted: Seems it beta version of there is issue with DB please clear site Data in debug
Blueshell AI - Chat privately with AI and your documents fully offline, directly in your browser.
TalkFlow-Gun-Vue-WebLLM-DApp - E2EE Chat - E2EE Video/Voice Call - Decentralized AI- robots - drones - communication facilities
A proof of concept of what can be done with the BlockNote editor based on ProseMiror and an LLM running in the browser
🚀 A blazingly fast, modern chat interface built with Rust/WASM and Leptos, featuring local AI model execution via WebLLM. Privacy-first design with no backend dependencies.
ChatGPT style interface for open-source LLMs, allowing completely free, offline, and private AI use. Connect with models downloaded from Ollama, or load models directly in the browser with WebLLM.
Local-first AI extension that turns what you read into a searchable knowledge graph with hybrid search, temporal context, and proactive recall—fully private.
Add a description, image, and links to the webllm topic page so that developers can more easily learn about it.
To associate your repository with the webllm topic, visit your repo's landing page and select "manage topics."