Building MIT-licensed, offline-first, privacy-friendly LLM tools for everyday use.
- Develop browser-based LLM apps that run entirely on the client (no servers, no tracking).
- Package ultra-light open LLMs to run smoothly on modest hardware.
- Create task-specific tools (e.g. translator, summarizer, document Q&A) designed for offline use.
- Publish everything under MIT license so others can freely reuse and build upon it.
- Privacy-first: tools should work offline without sending data anywhere.
- Accessible: usable on everyday devices without heavy GPUs or cloud.
- Simple deploys: static hosting or local use — no complicated infra.
- Open by default: all code permissively licensed.
- Building ready-to-use translators that run entirely in-browser.
- Exploring everyday AI tools (summarization, note rewriting, language Q&A).
- Documenting clear, reproducible how-to-run guides for non-technical users.
transformers.js
& WebAssembly (SIMD)- Web Workers & browser-native APIs
- Quantized LLMs (GGUF, llama.cpp ecosystem)
- Model Context Protocol (MCP) for local integrations
When I’m not building, I’m usually outdoors —
I play basketball & football (often with kids!) and can spin a basketball on my thumb 🏀👍.
For more information and links to my social profiles, visit my Linktree