Skip to content
View harisnae's full-sized avatar

Block or report harisnae

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Please don't include any personal information such as legal names or email addresses. Maximum 100 characters, markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse
harisnae/README.md

Haris Naeem

Building MIT-licensed, offline-first, privacy-friendly LLM tools for everyday use.


What I do

  • Develop browser-based LLM apps that run entirely on the client (no servers, no tracking).
  • Package ultra-light open LLMs to run smoothly on modest hardware.
  • Create task-specific tools (e.g. translator, summarizer, document Q&A) designed for offline use.
  • Publish everything under MIT license so others can freely reuse and build upon it.

Principles

  • Privacy-first: tools should work offline without sending data anywhere.
  • Accessible: usable on everyday devices without heavy GPUs or cloud.
  • Simple deploys: static hosting or local use — no complicated infra.
  • Open by default: all code permissively licensed.

Current focus

  • Building ready-to-use translators that run entirely in-browser.
  • Exploring everyday AI tools (summarization, note rewriting, language Q&A).
  • Documenting clear, reproducible how-to-run guides for non-technical users.

Tech stack

  • transformers.js & WebAssembly (SIMD)
  • Web Workers & browser-native APIs
  • Quantized LLMs (GGUF, llama.cpp ecosystem)
  • Model Context Protocol (MCP) for local integrations

Beyond coding

When I’m not building, I’m usually outdoors —
I play basketball & football (often with kids!) and can spin a basketball on my thumb 🏀👍.

Contact

For more information and links to my social profiles, visit my Linktree

Popular repositories Loading

  1. multilingual-translator-offline multilingual-translator-offline Public

    Offline multilingual translator using ONNX‑quantised OPUS‑MT models run entirely in the browser via @xenova/transformers. Loads once, caches in IndexedDB, works without internet, supports 43 langua…

    JavaScript 3

  2. browser-based-LLM browser-based-LLM Public

    Ultra-lightweight open-source LLM implementation running entirely client-side using transformers.js - GitHub Pages deployable with mobile optimization and browser memory management

    JavaScript 2

  3. Connect-local-LLM-to-use-local-datasets-via-MCP Connect-local-LLM-to-use-local-datasets-via-MCP Public

    Project to connect local LLM/SLM to use local datasets via MCP (Model Context Protocol)

    1

  4. Browser-Based-Client-Side-Multi-LLM Browser-Based-Client-Side-Multi-LLM Public

    Browser-based, client-side deployment of multiple ultra-light language models (LLMs) for fast, secure, and private AI experiences. Supports streaming outputs, cancellation, and error handling.

    JavaScript 1

  5. OfflineWebOCR OfflineWebOCR Public

    A privacy-focused browser app for OCR using Transformers.js. Extract text from images entirely offline – no internet needed! Supports TrOCR & Donut models. Deployable to GitHub Pages.

    JavaScript 1

  6. Phi-4-mini-instruct-GGUF-Q3_K_M Phi-4-mini-instruct-GGUF-Q3_K_M Public

    Q3_K_M (3-bit) Quantized GGUF of Microsoft's SLM Phi-4-mini-instruct