Skip to content

Build a lightweight local REST API that simulates a core ModelVault feature: running a local LLM to respond to a user prompt — completely offline.

Notifications You must be signed in to change notification settings

ZA1031/minivault

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MiniVault API

A lightweight, local-first REST API simulating a core ModelVault feature: running a local LLM to respond to a prompt — fully offline.

Author: Do Minh Long ([email protected])


Features

  • POST /generate endpoint powered by a local LLM (via Ollama)
  • Logs each interaction to logs/log.jsonl
  • Command-line tool: minivault --prompt "..." or --status
  • /status endpoint reports memory + uptime
  • Docker support
  • Packaged with pyproject.toml for CLI installation

Requirements

  • Python 3.10+
  • Ollama running locally (e.g., ollama run llama3)

Setup & Run

⬇Install dependencies

pip install -r requirements.txt
uvicorn minivault.main:app --reload

## Run Methods
1. Run without packaging (just scripts)
    python minivault/cli.py --prompt "Hello, who are you?"
    python minivault/cli.py --status

2. Run after packaging (via pip install -e .)
    pip install -e .
    minivault --prompt "Hello, who are you?"
    minivault --status

3. Run with Docker
    docker build -t minivault .
    docker run -p 8000:8000 minivault

About

Build a lightweight local REST API that simulates a core ModelVault feature: running a local LLM to respond to a user prompt — completely offline.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages