TinkerTasker is an open-source, local-first agent that runs in your terminal.
A explanation of how TinkerTasker works is available on YouTube!
-
Install uv
-
Install Ollama
-
Pull the default model from Ollama:
ollama run qwen3:30b-a3b-q4_K_M
You can configure other models in the
config.yaml
that is generated (TinkerTasker will tell you the path). -
Install as a uv tool
uv tool install git+https://github.com/DaveCoDev/TinkerTasker#subdirectory=cli-ux uv run tinkertasker-setup
-
Run the CLI. It will use the current directory as the working directory which enables file editing. Use at your own risk!
tinkertasker
uv tool install git+https://github.com/DaveCoDev/TinkerTasker#subdirectory=cli-ux --force
- WARNING: Currently your configuration will likely be reset when updating.
- Running
uv run tinkertasker-setup
again should not be currently necessary.
TinkerTasker uses a YAML configuration file that's automatically created on first run. The configuration is stored at:
- Windows:
%APPDATA%\tinkertasker\config.yaml
- Linux
~/.config/tinkertasker/config.yaml
llm_config:
model_name: "ollama_chat/qwen3:30b-a3b-q4_K_M" # LiteLLM model to use
max_completion_tokens: 4000 # Max tokens per response
... # Any other LiteLLM parameters can be added here
agent_config:
max_steps: 25 # Maximum steps per agent turn
prompt_config:
knowledge_cutoff: "2024-10" # Knowledge cutoff date - set this depending on the model used
timezone: "America/New_York" # Timezone for time in prompts
native_mcp_servers:
- filesystem # File system operations
- web # Web browsing and search
mcp_servers: [] # External MCP servers (see below)
ux_config:
number_tool_lines: 1 # Lines to show for tool outputs (-1 for all)
max_arg_value_length: 100 # Max length for argument values in display
You can add external MCP servers to extend TinkerTasker's capabilities. For example:
agent_config:
mcp_servers:
- identifier: "context7"
command: "npx"
args: ["-y", "@upstash/context7-mcp"]
prefix: null # Optional prefix for tools
If your configuration becomes corrupted or you want to start fresh, simply delete the config file. TinkerTasker will recreate it with default values on the next run.
.
|-- ai-core
| |-- src
| | `-- ai_core
| | `-- mcp_servers
| | |-- filesystem
| | `-- web
| `-- tests
| `-- mcp_servers
| |-- filesystem
| `-- web
`-- cli-ux
`-- src
`-- cli_ux
- uv for Python package management
- make
- On Windows, you can install winget then run
winget install ezwinports.make -e
- On Windows, you can install winget then run
- ollama
make
- Run all commands below in sequencemake install
- Install and update dependencies for all Python packagesmake lint
- Run ruff linting with auto-fix for all packagesmake format
- Format code with ruff for all packages
- Install WSL following Developing in WSL
- Open a terminal and type
wsl.exe
to start WSL cd /home/<your-username>
to navigate to your home directory- Clone the repo
git clone https://github.com/DavidKoleczek/TinkerTasker.git
cd TinkerTasker
to enter the project directorycode .
to open the project in Visual Studio Code- If this doesn't work, you may need to add VSCode to path.
- The repo should open and automatically install the WSL extensions. If it does not, click in the bottom left corner of VSCode and select WSL
2. Install uv to Manage Python Environments
- Go to Installation and follow the instructions
- Check if it installed correctly by running
uv --version
in your terminal
- Setup GitHub Copilot for auto-completions and for quick questions.
- Setup Claude Code and also let it install the VSCode extension.
The project includes a VSCode workspace file (TinkerTasker.code-workspace
) that configures the multi-package setup. Open it for the best development experience:
- In VSCode, go to File → Open Workspace from File
- Select
TinkerTasker.code-workspace
-
Install dependencies and set up all packages:
make install
-
Before committing code, run linting and formatting:
make lint make format
-
Or run everything at once:
make
The project uses a Makefile to manage multiple Python packages (ai-core
and cli-ux
). Each command automatically handles virtual environments and runs the specified operations across all packages.
To be determined based on interest.
This project uses Crawl4AI (https://github.com/unclecode/crawl4ai) for web data extraction.