Skip to content

Commit 533b58c

Browse files
committed
v5.6.28
1 parent a099bef commit 533b58c

File tree

2 files changed

+18
-7
lines changed

2 files changed

+18
-7
lines changed

deno.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
{
22
"name": "@nshiab/simple-data-analysis",
3-
"version": "5.6.27",
3+
"version": "5.6.28",
44
"exports": {
55
".": "./src/index.ts"
66
},

llm.md

Lines changed: 17 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -196,7 +196,7 @@ await sdb.selectTables(employeesTable);
196196

197197
#### `getTableNames`
198198

199-
Returns an array of all table names in the database.
199+
Returns an array of all table names in the database, sorted alphabetically.
200200

201201
##### Signature
202202

@@ -218,7 +218,8 @@ console.log(tableNames); // Output: ["employees", "customers"]
218218

219219
#### `logTableNames`
220220

221-
Logs the names of all tables in the database to the console.
221+
Logs the names of all tables in the database to the console, sorted
222+
alphabetically.
222223

223224
##### Signature
224225

@@ -852,7 +853,7 @@ This method does not support tables containing geometries.
852853
##### Signature
853854

854855
```typescript
855-
async aiRowByRow(column: string, newColumn: string, prompt: string, options?: { batchSize?: number; concurrent?: number; cache?: boolean; test?: (dataPoint: unknown) => any; retry?: number; model?: string; apiKey?: string; vertex?: boolean; project?: string; location?: string; ollama?: boolean | Ollama; verbose?: boolean; rateLimitPerMinute?: number; clean?: (response: unknown) => any; contextWindow?: number }): Promise<void>;
856+
async aiRowByRow(column: string, newColumn: string, prompt: string, options?: { batchSize?: number; concurrent?: number; cache?: boolean; test?: (dataPoint: unknown) => any; retry?: number; model?: string; apiKey?: string; vertex?: boolean; project?: string; location?: string; ollama?: boolean | Ollama; verbose?: boolean; rateLimitPerMinute?: number; clean?: (response: string) => any; contextWindow?: number; thinkingBudget?: number; extraInstructions?: string }): Promise<void>;
856857
```
857858

858859
##### Parameters
@@ -892,11 +893,17 @@ async aiRowByRow(column: string, newColumn: string, prompt: string, options?: {
892893
pass it here too.
893894
- **`options.verbose`**: - If `true`, logs additional debugging information,
894895
including the full prompt sent to the AI. Defaults to `false`.
895-
- **`options.clean`**: - A function to clean the AI's response before testing,
896-
caching, and storing. Defaults to `undefined`.
896+
- **`options.clean`**: - A function to clean the AI's response before JSON
897+
parsing, testing, caching, and storing. Defaults to `undefined`.
897898
- **`options.contextWindow`**: - An option to specify the context window size
898899
for Ollama models. By default, Ollama sets this depending on the model, which
899900
can be lower than the actual maximum context window size of the model.
901+
- **`options.thinkingBudget`**: - Sets the reasoning token budget: 0 to disable
902+
(default, though some models may reason regardless), -1 for a dynamic budget,
903+
or > 0 for a fixed budget. For Ollama models, any non-zero value simply
904+
enables reasoning, ignoring the specific budget amount.
905+
- **`options.extraInstructions`**: - Additional instructions to append to the
906+
prompt, providing more context or guidance for the AI.
900907

901908
##### Returns
902909

@@ -1170,7 +1177,7 @@ and time. Remember to add `.journalism-cache` to your `.gitignore`.
11701177
##### Signature
11711178

11721179
```typescript
1173-
async aiQuery(prompt: string, options?: { cache?: boolean; model?: string; apiKey?: string; vertex?: boolean; project?: string; location?: string; ollama?: boolean | Ollama; contextWindow?: number; verbose?: boolean }): Promise<void>;
1180+
async aiQuery(prompt: string, options?: { cache?: boolean; model?: string; apiKey?: string; vertex?: boolean; project?: string; location?: string; ollama?: boolean | Ollama; contextWindow?: number; thinkingBudget?: number; verbose?: boolean }): Promise<void>;
11741181
```
11751182

11761183
##### Parameters
@@ -1196,6 +1203,10 @@ async aiQuery(prompt: string, options?: { cache?: boolean; model?: string; apiKe
11961203
- **`options.contextWindow`**: - An option to specify the context window size
11971204
for Ollama models. By default, Ollama sets this depending on the model, which
11981205
can be lower than the actual maximum context window size of the model.
1206+
- **`options.thinkingBudget`**: - Sets the reasoning token budget: 0 to disable
1207+
(default, though some models may reason regardless), -1 for a dynamic budget,
1208+
or > 0 for a fixed budget. For Ollama models, any non-zero value simply
1209+
enables reasoning, ignoring the specific budget amount.
11991210
- **`options.verbose`**: - If `true`, logs additional debugging information,
12001211
including the full prompt sent to the AI. Defaults to `false`.
12011212

0 commit comments

Comments
 (0)