Skip to content

Conversation

@MohamedBassem
Copy link
Collaborator

@MohamedBassem MohamedBassem commented Sep 28, 2025

An attempt for #1748

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Sep 28, 2025

Walkthrough

Adds abort-aware control flow and streamed, size-enforced asset downloads in crawlerWorker.ts; switches asset handling from in-memory buffers to temporary file pipelines, improves cleanup on abort/quota/error, and includes runNumber in generated jobId.

Changes

Cohort / File(s) Summary of changes
Crawler abort + streaming downloads
apps/workers/workers/crawlerWorker.ts
Added abortPromise helper and wired abort checks into navigation, wait-for-load, content extraction, screenshot capture, metadata extraction, and asset flows. Raced navigation and other critical async tasks against abort; added throwIfAborted checks.
Streamed asset pipeline & temp-file handling
apps/workers/workers/crawlerWorker.ts
Introduced a Transform that tracks bytesRead, enforces a max asset size, validates content-length on flush, and pipes response body into a temp file. Replaced in-memory buffering with file-based asset saving (saveAssetFromFile) and updated quota checks to use streamed bytesRead. Ensured temp files are cleaned in finally blocks.
Archive / HTML storage changes
apps/workers/workers/crawlerWorker.ts
archiveWebpage and HTML/content storage now write to temp files (via os.tmpdir) and perform cleanup when archiving is skipped, fails, or quota prevents storage. Preserved inline-storage alternatives for very large HTML.
Job and control-flow adjustments
apps/workers/workers/crawlerWorker.ts
runCrawler jobId generation now includes runNumber. Enhanced content-type detection and propagated aborts through crawlAndParseUrl, including during content-type checks, image handling, downloads, and downstream processing.

Sequence Diagram(s)

sequenceDiagram
  autonumber
  participant W as Worker
  participant CW as CrawlerWorker
  participant P as BrowserPage
  participant N as HTTP Server
  participant FS as TempFileStore
  participant S as PersistentStore
  participant A as AbortSignal

  Note over CW,A #fff2cc: Abort-aware crawl with streamed asset downloads

  W->>CW: runCrawler(jobId + runNumber, A)
  CW->>P: page.goto(url)
  par Race navigation vs abort
    P-->>CW: navigate/loaded
    A-->>CW: abort signaled
  end
  alt Aborted
    CW-->>W: AbortError (cleanup temp files)
  else Loaded
    CW->>P: wait-for-load / extract HTML & metadata
    A-->>CW: abort?
    alt Aborted
      CW-->>W: AbortError
    else Continue
      CW->>P: optional screenshot (abort checked)
      CW->>N: fetch asset
      N-->>CW: response stream + headers
      CW->>FS: create temp file
      CW->>FS: pipeline(stream -> Transform[size-check] -> temp file)
      A-->>CW: abort?
      alt Aborted during download
        CW->>FS: cleanup temp file
        CW-->>W: AbortError
      else Download complete
        CW->>S: saveAssetFromFile(temp file, bytesRead)
        CW->>FS: cleanup temp file
        CW-->>W: result
      end
    end
  end
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~60 minutes

Pre-merge checks and finishing touches

❌ Failed checks (1 warning, 1 inconclusive)
Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 0.00% which is insufficient. The required threshold is 80.00%. You can run @coderabbitai generate docstrings to improve docstring coverage.
Description Check ❓ Inconclusive The description merely states “An attempt for #1748” and provides no information about what changes were made or why, making it too generic to convey the nature of the work in this pull request. Please expand the description to include a brief summary of the key changes (for example, integration of an abortPromise helper, stream-based asset size enforcement, and cleanup of temporary files) so reviewers can quickly understand the scope and intent of the PR.
✅ Passed checks (1 passed)
Check name Status Explanation
Title Check ✅ Passed The title succinctly and accurately describes the primary change—ensuring that any in-flight or “dangling” processing is aborted when the crawler’s abort signal fires—without extraneous details, making it clear to reviewers what the PR accomplishes.
✨ Finishing touches
  • 📝 Generate Docstrings
🧪 Generate unit tests
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch abort-dangling-processing

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@cloudflare-workers-and-pages
Copy link

cloudflare-workers-and-pages bot commented Sep 28, 2025

Deploying karakeep-landing with  Cloudflare Pages  Cloudflare Pages

Latest commit: 54b145e
Status: ✅  Deploy successful!
Preview URL: https://1dc89b2b.karakeep-landing.pages.dev
Branch Preview URL: https://abort-dangling-processing.karakeep-landing.pages.dev

View logs

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
apps/workers/workers/crawlerWorker.ts (1)

1006-1011: Abort guard keeps Promise.all pending forever

abortPromise never resolves; it only rejects when the signal fires. Including it in Promise.all([...]) means the array never settles on the non-aborted path, so crawlAndParseUrl hangs indefinitely and the worker never moves forward. Please race the combined work against the abort signal instead of awaiting a never-resolving promise.

-  const [meta, readableContent, screenshotAssetInfo] = await Promise.all([
-    extractMetadata(htmlContent, browserUrl, jobId),
-    extractReadableContent(htmlContent, browserUrl, jobId),
-    storeScreenshot(screenshot, userId, jobId),
-    abortPromise(abortSignal),
-  ]);
+  const abortableWork = Promise.all([
+    extractMetadata(htmlContent, browserUrl, jobId),
+    extractReadableContent(htmlContent, browserUrl, jobId),
+    storeScreenshot(screenshot, userId, jobId),
+  ]);
+  await Promise.race([abortableWork, abortPromise(abortSignal)]);
+  const [meta, readableContent, screenshotAssetInfo] = await abortableWork;
📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 8dd84ef and 6f23df4.

📒 Files selected for processing (1)
  • apps/workers/workers/crawlerWorker.ts (14 hunks)
🧰 Additional context used
📓 Path-based instructions (2)
**/*.{ts,tsx,js,jsx}

📄 CodeRabbit inference engine (AGENTS.md)

Use oxlint as the linter for JavaScript/TypeScript code

Files:

  • apps/workers/workers/crawlerWorker.ts
apps/workers/**

📄 CodeRabbit inference engine (AGENTS.md)

Background workers code resides in apps/workers

Files:

  • apps/workers/workers/crawlerWorker.ts
🧬 Code graph analysis (1)
apps/workers/workers/crawlerWorker.ts (4)
apps/workers/utils.ts (1)
  • fetchWithProxy (62-71)
packages/shared/assetdb.ts (4)
  • newAssetId (126-128)
  • saveAssetFromFile (177-202)
  • saveAssetFromFile (369-395)
  • saveAssetFromFile (651-678)
packages/shared/tryCatch.ts (1)
  • tryCatch (15-24)
packages/shared-server/src/services/quotaService.ts (1)
  • QuotaService (22-93)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
  • GitHub Check: tests
  • GitHub Check: typecheck

Comment on lines 643 to 676
const expectedLength = response.headers.get("content-length");

const assetId = newAssetId();
assetPath = path.join(os.tmpdir(), assetId);

let bytesRead = 0;
const contentLengthEnforcer = new Transform({
transform(chunk, _, callback) {
bytesRead += chunk.length;

if (abortSignal.aborted) {
callback(new Error("AbortError"));
} else if (bytesRead > serverConfig.maxAssetSizeMb * 1024 * 1024) {
callback(
new Error(
`Content length exceeds maximum allowed size: ${serverConfig.maxAssetSizeMb}MB`,
),
);
} else {
callback(null, chunk); // pass data along unchanged
}
},
flush(callback) {
if (expectedLength && bytesRead !== Number(expectedLength)) {
callback(
new Error(
`Content length mismatch: expected ${expectedLength}, got ${bytesRead}`,
),
);
} else {
callback();
}
},
});
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Compressed downloads always fail length validation

Response.body from undici is already decompressed. When a server sends a Content-Length for the compressed payload (e.g., gzip), bytesRead grows to the decompressed size, so the flush check throws Content length mismatch on every compressed asset. That breaks downloads for most real-world sites. Guard the comparison so it only runs when the header parses to a finite number and the response isn’t compressed.

-    const expectedLength = response.headers.get("content-length");
+    const rawContentLength = response.headers.get("content-length");
+    const parsedContentLength =
+      rawContentLength !== null ? Number(rawContentLength) : null;
+    const expectedLength =
+      parsedContentLength !== null &&
+      Number.isFinite(parsedContentLength) &&
+      !response.headers.get("content-encoding")
+        ? parsedContentLength
+        : null;
...
-        if (expectedLength && bytesRead !== Number(expectedLength)) {
+        if (expectedLength !== null && bytesRead !== expectedLength) {
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
const expectedLength = response.headers.get("content-length");
const assetId = newAssetId();
assetPath = path.join(os.tmpdir(), assetId);
let bytesRead = 0;
const contentLengthEnforcer = new Transform({
transform(chunk, _, callback) {
bytesRead += chunk.length;
if (abortSignal.aborted) {
callback(new Error("AbortError"));
} else if (bytesRead > serverConfig.maxAssetSizeMb * 1024 * 1024) {
callback(
new Error(
`Content length exceeds maximum allowed size: ${serverConfig.maxAssetSizeMb}MB`,
),
);
} else {
callback(null, chunk); // pass data along unchanged
}
},
flush(callback) {
if (expectedLength && bytesRead !== Number(expectedLength)) {
callback(
new Error(
`Content length mismatch: expected ${expectedLength}, got ${bytesRead}`,
),
);
} else {
callback();
}
},
});
const rawContentLength = response.headers.get("content-length");
const parsedContentLength =
rawContentLength !== null ? Number(rawContentLength) : null;
const expectedLength =
parsedContentLength !== null &&
Number.isFinite(parsedContentLength) &&
!response.headers.get("content-encoding")
? parsedContentLength
: null;
const assetId = newAssetId();
assetPath = path.join(os.tmpdir(), assetId);
let bytesRead = 0;
const contentLengthEnforcer = new Transform({
transform(chunk, _, callback) {
bytesRead += chunk.length;
if (abortSignal.aborted) {
callback(new Error("AbortError"));
} else if (bytesRead > serverConfig.maxAssetSizeMb * 1024 * 1024) {
callback(
new Error(
`Content length exceeds maximum allowed size: ${serverConfig.maxAssetSizeMb}MB`,
),
);
} else {
callback(null, chunk); // pass data along unchanged
}
},
flush(callback) {
if (expectedLength !== null && bytesRead !== expectedLength) {
callback(
new Error(
`Content length mismatch: expected ${expectedLength}, got ${bytesRead}`,
),
);
} else {
callback();
}
},
});

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 6f23df4 and 2e16ff4.

📒 Files selected for processing (1)
  • apps/workers/workers/crawlerWorker.ts (15 hunks)
🧰 Additional context used
📓 Path-based instructions (2)
**/*.{ts,tsx,js,jsx}

📄 CodeRabbit inference engine (AGENTS.md)

Use oxlint as the linter for JavaScript/TypeScript code

Files:

  • apps/workers/workers/crawlerWorker.ts
apps/workers/**

📄 CodeRabbit inference engine (AGENTS.md)

Background workers code resides in apps/workers

Files:

  • apps/workers/workers/crawlerWorker.ts
🧬 Code graph analysis (1)
apps/workers/workers/crawlerWorker.ts (4)
apps/workers/utils.ts (1)
  • fetchWithProxy (62-71)
packages/shared/assetdb.ts (4)
  • newAssetId (126-128)
  • saveAssetFromFile (177-202)
  • saveAssetFromFile (369-395)
  • saveAssetFromFile (651-678)
packages/shared/tryCatch.ts (1)
  • tryCatch (15-24)
packages/shared-server/src/services/quotaService.ts (1)
  • QuotaService (22-93)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
  • GitHub Check: tests
  • GitHub Check: typecheck

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 2e16ff4 and 8bebdcc.

📒 Files selected for processing (1)
  • apps/workers/workers/crawlerWorker.ts (15 hunks)
🧰 Additional context used
📓 Path-based instructions (2)
**/*.{ts,tsx,js,jsx}

📄 CodeRabbit inference engine (AGENTS.md)

Use oxlint as the linter for JavaScript/TypeScript code

Files:

  • apps/workers/workers/crawlerWorker.ts
apps/workers/**

📄 CodeRabbit inference engine (AGENTS.md)

Background workers code resides in apps/workers

Files:

  • apps/workers/workers/crawlerWorker.ts
🧬 Code graph analysis (1)
apps/workers/workers/crawlerWorker.ts (4)
apps/workers/utils.ts (1)
  • fetchWithProxy (62-71)
packages/shared/assetdb.ts (4)
  • newAssetId (126-128)
  • saveAssetFromFile (177-202)
  • saveAssetFromFile (369-395)
  • saveAssetFromFile (651-678)
packages/shared/tryCatch.ts (1)
  • tryCatch (15-24)
packages/shared-server/src/services/quotaService.ts (1)
  • QuotaService (22-93)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
  • GitHub Check: tests
  • GitHub Check: typecheck

Comment on lines 681 to 685
await streamPipeline(
response.body,
contentLengthEnforcer,
fsSync.createWriteStream(assetPath),
);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Fix: Convert fetch body to a Node stream before piping

Line 683: stream.pipeline expects a Node Readable, but fetchWithProxy returns an Undici/WHATWG ReadableStream. Passing it directly throws ERR_INVALID_ARG_TYPE, so every download aborts at runtime. Please wrap the body with Readable.fromWeb(...) (and import Readable) before piping it into the temp file.

-import { pipeline, Transform } from "stream";
+import { pipeline, Transform, Readable } from "stream";-    await streamPipeline(
-      response.body,
+    await streamPipeline(
+      Readable.fromWeb(response.body),
       contentLengthEnforcer,
       fsSync.createWriteStream(assetPath),
     );
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
await streamPipeline(
response.body,
contentLengthEnforcer,
fsSync.createWriteStream(assetPath),
);
// At the top of the file, include Readable
import { pipeline, Transform, Readable } from "stream";
// In the download pipeline, convert the web ReadableStream to a Node Readable
- await streamPipeline(
await streamPipeline(
Readable.fromWeb(response.body),
contentLengthEnforcer,
fsSync.createWriteStream(assetPath),
);
🤖 Prompt for AI Agents
In apps/workers/workers/crawlerWorker.ts around lines 681 to 685, the code
passes the WHATWG/Undici Response.body directly into stream.pipeline which
expects a Node Readable, causing ERR_INVALID_ARG_TYPE; wrap the fetch body with
Readable.fromWeb(response.body) before piping and add the Node stream import
(import { Readable } from 'stream' or equivalent CommonJS require) at the top of
the file so streamPipeline receives a proper Node Readable.

Comment on lines 641 to +717
logger.error(
`[Crawler][${jobId}] Failed to download and store ${fileType}: ${e}`,
);
return null;
} finally {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Propagate aborts instead of treating them as download failures

Line 715: When the abort signal fires, either fetch or the transform surfaces an AbortError, but this catch block logs it and returns null. In the PDF/image path (handleAsAssetBookmark), that means the crawler exits normally instead of honoring the abort—which is the regression this PR is trying to avoid. Please rethrow abort-driven failures (e.g., check abortSignal.aborted or e?.name === "AbortError" and throw the abort reason) so callers can stop the job.

   } catch (e) {
-    logger.error(
-      `[Crawler][${jobId}] Failed to download and store ${fileType}: ${e}`,
-    );
-    return null;
+    if (
+      abortSignal.aborted ||
+      (e instanceof Error && e.name === "AbortError")
+    ) {
+      throw abortSignal.reason ?? e;
+    }
+    logger.error(
+      `[Crawler][${jobId}] Failed to download and store ${fileType}: ${e}`,
+    );
+    return null;
   } finally {
🤖 Prompt for AI Agents
In apps/workers/workers/crawlerWorker.ts around lines 713-717, the catch
currently logs all errors from fetch/transform and returns null which swallows
abort-driven failures; change the handler to detect aborts (e.g., if
(abortSignal?.aborted) or if (e?.name === "AbortError")) and rethrow the abort
(preferably throw abortSignal.reason || e) so the caller can stop the job, while
still logging non-abort errors and returning null for those.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (1)
apps/workers/workers/crawlerWorker.ts (1)

746-747: Verify temporary file path generation

Using os.tmpdir() with just the assetId as filename could potentially cause conflicts if the same ID is generated multiple times.

Consider using a more unique temporary filename:

-  const assetPath = path.join(os.tmpdir(), assetId);
+  const assetPath = path.join(os.tmpdir(), `karakeep_archive_${assetId}.html`);

This adds a prefix and extension for better file management and debugging.

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 8bebdcc and 54b145e.

📒 Files selected for processing (1)
  • apps/workers/workers/crawlerWorker.ts (15 hunks)
🧰 Additional context used
📓 Path-based instructions (2)
**/*.{ts,tsx,js,jsx}

📄 CodeRabbit inference engine (AGENTS.md)

Use oxlint as the linter for JavaScript/TypeScript code

Files:

  • apps/workers/workers/crawlerWorker.ts
apps/workers/**

📄 CodeRabbit inference engine (AGENTS.md)

Background workers code resides in apps/workers

Files:

  • apps/workers/workers/crawlerWorker.ts
🧬 Code graph analysis (1)
apps/workers/workers/crawlerWorker.ts (4)
apps/workers/utils.ts (1)
  • fetchWithProxy (62-71)
packages/shared/assetdb.ts (4)
  • newAssetId (126-128)
  • saveAssetFromFile (177-202)
  • saveAssetFromFile (369-395)
  • saveAssetFromFile (651-678)
packages/shared/tryCatch.ts (1)
  • tryCatch (15-24)
packages/shared-server/src/services/quotaService.ts (1)
  • QuotaService (22-93)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
  • GitHub Check: tests
  • GitHub Check: typecheck
🔇 Additional comments (13)
apps/workers/workers/crawlerWorker.ts (13)

457-463: Verify null handling for navigation response

When abortPromise wins the race on line 462, response becomes null, but lines 523-524 still access response?.status() which correctly handles null. This is good defensive programming.

The null-safe access pattern response?.status() ?? 0 properly handles the abort case where response is null. Well done!


476-477: Good practice: Immediate abort check after async operations

Adding abortSignal.throwIfAborted() after the wait ensures the crawler stops immediately when aborted, preventing unnecessary processing.


715-719: Good practice: Cleanup temporary files in finally block

The finally block ensures temporary files are cleaned up even on errors, preventing disk space leaks.


1007-1018: Effective abort handling with race conditions

The implementation properly races extractors against abort and then awaits the work to get results or throw on abort. The pattern ensures work completes or aborts cleanly.

Excellent pattern for handling concurrent operations with abort support! The code races the work against abort, then awaits the work promise to either get results or propagate any errors that occurred during execution.


1174-1174: Good improvement: Include runNumber in jobId for uniqueness

Adding runNumber to the jobId ensures unique identifiers across retries, improving log traceability.


1201-1201: Consistent abort checking throughout the flow

The abort check after getContentType ensures the job stops if aborted during content-type detection.

Good placement of abort check after the async operation. Based on learnings


783-783: Good error handling: Cleanup on quota failure

Properly cleaning up the temporary file when quota check fails prevents orphaned files.


635-636: Stream-based architecture improves memory efficiency

The transition from in-memory buffers to stream-based file operations with temporary files is a significant improvement for handling large assets without memory exhaustion.

Excellent architectural improvement! The stream-based approach with temporary files prevents memory issues when downloading large assets.

Also applies to: 654-656


641-643: Fix needed: Convert WHATWG stream to Node stream

The fetchWithProxy returns a WHATWG ReadableStream, but pipeline expects a Node.js Readable stream. This will cause a runtime error.


657-677: Verify content-length validation for compressed responses

The content-length enforcer doesn't account for compressed responses where the Content-Length header refers to the compressed size, but bytesRead tracks decompressed bytes.


710-719: Critical: Propagate abort errors instead of swallowing them

The catch block logs all errors and returns null, which prevents abort propagation. This defeats the purpose of abort handling.


679-683: Stream conversion required before pipeline

The WHATWG stream needs to be converted to a Node.js stream.

Import Readable from stream and convert the response body:

-import { Transform } from "stream";
+import { Transform, Readable } from "stream";
...
    await pipeline(
-      response.body,
+      Readable.fromWeb(response.body),
      contentLengthEnforcer,
      fsSync.createWriteStream(assetPath),
    );

78-101: Critical fix needed: Prevent unhandled rejections in abortPromise

The current implementation has a critical issue where unhandled rejections can occur when the promise is rejected after being raced but before being awaited. The rejected promise created on line 80 isn't properly caught, and the main promise (line 87) can still reject after its race resolves.

Apply this fix to properly handle rejections and clean up listeners:

function abortPromise(signal: AbortSignal): Promise<never> {
  if (signal.aborted) {
-    const p = Promise.reject(signal.reason ?? new Error("AbortError"));
-    p.catch(() => {
-      /* empty */
-    }); // suppress unhandledRejection if not awaited
-    return p;
+    return Promise.reject(signal.reason ?? new Error("AbortError"));
  }

-  const p = new Promise<never>((_, reject) => {
-    signal.addEventListener(
-      "abort",
-      () => {
-        reject(signal.reason ?? new Error("AbortError"));
-      },
-      { once: true },
-    );
-  });
+  let onAbort: (() => void) | undefined;
+  const promise = new Promise<never>((_, reject) => {
+    onAbort = () => {
+      signal.removeEventListener("abort", onAbort!);
+      reject(signal.reason ?? new Error("AbortError"));
+    };
+    signal.addEventListener("abort", onAbort, { once: true });
+  });

-  p.catch(() => {
-    /* empty */
-  });
-  return p;
+  // Attach a no-op catch handler to prevent unhandled rejection warnings
+  // when the promise is raced and loses
+  promise.catch(() => {});
+  return promise;
}

Comment on lines +506 to +509
abortPromise(abortSignal).then(() => Buffer.from("")),
]),
);
abortSignal.throwIfAborted();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Potential issue: Empty buffer on abort may cause issues downstream

When the abort promise wins on line 506, it returns an empty buffer Buffer.from(""). This could be misleading as it appears successful but with empty data.

Consider throwing the abort error instead of returning an empty buffer:

-          abortPromise(abortSignal).then(() => Buffer.from("")),
+          abortPromise(abortSignal),

This way, the error handling on line 510 will properly detect the abort.

📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
abortPromise(abortSignal).then(() => Buffer.from("")),
]),
);
abortSignal.throwIfAborted();
abortPromise(abortSignal),
]),
);
abortSignal.throwIfAborted();
🤖 Prompt for AI Agents
In apps/workers/workers/crawlerWorker.ts around lines 506 to 509, returning an
empty Buffer when the abort promise wins hides the abort as a successful empty
result; change the abort branch so it throws/rejects the abort error instead of
resolving Buffer.from("") (e.g. have abortPromise reject with the AbortError or
propagate abortSignal.reason), so the race will surface the abort as an
exception and existing abortSignal.throwIfAborted() / surrounding error handling
will detect it.

@MohamedBassem MohamedBassem merged commit 9eecda1 into main Sep 28, 2025
12 checks passed
alexlebens pushed a commit to alexlebens/infrastructure that referenced this pull request Nov 14, 2025
This PR contains the following updates:

| Package | Update | Change |
|---|---|---|
| [ghcr.io/karakeep-app/karakeep](https://github.com/karakeep-app/karakeep) | minor | `0.27.1` -> `0.28.0` |

---

### Release Notes

<details>
<summary>karakeep-app/karakeep (ghcr.io/karakeep-app/karakeep)</summary>

### [`v0.28.0`](https://github.com/karakeep-app/karakeep/releases/tag/v0.28.0): 0.28.0

[Compare Source](karakeep-app/karakeep@v0.27.1...v0.28.0)

### 0.28.0 (20k stars ⭐)

Welcome to the 0.28.0 release of Karakeep! We've have hit 20k stars on Github 🎉 (well 21k because I was too late with the release)! Thanks a lot for your support throughout this journey! This release brings a refreshed import pipeline, uploading custom attachments, revamped tags page, inline checklists, and a bunch of quality-of-life touches across the web app, extension and mobile app. Huge thanks to [@&#8203;BOTkirial](https://github.com/BOTkirial), [@&#8203;qixing-jk](https://github.com/qixing-jk), @&#8203;[@&#8203;maya-doshi](https://github.com/maya-doshi), [@&#8203;BenjaminMichaelis](https://github.com/BenjaminMichaelis), [@&#8203;cloudchristoph](https://github.com/cloudchristoph), [@&#8203;claytono](https://github.com/claytono), as usual [@&#8203;xuatz](https://github.com/xuatz) and everyone who shipped code, triaged bugs, or shared feedback for this release.

> If you enjoy using Karakeep, consider supporting the project [here ☕️](https://buymeacoffee.com/mbassem) or via GitHub [here](https://github.com/sponsors/MohamedBassem).

<a href="https://www.buymeacoffee.com/mbassem" target="_blank"><img src="https://cdn.buymeacoffee.com/buttons/v2/default-yellow.png" alt="Buy Me A Coffee" width="auto" height="50" ></a>

And in case you missed it, we now have a ☁️ managed offering ☁️ for those who don't want to self-host. We're still in private beta (you can signup for access [here](https://tally.so/r/wo8zzx)) and gradually letting more and more users in.

### New Features 🚀

- Revamped import experience with progress tracking ([#&#8203;2001](karakeep-app/karakeep#2001))
- Revamped Tags page that adds search and pagination to better serve users with thousands of tags ([#&#8203;1987](karakeep-app/karakeep#1987))
- You can now upload custom attachments to bookmarks ([#&#8203;2100](karakeep-app/karakeep#2100))
- When deleting a list, you can now optionally delete all its children ([#&#8203;1989](karakeep-app/karakeep#1989))
- Server overview highlights service dependency health.
- Inline checklist toggling for text bookmarks ([#&#8203;1933](karakeep-app/karakeep#1933)) – [@&#8203;BOTkirial](https://github.com/BOTkirial)
- With every release, you'll be prompted to view what's new in that release from inside the app.
- You can now pass custom headers from the mobile app to the server ([#&#8203;2103](karakeep-app/karakeep#2103))
- Extension improvements:
  - Tab bookmark badge indicator by [@&#8203;qixing-jk](https://github.com/qixing-jk) shows when a page is already bookmarked ([#&#8203;1745](karakeep-app/karakeep#1745))
  - You can now write notes directly after saving a bookmark in the extension ([#&#8203;2104](karakeep-app/karakeep#2104))

### UX Improvements ✨

- Grid view controls expose title/tag toggles and image fit options ([#&#8203;1960](karakeep-app/karakeep#1960))
- Bookmark cards can surface saved notes across web and mobile ([#&#8203;2083](karakeep-app/karakeep#2083)) – [@&#8203;xuatz](https://github.com/xuatz)
- Manage Lists modal is searchable for faster sorting ([#&#8203;2029](karakeep-app/karakeep#2029))
- The tags page now has a "Create Tag" button ([#&#8203;1942](karakeep-app/karakeep#1942))
- You can now regenerate the API key without having to recreate it.
- New `title:` seach qualifier for searching bookmarks by title ([#&#8203;1940](karakeep-app/karakeep#1940))

### Fixes 🔧

- ⚠️ (Potentially breaking change) ⚠️ Stricter URL valdaition to protect against SSRF attacks ([#&#8203;2082](karakeep-app/karakeep#2082))
  - Webhook requests now go through the proxy if there's one configured
  - All server-initiated requests (including webhooks) to internal IP addresses are now blocked by default unless explicitly allowed via `CRAWLER_ALLOWED_INTERNAL_HOSTNAMES`. If your webhooks are hitting internal services, you'll have to allowlist them via `CRAWLER_ALLOWED_INTERNAL_HOSTNAMES`.
  - Monolith now honors the configured crawler proxy.
  - Metascraper logo extraction now respects the crawler proxy.
- Crawler memory footprint shrinks with targeted optimizations ([#&#8203;1748](karakeep-app/karakeep#1748))
- Allow karakeep to use newer openai models that was previously failing because of deprecated max\_tokens ([#&#8203;2000](karakeep-app/karakeep#2000)) - [@&#8203;BenjaminMichaelis](https://github.com/BenjaminMichaelis)
  - You'll need to set `INFERENCE_USE_MAX_COMPLETION_TOKENS=true` in your `.env` file to use the new models. This is eventually going to become the default.
- Admin maintenance jobs respect abort signals to stop gracefully
- Search input no longer crashes on percent signs and also works correctly with IME composition
- Fixed a crash when sharing a list publicly that didn't have any bookmarks ([#&#8203;1990](karakeep-app/karakeep#1990))
- Screenshots are now stored as jpegs instead of pngs to reduce file size
- Fixed a bug that was preventing tag merging ([#&#8203;1938](karakeep-app/karakeep#1938))
- RSS imports can apply feed categories as tags ([#&#8203;2031](karakeep-app/karakeep#2031))

### For Developers 🛠️

- Create bookmark API returns 200 instead of 201 when a bookmark already exists
- CLI Improvements:
  - New commands to migrate data from one server to another
  - New command to dump a full account archive
  - A new wipe command to selectively clean up data from the account

### Community Projects 💡

##### [Karakeeper](https://apps.apple.com/us/app/karakeeper-for-karakeep/id6746722790)

3rd Party iOS/Safari Client - *by [@&#8203;simplytoast1](https://github.com/simplytoast1)*

Karakeeper now is providing an alternative iOS native mobile/desktop client for Karakeep beyond its existing functionality of providing a safari extension.

##### [Karakeep Sync](https://github.com/sidoshi/karakeep-sync)

A syncing tool for Karakeep - *by [@&#8203;sidoshi](https://github.com/sidoshi)*

A rust-based syncing tool that syncs: Hacker News upvotes, Reddit saved posts, Github stars and Pinboard bookmarks automatically to Karakeep!

### Screenshots 📸

#### Inline Checklists

<img width="1230" height="806" alt="Screenshot 2025-11-08 at 8  55 18@&#8203;2x" src="https://github.com/user-attachments/assets/c092d903-eb6f-40c6-aee6-1ce6127f67e8" />

#### Import Sessions
<img width="1814" height="762" alt="Screenshot 2025-11-08 at 8  58 21@&#8203;2x" src="https://github.com/user-attachments/assets/dfcb856b-6a63-4d7a-ba4b-ce2ca83bc844" />

#### Service Health Indicators
<img width="1874" height="540" alt="Screenshot 2025-11-08 at 8  56 00@&#8203;2x" src="https://github.com/user-attachments/assets/7835f1ad-239d-477c-8e00-951e4a09f8c6" />

### Upgrading 📦

To upgrade:

- If you're using `KARAKEEP_VERSION=release`, run `docker compose pull && docker compose up -d`.
- If you're pinning it to a specific version, bump the version and then run `docker compose pull && docker compose up -d`.

### All Commits

- fix: standardize US English translations to professional tone - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`4f025f5`](karakeep-app/karakeep@4f025f5a)
- i18n: Sync weblate translations - [@&#8203;weblate](https://github.com/weblate) in [`5387c98`](karakeep-app/karakeep@5387c982)
- tests: fix crawling and search e2e tests ([#&#8203;2105](karakeep-app/karakeep#2105)) - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`c4bee9f`](karakeep-app/karakeep@c4bee9fe)
- feat(extension): Allow writing notes directly in the extension ([#&#8203;2104](karakeep-app/karakeep#2104)) - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`098e56a`](karakeep-app/karakeep@098e56a8)
- fix(mobile): fix default address not correctly stored in settings - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`a220319`](karakeep-app/karakeep@a2203196)
- feat(mobile): add custom headers configuration in sign-in screen ([#&#8203;2103](karakeep-app/karakeep#2103)) - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`ec621bf`](karakeep-app/karakeep@ec621bf5)
- tests: Fix failing test - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`27ed0a1`](karakeep-app/karakeep@27ed0a19)
- feat: Add what's new modal in the sidebar ([#&#8203;2099](karakeep-app/karakeep#2099)) - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`474f642`](karakeep-app/karakeep@474f6429)
- feat: Add support for user uploaded files ([#&#8203;2100](karakeep-app/karakeep#2100)) - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`31960fc`](karakeep-app/karakeep@31960fcd)
- refactor: consolidate multiple karakeep plugins into one package ([#&#8203;2101](karakeep-app/karakeep#2101)) - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`99413db`](karakeep-app/karakeep@99413db0)
- fix: metascraper logo to go through proxy if one configured. fixes [#&#8203;1863](karakeep-app/karakeep#1863) - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`737b031`](karakeep-app/karakeep@737b0317)
- feat(extension): add tab bookmark badge indicator ([#&#8203;1745](karakeep-app/karakeep#1745)) - [@&#8203;qixing-jk](https://github.com/qixing-jk) in [`f0b0959`](karakeep-app/karakeep@f0b0959e)
- fix: restore image size in grid layout - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`2056582`](karakeep-app/karakeep@2056582c)
- deps: Upgrade react-query to 5.90 - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`560900b`](karakeep-app/karakeep@560900bb)
- feat: Support inline toggling for todos. fixes [#&#8203;1931](karakeep-app/karakeep#1931) ([#&#8203;1933](karakeep-app/karakeep#1933)) - [@&#8203;BOTkirial](https://github.com/BOTkirial) in [`393bbd9`](karakeep-app/karakeep@393bbd9a)
- fix: fix monolith to respect crawler proxy - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`085c832`](karakeep-app/karakeep@085c832c)
- feat(rss): Add import tags from RSS feed categories ([#&#8203;2031](karakeep-app/karakeep#2031)) - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`5358682`](karakeep-app/karakeep@5358682a)
- fix: fix crash in search input when query contains a percent. fixes [#&#8203;1941](karakeep-app/karakeep#1941) - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`633686b`](karakeep-app/karakeep@633686b5)
- feat: Add view options to show tag/title and control image fit. Fixes [#&#8203;1960](karakeep-app/karakeep#1960) - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`34d2b48`](karakeep-app/karakeep@34d2b485)
- refactor: improve the userLocalSetting server functions - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`bb00c99`](karakeep-app/karakeep@bb00c996)
- feat: Make search job timeout configurable - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`965c603`](karakeep-app/karakeep@965c603d)
- feat: display notes on bookmark card ([#&#8203;2083](karakeep-app/karakeep#2083)) - [@&#8203;xuatz](https://github.com/xuatz) in [`33f4077`](karakeep-app/karakeep@33f40779)
- fix: Stricter SSRF validation ([#&#8203;2082](karakeep-app/karakeep#2082)) - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`b63a49f`](karakeep-app/karakeep@b63a49fc)
- fix: correctly handle composition in search input. fixes [#&#8203;2048](karakeep-app/karakeep#2048) - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`c6ebceb`](karakeep-app/karakeep@c6ebceb9)
- fix: browser service connection check using dns instead. Fixes [#&#8203;2080](karakeep-app/karakeep#2080) - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`c9c73d4`](karakeep-app/karakeep@c9c73d41)
- fix: More memory optimizations for crawler worker. [#&#8203;1748](karakeep-app/karakeep#1748) - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`40d548b`](karakeep-app/karakeep@40d548bd)
- fix: fix screenshot filepath in crawler - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`0704b8b`](karakeep-app/karakeep@0704b8bb)
- docs: Add Azure configuration details for OpenAI-compatible API ([#&#8203;2072](karakeep-app/karakeep#2072)) - [@&#8203;cloudchristoph](https://github.com/cloudchristoph) in [`bd9c933`](karakeep-app/karakeep@bd9c933b)
- fix: Respect abort signal in admin maintenance jobs - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`8a330dc`](karakeep-app/karakeep@8a330dc2)
- deps: Upgrade metascraper plugins - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`e43c7e0`](karakeep-app/karakeep@e43c7e0f)
- deps: Upgrade metascraper-readability 5.49.6 - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`6d234de`](karakeep-app/karakeep@6d234de8)
- feat: Allow configuring inline asset size threshold - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`cf3ffff`](karakeep-app/karakeep@cf3ffff0)
- feat: Add admin maintenance job to migrate large inline HTML ([#&#8203;2071](karakeep-app/karakeep#2071)) - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`2b769cb`](karakeep-app/karakeep@2b769cba)
- fix(inferance): skip token slicing when content is already witin max length - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`1713600`](karakeep-app/karakeep@17136006)
- refactor: generalize tidy assets queue into admin maintenance ([#&#8203;2059](karakeep-app/karakeep#2059)) - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`6ea5dd1`](karakeep-app/karakeep@6ea5dd19)
- fix: update OpenAI API to use max\_completion\_tokens instead of max\_tokens ([#&#8203;2000](karakeep-app/karakeep#2000)) - [@&#8203;BenjaminMichaelis](https://github.com/BenjaminMichaelis) in [`046c29d`](karakeep-app/karakeep@046c29dc)
- fix(restate): Fix priority for restate queue - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`8c0aae3`](karakeep-app/karakeep@8c0aae33)
- fix(restate): Ensure that the semaphore and idProvider services are ingress private - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`cdf8121`](karakeep-app/karakeep@cdf81213)
- feat: Add source field to track bookmark creation sources ([#&#8203;2037](karakeep-app/karakeep#2037)) - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`2defc24`](karakeep-app/karakeep@2defc247)
- feat: support passing multiple proxy values ([#&#8203;2039](karakeep-app/karakeep#2039)) - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`c14b693`](karakeep-app/karakeep@c14b6934)
- deps: Upgrade oxlint to 1.22 - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`88a7ffe`](karakeep-app/karakeep@88a7ffec)
- feat: Add service dependency checks in the server overview page - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`fda1c85`](karakeep-app/karakeep@fda1c851)
- fix(web): Add w-full to tags editor to prevent unusable narrow width ([#&#8203;2035](karakeep-app/karakeep#2035)) - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`7ee9416`](karakeep-app/karakeep@7ee9416e)
- fix(api): Return 200 when bookmark already exists instead of 200 - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`f2dec26`](karakeep-app/karakeep@f2dec26f)
- tests: Add a test for the GET /bookmarks/bookmarkId/lists api - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`d578038`](karakeep-app/karakeep@d5780388)
- fix(api): Document the API for getting lists of a bookmark. fixes [#&#8203;2030](karakeep-app/karakeep#2030) - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`7f138b9`](karakeep-app/karakeep@7f138b99)
- feat: make list dropdown searchable in Manage Lists modal ([#&#8203;2029](karakeep-app/karakeep#2029)) - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`87053d2`](karakeep-app/karakeep@87053d2e)
- fix: fix dev script shebang for better compatibility ([#&#8203;2019](karakeep-app/karakeep#2019)) - [@&#8203;maya-doshi](https://github.com/maya-doshi) in [`dcddda5`](karakeep-app/karakeep@dcddda56)
- fix: Correct grammatical errors in prompts ([#&#8203;2020](karakeep-app/karakeep#2020)) - [@&#8203;atsggx](https://github.com/atsggx) in [`f1e8cea`](karakeep-app/karakeep@f1e8cea2)
- docs: Add karakeep-sync to community projects ([#&#8203;1994](karakeep-app/karakeep#1994)) - [@&#8203;sidoshi](https://github.com/sidoshi) in [`36ffbdf`](karakeep-app/karakeep@36ffbdf8)
- fix: round feed refresh hour for idempotency ([#&#8203;2013](karakeep-app/karakeep#2013)) - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`bae8386`](karakeep-app/karakeep@bae8386d)
- fix: fix show no bookmark page when there isn't search results - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`57d731b`](karakeep-app/karakeep@57d731ba)
- fix: Disable idempotency keys for search - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`b6867be`](karakeep-app/karakeep@b6867be4)
- feat: Restate-based queue plugin ([#&#8203;2011](karakeep-app/karakeep#2011)) - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`74a1f7b`](karakeep-app/karakeep@74a1f7b6)
- feat: Revamp import experience ([#&#8203;2001](karakeep-app/karakeep#2001)) - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`4a580d7`](karakeep-app/karakeep@4a580d71)
- docs: Add doc updates for prometheus metrics ([#&#8203;1957](karakeep-app/karakeep#1957)) - [@&#8203;claytono](https://github.com/claytono) in [`5e331a7`](karakeep-app/karakeep@5e331a7d)
- fix: fix public list sharing for empty lists ([#&#8203;1990](karakeep-app/karakeep#1990)) - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`7df6d94`](karakeep-app/karakeep@7df6d942)
- feat: recursive list delete ([#&#8203;1989](karakeep-app/karakeep#1989)) - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`7d0b414`](karakeep-app/karakeep@7d0b414f)
- feat: use jpegs for screenshots instead of pngs - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`ed1f24f`](karakeep-app/karakeep@ed1f24f2)
- feat: Stop downloading video/audio in playwright - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`37845f9`](karakeep-app/karakeep@37845f99)
- fix: Abort dangling processing when crawler is aborted ([#&#8203;1988](karakeep-app/karakeep#1988)) - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`9eecda1`](karakeep-app/karakeep@9eecda18)
- fix: Cleanup temp assets on monolith timeout - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`8dd84ef`](karakeep-app/karakeep@8dd84ef5)
- chore: Silence lint on <a> and <img> tags when it's intentional - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`cdbedf6`](karakeep-app/karakeep@cdbedf6c)
- fix: dont re-enqueue indexing for a bookmark already pending indexing - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`e395ac2`](karakeep-app/karakeep@e395ac27)
- feat: Add tag search and pagination ([#&#8203;1987](karakeep-app/karakeep#1987)) - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`62f7d90`](karakeep-app/karakeep@62f7d900)
- fix: optimize memory usage of tag listing - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`9fe09bf`](karakeep-app/karakeep@9fe09bfa)
- deps: Upgrade oxlint to 1.16 - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`bbc5e6c`](karakeep-app/karakeep@bbc5e6c2)
- fix: fix bundling liteque in the workers - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`851d3e2`](karakeep-app/karakeep@851d3e29)
- refactor: Move callsites to liteque to be behind a plugin - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`8d32055`](karakeep-app/karakeep@8d320554)
- fix(dev): worker not started properly in helper start script ([#&#8203;1946](karakeep-app/karakeep#1946)) - [@&#8203;xuatz](https://github.com/xuatz) in [`6ba61b4`](karakeep-app/karakeep@6ba61b46)
- feat: Regen api keys - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`7671f4f`](karakeep-app/karakeep@7671f4ff)
- release(cli): Bump CLI version to 0.27.1 - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`69ef2ff`](karakeep-app/karakeep@69ef2ffe)
- feat(cli): Give more targetting options for dump/migrate/wipe - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`6501f69`](karakeep-app/karakeep@6501f69a)
- release(cli): Bump CLI version to 0.27.0 - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`0700aab`](karakeep-app/karakeep@0700aab8)
- feat(cli): Implement a full account dump archive - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`b9a8ca2`](karakeep-app/karakeep@b9a8ca29)
- feat(cli): Implement a wipe command in the CLI - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`bc0e746`](karakeep-app/karakeep@bc0e7461)
- feat: Add scripts to migrate all content from one server to the other - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`783f72c`](karakeep-app/karakeep@783f72cb)
- fix(web): Handle user deletion more gracefully - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`92e357f`](karakeep-app/karakeep@92e357f1)
- feat: A better looking catch all error boundary - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`d53b282`](karakeep-app/karakeep@d53b2826)
- fix(web): fix error when attempting to merge tags. fixes [#&#8203;1938](karakeep-app/karakeep#1938) - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`d173b10`](karakeep-app/karakeep@d173b101)
- feat: Add Create Tag button to tags page ([#&#8203;1942](karakeep-app/karakeep#1942)) - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`820b7e6`](karakeep-app/karakeep@820b7e65)
- chore: fix claude code action - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`c2dcb9d`](karakeep-app/karakeep@c2dcb9dc)
- refactor: strongly type the search plugin interface - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`bf5bf99`](karakeep-app/karakeep@bf5bf996)
- feat(search): add title search qualifier ([#&#8203;1940](karakeep-app/karakeep#1940)) - [@&#8203;MohamedBassem](https://github.com/MohamedBassem) in [`a92ada7`](karakeep-app/karakeep@a92ada77)
- feat(extension): add current tab title while saving from extension ([#&#8203;1930](karakeep-app/karakeep#1930)) - [@&#8203;Abel](https://github.com/Abel) in [`b594ff0`](karakeep-app/karakeep@b594ff09)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you are satisfied.

♻ **Rebasing**: Whenever PR is behind base branch, or you tick the rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update again.

---

 - [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check this box

---

This PR has been generated by [Renovate Bot](https://github.com/renovatebot/renovate).
<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0Mi41LjAiLCJ1cGRhdGVkSW5WZXIiOiI0Mi41LjAiLCJ0YXJnZXRCcmFuY2giOiJtYWluIiwibGFiZWxzIjpbImltYWdlIl19-->

Reviewed-on: https://gitea.alexlebens.dev/alexlebens/infrastructure/pulls/2006
Co-authored-by: Renovate Bot <[email protected]>
Co-committed-by: Renovate Bot <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants