-
-
Notifications
You must be signed in to change notification settings - Fork 958
fix: Abort dangling processing when crawler is aborted #1988
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
WalkthroughAdds abort-aware control flow and streamed, size-enforced asset downloads in Changes
Sequence Diagram(s)sequenceDiagram
autonumber
participant W as Worker
participant CW as CrawlerWorker
participant P as BrowserPage
participant N as HTTP Server
participant FS as TempFileStore
participant S as PersistentStore
participant A as AbortSignal
Note over CW,A #fff2cc: Abort-aware crawl with streamed asset downloads
W->>CW: runCrawler(jobId + runNumber, A)
CW->>P: page.goto(url)
par Race navigation vs abort
P-->>CW: navigate/loaded
A-->>CW: abort signaled
end
alt Aborted
CW-->>W: AbortError (cleanup temp files)
else Loaded
CW->>P: wait-for-load / extract HTML & metadata
A-->>CW: abort?
alt Aborted
CW-->>W: AbortError
else Continue
CW->>P: optional screenshot (abort checked)
CW->>N: fetch asset
N-->>CW: response stream + headers
CW->>FS: create temp file
CW->>FS: pipeline(stream -> Transform[size-check] -> temp file)
A-->>CW: abort?
alt Aborted during download
CW->>FS: cleanup temp file
CW-->>W: AbortError
else Download complete
CW->>S: saveAssetFromFile(temp file, bytesRead)
CW->>FS: cleanup temp file
CW-->>W: result
end
end
end
Estimated code review effort🎯 4 (Complex) | ⏱️ ~60 minutes Pre-merge checks and finishing touches❌ Failed checks (1 warning, 1 inconclusive)
✅ Passed checks (1 passed)
✨ Finishing touches
🧪 Generate unit tests
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
Deploying karakeep-landing with
|
| Latest commit: |
54b145e
|
| Status: | ✅ Deploy successful! |
| Preview URL: | https://1dc89b2b.karakeep-landing.pages.dev |
| Branch Preview URL: | https://abort-dangling-processing.karakeep-landing.pages.dev |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
apps/workers/workers/crawlerWorker.ts (1)
1006-1011: Abort guard keeps Promise.all pending forever
abortPromisenever resolves; it only rejects when the signal fires. Including it inPromise.all([...])means the array never settles on the non-aborted path, socrawlAndParseUrlhangs indefinitely and the worker never moves forward. Please race the combined work against the abort signal instead of awaiting a never-resolving promise.- const [meta, readableContent, screenshotAssetInfo] = await Promise.all([ - extractMetadata(htmlContent, browserUrl, jobId), - extractReadableContent(htmlContent, browserUrl, jobId), - storeScreenshot(screenshot, userId, jobId), - abortPromise(abortSignal), - ]); + const abortableWork = Promise.all([ + extractMetadata(htmlContent, browserUrl, jobId), + extractReadableContent(htmlContent, browserUrl, jobId), + storeScreenshot(screenshot, userId, jobId), + ]); + await Promise.race([abortableWork, abortPromise(abortSignal)]); + const [meta, readableContent, screenshotAssetInfo] = await abortableWork;
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
apps/workers/workers/crawlerWorker.ts(14 hunks)
🧰 Additional context used
📓 Path-based instructions (2)
**/*.{ts,tsx,js,jsx}
📄 CodeRabbit inference engine (AGENTS.md)
Use oxlint as the linter for JavaScript/TypeScript code
Files:
apps/workers/workers/crawlerWorker.ts
apps/workers/**
📄 CodeRabbit inference engine (AGENTS.md)
Background workers code resides in apps/workers
Files:
apps/workers/workers/crawlerWorker.ts
🧬 Code graph analysis (1)
apps/workers/workers/crawlerWorker.ts (4)
apps/workers/utils.ts (1)
fetchWithProxy(62-71)packages/shared/assetdb.ts (4)
newAssetId(126-128)saveAssetFromFile(177-202)saveAssetFromFile(369-395)saveAssetFromFile(651-678)packages/shared/tryCatch.ts (1)
tryCatch(15-24)packages/shared-server/src/services/quotaService.ts (1)
QuotaService(22-93)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
- GitHub Check: tests
- GitHub Check: typecheck
| const expectedLength = response.headers.get("content-length"); | ||
|
|
||
| const assetId = newAssetId(); | ||
| assetPath = path.join(os.tmpdir(), assetId); | ||
|
|
||
| let bytesRead = 0; | ||
| const contentLengthEnforcer = new Transform({ | ||
| transform(chunk, _, callback) { | ||
| bytesRead += chunk.length; | ||
|
|
||
| if (abortSignal.aborted) { | ||
| callback(new Error("AbortError")); | ||
| } else if (bytesRead > serverConfig.maxAssetSizeMb * 1024 * 1024) { | ||
| callback( | ||
| new Error( | ||
| `Content length exceeds maximum allowed size: ${serverConfig.maxAssetSizeMb}MB`, | ||
| ), | ||
| ); | ||
| } else { | ||
| callback(null, chunk); // pass data along unchanged | ||
| } | ||
| }, | ||
| flush(callback) { | ||
| if (expectedLength && bytesRead !== Number(expectedLength)) { | ||
| callback( | ||
| new Error( | ||
| `Content length mismatch: expected ${expectedLength}, got ${bytesRead}`, | ||
| ), | ||
| ); | ||
| } else { | ||
| callback(); | ||
| } | ||
| }, | ||
| }); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Compressed downloads always fail length validation
Response.body from undici is already decompressed. When a server sends a Content-Length for the compressed payload (e.g., gzip), bytesRead grows to the decompressed size, so the flush check throws Content length mismatch on every compressed asset. That breaks downloads for most real-world sites. Guard the comparison so it only runs when the header parses to a finite number and the response isn’t compressed.
- const expectedLength = response.headers.get("content-length");
+ const rawContentLength = response.headers.get("content-length");
+ const parsedContentLength =
+ rawContentLength !== null ? Number(rawContentLength) : null;
+ const expectedLength =
+ parsedContentLength !== null &&
+ Number.isFinite(parsedContentLength) &&
+ !response.headers.get("content-encoding")
+ ? parsedContentLength
+ : null;
...
- if (expectedLength && bytesRead !== Number(expectedLength)) {
+ if (expectedLength !== null && bytesRead !== expectedLength) {📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const expectedLength = response.headers.get("content-length"); | |
| const assetId = newAssetId(); | |
| assetPath = path.join(os.tmpdir(), assetId); | |
| let bytesRead = 0; | |
| const contentLengthEnforcer = new Transform({ | |
| transform(chunk, _, callback) { | |
| bytesRead += chunk.length; | |
| if (abortSignal.aborted) { | |
| callback(new Error("AbortError")); | |
| } else if (bytesRead > serverConfig.maxAssetSizeMb * 1024 * 1024) { | |
| callback( | |
| new Error( | |
| `Content length exceeds maximum allowed size: ${serverConfig.maxAssetSizeMb}MB`, | |
| ), | |
| ); | |
| } else { | |
| callback(null, chunk); // pass data along unchanged | |
| } | |
| }, | |
| flush(callback) { | |
| if (expectedLength && bytesRead !== Number(expectedLength)) { | |
| callback( | |
| new Error( | |
| `Content length mismatch: expected ${expectedLength}, got ${bytesRead}`, | |
| ), | |
| ); | |
| } else { | |
| callback(); | |
| } | |
| }, | |
| }); | |
| const rawContentLength = response.headers.get("content-length"); | |
| const parsedContentLength = | |
| rawContentLength !== null ? Number(rawContentLength) : null; | |
| const expectedLength = | |
| parsedContentLength !== null && | |
| Number.isFinite(parsedContentLength) && | |
| !response.headers.get("content-encoding") | |
| ? parsedContentLength | |
| : null; | |
| const assetId = newAssetId(); | |
| assetPath = path.join(os.tmpdir(), assetId); | |
| let bytesRead = 0; | |
| const contentLengthEnforcer = new Transform({ | |
| transform(chunk, _, callback) { | |
| bytesRead += chunk.length; | |
| if (abortSignal.aborted) { | |
| callback(new Error("AbortError")); | |
| } else if (bytesRead > serverConfig.maxAssetSizeMb * 1024 * 1024) { | |
| callback( | |
| new Error( | |
| `Content length exceeds maximum allowed size: ${serverConfig.maxAssetSizeMb}MB`, | |
| ), | |
| ); | |
| } else { | |
| callback(null, chunk); // pass data along unchanged | |
| } | |
| }, | |
| flush(callback) { | |
| if (expectedLength !== null && bytesRead !== expectedLength) { | |
| callback( | |
| new Error( | |
| `Content length mismatch: expected ${expectedLength}, got ${bytesRead}`, | |
| ), | |
| ); | |
| } else { | |
| callback(); | |
| } | |
| }, | |
| }); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
apps/workers/workers/crawlerWorker.ts(15 hunks)
🧰 Additional context used
📓 Path-based instructions (2)
**/*.{ts,tsx,js,jsx}
📄 CodeRabbit inference engine (AGENTS.md)
Use oxlint as the linter for JavaScript/TypeScript code
Files:
apps/workers/workers/crawlerWorker.ts
apps/workers/**
📄 CodeRabbit inference engine (AGENTS.md)
Background workers code resides in apps/workers
Files:
apps/workers/workers/crawlerWorker.ts
🧬 Code graph analysis (1)
apps/workers/workers/crawlerWorker.ts (4)
apps/workers/utils.ts (1)
fetchWithProxy(62-71)packages/shared/assetdb.ts (4)
newAssetId(126-128)saveAssetFromFile(177-202)saveAssetFromFile(369-395)saveAssetFromFile(651-678)packages/shared/tryCatch.ts (1)
tryCatch(15-24)packages/shared-server/src/services/quotaService.ts (1)
QuotaService(22-93)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
- GitHub Check: tests
- GitHub Check: typecheck
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
apps/workers/workers/crawlerWorker.ts(15 hunks)
🧰 Additional context used
📓 Path-based instructions (2)
**/*.{ts,tsx,js,jsx}
📄 CodeRabbit inference engine (AGENTS.md)
Use oxlint as the linter for JavaScript/TypeScript code
Files:
apps/workers/workers/crawlerWorker.ts
apps/workers/**
📄 CodeRabbit inference engine (AGENTS.md)
Background workers code resides in apps/workers
Files:
apps/workers/workers/crawlerWorker.ts
🧬 Code graph analysis (1)
apps/workers/workers/crawlerWorker.ts (4)
apps/workers/utils.ts (1)
fetchWithProxy(62-71)packages/shared/assetdb.ts (4)
newAssetId(126-128)saveAssetFromFile(177-202)saveAssetFromFile(369-395)saveAssetFromFile(651-678)packages/shared/tryCatch.ts (1)
tryCatch(15-24)packages/shared-server/src/services/quotaService.ts (1)
QuotaService(22-93)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
- GitHub Check: tests
- GitHub Check: typecheck
| await streamPipeline( | ||
| response.body, | ||
| contentLengthEnforcer, | ||
| fsSync.createWriteStream(assetPath), | ||
| ); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fix: Convert fetch body to a Node stream before piping
Line 683: stream.pipeline expects a Node Readable, but fetchWithProxy returns an Undici/WHATWG ReadableStream. Passing it directly throws ERR_INVALID_ARG_TYPE, so every download aborts at runtime. Please wrap the body with Readable.fromWeb(...) (and import Readable) before piping it into the temp file.
-import { pipeline, Transform } from "stream";
+import { pipeline, Transform, Readable } from "stream";
…
- await streamPipeline(
- response.body,
+ await streamPipeline(
+ Readable.fromWeb(response.body),
contentLengthEnforcer,
fsSync.createWriteStream(assetPath),
);📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| await streamPipeline( | |
| response.body, | |
| contentLengthEnforcer, | |
| fsSync.createWriteStream(assetPath), | |
| ); | |
| // At the top of the file, include Readable | |
| import { pipeline, Transform, Readable } from "stream"; | |
| … | |
| // In the download pipeline, convert the web ReadableStream to a Node Readable | |
| - await streamPipeline( | |
| await streamPipeline( | |
| Readable.fromWeb(response.body), | |
| contentLengthEnforcer, | |
| fsSync.createWriteStream(assetPath), | |
| ); |
🤖 Prompt for AI Agents
In apps/workers/workers/crawlerWorker.ts around lines 681 to 685, the code
passes the WHATWG/Undici Response.body directly into stream.pipeline which
expects a Node Readable, causing ERR_INVALID_ARG_TYPE; wrap the fetch body with
Readable.fromWeb(response.body) before piping and add the Node stream import
(import { Readable } from 'stream' or equivalent CommonJS require) at the top of
the file so streamPipeline receives a proper Node Readable.
| logger.error( | ||
| `[Crawler][${jobId}] Failed to download and store ${fileType}: ${e}`, | ||
| ); | ||
| return null; | ||
| } finally { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Propagate aborts instead of treating them as download failures
Line 715: When the abort signal fires, either fetch or the transform surfaces an AbortError, but this catch block logs it and returns null. In the PDF/image path (handleAsAssetBookmark), that means the crawler exits normally instead of honoring the abort—which is the regression this PR is trying to avoid. Please rethrow abort-driven failures (e.g., check abortSignal.aborted or e?.name === "AbortError" and throw the abort reason) so callers can stop the job.
} catch (e) {
- logger.error(
- `[Crawler][${jobId}] Failed to download and store ${fileType}: ${e}`,
- );
- return null;
+ if (
+ abortSignal.aborted ||
+ (e instanceof Error && e.name === "AbortError")
+ ) {
+ throw abortSignal.reason ?? e;
+ }
+ logger.error(
+ `[Crawler][${jobId}] Failed to download and store ${fileType}: ${e}`,
+ );
+ return null;
} finally {🤖 Prompt for AI Agents
In apps/workers/workers/crawlerWorker.ts around lines 713-717, the catch
currently logs all errors from fetch/transform and returns null which swallows
abort-driven failures; change the handler to detect aborts (e.g., if
(abortSignal?.aborted) or if (e?.name === "AbortError")) and rethrow the abort
(preferably throw abortSignal.reason || e) so the caller can stop the job, while
still logging non-abort errors and returning null for those.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🧹 Nitpick comments (1)
apps/workers/workers/crawlerWorker.ts (1)
746-747: Verify temporary file path generationUsing
os.tmpdir()with just theassetIdas filename could potentially cause conflicts if the same ID is generated multiple times.Consider using a more unique temporary filename:
- const assetPath = path.join(os.tmpdir(), assetId); + const assetPath = path.join(os.tmpdir(), `karakeep_archive_${assetId}.html`);This adds a prefix and extension for better file management and debugging.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
apps/workers/workers/crawlerWorker.ts(15 hunks)
🧰 Additional context used
📓 Path-based instructions (2)
**/*.{ts,tsx,js,jsx}
📄 CodeRabbit inference engine (AGENTS.md)
Use oxlint as the linter for JavaScript/TypeScript code
Files:
apps/workers/workers/crawlerWorker.ts
apps/workers/**
📄 CodeRabbit inference engine (AGENTS.md)
Background workers code resides in apps/workers
Files:
apps/workers/workers/crawlerWorker.ts
🧬 Code graph analysis (1)
apps/workers/workers/crawlerWorker.ts (4)
apps/workers/utils.ts (1)
fetchWithProxy(62-71)packages/shared/assetdb.ts (4)
newAssetId(126-128)saveAssetFromFile(177-202)saveAssetFromFile(369-395)saveAssetFromFile(651-678)packages/shared/tryCatch.ts (1)
tryCatch(15-24)packages/shared-server/src/services/quotaService.ts (1)
QuotaService(22-93)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
- GitHub Check: tests
- GitHub Check: typecheck
🔇 Additional comments (13)
apps/workers/workers/crawlerWorker.ts (13)
457-463: Verify null handling for navigation responseWhen
abortPromisewins the race on line 462,responsebecomesnull, but lines 523-524 still accessresponse?.status()which correctly handles null. This is good defensive programming.The null-safe access pattern
response?.status() ?? 0properly handles the abort case where response is null. Well done!
476-477: Good practice: Immediate abort check after async operationsAdding
abortSignal.throwIfAborted()after the wait ensures the crawler stops immediately when aborted, preventing unnecessary processing.
715-719: Good practice: Cleanup temporary files in finally blockThe finally block ensures temporary files are cleaned up even on errors, preventing disk space leaks.
1007-1018: Effective abort handling with race conditionsThe implementation properly races extractors against abort and then awaits the work to get results or throw on abort. The pattern ensures work completes or aborts cleanly.
Excellent pattern for handling concurrent operations with abort support! The code races the work against abort, then awaits the work promise to either get results or propagate any errors that occurred during execution.
1174-1174: Good improvement: Include runNumber in jobId for uniquenessAdding
runNumberto the jobId ensures unique identifiers across retries, improving log traceability.
1201-1201: Consistent abort checking throughout the flowThe abort check after
getContentTypeensures the job stops if aborted during content-type detection.Good placement of abort check after the async operation. Based on learnings
783-783: Good error handling: Cleanup on quota failureProperly cleaning up the temporary file when quota check fails prevents orphaned files.
635-636: Stream-based architecture improves memory efficiencyThe transition from in-memory buffers to stream-based file operations with temporary files is a significant improvement for handling large assets without memory exhaustion.
Excellent architectural improvement! The stream-based approach with temporary files prevents memory issues when downloading large assets.
Also applies to: 654-656
641-643: Fix needed: Convert WHATWG stream to Node streamThe
fetchWithProxyreturns a WHATWG ReadableStream, butpipelineexpects a Node.js Readable stream. This will cause a runtime error.
657-677: Verify content-length validation for compressed responsesThe content-length enforcer doesn't account for compressed responses where the
Content-Lengthheader refers to the compressed size, butbytesReadtracks decompressed bytes.
710-719: Critical: Propagate abort errors instead of swallowing themThe catch block logs all errors and returns
null, which prevents abort propagation. This defeats the purpose of abort handling.
679-683: Stream conversion required before pipelineThe WHATWG stream needs to be converted to a Node.js stream.
Import
Readablefrom stream and convert the response body:-import { Transform } from "stream"; +import { Transform, Readable } from "stream"; ... await pipeline( - response.body, + Readable.fromWeb(response.body), contentLengthEnforcer, fsSync.createWriteStream(assetPath), );
78-101: Critical fix needed: Prevent unhandled rejections in abortPromiseThe current implementation has a critical issue where unhandled rejections can occur when the promise is rejected after being raced but before being awaited. The rejected promise created on line 80 isn't properly caught, and the main promise (line 87) can still reject after its race resolves.
Apply this fix to properly handle rejections and clean up listeners:
function abortPromise(signal: AbortSignal): Promise<never> { if (signal.aborted) { - const p = Promise.reject(signal.reason ?? new Error("AbortError")); - p.catch(() => { - /* empty */ - }); // suppress unhandledRejection if not awaited - return p; + return Promise.reject(signal.reason ?? new Error("AbortError")); } - const p = new Promise<never>((_, reject) => { - signal.addEventListener( - "abort", - () => { - reject(signal.reason ?? new Error("AbortError")); - }, - { once: true }, - ); - }); + let onAbort: (() => void) | undefined; + const promise = new Promise<never>((_, reject) => { + onAbort = () => { + signal.removeEventListener("abort", onAbort!); + reject(signal.reason ?? new Error("AbortError")); + }; + signal.addEventListener("abort", onAbort, { once: true }); + }); - p.catch(() => { - /* empty */ - }); - return p; + // Attach a no-op catch handler to prevent unhandled rejection warnings + // when the promise is raced and loses + promise.catch(() => {}); + return promise; }
| abortPromise(abortSignal).then(() => Buffer.from("")), | ||
| ]), | ||
| ); | ||
| abortSignal.throwIfAborted(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Potential issue: Empty buffer on abort may cause issues downstream
When the abort promise wins on line 506, it returns an empty buffer Buffer.from(""). This could be misleading as it appears successful but with empty data.
Consider throwing the abort error instead of returning an empty buffer:
- abortPromise(abortSignal).then(() => Buffer.from("")),
+ abortPromise(abortSignal),This way, the error handling on line 510 will properly detect the abort.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| abortPromise(abortSignal).then(() => Buffer.from("")), | |
| ]), | |
| ); | |
| abortSignal.throwIfAborted(); | |
| abortPromise(abortSignal), | |
| ]), | |
| ); | |
| abortSignal.throwIfAborted(); |
🤖 Prompt for AI Agents
In apps/workers/workers/crawlerWorker.ts around lines 506 to 509, returning an
empty Buffer when the abort promise wins hides the abort as a successful empty
result; change the abort branch so it throws/rejects the abort error instead of
resolving Buffer.from("") (e.g. have abortPromise reject with the AbortError or
propagate abortSignal.reason), so the race will surface the abort as an
exception and existing abortSignal.throwIfAborted() / surrounding error handling
will detect it.
This PR contains the following updates: | Package | Update | Change | |---|---|---| | [ghcr.io/karakeep-app/karakeep](https://github.com/karakeep-app/karakeep) | minor | `0.27.1` -> `0.28.0` | --- ### Release Notes <details> <summary>karakeep-app/karakeep (ghcr.io/karakeep-app/karakeep)</summary> ### [`v0.28.0`](https://github.com/karakeep-app/karakeep/releases/tag/v0.28.0): 0.28.0 [Compare Source](karakeep-app/karakeep@v0.27.1...v0.28.0) ### 0.28.0 (20k stars ⭐) Welcome to the 0.28.0 release of Karakeep! We've have hit 20k stars on Github 🎉 (well 21k because I was too late with the release)! Thanks a lot for your support throughout this journey! This release brings a refreshed import pipeline, uploading custom attachments, revamped tags page, inline checklists, and a bunch of quality-of-life touches across the web app, extension and mobile app. Huge thanks to [@​BOTkirial](https://github.com/BOTkirial), [@​qixing-jk](https://github.com/qixing-jk), @​[@​maya-doshi](https://github.com/maya-doshi), [@​BenjaminMichaelis](https://github.com/BenjaminMichaelis), [@​cloudchristoph](https://github.com/cloudchristoph), [@​claytono](https://github.com/claytono), as usual [@​xuatz](https://github.com/xuatz) and everyone who shipped code, triaged bugs, or shared feedback for this release. > If you enjoy using Karakeep, consider supporting the project [here ☕️](https://buymeacoffee.com/mbassem) or via GitHub [here](https://github.com/sponsors/MohamedBassem). <a href="https://www.buymeacoffee.com/mbassem" target="_blank"><img src="https://cdn.buymeacoffee.com/buttons/v2/default-yellow.png" alt="Buy Me A Coffee" width="auto" height="50" ></a> And in case you missed it, we now have a ☁️ managed offering ☁️ for those who don't want to self-host. We're still in private beta (you can signup for access [here](https://tally.so/r/wo8zzx)) and gradually letting more and more users in. ### New Features 🚀 - Revamped import experience with progress tracking ([#​2001](karakeep-app/karakeep#2001)) - Revamped Tags page that adds search and pagination to better serve users with thousands of tags ([#​1987](karakeep-app/karakeep#1987)) - You can now upload custom attachments to bookmarks ([#​2100](karakeep-app/karakeep#2100)) - When deleting a list, you can now optionally delete all its children ([#​1989](karakeep-app/karakeep#1989)) - Server overview highlights service dependency health. - Inline checklist toggling for text bookmarks ([#​1933](karakeep-app/karakeep#1933)) – [@​BOTkirial](https://github.com/BOTkirial) - With every release, you'll be prompted to view what's new in that release from inside the app. - You can now pass custom headers from the mobile app to the server ([#​2103](karakeep-app/karakeep#2103)) - Extension improvements: - Tab bookmark badge indicator by [@​qixing-jk](https://github.com/qixing-jk) shows when a page is already bookmarked ([#​1745](karakeep-app/karakeep#1745)) - You can now write notes directly after saving a bookmark in the extension ([#​2104](karakeep-app/karakeep#2104)) ### UX Improvements ✨ - Grid view controls expose title/tag toggles and image fit options ([#​1960](karakeep-app/karakeep#1960)) - Bookmark cards can surface saved notes across web and mobile ([#​2083](karakeep-app/karakeep#2083)) – [@​xuatz](https://github.com/xuatz) - Manage Lists modal is searchable for faster sorting ([#​2029](karakeep-app/karakeep#2029)) - The tags page now has a "Create Tag" button ([#​1942](karakeep-app/karakeep#1942)) - You can now regenerate the API key without having to recreate it. - New `title:` seach qualifier for searching bookmarks by title ([#​1940](karakeep-app/karakeep#1940)) ### Fixes 🔧 -⚠️ (Potentially breaking change)⚠️ Stricter URL valdaition to protect against SSRF attacks ([#​2082](karakeep-app/karakeep#2082)) - Webhook requests now go through the proxy if there's one configured - All server-initiated requests (including webhooks) to internal IP addresses are now blocked by default unless explicitly allowed via `CRAWLER_ALLOWED_INTERNAL_HOSTNAMES`. If your webhooks are hitting internal services, you'll have to allowlist them via `CRAWLER_ALLOWED_INTERNAL_HOSTNAMES`. - Monolith now honors the configured crawler proxy. - Metascraper logo extraction now respects the crawler proxy. - Crawler memory footprint shrinks with targeted optimizations ([#​1748](karakeep-app/karakeep#1748)) - Allow karakeep to use newer openai models that was previously failing because of deprecated max\_tokens ([#​2000](karakeep-app/karakeep#2000)) - [@​BenjaminMichaelis](https://github.com/BenjaminMichaelis) - You'll need to set `INFERENCE_USE_MAX_COMPLETION_TOKENS=true` in your `.env` file to use the new models. This is eventually going to become the default. - Admin maintenance jobs respect abort signals to stop gracefully - Search input no longer crashes on percent signs and also works correctly with IME composition - Fixed a crash when sharing a list publicly that didn't have any bookmarks ([#​1990](karakeep-app/karakeep#1990)) - Screenshots are now stored as jpegs instead of pngs to reduce file size - Fixed a bug that was preventing tag merging ([#​1938](karakeep-app/karakeep#1938)) - RSS imports can apply feed categories as tags ([#​2031](karakeep-app/karakeep#2031)) ### For Developers 🛠️ - Create bookmark API returns 200 instead of 201 when a bookmark already exists - CLI Improvements: - New commands to migrate data from one server to another - New command to dump a full account archive - A new wipe command to selectively clean up data from the account ### Community Projects 💡 ##### [Karakeeper](https://apps.apple.com/us/app/karakeeper-for-karakeep/id6746722790) 3rd Party iOS/Safari Client - *by [@​simplytoast1](https://github.com/simplytoast1)* Karakeeper now is providing an alternative iOS native mobile/desktop client for Karakeep beyond its existing functionality of providing a safari extension. ##### [Karakeep Sync](https://github.com/sidoshi/karakeep-sync) A syncing tool for Karakeep - *by [@​sidoshi](https://github.com/sidoshi)* A rust-based syncing tool that syncs: Hacker News upvotes, Reddit saved posts, Github stars and Pinboard bookmarks automatically to Karakeep! ### Screenshots 📸 #### Inline Checklists <img width="1230" height="806" alt="Screenshot 2025-11-08 at 8 55 18@​2x" src="https://github.com/user-attachments/assets/c092d903-eb6f-40c6-aee6-1ce6127f67e8" /> #### Import Sessions <img width="1814" height="762" alt="Screenshot 2025-11-08 at 8 58 21@​2x" src="https://github.com/user-attachments/assets/dfcb856b-6a63-4d7a-ba4b-ce2ca83bc844" /> #### Service Health Indicators <img width="1874" height="540" alt="Screenshot 2025-11-08 at 8 56 00@​2x" src="https://github.com/user-attachments/assets/7835f1ad-239d-477c-8e00-951e4a09f8c6" /> ### Upgrading 📦 To upgrade: - If you're using `KARAKEEP_VERSION=release`, run `docker compose pull && docker compose up -d`. - If you're pinning it to a specific version, bump the version and then run `docker compose pull && docker compose up -d`. ### All Commits - fix: standardize US English translations to professional tone - [@​MohamedBassem](https://github.com/MohamedBassem) in [`4f025f5`](karakeep-app/karakeep@4f025f5a) - i18n: Sync weblate translations - [@​weblate](https://github.com/weblate) in [`5387c98`](karakeep-app/karakeep@5387c982) - tests: fix crawling and search e2e tests ([#​2105](karakeep-app/karakeep#2105)) - [@​MohamedBassem](https://github.com/MohamedBassem) in [`c4bee9f`](karakeep-app/karakeep@c4bee9fe) - feat(extension): Allow writing notes directly in the extension ([#​2104](karakeep-app/karakeep#2104)) - [@​MohamedBassem](https://github.com/MohamedBassem) in [`098e56a`](karakeep-app/karakeep@098e56a8) - fix(mobile): fix default address not correctly stored in settings - [@​MohamedBassem](https://github.com/MohamedBassem) in [`a220319`](karakeep-app/karakeep@a2203196) - feat(mobile): add custom headers configuration in sign-in screen ([#​2103](karakeep-app/karakeep#2103)) - [@​MohamedBassem](https://github.com/MohamedBassem) in [`ec621bf`](karakeep-app/karakeep@ec621bf5) - tests: Fix failing test - [@​MohamedBassem](https://github.com/MohamedBassem) in [`27ed0a1`](karakeep-app/karakeep@27ed0a19) - feat: Add what's new modal in the sidebar ([#​2099](karakeep-app/karakeep#2099)) - [@​MohamedBassem](https://github.com/MohamedBassem) in [`474f642`](karakeep-app/karakeep@474f6429) - feat: Add support for user uploaded files ([#​2100](karakeep-app/karakeep#2100)) - [@​MohamedBassem](https://github.com/MohamedBassem) in [`31960fc`](karakeep-app/karakeep@31960fcd) - refactor: consolidate multiple karakeep plugins into one package ([#​2101](karakeep-app/karakeep#2101)) - [@​MohamedBassem](https://github.com/MohamedBassem) in [`99413db`](karakeep-app/karakeep@99413db0) - fix: metascraper logo to go through proxy if one configured. fixes [#​1863](karakeep-app/karakeep#1863) - [@​MohamedBassem](https://github.com/MohamedBassem) in [`737b031`](karakeep-app/karakeep@737b0317) - feat(extension): add tab bookmark badge indicator ([#​1745](karakeep-app/karakeep#1745)) - [@​qixing-jk](https://github.com/qixing-jk) in [`f0b0959`](karakeep-app/karakeep@f0b0959e) - fix: restore image size in grid layout - [@​MohamedBassem](https://github.com/MohamedBassem) in [`2056582`](karakeep-app/karakeep@2056582c) - deps: Upgrade react-query to 5.90 - [@​MohamedBassem](https://github.com/MohamedBassem) in [`560900b`](karakeep-app/karakeep@560900bb) - feat: Support inline toggling for todos. fixes [#​1931](karakeep-app/karakeep#1931) ([#​1933](karakeep-app/karakeep#1933)) - [@​BOTkirial](https://github.com/BOTkirial) in [`393bbd9`](karakeep-app/karakeep@393bbd9a) - fix: fix monolith to respect crawler proxy - [@​MohamedBassem](https://github.com/MohamedBassem) in [`085c832`](karakeep-app/karakeep@085c832c) - feat(rss): Add import tags from RSS feed categories ([#​2031](karakeep-app/karakeep#2031)) - [@​MohamedBassem](https://github.com/MohamedBassem) in [`5358682`](karakeep-app/karakeep@5358682a) - fix: fix crash in search input when query contains a percent. fixes [#​1941](karakeep-app/karakeep#1941) - [@​MohamedBassem](https://github.com/MohamedBassem) in [`633686b`](karakeep-app/karakeep@633686b5) - feat: Add view options to show tag/title and control image fit. Fixes [#​1960](karakeep-app/karakeep#1960) - [@​MohamedBassem](https://github.com/MohamedBassem) in [`34d2b48`](karakeep-app/karakeep@34d2b485) - refactor: improve the userLocalSetting server functions - [@​MohamedBassem](https://github.com/MohamedBassem) in [`bb00c99`](karakeep-app/karakeep@bb00c996) - feat: Make search job timeout configurable - [@​MohamedBassem](https://github.com/MohamedBassem) in [`965c603`](karakeep-app/karakeep@965c603d) - feat: display notes on bookmark card ([#​2083](karakeep-app/karakeep#2083)) - [@​xuatz](https://github.com/xuatz) in [`33f4077`](karakeep-app/karakeep@33f40779) - fix: Stricter SSRF validation ([#​2082](karakeep-app/karakeep#2082)) - [@​MohamedBassem](https://github.com/MohamedBassem) in [`b63a49f`](karakeep-app/karakeep@b63a49fc) - fix: correctly handle composition in search input. fixes [#​2048](karakeep-app/karakeep#2048) - [@​MohamedBassem](https://github.com/MohamedBassem) in [`c6ebceb`](karakeep-app/karakeep@c6ebceb9) - fix: browser service connection check using dns instead. Fixes [#​2080](karakeep-app/karakeep#2080) - [@​MohamedBassem](https://github.com/MohamedBassem) in [`c9c73d4`](karakeep-app/karakeep@c9c73d41) - fix: More memory optimizations for crawler worker. [#​1748](karakeep-app/karakeep#1748) - [@​MohamedBassem](https://github.com/MohamedBassem) in [`40d548b`](karakeep-app/karakeep@40d548bd) - fix: fix screenshot filepath in crawler - [@​MohamedBassem](https://github.com/MohamedBassem) in [`0704b8b`](karakeep-app/karakeep@0704b8bb) - docs: Add Azure configuration details for OpenAI-compatible API ([#​2072](karakeep-app/karakeep#2072)) - [@​cloudchristoph](https://github.com/cloudchristoph) in [`bd9c933`](karakeep-app/karakeep@bd9c933b) - fix: Respect abort signal in admin maintenance jobs - [@​MohamedBassem](https://github.com/MohamedBassem) in [`8a330dc`](karakeep-app/karakeep@8a330dc2) - deps: Upgrade metascraper plugins - [@​MohamedBassem](https://github.com/MohamedBassem) in [`e43c7e0`](karakeep-app/karakeep@e43c7e0f) - deps: Upgrade metascraper-readability 5.49.6 - [@​MohamedBassem](https://github.com/MohamedBassem) in [`6d234de`](karakeep-app/karakeep@6d234de8) - feat: Allow configuring inline asset size threshold - [@​MohamedBassem](https://github.com/MohamedBassem) in [`cf3ffff`](karakeep-app/karakeep@cf3ffff0) - feat: Add admin maintenance job to migrate large inline HTML ([#​2071](karakeep-app/karakeep#2071)) - [@​MohamedBassem](https://github.com/MohamedBassem) in [`2b769cb`](karakeep-app/karakeep@2b769cba) - fix(inferance): skip token slicing when content is already witin max length - [@​MohamedBassem](https://github.com/MohamedBassem) in [`1713600`](karakeep-app/karakeep@17136006) - refactor: generalize tidy assets queue into admin maintenance ([#​2059](karakeep-app/karakeep#2059)) - [@​MohamedBassem](https://github.com/MohamedBassem) in [`6ea5dd1`](karakeep-app/karakeep@6ea5dd19) - fix: update OpenAI API to use max\_completion\_tokens instead of max\_tokens ([#​2000](karakeep-app/karakeep#2000)) - [@​BenjaminMichaelis](https://github.com/BenjaminMichaelis) in [`046c29d`](karakeep-app/karakeep@046c29dc) - fix(restate): Fix priority for restate queue - [@​MohamedBassem](https://github.com/MohamedBassem) in [`8c0aae3`](karakeep-app/karakeep@8c0aae33) - fix(restate): Ensure that the semaphore and idProvider services are ingress private - [@​MohamedBassem](https://github.com/MohamedBassem) in [`cdf8121`](karakeep-app/karakeep@cdf81213) - feat: Add source field to track bookmark creation sources ([#​2037](karakeep-app/karakeep#2037)) - [@​MohamedBassem](https://github.com/MohamedBassem) in [`2defc24`](karakeep-app/karakeep@2defc247) - feat: support passing multiple proxy values ([#​2039](karakeep-app/karakeep#2039)) - [@​MohamedBassem](https://github.com/MohamedBassem) in [`c14b693`](karakeep-app/karakeep@c14b6934) - deps: Upgrade oxlint to 1.22 - [@​MohamedBassem](https://github.com/MohamedBassem) in [`88a7ffe`](karakeep-app/karakeep@88a7ffec) - feat: Add service dependency checks in the server overview page - [@​MohamedBassem](https://github.com/MohamedBassem) in [`fda1c85`](karakeep-app/karakeep@fda1c851) - fix(web): Add w-full to tags editor to prevent unusable narrow width ([#​2035](karakeep-app/karakeep#2035)) - [@​MohamedBassem](https://github.com/MohamedBassem) in [`7ee9416`](karakeep-app/karakeep@7ee9416e) - fix(api): Return 200 when bookmark already exists instead of 200 - [@​MohamedBassem](https://github.com/MohamedBassem) in [`f2dec26`](karakeep-app/karakeep@f2dec26f) - tests: Add a test for the GET /bookmarks/bookmarkId/lists api - [@​MohamedBassem](https://github.com/MohamedBassem) in [`d578038`](karakeep-app/karakeep@d5780388) - fix(api): Document the API for getting lists of a bookmark. fixes [#​2030](karakeep-app/karakeep#2030) - [@​MohamedBassem](https://github.com/MohamedBassem) in [`7f138b9`](karakeep-app/karakeep@7f138b99) - feat: make list dropdown searchable in Manage Lists modal ([#​2029](karakeep-app/karakeep#2029)) - [@​MohamedBassem](https://github.com/MohamedBassem) in [`87053d2`](karakeep-app/karakeep@87053d2e) - fix: fix dev script shebang for better compatibility ([#​2019](karakeep-app/karakeep#2019)) - [@​maya-doshi](https://github.com/maya-doshi) in [`dcddda5`](karakeep-app/karakeep@dcddda56) - fix: Correct grammatical errors in prompts ([#​2020](karakeep-app/karakeep#2020)) - [@​atsggx](https://github.com/atsggx) in [`f1e8cea`](karakeep-app/karakeep@f1e8cea2) - docs: Add karakeep-sync to community projects ([#​1994](karakeep-app/karakeep#1994)) - [@​sidoshi](https://github.com/sidoshi) in [`36ffbdf`](karakeep-app/karakeep@36ffbdf8) - fix: round feed refresh hour for idempotency ([#​2013](karakeep-app/karakeep#2013)) - [@​MohamedBassem](https://github.com/MohamedBassem) in [`bae8386`](karakeep-app/karakeep@bae8386d) - fix: fix show no bookmark page when there isn't search results - [@​MohamedBassem](https://github.com/MohamedBassem) in [`57d731b`](karakeep-app/karakeep@57d731ba) - fix: Disable idempotency keys for search - [@​MohamedBassem](https://github.com/MohamedBassem) in [`b6867be`](karakeep-app/karakeep@b6867be4) - feat: Restate-based queue plugin ([#​2011](karakeep-app/karakeep#2011)) - [@​MohamedBassem](https://github.com/MohamedBassem) in [`74a1f7b`](karakeep-app/karakeep@74a1f7b6) - feat: Revamp import experience ([#​2001](karakeep-app/karakeep#2001)) - [@​MohamedBassem](https://github.com/MohamedBassem) in [`4a580d7`](karakeep-app/karakeep@4a580d71) - docs: Add doc updates for prometheus metrics ([#​1957](karakeep-app/karakeep#1957)) - [@​claytono](https://github.com/claytono) in [`5e331a7`](karakeep-app/karakeep@5e331a7d) - fix: fix public list sharing for empty lists ([#​1990](karakeep-app/karakeep#1990)) - [@​MohamedBassem](https://github.com/MohamedBassem) in [`7df6d94`](karakeep-app/karakeep@7df6d942) - feat: recursive list delete ([#​1989](karakeep-app/karakeep#1989)) - [@​MohamedBassem](https://github.com/MohamedBassem) in [`7d0b414`](karakeep-app/karakeep@7d0b414f) - feat: use jpegs for screenshots instead of pngs - [@​MohamedBassem](https://github.com/MohamedBassem) in [`ed1f24f`](karakeep-app/karakeep@ed1f24f2) - feat: Stop downloading video/audio in playwright - [@​MohamedBassem](https://github.com/MohamedBassem) in [`37845f9`](karakeep-app/karakeep@37845f99) - fix: Abort dangling processing when crawler is aborted ([#​1988](karakeep-app/karakeep#1988)) - [@​MohamedBassem](https://github.com/MohamedBassem) in [`9eecda1`](karakeep-app/karakeep@9eecda18) - fix: Cleanup temp assets on monolith timeout - [@​MohamedBassem](https://github.com/MohamedBassem) in [`8dd84ef`](karakeep-app/karakeep@8dd84ef5) - chore: Silence lint on <a> and <img> tags when it's intentional - [@​MohamedBassem](https://github.com/MohamedBassem) in [`cdbedf6`](karakeep-app/karakeep@cdbedf6c) - fix: dont re-enqueue indexing for a bookmark already pending indexing - [@​MohamedBassem](https://github.com/MohamedBassem) in [`e395ac2`](karakeep-app/karakeep@e395ac27) - feat: Add tag search and pagination ([#​1987](karakeep-app/karakeep#1987)) - [@​MohamedBassem](https://github.com/MohamedBassem) in [`62f7d90`](karakeep-app/karakeep@62f7d900) - fix: optimize memory usage of tag listing - [@​MohamedBassem](https://github.com/MohamedBassem) in [`9fe09bf`](karakeep-app/karakeep@9fe09bfa) - deps: Upgrade oxlint to 1.16 - [@​MohamedBassem](https://github.com/MohamedBassem) in [`bbc5e6c`](karakeep-app/karakeep@bbc5e6c2) - fix: fix bundling liteque in the workers - [@​MohamedBassem](https://github.com/MohamedBassem) in [`851d3e2`](karakeep-app/karakeep@851d3e29) - refactor: Move callsites to liteque to be behind a plugin - [@​MohamedBassem](https://github.com/MohamedBassem) in [`8d32055`](karakeep-app/karakeep@8d320554) - fix(dev): worker not started properly in helper start script ([#​1946](karakeep-app/karakeep#1946)) - [@​xuatz](https://github.com/xuatz) in [`6ba61b4`](karakeep-app/karakeep@6ba61b46) - feat: Regen api keys - [@​MohamedBassem](https://github.com/MohamedBassem) in [`7671f4f`](karakeep-app/karakeep@7671f4ff) - release(cli): Bump CLI version to 0.27.1 - [@​MohamedBassem](https://github.com/MohamedBassem) in [`69ef2ff`](karakeep-app/karakeep@69ef2ffe) - feat(cli): Give more targetting options for dump/migrate/wipe - [@​MohamedBassem](https://github.com/MohamedBassem) in [`6501f69`](karakeep-app/karakeep@6501f69a) - release(cli): Bump CLI version to 0.27.0 - [@​MohamedBassem](https://github.com/MohamedBassem) in [`0700aab`](karakeep-app/karakeep@0700aab8) - feat(cli): Implement a full account dump archive - [@​MohamedBassem](https://github.com/MohamedBassem) in [`b9a8ca2`](karakeep-app/karakeep@b9a8ca29) - feat(cli): Implement a wipe command in the CLI - [@​MohamedBassem](https://github.com/MohamedBassem) in [`bc0e746`](karakeep-app/karakeep@bc0e7461) - feat: Add scripts to migrate all content from one server to the other - [@​MohamedBassem](https://github.com/MohamedBassem) in [`783f72c`](karakeep-app/karakeep@783f72cb) - fix(web): Handle user deletion more gracefully - [@​MohamedBassem](https://github.com/MohamedBassem) in [`92e357f`](karakeep-app/karakeep@92e357f1) - feat: A better looking catch all error boundary - [@​MohamedBassem](https://github.com/MohamedBassem) in [`d53b282`](karakeep-app/karakeep@d53b2826) - fix(web): fix error when attempting to merge tags. fixes [#​1938](karakeep-app/karakeep#1938) - [@​MohamedBassem](https://github.com/MohamedBassem) in [`d173b10`](karakeep-app/karakeep@d173b101) - feat: Add Create Tag button to tags page ([#​1942](karakeep-app/karakeep#1942)) - [@​MohamedBassem](https://github.com/MohamedBassem) in [`820b7e6`](karakeep-app/karakeep@820b7e65) - chore: fix claude code action - [@​MohamedBassem](https://github.com/MohamedBassem) in [`c2dcb9d`](karakeep-app/karakeep@c2dcb9dc) - refactor: strongly type the search plugin interface - [@​MohamedBassem](https://github.com/MohamedBassem) in [`bf5bf99`](karakeep-app/karakeep@bf5bf996) - feat(search): add title search qualifier ([#​1940](karakeep-app/karakeep#1940)) - [@​MohamedBassem](https://github.com/MohamedBassem) in [`a92ada7`](karakeep-app/karakeep@a92ada77) - feat(extension): add current tab title while saving from extension ([#​1930](karakeep-app/karakeep#1930)) - [@​Abel](https://github.com/Abel) in [`b594ff0`](karakeep-app/karakeep@b594ff09) </details> --- ### Configuration 📅 **Schedule**: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined). 🚦 **Automerge**: Disabled by config. Please merge this manually once you are satisfied. ♻ **Rebasing**: Whenever PR is behind base branch, or you tick the rebase/retry checkbox. 🔕 **Ignore**: Close this PR and you won't be reminded about this update again. --- - [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check this box --- This PR has been generated by [Renovate Bot](https://github.com/renovatebot/renovate). <!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0Mi41LjAiLCJ1cGRhdGVkSW5WZXIiOiI0Mi41LjAiLCJ0YXJnZXRCcmFuY2giOiJtYWluIiwibGFiZWxzIjpbImltYWdlIl19--> Reviewed-on: https://gitea.alexlebens.dev/alexlebens/infrastructure/pulls/2006 Co-authored-by: Renovate Bot <[email protected]> Co-committed-by: Renovate Bot <[email protected]>
An attempt for #1748