-
Notifications
You must be signed in to change notification settings - Fork 1
Switch to raw fetch #66
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…rate odata-query and @vitest/ui dependencies, refactor fetch logic in adapter and migrate files for improved error handling and query construction.
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
WalkthroughReplaced the higher-level OData fetch API with a low-level createRawFetch (returns { baseURL, fetch }), switched adapter to build OData query strings via odata-query and use RESTful endpoints (GET/POST/PATCH/DELETE by id), updated migrate/CLI/tests to the new fetch shape and endpoints, and updated package deps. Changes
Sequence Diagram(s)sequenceDiagram
autonumber
actor App
participant Adapter
participant OData as OData Server
App->>Adapter: findMany({ model, where, orderBy, limit, offset })
Adapter->>Adapter: buildQuery({ $filter, $orderby, $top, $skip })
Adapter->>OData: GET /{model}{?query}
OData-->>Adapter: 200 JSON { value: [...] }
Adapter-->>App: records[]
App->>Adapter: update({ model, where, data })
Adapter->>Adapter: Resolve id via GET /{model}?$select=id&$top=1&$filter=...
Adapter->>OData: PATCH /{model}('id') body: data
OData-->>Adapter: 200/204
Adapter->>OData: GET /{model}('id')
OData-->>Adapter: 200 JSON
Adapter-->>App: updated record
sequenceDiagram
autonumber
actor Dev
participant CLI
participant Migrate as Migration Module
participant OData as OData Server
Dev->>CLI: run migration
CLI->>Migrate: planMigration(fetch, schema, db)
Migrate->>OData: GET /$metadata
OData-->>Migrate: 200 or error
Migrate-->>CLI: plan
CLI->>Migrate: executeMigration(fetch, plan)
alt create table
Migrate->>OData: POST /FileMaker_Tables body: {...}
OData-->>Migrate: 201 or error
end
alt update table
Migrate->>OData: PATCH /FileMaker_Tables/{tableName}
OData-->>Migrate: 200/204 or error
end
Migrate-->>CLI: result
Estimated code review effort🎯 4 (Complex) | ⏱️ ~75 minutes Possibly related PRs
Tip 🔌 Remote MCP (Model Context Protocol) integration is now available!Pro plan users can now connect to remote MCP servers from the Integrations page. Connect with popular remote MCPs such as Notion and Linear to add more context to your reviews and chats. 📜 Recent review detailsConfiguration used: .coderabbit.yaml 💡 Knowledge Base configuration:
You can enable these sources in your CodeRabbit configuration. 📒 Files selected for processing (1)
✅ Files skipped from review due to trivial changes (1)
✨ Finishing Touches🧪 Generate unit tests
🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. CodeRabbit Commands (Invoked using PR/Issue comments)Type Other keywords and placeholders
Status, Documentation and Community
|
@proofkit/better-auth
@proofkit/cli
create-proofkit
@proofkit/fmdapi
@proofkit/typegen
@proofkit/webviewer
commit: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 7
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (2)
packages/better-auth/src/cli/index.ts (1)
67-78: Harden initialization and make verbose logging opt-in via a CLI flag.Right now, createRawFetch can throw (invalid URL), and verbose logs are always on. Wrap initialization in try/catch and add a --verbose flag to avoid noisy output by default and to exit non-zero on failures.
Apply within this block:
- const { fetch } = createRawFetch({ + let fetch; + try { + ({ fetch } = createRawFetch({ ...adapterConfig.odata, auth: // If the username and password are provided in the CLI, use them to authenticate instead of what's in the config file. options.username && options.password ? { username: options.username, password: options.password, } : adapterConfig.odata.auth, - logging: "verbose", // Enable logging for CLI operations - }); + logging: options.verbose ? "verbose" : true, + })); + } catch (e) { + logger.error( + `Failed to initialize FileMaker connection: ${ + e instanceof Error ? e.message : String(e) + }`, + ); + process.exit(1); + }And define the flag (outside this hunk, near other options):
.option("-p, --password <password>", "Full Access Password") +.option("-v, --verbose", "Enable verbose HTTP logging", false) .option("-y, --yes", "Skip confirmation", false)packages/better-auth/src/adapter.ts (1)
102-145: Guard against empty/invalid "in" clauses to avoid malformed filters.If
operator === "in"butvalueis not an array or is empty,clausestays empty and you still append connectors, yielding invalid OData.Apply this diff:
- clauses.push(clause); - // Add connector if not last - if (i < where.length - 1) { - clauses.push((cond.connector || "and").toLowerCase()); - } + if (!clause) { + // Skip empty clauses (e.g., "in" with empty/non-array values) + continue; + } + clauses.push(clause); + // Add connector if not last + if (i < where.length - 1) { + clauses.push((cond.connector || "and").toLowerCase()); + }
🧹 Nitpick comments (16)
packages/better-auth/package.json (1)
66-66: Consider adding a test:ui script for @vitest/ui.Nice addition for interactive debugging. Add a dedicated script so CI stays on vitest run while local devs can run the UI.
Proposed script addition (outside this hunk):
"scripts": { "dev": "pnpm build:watch", "test": "vitest run", + "test:ui": "vitest --ui", "build": "vite build && publint --strict", "build:watch": "vite build --watch", "ci": "pnpm run build && pnpm run test", "prepublishOnly": "pnpm run ci" }packages/better-auth/tests/adapter.test.ts (2)
20-28: Make verbose logging in tests conditional to reduce noise.Verbose logs are helpful for debugging, but they drown test output. Consider toggling via an env var (e.g., process.env.VITEST_VERBOSE) or Vitest’s built-in debug flags.
Example:
- logging: "verbose", // Enable verbose logging to see the response details + logging: process.env.VITEST_VERBOSE ? "verbose" : false,
34-56: Database reset logic is correct; consider bulk delete via OData filter to speed up.Looping and deleting per-record is fine but slow. If the backend supports collection deletes with $filter, issue one DELETE per table using buildQuery for safety. Ensure it’s supported by your FM OData gateway before switching.
Illustrative change (outside this block):
const q = buildQuery({ filter: { id: { ne: "0" } } }); await fetch(`/${table}${q}`, { method: "DELETE" });packages/better-auth/tests/migrate.test.ts (3)
31-33: Use a unique table name per run to avoid flakiness across parallel/previous runs.Static names can collide if a previous run failed mid-flight.
Apply:
- const tableName = "test_table"; + const tableName = `test_table_${Date.now()}`;
41-57: Add output schemas or shape checks to catch server-side changes early.You’re checking result.error; also assert minimal shape (e.g., { ok: true } or created resource fields) when available, using createRawFetch’s output option.
Example (illustrative, adjust to actual API):
const createResult = await fetch("/FileMaker_Tables", { method: "POST", body: { tableName, fields: [/* ... */] }, // output: z.object({ tableName: z.string() }).passthrough(), }); expect(createResult.error).toBeUndefined();
77-95: Good explicit error checks on delete/patch flows; consider wrapping cleanup in finally.If an earlier assertion throws, the table may remain. A try/finally around the body ensures cleanup runs.
Sketch (outside this hunk):
await (async () => { const tableName = `test_table_${Date.now()}`; try { // create + update + delete field } finally { await fetch(`/FileMaker_Tables/${tableName}`, { method: "DELETE" }); } })();packages/better-auth/src/migrate.ts (1)
12-23: Consider a fallback for metadata JSON by using $format or XML parsing.Some OData servers return $metadata as XML by default regardless of Accept. If JSON isn’t available, attempt
/$metadata?$format=application/jsonbefore returning null. Optionally, parse XML as a last resort.Example (outside this hunk):
let result = await fetch("/$metadata", { method: "GET", headers: { accept: "application/json" }, /* ... */}); if (result.error) { const alt = await fetch("/$metadata?$format=application/json", { method: "GET", headers: { accept: "application/json" }, /* ... */}); if (!alt.error) result = alt; }packages/better-auth/src/adapter.ts (6)
60-91: Improve ISO date string detection and avoid misclassification.The current regex misses timezone offsets and may misclassify strings that happen to match the pattern. Support offsets and tighten the matching.
Apply this diff:
- const isoDateRegex = /^\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}(\.\d{3})?Z?$/; + const isoDateRegex = + /^\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}(?:\.\d{1,3})?(?:Z|[+\-]\d{2}:\d{2})?$/;Optional: Prefer Date objects over strings for date filtering to avoid ambiguous detection.
Would you like me to add adapter tests that exercise offset-based timestamps to ensure this works end-to-end?
256-283: Replace console logs with the library logger and include error context.Use
loggerfor consistency and to respect configured log levels. Also preserve the original error message.Apply this diff:
- console.log("DELETE", { model, where, filter }); + logger.debug("DELETE", { model, where, filter }); @@ - if (result.error) { - console.log("DELETE ERROR", result.error); - throw new Error("Failed to delete record"); - } + if (result.error) { + logger.error("DELETE ERROR", result.error); + throw new Error(`Failed to delete record: ${result.error}`); + }
287-309: Speed up deleteMany with concurrent requests.The current sequential loop will be slow for many IDs. Use a bounded concurrency approach or
Promise.allSettled.Apply this diff:
- const ids = rows.data?.value?.map((r: any) => r.id) ?? []; - let deleted = 0; - for (const id of ids) { - const res = await fetch(`/${model}('${id}')`, { - method: "DELETE", - }); - if (!res.error) deleted++; - } - return deleted; + const ids = rows.data?.value?.map((r: any) => r.id) ?? []; + const results = await Promise.allSettled( + ids.map((id) => fetch(`/${model}('${id}')`, { method: "DELETE" })), + ); + return results.filter( + (r) => r.status === "fulfilled" && !(r.value as any).error, + ).length;Also replace:
- console.log("DELETE MANY", { model, where, filter }); + logger.debug("DELETE MANY", { model, where, filter });
345-367: Speed up updateMany with concurrent requests; remove unnecessaryas any.Similar to deleteMany, patching sequentially is slow. Also, the
as anycast on the return value is unnecessary if the signature returns a number.Apply this diff:
- const ids = rows.data?.value?.map((r: any) => r.id) ?? []; - let updated = 0; - for (const id of ids) { - const res = await fetch(`/${model}('${id}')`, { - method: "PATCH", - body: update, - }); - if (!res.error) updated++; - } - return updated as any; + const ids = rows.data?.value?.map((r: any) => r.id) ?? []; + const results = await Promise.allSettled( + ids.map((id) => + fetch(`/${model}('${id}')`, { method: "PATCH", body: update }), + ), + ); + return results.filter( + (r) => r.status === "fulfilled" && !(r.value as any).error, + ).length;
176-185: Remove leftover session-only console log or guard it with debugLogs.
console.log("session", data)is noisy and bypasses configured log levels.Apply this diff:
- if (model === "session") { - console.log("session", data); - } + if (config.debugLogs) { + logger.debug("CREATE", { model, data }); + }
166-172: Comment/documentation mismatch on supports flags.Comments say “Default: true” while the values are
false. Align comments or values to avoid confusion for consumers of the adapter.packages/better-auth/src/odata/index.ts (3)
121-126: Avoid logging sensitive response headers.
Set-Cookieand other sensitive headers may be included in logs. Redact or omit them when verbose logging is enabled.Example:
const headersForLog = Object.fromEntries( [...response.headers.entries()].filter(([k]) => !/^set-cookie$/i.test(k)), ); betterAuthLogger.info("raw-fetch", "Response headers:", headersForLog);
199-208: Coerce numeric text responses to numbers (helps with $count).For
text/plainnumeric payloads, map them to numbers to simplify callers and validations.Apply this diff:
- responseData = await response.text(); + responseData = await response.text(); + const trimmed = responseData.trim(); + if (/^\d+$/.test(trimmed)) { + responseData = Number(trimmed); + }
30-42: Base URL construction assumes no path in serverUrl.Using
result.value.origindiscards any path segments provided inserverUrl. If users pass a URL with a base path, it will be ignored. Consider documenting this expectation or supporting paths.
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
💡 Knowledge Base configuration:
- MCP integration is disabled by default for public repositories
- Jira integration is disabled by default for public repositories
- Linear integration is disabled by default for public repositories
You can enable these sources in your CodeRabbit configuration.
⛔ Files ignored due to path filters (1)
pnpm-lock.yamlis excluded by!**/pnpm-lock.yaml
📒 Files selected for processing (7)
packages/better-auth/package.json(1 hunks)packages/better-auth/src/adapter.ts(3 hunks)packages/better-auth/src/cli/index.ts(3 hunks)packages/better-auth/src/migrate.ts(3 hunks)packages/better-auth/src/odata/index.ts(1 hunks)packages/better-auth/tests/adapter.test.ts(3 hunks)packages/better-auth/tests/migrate.test.ts(5 hunks)
🧰 Additional context used
🧬 Code Graph Analysis (5)
packages/better-auth/tests/adapter.test.ts (1)
packages/better-auth/src/odata/index.ts (1)
createRawFetch(30-263)
packages/better-auth/src/cli/index.ts (1)
packages/better-auth/src/odata/index.ts (1)
createRawFetch(30-263)
packages/better-auth/src/migrate.ts (1)
packages/better-auth/src/odata/index.ts (1)
createRawFetch(30-263)
packages/better-auth/src/adapter.ts (1)
packages/better-auth/src/odata/index.ts (1)
createRawFetch(30-263)
packages/better-auth/tests/migrate.test.ts (1)
packages/better-auth/src/odata/index.ts (1)
createRawFetch(30-263)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
- GitHub Check: build
- GitHub Check: build
🔇 Additional comments (12)
packages/better-auth/package.json (1)
58-58: Runtime usage of odata-query confirmedFound an import of
buildQueryfrom"odata-query"inpackages/better-auth/src/adapter.ts(line 9), which is part of the package’s runtime code. Keeping"odata-query": "^8.0.4"underdependenciesis therefore correct.packages/better-auth/src/cli/index.ts (1)
16-16: Import swap to createRawFetch is correct.The CLI now aligns with the new { baseURL, fetch } shape. Good.
packages/better-auth/tests/adapter.test.ts (3)
111-118: Good: validate POST response with a schema and throw on error.This keeps test failures actionable and explicit.
138-145: Cleanup step looks fine; keep it non-throwing.Deleting the test record at the end without failing the test on delete errors is pragmatic for cleanup.
4-6: Ignore the zod/v4 import fixOur codebase intentionally uses Zod v4 via the
"zod/v4"subpath. A project-wide search shows dozens of imports from"zod/v4"(e.g. packages/better-auth, packages/cli, packages/typegen, apps/docs, etc.), and our long-term conventions requireimport { z } from "zod/v4"for all schemas. No changes are needed here.Likely an incorrect or invalid review comment.
packages/better-auth/tests/migrate.test.ts (1)
15-22: Switch to createRawFetch is correct; verbose logging is helpful during test runs.No issues here.
packages/better-auth/src/migrate.ts (5)
8-9: Type constraint on fetch matches createRawFetch; good.Updating signatures to ReturnType["fetch"] keeps the surface consistent across callers.
24-27: Graceful handling when metadata retrieval fails.Returning null and letting planMigration handle the empty state avoids crashes in the CLI.
170-185: Create step: improved error reporting and fail-fast behavior is solid.Clear log + throwing preserves a good DX when migrations fail.
187-199: Update step: same as above—good fail-fast path with detailed context.No issues here.
5-5: Critical: Replace all “zod/v4” imports and.looseObject()usage across the codebaseThe verification command uncovered numerous remaining imports from
zod/v4and calls to.looseObject(...), including in the core packages you maintain. These must all be updated to the standardimport { z } from "zod"and use.object(...).passthrough()instead of.looseObject(...).Files requiring urgent fixes:
packages/better-auth/src/migrate.ts(lines 4–5, 15–18)packages/better-auth/src/adapter.ts(lines 7, 184)packages/better-auth/src/odata/index.ts(line 3)- Any other modules under
packages/better-authstill importing fromzod/v4- Core utilities in
packages/cliandpackages/typegen(multiple files)- Demo/tests and MDX docs that reference
import { z } from "zod/v4"Example patch for
packages/better-auth/src/migrate.ts:-import z from "zod/v4"; +import { z } from "zod"; import { createRawFetch } from "./odata"; … - output: z - .looseObject({ - $Version: z.string(), - "@ServerVersion": z.string(), - }) - .or(z.null()) - .catch(null), + output: z + .object({ + $Version: z.string(), + "@ServerVersion": z.string(), + }) + .passthrough() + .or(z.null()) + .catch(null),And for
packages/better-auth/src/adapter.ts:-import { prettifyError, z } from "zod/v4"; +import { prettifyError, z } from "zod"; … - output: z.looseObject({ id: z.string() }), + output: z + .object({ id: z.string() }) + .passthrough(),Please apply equivalent updates in all listed modules to prevent runtime errors under Zod 3.x and ensure consistency.
⛔ Skipped due to learnings
Learnt from: eluce2 PR: proofgeist/proofkit#59 File: apps/docs/src/registry/lib/types.ts:1-1 Timestamp: 2025-08-13T15:42:49.653Z Learning: In the proofgeist/proofkit codebase, `import { z } from "zod/v4";` is the correct import pattern for Zod, not the standard `import { z } from "zod";`.packages/better-auth/src/adapter.ts (1)
259-269: Double-quoted"id"in $select may not be portable across OData servers.Passing
select: [\"id"`]can yield$select=%22id%22. Some servers expect unquoted property names. Consider usingselect: ["id"]` unless quoting is strictly required by FileMaker’s OData dialect.Would you like me to adjust and run a quick query against your test harness to confirm that
$select="id"behaves identically to$select=id?
| const query = buildQuery({ | ||
| top: limit, | ||
| skip: offset, | ||
| orderBy: sortBy | ||
| ? `${sortBy.field} ${sortBy.direction ?? "asc"}` | ||
| : undefined, | ||
| filter: filter.length > 0 ? filter : undefined, | ||
| }); | ||
| logger.debug("QUERY", query); | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Verification agent
🧩 Analysis chain
Confirm odata-query accepts raw string for filter/orderBy.
You're passing a prebuilt filter string and a raw 'field dir' string to buildQuery. Some versions of odata-query expect structured inputs. Verify compatibility to avoid silent query mishandling.
🌐 Web query:
Does the 'odata-query' package accept a raw string for 'filter' and 'orderBy' (e.g., filter: "Foo eq 1" and orderBy: "Foo asc") when calling buildQuery?
💡 Result:
The odata-query package does not accept raw OData strings directly for the filter or orderBy parameters when calling buildQuery. Instead, both parameters expect an object (for filter) or an array/object (for orderBy), which the library then converts into OData query syntax.
Details and Examples:
-
filterparameter:- You cannot simply pass
filter: "Foo eq 1". - You must use an object or a supported OData-query "expression object." For example:
const filter = { Foo: 1 }; // Becomes 'Foo eq 1' buildQuery({ filter }); // => '?$filter=Foo eq 1'
- You may mix object notation with limited custom string snippets in logical groups, but only inside operators (
and,or,not). For example:Raw strings are only allowed inside logical group arrays—not as the top-level filter[3].const filter = { and: [ { Foo: 1 }, 'startswith(Name, "foo")' ] }; buildQuery({ filter }); // => '?$filter=Foo eq 1 and startswith(Name, "foo")'
- You cannot simply pass
-
orderByparameter:- You cannot simply pass
orderBy: "Foo asc". - You must use an array or an object notation describing the order, such as:
or
buildQuery({ orderBy: [{ Foo: 'asc' }] }); // => '?$orderby=Foo asc'
The value can also bebuildQuery({ orderBy: { Foo: 'asc' } }); // => '?$orderby=Foo asc'
'desc'for descending sort[3].
- You cannot simply pass
Conclusion:
- Raw filter and orderBy strings are not supported as top-level values.
- You must use the object or array syntax provided by the library. Limited use of raw snippets is possible for complex case filters, but never directly as the filter/orderBy root value[3].
- This reflects the goal of the package: to help avoid hand-written OData strings and provide type safety.
For reference, see the npm documentation for [odata-query][3].
If you need to use a fully raw filter/orderBy string, you must build the full query string manually or use another library that supports direct string injection.
Key facts:
buildQuery({ filter: "Foo eq 1" })→ ❌ Not supportedbuildQuery({ orderBy: "Foo asc" })→ ❌ Not supportedbuildQuery({ filter: { Foo: 1 } })→ ✅ SupportedbuildQuery({ orderBy: [{ Foo: 'asc' }] })→ ✅ Supported
[3]
Citations:
- 1: https://pub.dev/packages/odata_query
- 2: https://learn.microsoft.com/en-us/azure/search/search-query-odata-orderby
- 3: https://www.npmjs.com/package/odata-query
- 4: https://learn.microsoft.com/en-us/azure/search/query-odata-filter-orderby-syntax
- 5: https://www.odata.org/documentation/odata-version-2-0/uri-conventions/
- 6: https://cran.r-project.org/web/packages/ODataQuery/ODataQuery.pdf
- 7: https://www.odata.org/getting-started/basic-tutorial/
- 8: https://github.com/NuGet/Home/wiki/Filter-OData-query-requests
- 9: https://learn.microsoft.com/en-us/odata/concepts/queryoptions-overview
- 10: https://docs.oasis-open.org/odata/odata/v4.01/odata-v4.01-part2-url-conventions.html
Use structured filter and orderBy inputs with buildQuery
The odata-query package does not accept raw OData strings as the top-level values for filter or orderBy. Passing "Foo eq 1" or "Foo asc" directly will be ignored or mis-serialized. You must supply an object (for filter) or an object/array (for orderBy) so the library can generate valid query syntax.
· packages/better-auth/src/adapter.ts (around lines 232–241):
- Replace the raw
filterarray with a structured expression (e.g.{ and: filter }). - Replace the raw “field dir” string for
orderBywith an array or object mapping field to direction.
Suggested diff:
const query = buildQuery({
top: limit,
skip: offset,
- orderBy: sortBy
- ? `${sortBy.field} ${sortBy.direction ?? "asc"}`
- : undefined,
- filter: filter.length > 0 ? filter : undefined,
+ orderBy: sortBy
+ ? [ { [sortBy.field]: sortBy.direction ?? "asc" } ]
+ : undefined,
+ filter: filter.length > 0
+ ? { and: filter }
+ : undefined,
});
logger.debug("QUERY", query);This ensures your filters and sort orders are serialized properly (e.g. ?$filter=Foo eq 1 and ...&$orderby=Foo asc).
Committable suggestion skipped: line range outside the PR's diff.
🤖 Prompt for AI Agents
In packages/better-auth/src/adapter.ts around lines 232 to 241, update the
buildQuery inputs so you pass structured objects instead of raw OData strings:
wrap the filter array as a logical expression (e.g. pass filter: filter.length ?
{ and: filter } : undefined) and convert the sortBy string into an object/array
form that odata-query expects (e.g. orderBy: sortBy ? [{ [sortBy.field]:
(sortBy.direction ?? "asc") }] : undefined); keep top/skip as-is and preserve
the debug log.
| // Create authentication headers | ||
| const authHeaders: Record<string, string> = {}; | ||
| if ("apiKey" in args.auth) { | ||
| authHeaders.Authorization = `Bearer ${args.auth.apiKey}`; | ||
| } else { | ||
| const credentials = btoa(`${args.auth.username}:${args.auth.password}`); | ||
| authHeaders.Authorization = `Basic ${credentials}`; | ||
| } | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Make Basic auth header cross-runtime (Node + Browser).
btoa is not available in Node.js. Use a safe fallback to Buffer when btoa is missing.
Apply this diff:
- if ("apiKey" in args.auth) {
- authHeaders.Authorization = `Bearer ${args.auth.apiKey}`;
- } else {
- const credentials = btoa(`${args.auth.username}:${args.auth.password}`);
- authHeaders.Authorization = `Basic ${credentials}`;
- }
+ if ("apiKey" in args.auth) {
+ authHeaders.Authorization = `Bearer ${args.auth.apiKey}`;
+ } else {
+ const raw = `${args.auth.username}:${args.auth.password}`;
+ const credentials =
+ typeof btoa === "function"
+ ? btoa(raw)
+ : Buffer.from(raw, "utf8").toString("base64");
+ authHeaders.Authorization = `Basic ${credentials}`;
+ }📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| // Create authentication headers | |
| const authHeaders: Record<string, string> = {}; | |
| if ("apiKey" in args.auth) { | |
| authHeaders.Authorization = `Bearer ${args.auth.apiKey}`; | |
| } else { | |
| const credentials = btoa(`${args.auth.username}:${args.auth.password}`); | |
| authHeaders.Authorization = `Basic ${credentials}`; | |
| } | |
| // Create authentication headers | |
| const authHeaders: Record<string, string> = {}; | |
| if ("apiKey" in args.auth) { | |
| authHeaders.Authorization = `Bearer ${args.auth.apiKey}`; | |
| } else { | |
| const raw = `${args.auth.username}:${args.auth.password}`; | |
| const credentials = | |
| typeof btoa === "function" | |
| ? btoa(raw) | |
| : Buffer.from(raw, "utf8").toString("base64"); | |
| authHeaders.Authorization = `Basic ${credentials}`; | |
| } |
🤖 Prompt for AI Agents
In packages/better-auth/src/odata/index.ts around lines 43 to 51, the code uses
btoa() to create Basic auth credentials which fails in Node; replace that with a
cross-runtime fallback: compute credentials string
`${args.auth.username}:${args.auth.password}` and base64-encode it using typeof
globalThis.btoa === 'function' ? globalThis.btoa(creds) : Buffer.from(creds,
'utf8').toString('base64'), then set authHeaders.Authorization = `Basic
${encoded}`; ensure you reference globalThis to avoid undefined btoa and do not
add new runtime-only imports.
| // Merge headers | ||
| const headers = { | ||
| "Content-Type": "application/json", | ||
| ...authHeaders, | ||
| ...(options?.headers || {}), | ||
| }; | ||
|
|
||
| const requestInit: RequestInit = { | ||
| ...options, | ||
| headers, | ||
| body: processedBody, | ||
| }; | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Only set Content-Type when a body is present; add Accept header.
Always sending Content-Type: application/json (even for GET/DELETE) can trigger unnecessary preflights. Also set Accept to prefer JSON responses.
Apply this diff:
- const headers = {
- "Content-Type": "application/json",
- ...authHeaders,
- ...(options?.headers || {}),
- };
+ const headers = {
+ ...(processedBody !== undefined ? { "Content-Type": "application/json" } : {}),
+ Accept: "application/json",
+ ...authHeaders,
+ ...(options?.headers || {}),
+ };📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| // Merge headers | |
| const headers = { | |
| "Content-Type": "application/json", | |
| ...authHeaders, | |
| ...(options?.headers || {}), | |
| }; | |
| const requestInit: RequestInit = { | |
| ...options, | |
| headers, | |
| body: processedBody, | |
| }; | |
| // Merge headers | |
| const headers = { | |
| ...(processedBody !== undefined ? { "Content-Type": "application/json" } : {}), | |
| Accept: "application/json", | |
| ...authHeaders, | |
| ...(options?.headers || {}), | |
| }; | |
| const requestInit: RequestInit = { | |
| ...options, | |
| headers, | |
| body: processedBody, | |
| }; |
🤖 Prompt for AI Agents
In packages/better-auth/src/odata/index.ts around lines 89 to 101, the code
always sets "Content-Type": "application/json" which causes unnecessary
preflight requests for methods without a body; modify the header merge so
Content-Type is only added when processedBody is not undefined (or when method
implies a body), add an "Accept": "application/json" header by default, and
ensure existing headers passed in options and authHeaders keep precedence (merge
defaults first, then authHeaders, then options.headers) so callers can override
these defaults.
| it("should sort descending", async () => { | ||
| const result = await adapter({}).findMany({ | ||
| model: "verification", | ||
| where: [ | ||
| { | ||
| field: "identifier", | ||
| operator: "eq", | ||
| value: "zyzaUHEsETWiuORCCdyguVVlVPcnduXk", | ||
| }, | ||
| ], | ||
| limit: 1, | ||
| sortBy: { direction: "desc", field: "createdAt" }, | ||
| }); | ||
|
|
||
| console.log(result); | ||
|
|
||
| // expect(result.data).toHaveLength(1); | ||
| }); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Turn the “order by” smoke test into a real assertion.
The assertion is commented out, defeating the PR’s “Adds an order by test” objective. Also, use limit > 1 to verify sort ordering.
Apply:
- it("should sort descending", async () => {
- const result = await adapter({}).findMany({
+ it("should sort descending", async () => {
+ const result = await adapter({}).findMany({
model: "verification",
where: [
{
field: "identifier",
operator: "eq",
value: "zyzaUHEsETWiuORCCdyguVVlVPcnduXk",
},
],
- limit: 1,
+ limit: 5,
sortBy: { direction: "desc", field: "createdAt" },
});
- console.log(result);
-
- // expect(result.data).toHaveLength(1);
+ expect(result.error).toBeUndefined();
+ const recs = result.data ?? [];
+ for (let i = 1; i < recs.length; i++) {
+ expect(new Date(recs[i - 1].createdAt).getTime())
+ .toBeGreaterThanOrEqual(new Date(recs[i].createdAt).getTime());
+ }
});📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| it("should sort descending", async () => { | |
| const result = await adapter({}).findMany({ | |
| model: "verification", | |
| where: [ | |
| { | |
| field: "identifier", | |
| operator: "eq", | |
| value: "zyzaUHEsETWiuORCCdyguVVlVPcnduXk", | |
| }, | |
| ], | |
| limit: 1, | |
| sortBy: { direction: "desc", field: "createdAt" }, | |
| }); | |
| console.log(result); | |
| // expect(result.data).toHaveLength(1); | |
| }); | |
| it("should sort descending", async () => { | |
| const result = await adapter({}).findMany({ | |
| model: "verification", | |
| where: [ | |
| { | |
| field: "identifier", | |
| operator: "eq", | |
| value: "zyzaUHEsETWiuORCCdyguVVlVPcnduXk", | |
| }, | |
| ], | |
| limit: 5, | |
| sortBy: { direction: "desc", field: "createdAt" }, | |
| }); | |
| expect(result.error).toBeUndefined(); | |
| const recs = result.data ?? []; | |
| for (let i = 1; i < recs.length; i++) { | |
| expect(new Date(recs[i - 1].createdAt).getTime()) | |
| .toBeGreaterThanOrEqual(new Date(recs[i].createdAt).getTime()); | |
| } | |
| }); |
🤖 Prompt for AI Agents
In packages/better-auth/tests/adapter.test.ts around lines 79 to 96, the "should
sort descending" test currently has its assertion commented out and uses limit:
1 which cannot verify ordering; update the test to use limit: 2 (or >1), remove
the console.log, assert the returned data length equals the new limit, and add
an explicit ordering assertion that the first item's createdAt is >= the second
item's createdAt (or compare the expected identifier order if deterministic), so
the test verifies descending sort behavior.
| // delete all users - using buildQuery to construct the filter properly | ||
| const deleteAllResult = await fetch(`/user?$filter="id" ne '0'`, { | ||
| method: "DELETE", | ||
| query: { | ||
| $filter: `"id" ne '0'`, | ||
| }, | ||
| }); | ||
|
|
||
| if (deleteAllResult.error) { | ||
| console.log("Error deleting all users:", deleteAllResult.error); | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Fix the OData filter for DELETE; don’t quote the field name; use odata-query.
"$filter="id" ne '0'" treats "id" as a string literal. Build it safely with odata-query.
Apply:
- // delete all users - using buildQuery to construct the filter properly
- const deleteAllResult = await fetch(`/user?$filter="id" ne '0'`, {
+ // delete all users - use buildQuery to construct the filter properly
+ const deleteAllResult = await fetch(
+ `/user${buildQuery({ filter: { id: { ne: "0" } } })}`,
+ {
method: "DELETE",
- });
+ },
+ );📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| // delete all users - using buildQuery to construct the filter properly | |
| const deleteAllResult = await fetch(`/user?$filter="id" ne '0'`, { | |
| method: "DELETE", | |
| query: { | |
| $filter: `"id" ne '0'`, | |
| }, | |
| }); | |
| if (deleteAllResult.error) { | |
| console.log("Error deleting all users:", deleteAllResult.error); | |
| } | |
| // delete all users - use buildQuery to construct the filter properly | |
| const deleteAllResult = await fetch( | |
| `/user${buildQuery({ filter: { id: { ne: "0" } } })}`, | |
| { | |
| method: "DELETE", | |
| }, | |
| ); | |
| if (deleteAllResult.error) { | |
| console.log("Error deleting all users:", deleteAllResult.error); | |
| } |
🤖 Prompt for AI Agents
In packages/better-auth/tests/adapter.test.ts around lines 100 to 107, the
DELETE request uses the filter string "$filter=\"id\" ne '0'" which treats "id"
as a string literal; replace this with a properly constructed OData query using
the odata-query builder (import/buildQuery from odata-query), construct a filter
object like { id: { ne: '0' } }, generate the query string and append it to the
fetch URL so the field name is not quoted and the filter is safely encoded for
the DELETE call.
| const result = await fetch(`/user?$filter=createdAt ge 2025-01-05`, { | ||
| method: "GET", | ||
| query: { | ||
| $filter: `createdAt ge 2025-01-05`, | ||
| }, | ||
| output: z.object({ value: z.array(z.any()) }), | ||
| }); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Date filter should use an ISO timestamp for DateTimeOffset; prefer odata-query to serialize.
If createdAt is a DateTimeOffset, plain “2025-01-05” may not filter correctly. Let odata-query serialize a Date.
Apply:
- const result = await fetch(`/user?$filter=createdAt ge 2025-01-05`, {
+ const result = await fetch(
+ `/user${buildQuery({
+ filter: { createdAt: { ge: new Date("2025-01-05T00:00:00.000Z") } },
+ })}`,
+ {
method: "GET",
output: z.object({ value: z.array(z.any()) }),
- });
+ },
+ );📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const result = await fetch(`/user?$filter=createdAt ge 2025-01-05`, { | |
| method: "GET", | |
| query: { | |
| $filter: `createdAt ge 2025-01-05`, | |
| }, | |
| output: z.object({ value: z.array(z.any()) }), | |
| }); | |
| const result = await fetch( | |
| `/user${buildQuery({ | |
| filter: { createdAt: { ge: new Date("2025-01-05T00:00:00.000Z") } }, | |
| })}`, | |
| { | |
| method: "GET", | |
| output: z.object({ value: z.array(z.any()) }), | |
| }, | |
| ); |
🤖 Prompt for AI Agents
In packages/better-auth/tests/adapter.test.ts around lines 124 to 127, the test
builds the $filter using a plain date string ("2025-01-05") which is not a
proper DateTimeOffset; replace the hard-coded string with a proper ISO timestamp
generated/serialized via odata-query (or Date.toISOString()) so the request uses
a full DateTimeOffset. Update the test to construct the query using
odata-query's filter builder for the createdAt ge comparison (or call new
Date("2025-01-05").toISOString()) and pass that serialized value into the fetch
URL so the OData service receives an ISO 8601 DateTimeOffset.

add order by test
Update pnpm-lock.yaml and package.json for better-auth package; integrate odata-query and @vitest/ui dependencies, refactor fetch logic in adapter and migrate files for improved error handling and query construction.
Summary by CodeRabbit
New Features
Refactor
Chores
Tests