Skip to content

fix(react): resolve infinite Suspense retry loop on promise rejection#2323

Open
schettn wants to merge 1 commit intogqty-dev:mainfrom
schettn:main
Open

fix(react): resolve infinite Suspense retry loop on promise rejection#2323
schettn wants to merge 1 commit intogqty-dev:mainfrom
schettn:main

Conversation

@schettn
Copy link
Copy Markdown

@schettn schettn commented Apr 24, 2026

Introduce a stable resource cache for Suspense to ensure consistent
thrown promises across renders. This specifically addresses issues
when using suspense: true with the prepare helper.

The stable cache ensures the fix works in both CSR and SSR environments.
While a simple setState({ error }) might suffice for CSR, it fails on
the server because state is lost during the retry phase. By caching the
resource, we guarantee that rejected promises are correctly thrown as
errors, allowing them to be caught by an ErrorBoundary instead of
triggering an infinite retry loop.

  • Add suspense-resource.ts for managing stable Suspense resources.
  • Update useQuery to utilize suspenseResourceCache for prepare calls.
  • Add detailed comments on hashing, SSR state persistence, and error handling.

Introduce a stable resource cache for Suspense to ensure consistent
thrown promises across renders. This specifically addresses issues
when using `suspense: true` with the `prepare` helper.

The stable cache ensures the fix works in both CSR and SSR environments.
While a simple `setState({ error })` might suffice for CSR, it fails on
the server because state is lost during the retry phase. By caching the
resource, we guarantee that rejected promises are correctly thrown as
errors, allowing them to be caught by an ErrorBoundary instead of
triggering an infinite retry loop.

- Add `suspense-resource.ts` for managing stable Suspense resources.
- Update `useQuery` to utilize `suspenseResourceCache` for `prepare` calls.
- Add detailed comments on hashing, SSR state persistence, and error handling.
// We create a unique hash for the current query based on the selections
// and operation parameters. This ensures that we can identify if we're
// fetching the same data across renders.
const queryHash = Array.from(selections)
Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@vicary maybe there is a better way to build the cache key

@vicary
Copy link
Copy Markdown
Member

vicary commented Apr 25, 2026

@schettn Thanks for the PR.

There has been a boundary tension between imperative and reactive (i.e. with and withoutprepare) and some internals requires rethinking. I need some time to think it through, and lots of tests around that.

Please do let me know if your fork is working for your use case, that information helps.

@schettn
Copy link
Copy Markdown
Author

schettn commented Apr 27, 2026

@vicary would you be open to publishing a canary build? It would save me from having to maintain a separate package in the meantime.

Decided to go with a packed build instead. Thanks anyway!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants