Skip to content

SUP-140: fix MAXVALUE on snowflake-converted sequences.#38

Open
ibrarahmad wants to merge 1 commit into
mainfrom
SUP-140
Open

SUP-140: fix MAXVALUE on snowflake-converted sequences.#38
ibrarahmad wants to merge 1 commit into
mainfrom
SUP-140

Conversation

@ibrarahmad
Copy link
Copy Markdown

convert_sequence_to_snowflake() previously set MAXVALUE to (old last_value + 1). snowflake.nextval() bypasses MAXVALUE, so the sequence stored huge snowflake IDs against a tiny ceiling; pg_dump then captured both, and the restore failed with
"setval: value <snowflake_id> is out of bounds for sequence ...".

Fix in every embedded copy: ALTER SEQUENCE ... AS bigint NO CYCLE MAXVALUE 9223372036854775807. AS bigint keeps the fix safe for int4-backed serial sequences.

Adds snowflake--2.4--2.5.sql which re-installs the corrected function and repairs already-converted sequences in existing databases. Bumps default_version to 2.5. Regression test sql/maxvalue.sql exercises the pg_dump setval path.

@ibrarahmad ibrarahmad requested a review from mason-sharp May 16, 2026 01:54
@coderabbitai
Copy link
Copy Markdown

coderabbitai Bot commented May 16, 2026

Review Change Stack

Warning

Rate limit exceeded

@ibrarahmad has exceeded the limit for the number of commits that can be reviewed per hour. Please wait 48 minutes and 41 seconds before requesting another review.

You’ve run out of usage credits. Purchase more in the billing tab.

⌛ How to resolve this issue?

After the wait time has elapsed, a review can be triggered using the @coderabbitai review command as a PR comment. Alternatively, push new commits to this PR.

We recommend that you space out your commits to avoid hitting the rate limit.

🚦 How do rate limits work?

CodeRabbit enforces hourly rate limits for each developer per organization.

Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout.

Please see our FAQ for further information.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: a6a50e1b-b3bd-4cd0-84b9-a98f52adf7d2

📥 Commits

Reviewing files that changed from the base of the PR and between 565ce74 and 11d109d.

⛔ Files ignored due to path filters (2)
  • expected/conversion.out is excluded by !**/*.out
  • expected/maxvalue.out is excluded by !**/*.out
📒 Files selected for processing (11)
  • Makefile
  • snowflake--1.2--2.0.sql
  • snowflake--2.0--2.2.sql
  • snowflake--2.0.sql
  • snowflake--2.2--2.3.sql
  • snowflake--2.2.sql
  • snowflake--2.3--2.4.sql
  • snowflake--2.3.sql
  • snowflake--2.4--2.5.sql
  • snowflake.control
  • sql/maxvalue.sql
📝 Walkthrough

Walkthrough

This PR repairs sequence MAXVALUE handling across the snowflake PostgreSQL extension. The change updates existing migration scripts to use a fixed 64-bit bigint maximum value (9223372036854775807) instead of computing MAXVALUE dynamically based on current last_value, and introduces a new migration with repair logic for sequences already converted under the old approach.

Changes

Sequence MAXVALUE Repair across Extension Versions

Layer / File(s) Summary
Prior migration files updated with MAXVALUE fix
snowflake--1.2--2.0.sql, snowflake--2.0--2.2.sql, snowflake--2.0.sql, snowflake--2.2--2.3.sql, snowflake--2.2.sql, snowflake--2.3--2.4.sql, snowflake--2.3.sql
Multiple existing migration files are updated with the same fix pattern: the v_last_value variable is removed from convert_sequence_to_snowflake() function declarations, and the sequence adjustment logic is replaced with a direct ALTER SEQUENCE command that sets AS bigint NO CYCLE and a constant MAXVALUE 9223372036854775807 instead of computing MAXVALUE from the sequence's current last_value + 1.
New 2.4→2.5 migration with function update and repair block
snowflake--2.4--2.5.sql
New migration file defines the updated convert_sequence_to_snowflake(p_seqid regclass) function with the fixed MAXVALUE approach. Additionally, a one-shot DO $repair$ block scans for sequences already converted to snowflake (identified by column defaults referencing snowflake.nextval(...)) that still have max_value below the 64-bit maximum, and issues ALTER SEQUENCE statements to raise their MAXVALUE to 9223372036854775807.
Regression test and build configuration
Makefile, snowflake.control, sql/maxvalue.sql
Makefile is updated to include snowflake--2.4--2.5.sql in the DATA list and add maxvalue to the REGRESS test targets. Extension metadata in snowflake.control is updated to set default_version to 2.5. New regression test creates test sequences, verifies that conversion sets max_value to the correct 64-bit ceiling, and confirms that pg_catalog.setval() accepts snowflake-sized values without raising bounds errors.

Poem

A sequence once bounded so tight,
Now stretches to bigint's full height,
No more does it stumble,
On values that crumble—
🐰 Fixed maxvalue shines ever so bright!

🚥 Pre-merge checks | ✅ 5
✅ Passed checks (5 passed)
Check name Status Explanation
Title check ✅ Passed The title clearly and specifically identifies the main change: fixing MAXVALUE handling in snowflake-converted sequences, referencing the relevant issue (SUP-140).
Description check ✅ Passed The description is directly related to the changeset, explaining the problem, solution, and all file changes included in the PR.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.
Linked Issues check ✅ Passed Check skipped because no linked issues were found for this pull request.
Out of Scope Changes check ✅ Passed Check skipped because no linked issues were found for this pull request.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Commit unit tests in branch SUP-140

Tip

💬 Introducing Slack Agent: The best way for teams to turn conversations into code.

Slack Agent is built on CodeRabbit's deep understanding of your code, so your team can collaborate across the entire SDLC without losing context.

  • Generate code and open pull requests
  • Plan features and break down work
  • Investigate incidents and troubleshoot customer tickets together
  • Automate recurring tasks and respond to alerts with triggers
  • Summarize progress and report instantly

Built for teams:

  • Shared memory across your entire org—no repeating context
  • Per-thread sandboxes to safely plan and execute work
  • Governance built-in—scoped access, auditability, and budget controls

One agent for your entire SDLC. Right inside Slack.

👉 Get started


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@codacy-production
Copy link
Copy Markdown

codacy-production Bot commented May 16, 2026

Up to standards ✅

🟢 Issues 0 issues

Results:
0 new issues

View in Codacy

NEW Get contextual insights on your PRs based on Codacy's metrics, along with PR and Jira context, without leaving GitHub. Enable AI reviewer
TIP This summary will be updated as you push new changes.

Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🤖 Prompt for all review comments with AI agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

Inline comments:
In `@snowflake--2.4--2.5.sql`:
- Around line 269-274: The repair query currently only follows pg_depend rows
with deptype IN ('a','i') while convert_sequence_to_snowflake() also handles
non-owned sequences (deptype = 'n'); update the JOIN/filter that references
pg_depend (the clause using d.deptype IN ('a','i')) to include the same sequence
classes the conversion supports (e.g., add 'n' or otherwise match
convert_sequence_to_snowflake()'s logic) so the repair scan covers
non-owned/explicit nextval() sequences and avoids leaving stale max_value rows.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: d2bbe946-d103-4d69-9523-8f4a461262f5

📥 Commits

Reviewing files that changed from the base of the PR and between 9fbb938 and 565ce74.

⛔ Files ignored due to path filters (2)
  • expected/conversion.out is excluded by !**/*.out
  • expected/maxvalue.out is excluded by !**/*.out
📒 Files selected for processing (11)
  • Makefile
  • snowflake--1.2--2.0.sql
  • snowflake--2.0--2.2.sql
  • snowflake--2.0.sql
  • snowflake--2.2--2.3.sql
  • snowflake--2.2.sql
  • snowflake--2.3--2.4.sql
  • snowflake--2.3.sql
  • snowflake--2.4--2.5.sql
  • snowflake.control
  • sql/maxvalue.sql

Comment thread snowflake--2.4--2.5.sql Outdated
convert_sequence_to_snowflake() previously set MAXVALUE to
(old last_value + 1).  snowflake.nextval() bypasses MAXVALUE, so the
sequence stored huge snowflake IDs against a tiny ceiling; pg_dump
then captured both, and the restore failed with
"setval: value <snowflake_id> is out of bounds for sequence ...".

Fix in every embedded copy: ALTER SEQUENCE ... AS bigint NO CYCLE
MAXVALUE 9223372036854775807.  AS bigint keeps the fix safe for
int4-backed serial sequences.

Adds snowflake--2.4--2.5.sql which re-installs the corrected function
and repairs already-converted sequences in existing databases.
Bumps default_version to 2.5.  Regression test sql/maxvalue.sql
exercises the pg_dump setval path.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant