Adds overwrite script and timestamps migration#171
Conversation
* Get basic working * adds rss feeds * added stablewatch founder and stablecoin intern from messaria as approvers on stablecoins feed * fixes to config for grants, sui, and telegram channels (#102) * add query param for selective processing (#103) * adds query param to process * add query param for processing * simplify * add tags * Feat: implement frontend leaderboard (#93) * feat: implement frontend leaderboard * feat: implement a leaderboard in frontend * feat: implemented leaderboard * fix: rebuild implement leaderboard * fix: prettier * fix: prettier * fix: reimplement frontend leaderboard * fix: implement frontend leaderboard * approval rate * sets approval rate and hides curator --------- Co-authored-by: Elliot Braem <elliot@ejlbraem.com> * remove tailwind-scrollbar * added bob to desci feed * Get basic working * set .env.example * nitpicks --------- Co-authored-by: Elliot Braem <elliot@ejlbraem.com> Co-authored-by: codingshot <45281667+codingshot@users.noreply.github.com> Co-authored-by: Elliot Braem <elliot@everything.dev> Co-authored-by: Louis <112561517+louisdevzz@users.noreply.github.com>
* Explore Page - commit-1 * Explore Page - commit-2 * explore page - commit-3 * explore page - commit - 4 * explore page - commit - prettier * explore page responsiveness + code Rabbit Comments * code Rabbit Comments resolved * css updates * css changes 2 * header update + mobile responsive * conflicts resolved * Rebase and changes * Fix fmt
* Explore Page - commit-1 * Explore Page - commit-2 * explore page - commit-3 * explore page - commit - 4 * explore page - commit - prettier * explore page responsiveness + code Rabbit Comments * code Rabbit Comments resolved * css updates * css changes 2 * header update + mobile responsive * conflicts resolved * Rebase and changes * Fix fmt * Header Updates + Web3Auth getUserInfo + Explore Page changes * fmt * coderabbit comments resolved
* Update the FE to have the profile page (header and tabs init) * Move tabs to it's own component * Add stats and top badges to overview * Finish the overview tab * Update overview page and init content page * feat(profile-page): add Content and My Feed tags * feat(profile-page): finish profile page static UI * refactor: fmt
* Curate Engine Step 1 * content-progress-configuration step-1 * Curation Settings Part 2 and 3 * CodeRabbit Comments Resolved + Mobile Responsive * Responsiveness: empty State, JSON check --------- Co-authored-by: Elliot Braem <elliot@everything.dev>
* Submissions Page + Feed Page + Mobile Responsiveness * fixes * craete-feed authenticated user condition * fmt * remove sqlite --------- Co-authored-by: Elliot Braem <elliot@ejlbraem.com>
* feat: Add connect button to feed page * fix: recommit * added back to stablecoins feed since stablewatch forked their own * fix packages, update pg, and ignore cloudflare sockets * fmt --------- Co-authored-by: ethnclark <ethanclark1310@gmail.com> Co-authored-by: codingshot <45281667+codingshot@users.noreply.github.com> Co-authored-by: Elliot Braem <elliot@ejlbraem.com>
This reverts commit cd12908.
* fix: Sort By Oldest * fix: Sort By Oldest * fix: Fix RecentSubmissions Sort Order Update * fix: All feed should be hidden, remove double title --------- Co-authored-by: vohuunhan1310@gmail.com <ethanclark1310@gmail.com>
* UI fixes * fmt --------- Co-authored-by: Elliot Braem <elliot@everything.dev>
* fix: Leaderboard improvements * fix: fmt * reorganize * remove unused * rename * clean up * fmt --------- Co-authored-by: Elliot Braem <elliot@ejlbraem.com>
|
The latest updates on your projects. Learn more about Vercel for Git ↗︎
1 Skipped Deployment
|
WalkthroughThis update standardizes timestamp columns across the database to use timezone-aware types, removes the Changes
Sequence Diagram(s)sequenceDiagram
participant API
participant DB
participant Validator
API->>Validator: Validate submission insert (no id/createdAt/updatedAt)
Validator-->>API: Validated data
API->>DB: Insert submission (without explicit timestamps)
DB-->>API: Record with default timestamps
Note over API,DB: moderationResponseTweetId is not included in data or responses
Poem
📜 Recent review detailsConfiguration used: CodeRabbit UI 📒 Files selected for processing (10)
💤 Files with no reviewable changes (2)
✅ Files skipped from review due to trivial changes (4)
🚧 Files skipped from review as they are similar to previous changes (2)
🔇 Additional comments (2)
✨ Finishing Touches
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Actionable comments posted: 7
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
pnpm-lock.yamlis excluded by!**/pnpm-lock.yaml
📒 Files selected for processing (9)
apps/api/src/routes/api/submission.ts(0 hunks)apps/api/test/setup/seed-test.ts(1 hunks)packages/shared-db/migrations/0009_aromatic_stephen_strange.sql(1 hunks)packages/shared-db/migrations/meta/0009_snapshot.json(1 hunks)packages/shared-db/migrations/meta/_journal.json(1 hunks)packages/shared-db/package.json(2 hunks)packages/shared-db/scripts/overwrite-vars.ts(1 hunks)packages/shared-db/src/schema/common.ts(1 hunks)packages/types/src/domain/submission.ts(0 hunks)
💤 Files with no reviewable changes (2)
- packages/types/src/domain/submission.ts
- apps/api/src/routes/api/submission.ts
🔇 Additional comments (7)
apps/api/test/setup/seed-test.ts (1)
3-3: LGTM! Good modularization.The import path change from relative to external package import follows good modularization practices and improves code organization.
packages/shared-db/package.json (2)
14-15: LGTM! Well-structured database tooling scripts.The new overwrite-vars scripts with up/down operations follow good database tooling patterns and will support the new overwrite functionality.
33-33: LGTM! Appropriate dependency addition.The
pg-connection-stringdependency is a standard and well-maintained library for parsing PostgreSQL connection strings, which aligns with the database tooling enhancements.packages/shared-db/migrations/meta/_journal.json (1)
68-74: LGTM! Proper migration journal entry.The new migration entry follows the established pattern with correct index sequencing and consistent formatting.
packages/shared-db/src/schema/common.ts (1)
15-21: LGTM! Excellent timezone-aware timestamp configuration.The addition of timezone awareness to timestamp columns is a best practice that will prevent timezone-related issues in multi-timezone deployments. The configuration properly uses JavaScript Date objects with timezone support.
packages/shared-db/migrations/0009_aromatic_stephen_strange.sql (1)
1-56:Details
❓ Verification inconclusive
LGTM! Comprehensive timezone-aware timestamp migration.
This migration systematically updates all timestamp columns to be timezone-aware, which is a best practice for robust timestamp handling. The consistent pattern across all tables ensures data integrity.
Verify timezone handling in existing application code.
Ensure that existing application code properly handles the new timezone-aware timestamps, particularly in date comparisons and formatting operations.
🏁 Script executed:
#!/bin/bash # Description: Check for potential timezone-related code that might need updates # Expected: Find date/time operations that might need timezone consideration echo "Searching for potential timezone-sensitive operations..." # Look for date operations that might be affected rg -A 3 -B 1 "new Date\(|\.toISOString\(\)|\.getTime\(\)|Date\.now\(\)" --type ts --type js echo -e "\n=== Searching for timestamp comparisons ===" rg -A 2 -B 1 "(createdAt|updatedAt).*[<>=]" --type ts --type js echo -e "\n=== Searching for date formatting operations ===" rg -A 2 "toLocaleDateString|toLocaleTimeString|Intl\.DateTimeFormat" --type ts --type jsLength of output: 39806
Approve migration – ensure application-wide timezone consistency
The SQL migration correctly converts all
created_at/updated_atcolumns toTIMESTAMP WITH TIME ZONEwithnow()defaults. A code search surfaced extensive use ofnew Date(),toISOString(),Date.now(), and locale-based formatting across the codebase. Please review these areas to confirm they handle the new timezone-aware values as intended:• Database repositories (packages/shared-db/src/repositories//*.ts)
– e.g. plugin.repository.ts, twitter.repository.ts, activity.repository.ts, submission.repository.ts
• API services and routes (apps/api/src/services//.ts, apps/api/src/routes/**/.ts)
– instances ofcreatedAt: new Date(),.toISOString()
• Front-end utilities and components (apps/app/src/utils/datetime.ts, apps/app/src/components//*.tsx)
–toLocaleDateString(),toLocaleTimeString(), manualDate.UTCusage
• Test seed and helper scripts (apps/api/test/)
– defaultnew Date()/.toISOString()in test datapackages/shared-db/migrations/meta/0009_snapshot.json (1)
1-1425: Well-structured database schema with good practices.The migration snapshot demonstrates several good practices:
- Comprehensive foreign key relationships with appropriate cascade rules
- Strategic indexing for performance optimization
- JSONB columns for flexible data storage
- Proper unique constraints and composite primary keys
- Consistent use of
timestamp with time zonefor audit fieldsThe schema appears well-designed for the application's requirements.
| dotenv.config({ path: path.resolve(__dirname, "../.env") }); | ||
|
|
||
| // Define your list of transformations | ||
| const replacements = [{ oldValue: "OLD", newValue: "{{NEW}}" }]; |
There was a problem hiding this comment.
Replace placeholder values with actual transformation data.
The replacements array contains placeholder values that appear to be for demonstration purposes. This should not be deployed to production without actual transformation mappings.
-const replacements = [{ oldValue: "OLD", newValue: "{{NEW}}" }];
+const replacements = [
+ // Add your actual string replacements here
+ // { oldValue: "actual_old_value", newValue: "actual_new_value" }
+];📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const replacements = [{ oldValue: "OLD", newValue: "{{NEW}}" }]; | |
| // packages/shared-db/scripts/overwrite-vars.ts | |
| // … other imports and helper functions … | |
| async function up(knex: Knex): Promise<void> { | |
| // Define your actual replacements here | |
| const replacements = [ | |
| // Add your actual string replacements here | |
| // { oldValue: "actual_old_value", newValue: "actual_new_value" } | |
| ]; | |
| for (const { oldValue, newValue } of replacements) { | |
| await knex.raw( | |
| ` | |
| UPDATE your_table | |
| SET jsonb_column = replace(jsonb_column::text, ?, ?)::jsonb | |
| WHERE jsonb_column::text LIKE ? | |
| `, | |
| [oldValue, newValue, `%${oldValue}%`] | |
| ); | |
| } | |
| } | |
| // … down migration and exports … |
🤖 Prompt for AI Agents
In packages/shared-db/scripts/overwrite-vars.ts at line 11, the replacements
array currently uses placeholder values "OLD" and "{{NEW}}". Replace these
placeholders with the actual old and new string values that reflect the real
transformation mappings needed for your use case before deploying to production.
| try { | ||
| console.log(`Connecting to remote database specified by DATABASE_URL...`); | ||
| await dbClient.connect(); | ||
| const dbInstance = drizzle(dbClient); | ||
|
|
||
| if (operation === "up") { | ||
| console.log("Running UP operation..."); | ||
| await performUpOperation(dbInstance); | ||
| } else { | ||
| // operation === 'down' | ||
| console.log("Running DOWN operation..."); | ||
| await performDownOperation(dbInstance); | ||
| } | ||
| console.log(`Operation '${operation}' completed successfully.`); | ||
| } catch (error) { | ||
| console.error(`Error during '${operation}' operation:`, error); | ||
| process.exit(1); | ||
| } finally { | ||
| if (dbClient) { | ||
| console.log("Closing database connection."); | ||
| await dbClient.end(); | ||
| } | ||
| } |
There was a problem hiding this comment.
🛠️ Refactor suggestion
Add dry-run capability and affected rows logging.
Consider adding a dry-run option to preview changes and log the number of affected rows for better visibility.
try {
console.log(`Connecting to remote database specified by DATABASE_URL...`);
await dbClient.connect();
const dbInstance = drizzle(dbClient);
+ // Add dry-run capability
+ const isDryRun = process.argv.includes('--dry-run');
+ if (isDryRun) {
+ console.log("DRY RUN MODE - No changes will be made");
+ }
+
if (operation === "up") {
console.log("Running UP operation...");
- await performUpOperation(dbInstance);
+ if (isDryRun) {
+ await previewUpOperation(dbInstance);
+ } else {
+ await performUpOperation(dbInstance);
+ }
} else {
// operation === 'down'
console.log("Running DOWN operation...");
- await performDownOperation(dbInstance);
+ if (isDryRun) {
+ await previewDownOperation(dbInstance);
+ } else {
+ await performDownOperation(dbInstance);
+ }
}
console.log(`Operation '${operation}' completed successfully.`);Committable suggestion skipped: line range outside the PR's diff.
🤖 Prompt for AI Agents
In packages/shared-db/scripts/overwrite-vars.ts around lines 62 to 84, add a
dry-run option to allow previewing changes without applying them. Modify the
performUpOperation and performDownOperation functions to return the number of
affected rows, then log this count after the operation completes. Ensure that
when dry-run is enabled, the operations do not modify the database but still
report what would have been changed.
| async function performDownOperation(db: NodePgDatabase) { | ||
| for (let i = replacements.length - 1; i >= 0; i--) { | ||
| const { oldValue, newValue } = replacements[i]; | ||
| console.log(`Reverting replacement: "${newValue}" -> "${oldValue}"`); | ||
| await db.execute(sql` | ||
| UPDATE ${feeds} | ||
| SET | ||
| config = REPLACE(config::text, ${newValue}, ${oldValue})::jsonb | ||
| WHERE | ||
| config::text LIKE ${`%${newValue}%`}; | ||
| `); | ||
| } | ||
| console.log("All JSONB string replacements reverted."); | ||
| } |
There was a problem hiding this comment.
Apply same transaction safety to down operation.
The down operation has the same issues as the up operation and needs identical transaction safety and validation.
async function performDownOperation(db: NodePgDatabase) {
+ return await db.transaction(async (tx) => {
for (let i = replacements.length - 1; i >= 0; i--) {
const { oldValue, newValue } = replacements[i];
console.log(`Reverting replacement: "${newValue}" -> "${oldValue}"`);
- await db.execute(sql`
+
+ // Validate JSON before replacement
+ const testResult = await tx.execute(sql`
+ SELECT REPLACE(config::text, ${newValue}, ${oldValue}) as new_config
+ FROM ${feeds}
+ WHERE config::text LIKE ${`%${newValue}%`}
+ LIMIT 1;
+ `);
+
+ if (testResult.rows.length > 0) {
+ try {
+ JSON.parse(testResult.rows[0].new_config as string);
+ } catch (e) {
+ throw new Error(`Revert would create invalid JSON: ${e.message}`);
+ }
+ }
+
+ await tx.execute(sql`
UPDATE ${feeds}
SET
- config = REPLACE(config::text, ${newValue}, ${oldValue})::jsonb
+ config = REPLACE(config::text, ${newValue}, ${oldValue})::jsonb
WHERE
- config::text LIKE ${`%${newValue}%`};
+ config::text LIKE ${`%${newValue}%`};
`);
}
console.log("All JSONB string replacements reverted.");
+ });
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| async function performDownOperation(db: NodePgDatabase) { | |
| for (let i = replacements.length - 1; i >= 0; i--) { | |
| const { oldValue, newValue } = replacements[i]; | |
| console.log(`Reverting replacement: "${newValue}" -> "${oldValue}"`); | |
| await db.execute(sql` | |
| UPDATE ${feeds} | |
| SET | |
| config = REPLACE(config::text, ${newValue}, ${oldValue})::jsonb | |
| WHERE | |
| config::text LIKE ${`%${newValue}%`}; | |
| `); | |
| } | |
| console.log("All JSONB string replacements reverted."); | |
| } | |
| async function performDownOperation(db: NodePgDatabase) { | |
| return await db.transaction(async (tx) => { | |
| for (let i = replacements.length - 1; i >= 0; i--) { | |
| const { oldValue, newValue } = replacements[i]; | |
| console.log(`Reverting replacement: "${newValue}" -> "${oldValue}"`); | |
| // Validate JSON before replacement | |
| const testResult = await tx.execute(sql` | |
| SELECT REPLACE(config::text, ${newValue}, ${oldValue}) AS new_config | |
| FROM ${feeds} | |
| WHERE config::text LIKE ${`%${newValue}%`} | |
| LIMIT 1; | |
| `); | |
| if (testResult.rows.length > 0) { | |
| try { | |
| JSON.parse(testResult.rows[0].new_config as string); | |
| } catch (e) { | |
| throw new Error(`Revert would create invalid JSON: ${e.message}`); | |
| } | |
| } | |
| await tx.execute(sql` | |
| UPDATE ${feeds} | |
| SET | |
| config = REPLACE(config::text, ${newValue}, ${oldValue})::jsonb | |
| WHERE | |
| config::text LIKE ${`%${newValue}%`}; | |
| `); | |
| } | |
| console.log("All JSONB string replacements reverted."); | |
| }); | |
| } |
🤖 Prompt for AI Agents
In packages/shared-db/scripts/overwrite-vars.ts between lines 27 and 40, the
performDownOperation function lacks transaction safety and validation, similar
to the up operation. Wrap the entire for-loop inside a database transaction to
ensure atomicity, and add validation checks before executing the updates. This
will prevent partial updates and maintain data integrity during the down
operation.
| async function performUpOperation(db: NodePgDatabase) { | ||
| for (const { oldValue, newValue } of replacements) { | ||
| console.log(`Applying replacement: "${oldValue}" -> "${newValue}"`); | ||
| await db.execute(sql` | ||
| UPDATE ${feeds} | ||
| SET | ||
| config = REPLACE(config::text, ${oldValue}, ${newValue})::jsonb | ||
| WHERE | ||
| config::text LIKE ${`%${oldValue}%`}; | ||
| `); | ||
| } | ||
| console.log("All JSONB string replacements applied."); | ||
| } |
There was a problem hiding this comment.
Add transaction safety and JSON validation.
The current implementation has several critical issues:
- No transaction boundaries - partial failures could leave data in inconsistent state
- No validation that string replacement results in valid JSON
- REPLACE operation on text-cast JSONB could corrupt data structure
async function performUpOperation(db: NodePgDatabase) {
+ return await db.transaction(async (tx) => {
for (const { oldValue, newValue } of replacements) {
console.log(`Applying replacement: "${oldValue}" -> "${newValue}"`);
- await db.execute(sql`
+
+ // First, check if replacement would create valid JSON
+ const testResult = await tx.execute(sql`
+ SELECT REPLACE(config::text, ${oldValue}, ${newValue}) as new_config
+ FROM ${feeds}
+ WHERE config::text LIKE ${`%${oldValue}%`}
+ LIMIT 1;
+ `);
+
+ if (testResult.rows.length > 0) {
+ try {
+ JSON.parse(testResult.rows[0].new_config as string);
+ } catch (e) {
+ throw new Error(`Replacement would create invalid JSON: ${e.message}`);
+ }
+ }
+
+ await tx.execute(sql`
UPDATE ${feeds}
SET
- config = REPLACE(config::text, ${oldValue}, ${newValue})::jsonb
+ config = REPLACE(config::text, ${oldValue}, ${newValue})::jsonb
WHERE
- config::text LIKE ${`%${oldValue}%`};
+ config::text LIKE ${`%${oldValue}%`};
`);
}
console.log("All JSONB string replacements applied.");
+ });
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| async function performUpOperation(db: NodePgDatabase) { | |
| for (const { oldValue, newValue } of replacements) { | |
| console.log(`Applying replacement: "${oldValue}" -> "${newValue}"`); | |
| await db.execute(sql` | |
| UPDATE ${feeds} | |
| SET | |
| config = REPLACE(config::text, ${oldValue}, ${newValue})::jsonb | |
| WHERE | |
| config::text LIKE ${`%${oldValue}%`}; | |
| `); | |
| } | |
| console.log("All JSONB string replacements applied."); | |
| } | |
| async function performUpOperation(db: NodePgDatabase) { | |
| return await db.transaction(async (tx) => { | |
| for (const { oldValue, newValue } of replacements) { | |
| console.log(`Applying replacement: "${oldValue}" -> "${newValue}"`); | |
| // First, check if replacement would create valid JSON | |
| const testResult = await tx.execute(sql` | |
| SELECT REPLACE(config::text, ${oldValue}, ${newValue}) AS new_config | |
| FROM ${feeds} | |
| WHERE config::text LIKE ${`%${oldValue}%`} | |
| LIMIT 1; | |
| `); | |
| if (testResult.rows.length > 0) { | |
| try { | |
| JSON.parse(testResult.rows[0].new_config as string); | |
| } catch (e) { | |
| throw new Error(`Replacement would create invalid JSON: ${e.message}`); | |
| } | |
| } | |
| await tx.execute(sql` | |
| UPDATE ${feeds} | |
| SET | |
| config = REPLACE(config::text, ${oldValue}, ${newValue})::jsonb | |
| WHERE | |
| config::text LIKE ${`%${oldValue}%`}; | |
| `); | |
| } | |
| console.log("All JSONB string replacements applied."); | |
| }); | |
| } |
🤖 Prompt for AI Agents
In packages/shared-db/scripts/overwrite-vars.ts around lines 13 to 25, the
function performUpOperation lacks transaction handling and JSON validation,
risking partial updates and data corruption. Wrap the entire replacement loop in
a database transaction to ensure atomicity, and after each REPLACE operation,
validate that the resulting config is valid JSON before committing. If
validation fails, rollback the transaction and handle the error appropriately to
maintain data integrity.
Summary by CodeRabbit
New Features
Bug Fixes
Chores