Skip to content

Commit 620fab7

Browse files
authored
feat(migrate): AllowInvalidHash (#114)
1 parent a83b0e6 commit 620fab7

File tree

8 files changed

+290
-15
lines changed

8 files changed

+290
-15
lines changed

FORMATS.md

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -39,6 +39,17 @@ Comments elsewhere in the file are ignored - we do not implement an SQL parser
3939
so we do not know if the comment is within a SQL string or similar. It's easiest
4040
just not parse that far.
4141

42+
## `--! AllowInvalidHash`
43+
44+
Should you need to go back and edit a _committed_ migration you can opt out of
45+
Graphile Migrate's consistency checks by adding this comment to the very top of
46+
your committed migration. Please note that editing the migration **WILL NOT**
47+
cause the migration to run again. This is primarily useful where there was a
48+
mistake in your migration that prevents it running on production but you don't
49+
want to reset your staging database, or where an update to PostgreSQL has made
50+
the syntax or commands in an older migration invalid and thus you must edit them
51+
to make the migration run against a clean database again.
52+
4253
## `--! no-transaction`
4354

4455
This is treated as a body comment for backwards compatibility reasons. This

README.md

Lines changed: 22 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -705,6 +705,28 @@ so `--! no-transaction` migrations must contain exactly one statement. You might
705705
be able to work around this with a `DO $$` block? (If this works, please send a
706706
PR to this paragraph.)
707707
708+
## Editing a committed migration
709+
710+
Graphile Migrate deliberately performs cryptographic hashing to avoid/detect
711+
accidental editing of committed migrations and to ensure there is a strict
712+
linear progression in migrations. By default, Graphile Migrate will refuse to
713+
run a migration if its hash does not match what it declares; this is generally
714+
desired (and you shouldn't have to worry about it).
715+
716+
Should you need to go back and edit a _committed_ migration you can opt out of
717+
Graphile Migrate's consistency checks by adding the comment
718+
`--! AllowInvalidHash` to the very top of the committed migration. Please note
719+
that editing the migration **WILL NOT** cause the migration to run again on
720+
yours or any other system.
721+
722+
The need to edit a previous migration generally arises if there was a mistake in
723+
your migration that prevents it running on production but you don't want to
724+
reset your staging database, or where an update to PostgreSQL has made the
725+
syntax or commands in an older migration invalid and thus you must edit them to
726+
make the migration run against a clean database again. Most users should never
727+
need this functionality. If you find yourself using it more than once or twice,
728+
please get in touch and we can discuss how the tool can better serve your needs.
729+
708730
## Terminology
709731
710732
### The current migration
@@ -751,10 +773,6 @@ supported interface ─ we'd love to know how you're using it!
751773
The project as a whole is stable, but the approach is still "experimental", in
752774
particular:
753775
754-
- because committed migrations are hashed you cannot edit old migrations; this
755-
may cause you issues should you upgrade PostgreSQL and it drops support for a
756-
syntax or feature you were previously using. We plan to fix this issue _if and
757-
when_ it occurs, so if this affects you please open a detailed issue.
758776
- the approach of up-only and re-runnable migrations is not for the faint of
759777
heart ─ it requires solid SQL knowledge and if insufficient attention is paid
760778
it could result in your migrations and your local database state drifting

__tests__/helpers.ts

Lines changed: 18 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -115,13 +115,25 @@ export async function resetDb() {
115115
const parsedSettings = await parsedSettingsPromise;
116116
await withClient(TEST_DATABASE_URL, parsedSettings, async client => {
117117
await client.query("drop schema if exists graphile_migrate cascade;");
118-
const { rows } = await client.query(
119-
`select relname from pg_class where relkind = 'r' and relnamespace = (select oid from pg_namespace where nspname = 'public')`,
120-
);
121-
for (const row of rows) {
122-
await client.query(
123-
`drop table if exists ${escapeIdentifier(row.relname)} cascade;`,
118+
{
119+
const { rows } = await client.query(
120+
`select relname from pg_class where relkind = 'r' and relnamespace = 'public'::regnamespace`,
121+
);
122+
for (const row of rows) {
123+
await client.query(
124+
`drop table if exists ${escapeIdentifier(row.relname)} cascade;`,
125+
);
126+
}
127+
}
128+
{
129+
const { rows } = await client.query(
130+
`select typname from pg_type where typtype = 'e' and typnamespace = 'public'::regnamespace`,
124131
);
132+
for (const row of rows) {
133+
await client.query(
134+
`drop type if exists ${escapeIdentifier(row.typname)} cascade;`,
135+
);
136+
}
125137
}
126138
});
127139
}

__tests__/migrate.test.ts

Lines changed: 216 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,216 @@
1+
import "./helpers"; // Has side-effects; must come first
2+
3+
import * as mockFs from "mock-fs";
4+
5+
import { migrate } from "../src";
6+
import { withClient } from "../src/pg";
7+
import { ParsedSettings, parseSettings } from "../src/settings";
8+
import { makeMigrations, resetDb, settings } from "./helpers";
9+
10+
beforeEach(resetDb);
11+
beforeEach(async () => {
12+
mockFs({ migrations: mockFs.directory() });
13+
});
14+
afterEach(() => {
15+
mockFs.restore();
16+
});
17+
const {
18+
MIGRATION_1_TEXT,
19+
MIGRATION_1_COMMITTED,
20+
MIGRATION_ENUM_COMMITTED,
21+
MIGRATION_NOTRX_TEXT,
22+
MIGRATION_NOTRX_COMMITTED,
23+
} = makeMigrations();
24+
25+
function getStuff(parsedSettings: ParsedSettings) {
26+
return withClient(
27+
parsedSettings.connectionString,
28+
parsedSettings,
29+
async (pgClient, _context) => {
30+
const { rows: migrations } = await pgClient.query(
31+
"select * from graphile_migrate.migrations",
32+
);
33+
const { rows: tables } = await pgClient.query(
34+
"select * from pg_class where relnamespace = 'public'::regnamespace and relkind = 'r'",
35+
);
36+
const { rows: enums } = await pgClient.query(
37+
"select typname, (select count(*) from pg_enum where enumtypid = pg_type.oid) as value_count from pg_type where typnamespace = 'public'::regnamespace and typtype = 'e'",
38+
);
39+
return { migrations, tables, enums };
40+
},
41+
);
42+
}
43+
44+
it("runs migrations", async () => {
45+
mockFs({
46+
"migrations/current.sql": MIGRATION_1_TEXT,
47+
});
48+
49+
await migrate(settings);
50+
51+
const parsedSettings = await parseSettings(settings);
52+
53+
{
54+
const { migrations, tables, enums } = await getStuff(parsedSettings);
55+
expect(migrations).toHaveLength(0);
56+
expect(tables).toHaveLength(0);
57+
expect(enums).toHaveLength(0);
58+
}
59+
60+
mockFs({
61+
[`migrations/committed/000001.sql`]: MIGRATION_1_COMMITTED,
62+
[`migrations/committed/000002.sql`]: MIGRATION_ENUM_COMMITTED,
63+
"migrations/current.sql": MIGRATION_NOTRX_TEXT,
64+
});
65+
66+
await migrate(settings);
67+
68+
{
69+
const { migrations, tables, enums } = await getStuff(parsedSettings);
70+
71+
expect(migrations).toHaveLength(2);
72+
expect(migrations.map(({ date, ...rest }) => rest)).toMatchInlineSnapshot(`
73+
Array [
74+
Object {
75+
"filename": "000001.sql",
76+
"hash": "sha1:e00ec93314a423ee5cc68d1182ad52f16442d7df",
77+
"previous_hash": null,
78+
},
79+
Object {
80+
"filename": "000002.sql",
81+
"hash": "sha1:bddc1ead3310dc1c42cdc7f63537ebdff2e9fd7b",
82+
"previous_hash": "sha1:e00ec93314a423ee5cc68d1182ad52f16442d7df",
83+
},
84+
]
85+
`);
86+
expect(tables).toHaveLength(1);
87+
expect(tables.map(t => t.relname)).toMatchInlineSnapshot(`
88+
Array [
89+
"foo",
90+
]
91+
`);
92+
expect(enums).toHaveLength(1);
93+
expect(enums).toMatchInlineSnapshot(`
94+
Array [
95+
Object {
96+
"typname": "user_role",
97+
"value_count": "1",
98+
},
99+
]
100+
`);
101+
}
102+
103+
mockFs({
104+
[`migrations/committed/000001.sql`]: MIGRATION_1_COMMITTED,
105+
[`migrations/committed/000002.sql`]: MIGRATION_ENUM_COMMITTED,
106+
[`migrations/committed/000003.sql`]: MIGRATION_NOTRX_COMMITTED,
107+
"migrations/current.sql": "",
108+
});
109+
110+
await migrate(settings);
111+
112+
{
113+
const { migrations, tables, enums } = await getStuff(parsedSettings);
114+
115+
expect(migrations).toHaveLength(3);
116+
expect(migrations.map(({ date, ...rest }) => rest)).toMatchInlineSnapshot(`
117+
Array [
118+
Object {
119+
"filename": "000001.sql",
120+
"hash": "sha1:e00ec93314a423ee5cc68d1182ad52f16442d7df",
121+
"previous_hash": null,
122+
},
123+
Object {
124+
"filename": "000002.sql",
125+
"hash": "sha1:bddc1ead3310dc1c42cdc7f63537ebdff2e9fd7b",
126+
"previous_hash": "sha1:e00ec93314a423ee5cc68d1182ad52f16442d7df",
127+
},
128+
Object {
129+
"filename": "000003.sql",
130+
"hash": "sha1:2d248344ac299ebbad2aeba5bfec2ae3c3cb0a4f",
131+
"previous_hash": "sha1:bddc1ead3310dc1c42cdc7f63537ebdff2e9fd7b",
132+
},
133+
]
134+
`);
135+
expect(tables).toHaveLength(1);
136+
expect(tables.map(t => t.relname)).toMatchInlineSnapshot(`
137+
Array [
138+
"foo",
139+
]
140+
`);
141+
expect(enums).toHaveLength(1);
142+
expect(enums).toMatchInlineSnapshot(`
143+
Array [
144+
Object {
145+
"typname": "user_role",
146+
"value_count": "2",
147+
},
148+
]
149+
`);
150+
}
151+
});
152+
153+
it("refuses to run migration with invalid hash", async () => {
154+
mockFs({
155+
[`migrations/committed/000001.sql`]: MIGRATION_1_COMMITTED,
156+
[`migrations/committed/000002.sql`]:
157+
MIGRATION_ENUM_COMMITTED +
158+
"\ncomment on type user_role is 'this invalidates the hash';",
159+
[`migrations/committed/000003.sql`]: MIGRATION_NOTRX_COMMITTED,
160+
"migrations/current.sql": "",
161+
});
162+
163+
await expect(migrate(settings)).rejects.toThrowErrorMatchingInlineSnapshot(
164+
`"Hash for 000002.sql does not match - sha1:cbed240dda7dfa510ff785783bbe6af7743b3a11 !== sha1:bddc1ead3310dc1c42cdc7f63537ebdff2e9fd7b; has the file been tampered with?"`,
165+
);
166+
});
167+
168+
it("will run a migration with invalid hash if told to do so", async () => {
169+
const parsedSettings = await parseSettings(settings);
170+
171+
mockFs({
172+
[`migrations/committed/000001.sql`]: MIGRATION_1_COMMITTED,
173+
[`migrations/committed/000002.sql`]:
174+
"--! AllowInvalidHash\n" +
175+
MIGRATION_ENUM_COMMITTED +
176+
"\ncomment on type user_role is 'this invalidates the hash';",
177+
[`migrations/committed/000003.sql`]: MIGRATION_NOTRX_COMMITTED,
178+
"migrations/current.sql": "",
179+
});
180+
181+
await migrate(settings);
182+
183+
{
184+
const { migrations, enums } = await getStuff(parsedSettings);
185+
186+
expect(migrations).toHaveLength(3);
187+
expect(migrations.map(({ date, ...rest }) => rest)).toMatchInlineSnapshot(`
188+
Array [
189+
Object {
190+
"filename": "000001.sql",
191+
"hash": "sha1:e00ec93314a423ee5cc68d1182ad52f16442d7df",
192+
"previous_hash": null,
193+
},
194+
Object {
195+
"filename": "000002.sql",
196+
"hash": "sha1:bddc1ead3310dc1c42cdc7f63537ebdff2e9fd7b",
197+
"previous_hash": "sha1:e00ec93314a423ee5cc68d1182ad52f16442d7df",
198+
},
199+
Object {
200+
"filename": "000003.sql",
201+
"hash": "sha1:2d248344ac299ebbad2aeba5bfec2ae3c3cb0a4f",
202+
"previous_hash": "sha1:bddc1ead3310dc1c42cdc7f63537ebdff2e9fd7b",
203+
},
204+
]
205+
`);
206+
expect(enums).toHaveLength(1);
207+
expect(enums).toMatchInlineSnapshot(`
208+
Array [
209+
Object {
210+
"typname": "user_role",
211+
"value_count": "2",
212+
},
213+
]
214+
`);
215+
}
216+
});

__tests__/migration.test.ts

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -87,6 +87,7 @@ it("calls beforeAllMigrations and afterAllMigrations action (only) if we did som
8787
previous: null,
8888
message: "TEST MESSAGE",
8989
messageSlug: "test-message",
90+
allowInvalidHash: false,
9091
},
9192
];
9293
},

src/commands/uncommit.ts

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -42,8 +42,8 @@ export async function _uncommit(parsedSettings: ParsedSettings): Promise<void> {
4242
const contents = await fsp.readFile(lastMigrationFilepath, "utf8");
4343
const { headers, body } = parseMigrationText(lastMigrationFilepath, contents);
4444

45-
// Drop Hash and Previous from headers; then write out
46-
const { Hash, Previous, ...otherHeaders } = headers;
45+
// Drop Hash, Previous and AllowInvalidHash from headers; then write out
46+
const { Hash, Previous, AllowInvalidHash, ...otherHeaders } = headers;
4747
const completeBody = serializeMigration(body, otherHeaders);
4848
await writeCurrentMigration(parsedSettings, currentLocation, completeBody);
4949

src/instrumentation.ts

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -60,7 +60,6 @@ export async function runQueryWithErrorInstrumentation(
6060
}
6161

6262
export const logDbError = ({ logger }: ParsedSettings, e: Error): void => {
63-
/* eslint-disable no-console */
6463
e["_gmlogged"] = true;
6564
const messages = [""];
6665
if (e["_gmMessageOverride"]) {

src/migration.ts

Lines changed: 20 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -30,6 +30,11 @@ export interface Migration {
3030
* The hash of the previous migration, or null if there was no previous migration
3131
*/
3232
previousHash: string | null;
33+
34+
/**
35+
* True if we should allow the hash to be invalid; false otherwise.
36+
*/
37+
allowInvalidHash: boolean;
3338
}
3439

3540
export interface DbMigration extends Migration {
@@ -340,6 +345,9 @@ export async function getAllMigrations(
340345
// --! Message:
341346
const message = headers["Message"];
342347

348+
// --! AllowInvalidHash
349+
const allowInvalidHash = "AllowInvalidHash" in headers;
350+
343351
return {
344352
realFilename,
345353
filename: migrationNumberString + ".sql",
@@ -348,6 +356,7 @@ export async function getAllMigrations(
348356
fullPath,
349357
hash,
350358
previousHash,
359+
allowInvalidHash,
351360
body,
352361
previous: null,
353362
};
@@ -464,16 +473,25 @@ export async function runCommittedMigration(
464473
filename,
465474
body,
466475
previousHash,
476+
allowInvalidHash,
467477
} = committedMigration;
468478
// Check the hash
469479
const newHash = calculateHash(body, previousHash);
470-
if (newHash !== hash) {
480+
const hashIsInvalid = newHash !== hash;
481+
if (hashIsInvalid && !allowInvalidHash) {
471482
throw new Error(
472483
`Hash for ${realFilename} does not match - ${newHash} !== ${hash}; has the file been tampered with?`,
473484
);
474485
}
486+
if (allowInvalidHash && !hashIsInvalid) {
487+
throw new Error(
488+
`Invalid hash allowed for ${realFilename}; but hash matches.`,
489+
);
490+
}
475491
parsedSettings.logger.info(
476-
`graphile-migrate${logSuffix}: Running migration '${realFilename}'`,
492+
`graphile-migrate${logSuffix}: Running migration '${realFilename}'${
493+
hashIsInvalid ? " (🚨 INVALID HASH, allowed via AllowInvalidHash 🚨)" : ""
494+
}`,
477495
);
478496
await runStringMigration(
479497
pgClient,

0 commit comments

Comments
 (0)