Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Created by
brew bumpCreated with
brew bump-formula-pr.release notes
This adds a scrubber program that can take a SQL file and scrub all information from it, replacing it with random data (while still following referential data). Additionally, this adds a scrubbed SQL file, and creates a new GitHub action since these tests can take a bit longer (since SQL files can be megabytes in size).
This fixes numerous issues as seen from importing customer dumps. I'm currently scrubbing the dump of all sensitive or identifying information of any kind, which I'll include in a follow-up PR, alongside the tool built to allow us to scrub any future dumps so that we may keep them within our test suite. As a result, I didn't include any direct tests in this PR since the dump will fulfill the requirements.
Requires:
Some dumps include columns that are named
"index", which is forbidden in CockroachDB. Our parser was initially based on an open-licensed version of CockroachDB's parser (Postgres Parser Added dolthub/doltgresql#19 (comment)), and we therefore inherited some of the restrictions that CockroachDB has. We want our customers to be able to use standard Postgres dumps, so this removes the CockroachDB extensions and restores the functionality expected of Postgres users. We didn't implement functionality for the extensions anyway, so this should be a harmless removal.For reference, this is the page discussing the extensions:
https://www.cockroachlabs.com/docs/stable/order-by
Depends on allow setting session variable default value dolthub/go-mysql-server#3203
One thing that we'll be adding is tests to ensure that imports from all over not only work in Doltgres, but continue to work. To achieve this, this PR adds a new framework that allows us to specify files that will be imported, as well as allowing for any needed setup that an import may need (such as user creation). Additionally, as import files may be very, very large, we needed a good way to attach errors to their origin queries. This framework gives an experience that is as close to standard Go debugging as we can get, considering imports must be done through PSQL (or
pg_restore, which isn't supported yet, but would function very similarly to the PSQL path).To reiterate, this gives us:
See Refactorings to support index scans for pg catalog tables dolthub/go-mysql-server#3190
This is a proof of concept for pg_class. Other pg_catalog tables are next.
This adds support for some sequence-related statements from a customer-provided dump for the purposes of testing imports.
Closed Issues
dolt_history_$tablenamesometimes returns wrong rows for tables with same name in different schemasView the full release notes at https://github.com/dolthub/doltgresql/releases/tag/v0.52.0.