Skip to content

Techniques for Populating CC Database with Large Amounts of Test Data

Tim Downey edited this page Jan 10, 2018 · 2 revisions

WIP - Techniques for Populating CC Database with Large Amounts of Test Data

This wiki page is still a work in progress.

In both Postgres and MySQL you can load large amounts of data relatively quickly by providing it via a csv file.

Loading Data via CSV in MySQL

First connect to the database using the --local-infile option. E.g. mysql -u username -p database_name --local-infile. This will look like:

mysql> LOAD DATA LOCAL INFILE <local-file-path> INTO TABLE <table-name> FIELDS TERMINATED BY ",";

Example:

mysql> LOAD DATA LOCAL INFILE "/tmp/apps-mysql.csv" INTO TABLE apps FIELDS TERMINATED BY ",";

If the operation was successful you'll see that rows were modified. If not, it will usually say "0 rows modified" and list a count of warnings. You can view these by executing: show warnings;

Loading Data via CSV in Postgres

In Postgres you'll use the \copy command.

cloud_controller=> \copy <table-name>(optional, column, names) from <local-file-path> (format csv);

Example:

cloud_controller=> \copy apps(guid, created_at, updated_at, space_guid, name, droplet_guid, desired_state, encrypted_environment_variables, salt, max_task_sequence_id, buildpack_cache_sha256_checksum, enable_ssh) from /tmp/apps-postgres.csv (format csv);
Clone this wiki locally