Skip to content
This repository was archived by the owner on Aug 16, 2021. It is now read-only.

Commit 1467dba

Browse files
authored
Merge pull request #30 from startupturbo/dmius-pg-config
s3cfg, help, pg-config
2 parents 3e6b962 + b30ec9a commit 1467dba

File tree

1 file changed

+210
-9
lines changed

1 file changed

+210
-9
lines changed

nancy_run.sh

Lines changed: 210 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,190 @@ DEBUG_TIMEOUT=0
99
## Get command line params
1010
while true; do
1111
case "$1" in
12+
help )
13+
echo -e "\033[1mCOMMAND\033[22m
14+
15+
run
16+
17+
\033[1mDESCRIPTION\033[22m
18+
19+
Nancy is a member of Postgres.ai's Artificial DBA team
20+
responsible for conducting experiments.
21+
22+
Use 'nancy run' to request a new run for some experiment
23+
being conducted.
24+
25+
An experiment consists of one or more 'runs'. For instance,
26+
if Nancy is being used to verify that a new index will
27+
affect performance only in a positive way, two runs are needed.
28+
If one needs to only collect query plans for each query group,
29+
a single run is enough. And finally, if there is a goal to
30+
find an optimal value for some PostgreSQL setting,
31+
multiple runs will be needed to check how various
32+
values of specified setting affect performance
33+
of specified database and workload.
34+
35+
4 main parts of each run are:
36+
- environment: where it will happen, PostgreSQL version, etc;
37+
- database: copy or clone of some database;
38+
- workload: 'real' workload or custom SQL;
39+
- target: PostgreSQL config changes or some DDL such as
40+
'CREATE INDEX ...'.
41+
42+
\033[1mOPTIONS\033[22m
43+
44+
NOTICE: A value for a string option that starts with 'file://'
45+
is treated as a path to a local file. A string value
46+
starting with 's3://' is treated as a path
47+
to remote file located in S3 (AWS S3 or its analog).
48+
Otherwise, a string values is considered as 'content',
49+
not a link to a file.
50+
51+
\033[1m--debug\033[22m (boolean)
52+
53+
Turn on debug logging.
54+
55+
\033[1m--debug-timeout\033[22m (string)
56+
57+
How many seconds the entity (Docker container, Docker machine)
58+
where experimental run is being made will be alive after the
59+
main activity is finished. This is useful for various debugging:
60+
one can access container via ssh / docker exec and see PostgreSQL
61+
with data, logs, etc.
62+
63+
\033[1m--run-on\033[22m (string)
64+
65+
Specify, where the experimental run will take place
66+
67+
* 'localhost' (default)
68+
69+
* aws
70+
71+
* gcp (WIP)
72+
73+
If 'localhost' is specified (or --run-on is omitted),
74+
Nancy will perform the run on the localhost in a Docker container
75+
so (`docker run` must work locally).
76+
77+
If 'aws' is specified, Nancy will use a Docker machine with a single
78+
container running on an EC2 Spot instance.
79+
80+
\033[1m--pg-version\033[22m (string)
81+
82+
Specify Major PostgreSQL version.
83+
84+
* 9.6 (default)
85+
86+
* 10
87+
88+
* 11devel (WIP)
89+
90+
\033[1m--pg-config\033[22m (string)
91+
92+
Specify PostgreSQL config to be used (may be partial).
93+
94+
\033[1m--db-prepared-snapshot\033[22m (string)
95+
96+
Reserved / Not yet implemented.
97+
98+
\033[1m--db-dump-path\033[22m (string)
99+
100+
Specify the path to database dump (creted by pg_dump) to be used
101+
as an input.
102+
103+
\033[1m--after-db-init-code\033[22m (string)
104+
105+
Specify additional commands to be executed after database
106+
is initiated (dump loaded or snapshot attached).
107+
108+
\033[1m--workload-full-path\033[22m (string)
109+
110+
Path to 'real' workload prepared by using `nancy prepare-workload`.
111+
112+
\033[1m--workload-basis-path\033[22m (string)
113+
114+
Reserved / Not yet implemented.
115+
116+
\033[1m--workload-custom-sql\033[22m (string)
117+
118+
Specify custom SQL queries to be used as an input.
119+
120+
\033[1m--workload-replay-speed\033[22m (string)
121+
122+
Reserved / Not yet implemented.
123+
124+
\033[1m--target-ddl-do\033[22m (string)
125+
126+
SQL changing database somehow before workload is applied.
127+
'Do DDL' example:
128+
129+
create index i_t1_experiment on t1 using btree(col1);
130+
vacuum analyze t1;
131+
132+
\033[1m--target-ddl-undo\033[22m (string)
133+
134+
SQL reverting changes produced by those specified in the
135+
the value of the `--target-ddl-do` option. Reverting allows
136+
to serialize multiple runs, but it might be not possible
137+
in some cases. 'Undo DDL' example:
138+
139+
drop index i_t1_experiment;
140+
141+
\033[1m--target-config\033[22m (string)
142+
143+
Config changes to be applied to postgresql.conf before
144+
workload is applied. Once configuration changes are made,
145+
PostgreSQL is restarted. Example:
146+
147+
random_page_cost = 1.1
148+
149+
\033[1m--artifacts-destination\033[22m (string)
150+
151+
Path to a local ('file://...') or S3 ('s3://...') directory
152+
where Nancy will put all collected results of the run,
153+
including:
154+
155+
* detailed performance report in JSON format
156+
157+
* whole PostgreSQL log, gzipped
158+
159+
\033[1m--aws-ec2-type\033[22m (string)
160+
161+
EC2 instance type where the run will be performed. EC2 Spot
162+
instance will be used. WARNING: 'i3-metal' instances are
163+
not currently supported (WIP).
164+
165+
The option may be used only with `--run-on aws`.
166+
167+
\033[1m--aws-keypair-name\033[22m (string)
168+
169+
THe name of key pair used on EC2 instance to allow accessing
170+
to it. Must correspond to the value of the `--aws-ssh-key-path`
171+
option.
172+
173+
The option may be used only with `--run-on aws`.
174+
175+
\033[1m--aws-ssh-key-path\033[22m (string)
176+
177+
Path to SSH key file (usually, has '.pem' extension).
178+
179+
The option may be used only with `--run-on aws`.
180+
181+
\033[1m--s3cfg-path\033[22m
182+
183+
The path the '.s3cfg' configuration file to be used when
184+
accessing files in S3. This file must be local and must
185+
be specified if some options' values are in 's3://***'
186+
format.
187+
188+
See also: https://github.com/s3tools/s3cmd
189+
190+
\033[1mSEE ALSO\033[22m
191+
192+
nancy help
193+
194+
" | less -RFX
195+
exit ;;
12196
-d | --debug ) DEBUG=1; shift ;;
13197
--run-on )
14198
RUN_ON="$2"; shift 2 ;;
@@ -57,13 +241,12 @@ while true; do
57241
--aws-ssh-key-path )
58242
AWS_KEY_PATH="$2"; shift 2 ;;
59243

60-
--s3-cfg-path )
244+
--s3cfg-path )
61245
S3_CFG_PATH="$2"; shift 2 ;;
62246
--tmp-path )
63247
TMP_PATH="$2"; shift 2 ;;
64248
--debug-timeout )
65249
DEBUG_TIMEOUT="$2"; shift 2 ;;
66-
67250
-- ) shift; break ;;
68251
* ) break ;;
69252
esac
@@ -203,15 +386,21 @@ function checkParams() {
203386
exit 1
204387
fi
205388

206-
if [ ! -z ${DB_DUMP_PATH+x} ]
389+
[ ! -z ${DB_DUMP_PATH+x} ] && ! checkPath DB_DUMP_PATH && >&2 echo "ERROR: file $DB_DUMP_PATH given by db_dump_path not found" && exit 1
390+
391+
if [ -z ${PG_CONGIF+x} ]
207392
then
208-
echo "DB_DUMP_PATH found"
393+
>&2 echo "WARNING: Initial database server configuration not given. Will use default."
209394
else
210-
echo "DB_DUMP_PATH NOT found"
395+
checkPath PG_CONGIF
396+
if [ "$?" -ne "0" ]
397+
then
398+
>&2 echo "WARNING: Value given as pg_congif: '$PG_CONGIF' not found as file will use as content"
399+
echo "$PG_CONGIF" > $TMP_PATH/pg_congif_tmp.sql
400+
WORKLOAD_CUSTOM_SQL="$TMP_PATH/pg_congif_tmp.sql"
401+
fi
211402
fi
212403

213-
[ ! -z ${DB_DUMP_PATH+x} ] && ! checkPath DB_DUMP_PATH && >&2 echo "ERROR: file $DB_DUMP_PATH given by db_dump_path not found" && exit 1
214-
215404
if (([ -z ${TARGET_DDL_UNDO+x} ] && [ ! -z ${TARGET_DDL_DO+x} ]) || ([ -z ${TARGET_DDL_DO+x} ] && [ ! -z ${TARGET_DDL_UNDO+x} ]))
216405
then
217406
>&2 echo "ERROR: DDL code must have do and undo part."
@@ -275,7 +464,7 @@ function checkParams() {
275464
fi
276465
fi
277466

278-
if [ ! -z ${TARGET_DDL_UNDO} ]
467+
if [ ! -z ${TARGET_DDL_UNDO+x} ]
279468
then
280469
checkPath TARGET_DDL_UNDO
281470
if [ "$?" -ne "0" ]
@@ -403,6 +592,7 @@ function cleanup {
403592
rm -f "$TMP_PATH/target_ddl_do_tmp.sql"
404593
rm -f "$TMP_PATH/target_ddl_undo_tmp.sql"
405594
rm -f "$TMP_PATH/target_config_tmp.conf"
595+
rm -f "$TMP_PATH/pg_config_tmp.conf"
406596

407597
if [ "$RUN_ON" = "localhost" ]; then
408598
rm -rf "$TMP_PATH/pg_nancy_home_${CURRENT_TS}"
@@ -440,6 +630,7 @@ function copyFile() {
440630
[ ! -z ${S3_CFG_PATH+x} ] && copyFile $S3_CFG_PATH && docker_exec cp /machine_home/.s3cfg /root/.s3cfg
441631

442632
[ ! -z ${DB_DUMP_PATH+x} ] && copyFile $DB_DUMP_PATH
633+
[ ! -z ${PG_CONGIF+x} ] && copyFile $PG_CONGIF
443634
[ ! -z ${TARGET_CONFIG+x} ] && copyFile $TARGET_CONFIG
444635
[ ! -z ${TARGET_DDL_DO+x} ] && copyFile $TARGET_DDL_DO
445636
[ ! -z ${TARGET_DDL_UNDO+x} ] && copyFile $TARGET_DDL_UNDO
@@ -469,8 +660,18 @@ if ([ ! -z ${TARGET_DDL_DO+x} ] && [ "$TARGET_DDL_DO" != "" ]); then
469660
TARGET_DDL_DO_FILENAME=$(basename $TARGET_DDL_DO)
470661
docker_exec bash -c "psql -U postgres test -E -f /machine_home/$TARGET_DDL_DO_FILENAME"
471662
fi
663+
# Apply initial postgres configuration
664+
echo "Apply initial postgres configuration"
665+
if ([ ! -z ${PG_CONFIG+x} ] && [ "$PG_CONFIG" != "" ]); then
666+
PG_CONFIG_FILENAME=$(basename $PG_CONFIG)
667+
docker_exec bash -c "cat /machine_home/$PG_CONFIG_FILENAME >> /etc/postgresql/$PG_VERSION/main/postgresql.conf"
668+
if [ -z ${TARGET_CONFIG+x} ]
669+
then
670+
docker_exec bash -c "sudo /etc/init.d/postgresql restart"
671+
fi
672+
fi
472673
# Apply postgres configuration
473-
echo "Apply postgres conf"
674+
echo "Apply postgres configuration"
474675
if ([ ! -z ${TARGET_CONFIG+x} ] && [ "$TARGET_CONFIG" != "" ]); then
475676
TARGET_CONFIG_FILENAME=$(basename $TARGET_CONFIG)
476677
docker_exec bash -c "cat /machine_home/$TARGET_CONFIG_FILENAME >> /etc/postgresql/$PG_VERSION/main/postgresql.conf"

0 commit comments

Comments
 (0)