Skip to content

Commit d45a2c8

Browse files
authored
Bump dependencies and defaults; drop support for Python 3.7 (#367)
* dump all deps * drop support for python 3.7 and enable ci for python 3.11 and 3.12 * update defaults: spark -> 3.5.0 hadoop -> 3.3.6 * clean up pip-tools section * add test for normalize_keys and fix lint problems * update changelog
1 parent 0a7821b commit d45a2c8

File tree

17 files changed

+168
-176
lines changed

17 files changed

+168
-176
lines changed

.github/workflows/flintrock.yaml

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,10 +17,11 @@ jobs:
1717
- ubuntu-20.04
1818
- macos-11
1919
python-version:
20-
- "3.7"
2120
- "3.8"
2221
- "3.9"
2322
- "3.10"
23+
- "3.11"
24+
- "3.12"
2425
name: ${{ matrix.os }} / Python ${{ matrix.python-version }}
2526
steps:
2627
- uses: actions/checkout@v3

CHANGES.md

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -6,15 +6,16 @@
66

77
### Changed
88

9-
* [#348]: Bumped default Spark to 3.2; dropped support for Python 3.6; added CI build for Python 3.10.
9+
* [#348], [#367]: Bumped default Spark to 3.5.0 and default Hadoop to 3.3.6; dropped support for Python 3.6 and 3.7; added CI builds for Python 3.10, 3.11, and 3.12.
1010
* [#361]: Migrated from AdoptOpenJDK, which is deprecated, to Adoptium OpenJDK.
11-
* [#362][#366]: Improved Flintrock's ability to cleanup after launch failures.
11+
* [#362], [#366]: Improved Flintrock's ability to cleanup after launch failures.
1212
* [#366]: Deprecated `--ec2-spot-request-duration`, which is not needed for one-time spot instances launched using the RunInstances API.
1313

1414
[#348]: https://github.com/nchammas/flintrock/pull/348
1515
[#361]: https://github.com/nchammas/flintrock/pull/361
1616
[#362]: https://github.com/nchammas/flintrock/pull/362
1717
[#366]: https://github.com/nchammas/flintrock/pull/366
18+
[#367]: https://github.com/nchammas/flintrock/pull/367
1819

1920
## [2.0.0] - 2021-06-10
2021

CONTRIBUTING.md

Lines changed: 5 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -80,9 +80,11 @@ There are a few things you should do before diving in to write a new feature or
8080

8181
If you are changing anything about Flintrock's dependencies, be sure to update the compiled requirements using [pip-tools]:
8282

83+
[pip-tools]: https://github.com/jazzband/pip-tools
84+
8385
```shell
8486
function update-deps() {
85-
pip install -U "pip-tools==6.8.0"
87+
pip install -U "pip-tools==7.3.0"
8688

8789
pip-compile -U requirements/user.in -o requirements/user.pip
8890
pip-compile -U requirements/developer.in -o requirements/developer.pip
@@ -91,19 +93,13 @@ function update-deps() {
9193
# Uncomment whichever set of requirements makes sense for you.
9294
# pip-sync requirements/user.pip
9395
# pip-sync requirements/developer.pip
94-
pip-sync requirements/maintainer.pip
96+
# pip-sync requirements/maintainer.pip
9597
}
9698

9799
update-deps
98100
```
99101

100-
After doing that, make sure your environment matches what's in the compiled requirements by running `pip-sync` against the appropriate requirements file:
101-
102-
```
103-
pip-sync requirements/[user|developer|maintainer].pip
104-
```
105-
106-
[pip-tools]: https://github.com/jazzband/pip-tools
102+
`pip-compile` takes the provided set of input requirements, like `user.in` and compiles them into a full list of pinned transitive dependencies, like `user.pip`. This is similar to a lock file. `pip-sync` ensures that the current active virtual environment has exactly the dependencies listed in the provided pip file, no more and no less.
107103

108104
#### Coordinate first
109105

README.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@ Here's a quick way to launch a cluster on EC2, assuming you already have an [AWS
3030
```sh
3131
flintrock launch test-cluster \
3232
--num-slaves 1 \
33-
--spark-version 3.3.0 \
33+
--spark-version 3.5.0 \
3434
--ec2-key-name key_name \
3535
--ec2-identity-file /path/to/key.pem \
3636
--ec2-ami ami-0aeeebd8d2ab47354 \
@@ -87,12 +87,12 @@ these steps:
8787
better performance.
8888
3. Make sure Flintrock is configured to use Hadoop/HDFS 2.7+. Earlier
8989
versions of Hadoop do not have solid implementations of `s3a://`.
90-
Flintrock's default is Hadoop 3.3.4, so you don't need to do anything
90+
Flintrock's default is Hadoop 3.3.6, so you don't need to do anything
9191
here if you're using a vanilla configuration.
9292
4. Call Spark with the hadoop-aws package to enable `s3a://`. For example:
9393
```sh
94-
spark-submit --packages org.apache.hadoop:hadoop-aws:3.3.4 my-app.py
95-
pyspark --packages org.apache.hadoop:hadoop-aws:3.3.4
94+
spark-submit --packages org.apache.hadoop:hadoop-aws:3.3.6 my-app.py
95+
pyspark --packages org.apache.hadoop:hadoop-aws:3.3.6
9696
```
9797
If you have issues using the package, consult the [hadoop-aws troubleshooting
9898
guide](http://hadoop.apache.org/docs/current/hadoop-aws/tools/hadoop-aws/index.html)
@@ -114,7 +114,7 @@ Before using Flintrock, take a quick look at the
114114
notice and [license](https://github.com/nchammas/flintrock/blob/master/LICENSE)
115115
and make sure you're OK with their terms.
116116

117-
**Flintrock requires Python 3.7 or newer**, unless you are using one
117+
**Flintrock requires Python 3.8 or newer**, unless you are using one
118118
of our **standalone packages**. Flintrock has been thoroughly tested
119119
only on OS X, but it should run on all POSIX systems.
120120
A motivated contributor should be able to add
@@ -252,7 +252,7 @@ provider: ec2
252252

253253
services:
254254
spark:
255-
version: 3.3.0
255+
version: 3.5.0
256256

257257
launch:
258258
num-slaves: 1

flintrock/config.yaml.template

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
services:
22
spark:
3-
version: 3.3.0
3+
version: 3.5.0
44
# git-commit: latest # if not 'latest', provide a full commit SHA; e.g. d6dc12ef0146ae409834c78737c116050961f350
55
# git-repository: # optional; defaults to https://github.com/apache/spark
66
# optional; defaults to download from a dynamically selected Apache mirror
@@ -12,7 +12,7 @@ services:
1212
# download-source: "s3://some-bucket/spark/{v}/"
1313
# executor-instances: 1
1414
hdfs:
15-
version: 3.3.4
15+
version: 3.3.6
1616
# optional; defaults to download from a dynamically selected Apache mirror
1717
# - can be http, https, or s3 URL
1818
# - must contain a {v} template corresponding to the version

flintrock/flintrock.py

Lines changed: 6 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -292,7 +292,7 @@ def cli(cli_context, config, provider, debug):
292292
@click.option('--num-slaves', type=click.IntRange(min=1), required=True)
293293
@click.option('--java-version', type=click.IntRange(min=8), default=11)
294294
@click.option('--install-hdfs/--no-install-hdfs', default=False)
295-
@click.option('--hdfs-version', default='3.3.4')
295+
@click.option('--hdfs-version', default='3.3.6')
296296
@click.option('--hdfs-download-source',
297297
help=(
298298
"URL to download Hadoop from. If an S3 URL, Flintrock will use the "
@@ -1121,10 +1121,13 @@ def normalize_keys(obj):
11211121
"""
11221122
Used to map keys from config files to Python parameter names.
11231123
"""
1124-
if type(obj) != dict:
1124+
if not isinstance(obj, dict):
11251125
return obj
11261126
else:
1127-
return {k.replace('-', '_'): normalize_keys(v) for k, v in obj.items()}
1127+
return {
1128+
k.replace('-', '_'): normalize_keys(v)
1129+
for k, v in obj.items()
1130+
}
11281131

11291132

11301133
def config_to_click(config: dict) -> dict:

requirements/developer.in

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,5 @@
11
-r user.pip
22
pytest >= 3.5.0
33
pytest-cov >= 2.5.1
4-
flake8 == 3.8.4
5-
freezegun == 1.1.0
4+
flake8 == 6.1.0
65
# PyYAML # requirement already covered by setup.py

requirements/developer.pip

Lines changed: 27 additions & 43 deletions
Original file line numberDiff line numberDiff line change
@@ -1,119 +1,103 @@
11
#
2-
# This file is autogenerated by pip-compile with python 3.7
3-
# To update, run:
2+
# This file is autogenerated by pip-compile with Python 3.8
3+
# by the following command:
44
#
55
# pip-compile --output-file=requirements/developer.pip requirements/developer.in
66
#
77
-e file:.#egg=Flintrock
88
# via -r requirements/user.pip
9-
attrs==22.1.0
10-
# via pytest
11-
bcrypt==3.2.2
9+
bcrypt==4.0.1
1210
# via
1311
# -r requirements/user.pip
1412
# paramiko
15-
boto3==1.21.44
13+
boto3==1.29.4
1614
# via
1715
# -r requirements/user.pip
1816
# flintrock
19-
botocore==1.24.44
17+
botocore==1.32.4
2018
# via
2119
# -r requirements/user.pip
2220
# boto3
2321
# flintrock
2422
# s3transfer
25-
cffi==1.15.1
23+
cffi==1.16.0
2624
# via
2725
# -r requirements/user.pip
28-
# bcrypt
2926
# cryptography
3027
# pynacl
31-
click==7.1.2
28+
click==8.1.7
3229
# via
3330
# -r requirements/user.pip
3431
# flintrock
35-
coverage[toml]==6.4.3
36-
# via pytest-cov
37-
cryptography==37.0.4
32+
coverage[toml]==7.3.2
33+
# via
34+
# coverage
35+
# pytest-cov
36+
cryptography==41.0.5
3837
# via
3938
# -r requirements/user.pip
4039
# flintrock
4140
# paramiko
42-
flake8==3.8.4
43-
# via -r requirements/developer.in
44-
freezegun==1.1.0
41+
exceptiongroup==1.2.0
42+
# via pytest
43+
flake8==6.1.0
4544
# via -r requirements/developer.in
46-
importlib-metadata==4.12.0
47-
# via
48-
# flake8
49-
# pluggy
50-
# pytest
51-
iniconfig==1.1.1
45+
iniconfig==2.0.0
5246
# via pytest
5347
jmespath==1.0.1
5448
# via
5549
# -r requirements/user.pip
5650
# boto3
5751
# botocore
58-
mccabe==0.6.1
52+
mccabe==0.7.0
5953
# via flake8
60-
packaging==21.3
54+
packaging==23.2
6155
# via pytest
62-
paramiko==2.11.0
56+
paramiko==3.3.1
6357
# via
6458
# -r requirements/user.pip
6559
# flintrock
66-
pluggy==1.0.0
60+
pluggy==1.3.0
6761
# via pytest
68-
py==1.11.0
69-
# via pytest
70-
pycodestyle==2.6.0
62+
pycodestyle==2.11.1
7163
# via flake8
7264
pycparser==2.21
7365
# via
7466
# -r requirements/user.pip
7567
# cffi
76-
pyflakes==2.2.0
68+
pyflakes==3.1.0
7769
# via flake8
7870
pynacl==1.5.0
7971
# via
8072
# -r requirements/user.pip
8173
# paramiko
82-
pyparsing==3.0.9
83-
# via packaging
84-
pytest==7.1.2
74+
pytest==7.4.3
8575
# via
8676
# -r requirements/developer.in
8777
# pytest-cov
88-
pytest-cov==3.0.0
78+
pytest-cov==4.1.0
8979
# via -r requirements/developer.in
9080
python-dateutil==2.8.2
9181
# via
9282
# -r requirements/user.pip
9383
# botocore
94-
# freezegun
95-
pyyaml==6.0
84+
pyyaml==6.0.1
9685
# via
9786
# -r requirements/user.pip
9887
# flintrock
99-
s3transfer==0.5.2
88+
s3transfer==0.7.0
10089
# via
10190
# -r requirements/user.pip
10291
# boto3
10392
six==1.16.0
10493
# via
10594
# -r requirements/user.pip
106-
# paramiko
10795
# python-dateutil
10896
tomli==2.0.1
10997
# via
11098
# coverage
11199
# pytest
112-
typing-extensions==4.3.0
113-
# via importlib-metadata
114-
urllib3==1.26.11
100+
urllib3==1.26.18
115101
# via
116102
# -r requirements/user.pip
117103
# botocore
118-
zipp==3.8.1
119-
# via importlib-metadata

requirements/maintainer.in

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
11
-r developer.pip
22
wheel >= 0.31.0
3-
twine == 3.8.0
4-
PyInstaller == 5.3
3+
twine == 4.0.2
4+
PyInstaller == 6.2.0

0 commit comments

Comments
 (0)