Skip to content

Commit 8633a0a

Browse files
committed
feat: update all dependencies and project configs
Changes: 1. Update github actions base image to 22.04 since github is retiring 20.04. 2. Use pip-tools to compile all Python requirements. 3. Update all Python dependeny versions 4. Regenerated all pb2 files.
1 parent 2bcb53d commit 8633a0a

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

42 files changed

+1091
-7826
lines changed

.github/workflows/build-docker.yml

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -3,17 +3,17 @@ name: build-docker
33
on:
44
push:
55
branches:
6-
- 'master'
6+
- "master"
77
tags:
8-
- '*'
8+
- "*"
99

1010
env:
1111
REGISTRY: ghcr.io
1212
IMAGE_NAME: ${{ github.repository }}
1313

1414
jobs:
1515
build-and-push-image:
16-
runs-on: ubuntu-20.04
16+
runs-on: ubuntu-22.04
1717
permissions:
1818
contents: read
1919
packages: write

.github/workflows/coverage.yml

Lines changed: 7 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -2,24 +2,24 @@ name: coverage
22
on:
33
pull_request:
44
branches:
5-
- 'master'
5+
- "master"
66
push:
77
branches:
8-
- 'master'
8+
- "master"
99
jobs:
1010
code-coverage:
11-
runs-on: ubuntu-20.04
11+
runs-on: ubuntu-22.04
1212
env:
1313
ENVIRONMENT: TEST_RUNNER
14-
OS: ubuntu-20.04
15-
PYTHON: '3.9'
14+
OS: ubuntu-22.04
15+
PYTHON: "3.9"
1616
COVERAGE_TOTAL: 49 # Coverage threshold percentage
1717
steps:
1818
- name: Checkout (admin token)
1919
if: ${{github.event_name != 'pull_request'}} # We don't want to use the admin token for PR flows
2020
uses: actions/checkout@master
2121
with:
22-
token: '${{ secrets.GIT_ADMIN_WORKFLOW_TOKEN }}'
22+
token: "${{ secrets.GIT_ADMIN_WORKFLOW_TOKEN }}"
2323
fetch-depth: "2" # Original commit + code cov badge commit
2424
- name: Checkout (normal flow)
2525
if: ${{github.event_name == 'pull_request'}}
@@ -34,10 +34,8 @@ jobs:
3434
id: coverage-installer
3535
run: |
3636
python -m pip install --upgrade pip
37-
pip install cython==0.29.21 numpy==1.23.2
3837
sudo apt-get install jq
39-
pip install -r requirements.txt
40-
pip install -r requirements-dev.txt
38+
pip install -r requirements-dev.lock
4139
pip install coverage-badge
4240
- name: Run tests and calculate coverage
4341
id: test-runner

.github/workflows/doc-gen.yml

Lines changed: 8 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -2,17 +2,17 @@ name: doc-gen
22
on:
33
push:
44
branches:
5-
- 'master'
5+
- "master"
66
pull_request:
77
branches:
8-
- 'master'
8+
- "master"
99

1010
jobs:
1111
run:
12-
runs-on: ubuntu-20.04
12+
runs-on: ubuntu-22.04
1313
env:
14-
OS: ubuntu-20.04
15-
PYTHON: '3.9'
14+
OS: ubuntu-22.04
15+
PYTHON: "3.9"
1616
steps:
1717
- uses: actions/checkout@master
1818
with:
@@ -26,9 +26,8 @@ jobs:
2626
- name: Setup requirements and run sphinx
2727
run: |
2828
python -m pip install --upgrade pip
29-
pip install cython==0.29.21 numpy==1.23.2
30-
pip install -r requirements.txt
31-
pip install -r requirements-dev.txt
29+
pip install -r requirements.lock
30+
pip install -r requirements-dev.lock
3231
pip install -r docs/requirements-doc.txt
3332
cd docs
3433
make html
@@ -39,4 +38,4 @@ jobs:
3938
with:
4039
branch: gh-pages
4140
folder: ./docs/build/html
42-
commit-message: 'docs: update build documentation'
41+
commit-message: "docs: update build documentation"

.github/workflows/pre-merge.yml

Lines changed: 2 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ on:
99

1010
jobs:
1111
pre-merge-tests:
12-
runs-on: ubuntu-20.04
12+
runs-on: ubuntu-22.04
1313
env:
1414
ENVIRONMENT: TEST_RUNNER
1515
steps:
@@ -28,9 +28,7 @@ jobs:
2828
- name: Install dependencies
2929
run: |
3030
python -m pip install --upgrade pip
31-
pip install cython==0.29.21 numpy==1.23.2
32-
pip install -r requirements.txt
33-
pip install -r requirements-dev.txt
31+
pip install -r requirements-dev.lock
3432
- name: Lint all files with pre-commit
3533
run: |
3634
pre-commit install

Dockerfile

Lines changed: 2 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -45,10 +45,8 @@ RUN pip install --no-cache-dir \
4545
# Install python dependencies
4646
ARG WORKSPACE=/home/dgp
4747
WORKDIR ${WORKSPACE}
48-
COPY requirements.txt requirements-dev.txt /tmp/
49-
RUN pip install --no-cache-dir cython==0.29.30 numpy==1.20.3
50-
RUN pip install --no-cache-dir -r /tmp/requirements.txt
51-
RUN pip install --no-cache-dir -r /tmp/requirements-dev.txt
48+
COPY requirements-dev.lock /tmp/
49+
RUN pip install --no-cache-dir -r /tmp/requirements-dev.lock
5250

5351
# Settings for S3
5452
RUN aws configure set default.s3.max_concurrent_requests 100 && \

dgp/__init__.py

Lines changed: 21 additions & 21 deletions
Original file line numberDiff line numberDiff line change
@@ -2,51 +2,51 @@
22
import os
33
from collections import OrderedDict
44

5-
__version__ = '1.0'
5+
__version__ = "1.6.0"
66

7-
DGP_PATH = os.getenv('DGP_PATH', default=os.getenv('HOME', os.getcwd()))
8-
DGP_DATA_DIR = os.path.join(DGP_PATH, '.dgp')
9-
DGP_CACHE_DIR = os.path.join(DGP_DATA_DIR, 'cache')
10-
DGP_DATASETS_CACHE_DIR = os.path.join(DGP_DATA_DIR, 'datasets')
7+
DGP_PATH = os.getenv("DGP_PATH", default=os.getenv("HOME", os.getcwd()))
8+
DGP_DATA_DIR = os.path.join(DGP_PATH, ".dgp")
9+
DGP_CACHE_DIR = os.path.join(DGP_DATA_DIR, "cache")
10+
DGP_DATASETS_CACHE_DIR = os.path.join(DGP_DATA_DIR, "datasets")
1111

1212
TRI_DGP_FOLDER_PREFIX = "dgp/"
1313
TRI_RAW_FOLDER_PREFIX = "raw/"
1414
TRI_DGP_JSON_PREFIX = "dataset_v"
1515

1616
# DGP Directory structure constants
17-
RGB_FOLDER = 'rgb'
18-
POINT_CLOUD_FOLDER = 'point_cloud'
17+
RGB_FOLDER = "rgb"
18+
POINT_CLOUD_FOLDER = "point_cloud"
1919
RADAR_POINT_CLOUD_FOLDER = "radar_point_cloud"
20-
BOUNDING_BOX_2D_FOLDER = 'bounding_box_2d'
21-
BOUNDING_BOX_3D_FOLDER = 'bounding_box_3d'
22-
SEMANTIC_SEGMENTATION_2D_FOLDER = 'semantic_segmentation_2d'
23-
SEMANTIC_SEGMENTATION_3D_FOLDER = 'semantic_segmentation_3d'
24-
INSTANCE_SEGMENTATION_2D_FOLDER = 'instance_segmentation_2d'
25-
INSTANCE_SEGMENTATION_3D_FOLDER = 'instance_segmentation_3d'
26-
DEPTH_FOLDER = 'depth'
20+
BOUNDING_BOX_2D_FOLDER = "bounding_box_2d"
21+
BOUNDING_BOX_3D_FOLDER = "bounding_box_3d"
22+
SEMANTIC_SEGMENTATION_2D_FOLDER = "semantic_segmentation_2d"
23+
SEMANTIC_SEGMENTATION_3D_FOLDER = "semantic_segmentation_3d"
24+
INSTANCE_SEGMENTATION_2D_FOLDER = "instance_segmentation_2d"
25+
INSTANCE_SEGMENTATION_3D_FOLDER = "instance_segmentation_3d"
26+
DEPTH_FOLDER = "depth"
2727
EXTRA_DATA_FOLDER = "extra_data"
2828
FEATURE_ONTOLOGY_FOLDER = "feature_ontology"
2929
AGENT_FOLDER = "agent"
3030
CLASSIFICATION_FOLDER = "classification"
3131

3232
# Scene Directory structure constants
33-
CALIBRATION_FOLDER = 'calibration'
34-
ONTOLOGY_FOLDER = 'ontology'
35-
SCENE_JSON_FILENAME = 'scene.json'
33+
CALIBRATION_FOLDER = "calibration"
34+
ONTOLOGY_FOLDER = "ontology"
35+
SCENE_JSON_FILENAME = "scene.json"
3636

3737
# DGP file naming conventions
3838
TRI_DGP_SCENE_DATASET_JSON_NAME = "scene_dataset_v{version}.json"
3939
TRI_DGP_SCENE_JSON_NAME = "scene_{scene_hash}.json"
40-
ANNOTATION_FILE_NAME = '{image_content_hash}_{annotation_content_hash}.json'
40+
ANNOTATION_FILE_NAME = "{image_content_hash}_{annotation_content_hash}.json"
4141

4242
# DGP file naming conventions
4343
TRI_DGP_SCENE_DATASET_JSON_NAME = "scene_dataset_v{version}.json"
4444
TRI_DGP_AGENT_TRACKS_JSON_NAME = "agent_tracks_{track_hash}.json"
4545
TRI_DGP_SCENE_JSON_NAME = "scene_{scene_hash}.json"
46-
ANNOTATION_FILE_NAME = '{image_content_hash}_{annotation_content_hash}.json'
46+
ANNOTATION_FILE_NAME = "{image_content_hash}_{annotation_content_hash}.json"
4747
TRI_DGP_AGENTS_JSON_NAME = "agents_{agent_hash}.json"
4848
TRI_DGP_AGENTS_SLICES_JSON_NAME = "agents_slices_{slice_hash}.json"
4949

5050
# Autolabel constants
51-
AUTOLABEL_FOLDER = 'autolabels'
52-
AUTOLABEL_SCENE_JSON_NAME = 'scene.json'
51+
AUTOLABEL_FOLDER = "autolabels"
52+
AUTOLABEL_SCENE_JSON_NAME = "scene.json"
Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
grpcio>=1.51.1
2+
grpcio-tools>=1.51.1
3+
protobuf>=3.20.3,<5.0.0
4+
wicker[spark]
5+
retry
Lines changed: 50 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,50 @@
1+
# This file was autogenerated by uv via the following command:
2+
# uv pip compile requirements.in -o requirements.lock --resolver=backtracking
3+
boto3==1.37.36
4+
# via wicker
5+
botocore==1.37.36
6+
# via
7+
# boto3
8+
# s3transfer
9+
decorator==5.2.1
10+
# via retry
11+
grpcio==1.70.0
12+
# via
13+
# -r requirements.in
14+
# grpcio-tools
15+
grpcio-tools==1.62.3
16+
# via -r requirements.in
17+
jmespath==1.0.1
18+
# via
19+
# boto3
20+
# botocore
21+
numpy==1.24.4
22+
# via
23+
# pyarrow
24+
# wicker
25+
protobuf==4.25.6
26+
# via
27+
# -r requirements.in
28+
# grpcio-tools
29+
py==1.11.0
30+
# via retry
31+
py4j==0.10.9.7
32+
# via pyspark
33+
pyarrow==17.0.0
34+
# via wicker
35+
pyspark==3.5.5
36+
# via wicker
37+
python-dateutil==2.9.0.post0
38+
# via botocore
39+
retry==0.9.2
40+
# via -r requirements.in
41+
s3transfer==0.11.5
42+
# via boto3
43+
setuptools==75.3.2
44+
# via grpcio-tools
45+
six==1.17.0
46+
# via python-dateutil
47+
urllib3==1.26.20
48+
# via botocore
49+
wicker==0.0.16
50+
# via -r requirements.in

dgp/contribs/dgp2wicker/requirements.txt

Lines changed: 0 additions & 2 deletions
This file was deleted.

dgp/contribs/dgp2wicker/setup.py

Lines changed: 12 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -19,35 +19,34 @@ def run(self):
1919
develop.run(self)
2020

2121

22-
__version__ = importlib.import_module('dgp2wicker').__version__
22+
__version__ = importlib.import_module("dgp2wicker").__version__
2323

24-
with open('requirements.txt', 'r', encoding='utf-8') as f:
24+
with open("requirements.lock", "r", encoding="utf-8") as f:
2525
requirements = f.read().splitlines()
2626

27-
with open('README.md', 'r', encoding='utf-8') as f:
27+
with open("README.md", "r", encoding="utf-8") as f:
2828
readme = f.read()
2929

30-
packages = find_packages(exclude=['tests'])
30+
packages = find_packages(exclude=["tests"])
3131
setup(
3232
name="dgp2wicker",
3333
version=__version__,
3434
description="Tools to convert TRI's DGP to L5's Wicker format.",
3535
long_description=readme,
36-
long_description_content_type='text/markdown',
36+
long_description_content_type="text/markdown",
3737
author="Chris Ochoa, Kuan Lee",
38-
author_email='charles.ochoa@woven-planet.global, kuan-hui.lee@woven-planet.global',
38+
author_email="charles.ochoa@woven-planet.global, kuan-hui.lee@woven-planet.global",
3939
url="https://github.com/TRI-ML/dgp/tree/master/dgp/contribs/dgp2wicker",
4040
packages=packages,
41-
entry_points={'console_scripts': [
42-
'dgp2wicker=dgp2wicker.cli:cli',
41+
entry_points={"console_scripts": [
42+
"dgp2wicker=dgp2wicker.cli:cli",
4343
]},
4444
include_package_data=True,
45-
setup_requires=['cython==0.29.21', 'grpcio==1.41.0', 'grpcio-tools==1.41.0'],
4645
install_requires=requirements,
4746
zip_safe=False,
48-
python_requires='>=3.7',
47+
python_requires=">=3.8",
4948
cmdclass={
50-
'install': CustomInstallCommand,
51-
'develop': CustomDevelopCommand,
52-
}
49+
"install": CustomInstallCommand,
50+
"develop": CustomDevelopCommand,
51+
},
5352
)

0 commit comments

Comments
 (0)