Skip to content
This repository was archived by the owner on Sep 2, 2025. It is now read-only.

Commit 8c0a192

Browse files
authored
Move code quality deps to precommit (#1291)
* move linter config into .pre-commit-config.yaml * make updates from mypy - all unneeded type ignore comments * update contributing guide to better reflect current state
1 parent ea848b0 commit 8c0a192

File tree

21 files changed

+121
-150
lines changed

21 files changed

+121
-150
lines changed
Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
kind: Under the Hood
2+
body: Simplify linting environment and dev dependencies
3+
time: 2024-07-18T19:32:06.044016-04:00
4+
custom:
5+
Author: mikealfare
6+
Issue: "1291"

.flake8

Lines changed: 0 additions & 16 deletions
This file was deleted.

.github/workflows/main.yml

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -58,7 +58,6 @@ jobs:
5858
python -m pip install -r dev-requirements.txt
5959
python -m pip --version
6060
pre-commit --version
61-
mypy --version
6261
dbt --version
6362
- name: Run pre-comit hooks
6463
run: pre-commit run --all-files --show-diff-on-failure

.pre-commit-config.yaml

Lines changed: 53 additions & 61 deletions
Original file line numberDiff line numberDiff line change
@@ -1,66 +1,58 @@
11
# For more on configuring pre-commit hooks (see https://pre-commit.com/)
2-
32
default_language_version:
4-
python: python3
3+
python: python3
54

65
repos:
7-
- repo: https://github.com/pre-commit/pre-commit-hooks
8-
rev: v4.4.0
9-
hooks:
10-
- id: check-yaml
11-
args: [--unsafe]
12-
- id: check-json
13-
- id: end-of-file-fixer
14-
- id: trailing-whitespace
15-
- id: check-case-conflict
16-
- repo: https://github.com/dbt-labs/pre-commit-hooks
17-
rev: v0.1.0a1
18-
hooks:
19-
- id: dbt-core-in-adapters-check
20-
- repo: https://github.com/psf/black
21-
rev: 23.1.0
22-
hooks:
23-
- id: black
24-
additional_dependencies: ['click~=8.1']
25-
args:
26-
- "--line-length=99"
27-
- "--target-version=py38"
28-
- id: black
29-
alias: black-check
30-
stages: [manual]
31-
additional_dependencies: ['click~=8.1']
32-
args:
33-
- "--line-length=99"
34-
- "--target-version=py38"
35-
- "--check"
36-
- "--diff"
37-
- repo: https://github.com/pycqa/flake8
38-
rev: 6.0.0
39-
hooks:
40-
- id: flake8
41-
- id: flake8
42-
alias: flake8-check
43-
stages: [manual]
44-
- repo: https://github.com/pre-commit/mirrors-mypy
45-
rev: v1.1.1
46-
hooks:
47-
- id: mypy
48-
# N.B.: Mypy is... a bit fragile.
49-
#
50-
# By using `language: system` we run this hook in the local
51-
# environment instead of a pre-commit isolated one. This is needed
52-
# to ensure mypy correctly parses the project.
6+
- repo: https://github.com/pre-commit/pre-commit-hooks
7+
rev: v4.6.0
8+
hooks:
9+
- id: check-yaml
10+
args: [--unsafe]
11+
- id: check-json
12+
- id: end-of-file-fixer
13+
- id: trailing-whitespace
14+
- id: check-case-conflict
15+
16+
- repo: https://github.com/dbt-labs/pre-commit-hooks
17+
rev: v0.1.0a1
18+
hooks:
19+
- id: dbt-core-in-adapters-check
20+
21+
- repo: https://github.com/psf/black
22+
rev: 24.4.2
23+
hooks:
24+
- id: black
25+
args:
26+
- --line-length=99
27+
- --target-version=py38
28+
- --target-version=py39
29+
- --target-version=py310
30+
- --target-version=py311
31+
additional_dependencies: [flaky]
32+
33+
- repo: https://github.com/pycqa/flake8
34+
rev: 7.0.0
35+
hooks:
36+
- id: flake8
37+
exclude: tests/
38+
args:
39+
- --max-line-length=99
40+
- --select=E,F,W
41+
- --ignore=E203,E501,E741,W503,W504
42+
- --per-file-ignores=*/__init__.py:F401
5343

54-
# It may cause trouble in that it adds environmental variables out
55-
# of our control to the mix. Unfortunately, there's nothing we can
56-
# do about per pre-commit's author.
57-
# See https://github.com/pre-commit/pre-commit/issues/730 for details.
58-
args: [--show-error-codes, --ignore-missing-imports, --explicit-package-bases]
59-
files: ^dbt/adapters/.*
60-
language: system
61-
- id: mypy
62-
alias: mypy-check
63-
stages: [manual]
64-
args: [--show-error-codes, --pretty, --ignore-missing-imports, --explicit-package-bases]
65-
files: ^dbt/adapters
66-
language: system
44+
- repo: https://github.com/pre-commit/mirrors-mypy
45+
rev: v1.10.0
46+
hooks:
47+
- id: mypy
48+
args:
49+
- --explicit-package-bases
50+
- --ignore-missing-imports
51+
- --pretty
52+
- --show-error-codes
53+
- --warn-unused-ignores
54+
files: ^dbt/adapters/bigquery
55+
additional_dependencies:
56+
- types-protobuf
57+
- types-pytz
58+
- types-requests

CONTRIBUTING.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -67,7 +67,7 @@ $EDITOR test.env
6767
There are a few methods for running tests locally.
6868

6969
#### `tox`
70-
`tox` takes care of managing Python virtualenvs and installing dependencies in order to run tests. You can also run tests in parallel, for example you can run unit tests for Python 3.8, Python 3.9, and `flake8` checks in parallel with `tox -p`. Also, you can run unit tests for specific python versions with `tox -e py38`. The configuration of these tests are located in `tox.ini`.
70+
`tox` takes care of managing Python virtualenvs and installing dependencies in order to run tests. You can also run tests in parallel, for example you can run unit tests for Python 3.8, Python 3.9, Python 3.10, and Python 3.11 in parallel with `tox -p`. Also, you can run unit tests for specific python versions with `tox -e py38`. The configuration of these tests are located in `tox.ini`.
7171

7272
#### `pytest`
7373
Finally, you can also run a specific test or group of tests using `pytest` directly. With a Python virtualenv active and dev dependencies installed you can do things like:

dbt/adapters/bigquery/__init__.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,5 +8,5 @@
88
from dbt.include import bigquery
99

1010
Plugin = AdapterPlugin(
11-
adapter=BigQueryAdapter, credentials=BigQueryCredentials, include_path=bigquery.PACKAGE_PATH # type: ignore[arg-type]
11+
adapter=BigQueryAdapter, credentials=BigQueryCredentials, include_path=bigquery.PACKAGE_PATH
1212
)

dbt/adapters/bigquery/column.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ class BigQueryColumn(Column):
1818
"INTEGER": "INT64",
1919
}
2020
fields: List[Self] # type: ignore
21-
mode: str # type: ignore
21+
mode: str
2222

2323
def __init__(
2424
self,
@@ -110,7 +110,7 @@ def is_numeric(self) -> bool:
110110
def is_float(self):
111111
return self.dtype.lower() == "float64"
112112

113-
def can_expand_to(self: Self, other_column: Self) -> bool: # type: ignore
113+
def can_expand_to(self: Self, other_column: Self) -> bool:
114114
"""returns True if both columns are strings"""
115115
return self.is_string() and other_column.is_string()
116116

@@ -124,7 +124,7 @@ def column_to_bq_schema(self) -> SchemaField:
124124
fields = [field.column_to_bq_schema() for field in self.fields] # type: ignore[attr-defined]
125125
kwargs = {"fields": fields}
126126

127-
return SchemaField(self.name, self.dtype, self.mode, **kwargs) # type: ignore[arg-type]
127+
return SchemaField(self.name, self.dtype, self.mode, **kwargs)
128128

129129

130130
def get_nested_column_data_types(

dbt/adapters/bigquery/connections.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -116,8 +116,8 @@ class BigQueryCredentials(Credentials):
116116

117117
# BigQuery allows an empty database / project, where it defers to the
118118
# environment for the project
119-
database: Optional[str] = None # type: ignore
120-
schema: Optional[str] = None # type: ignore
119+
database: Optional[str] = None
120+
schema: Optional[str] = None
121121
execution_project: Optional[str] = None
122122
location: Optional[str] = None
123123
priority: Optional[Priority] = None
@@ -568,7 +568,7 @@ def execute(
568568
else:
569569
message = f"{code}"
570570

571-
response = BigQueryAdapterResponse( # type: ignore[call-arg]
571+
response = BigQueryAdapterResponse(
572572
_message=message,
573573
rows_affected=num_rows,
574574
code=code,

dbt/adapters/bigquery/dataproc/batch.py

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -20,9 +20,9 @@ def create_batch_request(
2020
batch: Batch, batch_id: str, project: str, region: str
2121
) -> CreateBatchRequest:
2222
return CreateBatchRequest(
23-
parent=f"projects/{project}/locations/{region}", # type: ignore
24-
batch_id=batch_id, # type: ignore
25-
batch=batch, # type: ignore
23+
parent=f"projects/{project}/locations/{region}",
24+
batch_id=batch_id,
25+
batch=batch,
2626
)
2727

2828

@@ -35,10 +35,10 @@ def poll_batch_job(
3535
run_time = 0
3636
while state in _BATCH_RUNNING_STATES and run_time < timeout:
3737
time.sleep(1)
38-
response = job_client.get_batch( # type: ignore
39-
request=GetBatchRequest(name=batch_name), # type: ignore
38+
response = job_client.get_batch(
39+
request=GetBatchRequest(name=batch_name),
4040
)
41-
run_time = datetime.now().timestamp() - response.create_time.timestamp() # type: ignore
41+
run_time = datetime.now().timestamp() - response.create_time.timestamp()
4242
state = response.state
4343
if not response:
4444
raise ValueError("No response from Dataproc")

dbt/adapters/bigquery/impl.py

Lines changed: 18 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@
2222
from dbt.adapters.contracts.relation import RelationConfig
2323

2424
import dbt_common.exceptions.base
25-
from dbt.adapters.base import ( # type: ignore
25+
from dbt.adapters.base import (
2626
AdapterConfig,
2727
BaseAdapter,
2828
BaseRelation,
@@ -33,11 +33,15 @@
3333
available,
3434
)
3535
from dbt.adapters.base.impl import FreshnessResponse
36-
from dbt.adapters.cache import _make_ref_key_dict # type: ignore
36+
from dbt.adapters.cache import _make_ref_key_dict
3737
from dbt.adapters.capability import Capability, CapabilityDict, CapabilitySupport, Support
3838
from dbt.adapters.contracts.connection import AdapterResponse
3939
from dbt.adapters.contracts.macros import MacroResolverProtocol
40-
from dbt_common.contracts.constraints import ColumnLevelConstraint, ConstraintType, ModelLevelConstraint # type: ignore
40+
from dbt_common.contracts.constraints import (
41+
ColumnLevelConstraint,
42+
ConstraintType,
43+
ModelLevelConstraint,
44+
)
4145
from dbt_common.dataclass_schema import dbtClassMixin
4246
from dbt.adapters.events.logging import AdapterLogger
4347
from dbt_common.events.functions import fire_event
@@ -163,7 +167,7 @@ def is_cancelable(cls) -> bool:
163167
return False
164168

165169
def drop_relation(self, relation: BigQueryRelation) -> None:
166-
is_cached = self._schema_is_cached(relation.database, relation.schema) # type: ignore[arg-type]
170+
is_cached = self._schema_is_cached(relation.database, relation.schema)
167171
if is_cached:
168172
self.cache_dropped(relation)
169173

@@ -258,7 +262,7 @@ def add_time_ingestion_partition_column(self, partition_by, columns) -> List[Big
258262
)
259263
return columns
260264

261-
def expand_column_types(self, goal: BigQueryRelation, current: BigQueryRelation) -> None: # type: ignore[override]
265+
def expand_column_types(self, goal: BigQueryRelation, current: BigQueryRelation) -> None:
262266
# This is a no-op on BigQuery
263267
pass
264268

@@ -323,7 +327,7 @@ def get_relation(
323327
# TODO: the code below is copy-pasted from SQLAdapter.create_schema. Is there a better way?
324328
def create_schema(self, relation: BigQueryRelation) -> None:
325329
# use SQL 'create schema'
326-
relation = relation.without_identifier() # type: ignore
330+
relation = relation.without_identifier()
327331

328332
fire_event(SchemaCreation(relation=_make_ref_key_dict(relation)))
329333
kwargs = {
@@ -410,7 +414,7 @@ def _agate_to_schema(
410414
for idx, col_name in enumerate(agate_table.column_names):
411415
inferred_type = self.convert_agate_type(agate_table, idx)
412416
type_ = column_override.get(col_name, inferred_type)
413-
bq_schema.append(SchemaField(col_name, type_)) # type: ignore[arg-type]
417+
bq_schema.append(SchemaField(col_name, type_))
414418
return bq_schema
415419

416420
@available.parse(lambda *a, **k: "")
@@ -736,8 +740,8 @@ def _get_catalog_schemas(self, relation_config: Iterable[RelationConfig]) -> Sch
736740
for candidate, schemas in candidates.items():
737741
database = candidate.database
738742
if database not in db_schemas:
739-
db_schemas[database] = set(self.list_schemas(database)) # type: ignore[index]
740-
if candidate.schema in db_schemas[database]: # type: ignore[index]
743+
db_schemas[database] = set(self.list_schemas(database))
744+
if candidate.schema in db_schemas[database]:
741745
result[candidate] = schemas
742746
else:
743747
logger.debug(
@@ -844,7 +848,7 @@ def describe_relation(
844848
return None
845849

846850
@available.parse_none
847-
def grant_access_to(self, entity, entity_type, role, grant_target_dict):
851+
def grant_access_to(self, entity, entity_type, role, grant_target_dict) -> None:
848852
"""
849853
Given an entity, grants it access to a dataset.
850854
"""
@@ -873,7 +877,7 @@ def get_dataset_location(self, relation):
873877
dataset = client.get_dataset(dataset_ref)
874878
return dataset.location
875879

876-
def get_rows_different_sql( # type: ignore[override]
880+
def get_rows_different_sql(
877881
self,
878882
relation_a: BigQueryRelation,
879883
relation_b: BigQueryRelation,
@@ -921,7 +925,7 @@ def run_sql_for_tests(self, sql, fetch, conn=None):
921925
return list(res)
922926

923927
def generate_python_submission_response(self, submission_result) -> BigQueryAdapterResponse:
924-
return BigQueryAdapterResponse(_message="OK") # type: ignore[call-arg]
928+
return BigQueryAdapterResponse(_message="OK")
925929

926930
@property
927931
def default_python_submission_method(self) -> str:
@@ -961,7 +965,7 @@ def render_raw_columns_constraints(cls, raw_columns: Dict[str, Dict[str, Any]])
961965

962966
@classmethod
963967
def render_column_constraint(cls, constraint: ColumnLevelConstraint) -> Optional[str]:
964-
c = super().render_column_constraint(constraint) # type: ignore
968+
c = super().render_column_constraint(constraint)
965969
if (
966970
constraint.type == ConstraintType.primary_key
967971
or constraint.type == ConstraintType.foreign_key
@@ -971,7 +975,7 @@ def render_column_constraint(cls, constraint: ColumnLevelConstraint) -> Optional
971975

972976
@classmethod
973977
def render_model_constraint(cls, constraint: ModelLevelConstraint) -> Optional[str]:
974-
c = super().render_model_constraint(constraint) # type: ignore
978+
c = super().render_model_constraint(constraint)
975979
if (
976980
constraint.type == ConstraintType.primary_key
977981
or constraint.type == ConstraintType.foreign_key

0 commit comments

Comments
 (0)