Skip to content

Commit 75ff846

Browse files
authored
Merge pull request #151 from jacobm001/issue-127
Issue #127 - Respect SQL Servers Parameter Limit
2 parents 4f2004c + 3fc968e commit 75ff846

File tree

3 files changed

+40
-3
lines changed

3 files changed

+40
-3
lines changed

CHANGELOG.md

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,15 @@
11
# Changelog
2+
3+
### v0.20.0
4+
5+
#### features
6+
7+
- users can now declare a custom `max_batch_size` in the project configuration to set the batch size used by the seed file loader. [#127](https://github.com/dbt-msft/dbt-sqlserver/issues/127) and [#151](https://github.com/dbt-msft/dbt-sqlserver/pull/151) thanks [@jacobm001](https://github.com/jacobm001)
8+
9+
#### under the hood
10+
11+
- `sqlserver__load_csv_rows` now has a safety provided by `calc_batch_size()` to ensure the insert statements won't exceed SQL Server's 2100 parameter limit. [#127](https://github.com/dbt-msft/dbt-sqlserver/issues/127) and [#151](https://github.com/dbt-msft/dbt-sqlserver/pull/151) thanks [@jacobm001](https://github.com/jacobm001)
12+
213
### v0.19.2
314

415
#### fixes

README.md

Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -137,6 +137,15 @@ client_secret: clientsecret
137137

138138
### Seeds
139139

140+
By default, dbt-sqlserver will attempt to insert seed files in batches of 400 rows. If this exceeds SQL Server's 2100 parameter limit, the adapter will automatically limit to the highest safe value possible.
141+
142+
To set a different default seed value, you can set the variable `max_batch_size` in your project configuration.
143+
144+
```yaml
145+
vars:
146+
max_batch_size: 200 # Any integer less than or equal to 2100 will do.
147+
```
148+
140149
### Hooks
141150
142151
### Custom schemas

dbt/include/sqlserver/macros/materializations/seed/seed.sql

Lines changed: 20 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,23 @@
1-
{% macro sqlserver__basic_load_csv_rows(model, batch_size, agate_table) %}
1+
{% macro calc_batch_size(num_columns,max_batch_size) %}
2+
{#
3+
SQL Server allows for a max of 2100 parameters in a single statement.
4+
Check if the max_batch_size fits with the number of columns, otherwise
5+
reduce the batch size so it fits.
6+
#}
7+
{% if num_columns * max_batch_size < 2100 %}
8+
{% set batch_size = max_batch_size %}
9+
{% else %}
10+
{% set batch_size = (2100 / num_columns)|int %}
11+
{% endif %}
12+
13+
{{ return(batch_size) }}
14+
{% endmacro %}
15+
16+
{% macro sqlserver__basic_load_csv_rows(model, max_batch_size, agate_table) %}
217
{% set cols_sql = get_seed_column_quoted_csv(model, agate_table.column_names) %}
3-
{% set bindings = [] %}
418

19+
{% set batch_size = calc_batch_size(cols_sql|length, max_batch_size) %}
20+
{% set bindings = [] %}
521
{% set statements = [] %}
622

723
{% for chunk in agate_table.rows | batch(batch_size) %}
@@ -34,5 +50,6 @@
3450
{% endmacro %}
3551

3652
{% macro sqlserver__load_csv_rows(model, agate_table) %}
37-
{{ return(sqlserver__basic_load_csv_rows(model, 200, agate_table) )}}
53+
{% set max_batch_size = var("max_batch_size", 400) %}
54+
{{ return(sqlserver__basic_load_csv_rows(model, max_batch_size, agate_table) )}}
3855
{% endmacro %}

0 commit comments

Comments
 (0)