Skip to content

Commit eb0a0c8

Browse files
committed
Increase max bulk_batch_size
The previous limit was a max of 1000 query parameters. This is changed to max of 1000 rows (the max allowed rows for inserting) or 2050 query parmeters (ms sql reports a max allowed of 2100 parameters but a few parameters are reserved for executing the query).
1 parent 7982506 commit eb0a0c8

File tree

1 file changed

+11
-3
lines changed

1 file changed

+11
-3
lines changed

sql_server/pyodbc/operations.py

Lines changed: 11 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -43,11 +43,19 @@ def bulk_batch_size(self, fields, objs):
4343
are the fields going to be inserted in the batch, the objs contains
4444
all the objects to be inserted.
4545
"""
46-
objs_len, fields_len, max_row_values = len(objs), len(fields), 1000
47-
if (objs_len * fields_len) <= max_row_values:
46+
# inserts are capped at 1000 rows. Other operations do not have this
47+
# limit.
48+
objs_len = min(len(objs), 1000)
49+
fields_len = len(fields)
50+
# MSSQL allows a query to have 2100 parameters but some parameters are
51+
# taken up defining `NVARCHAR` parameters to store the query text and
52+
# query parameters for the `sp_executesql` call. This should only take
53+
# up 2 parameters but I've had this error when sending 2098 parameters.
54+
max_query_params = 2050
55+
if (objs_len * fields_len) <= max_query_params:
4856
size = objs_len
4957
else:
50-
size = max_row_values // fields_len
58+
size = max_query_params // fields_len
5159
return size
5260

5361
def bulk_insert_sql(self, fields, placeholder_rows):

0 commit comments

Comments
 (0)