Skip to content

Commit 61e50ea

Browse files
committed
Increase max bulk_batch_size
The previous limit was a max of 1000 query parameters. This is changed to max of 1000 rows (the max allowed rows for inserting) or 2050 query parmeters (ms sql reports a max allowed of 2100 parameters but a few parameters are reserved for executing the query).
1 parent 7982506 commit 61e50ea

File tree

1 file changed

+9
-6
lines changed

1 file changed

+9
-6
lines changed

sql_server/pyodbc/operations.py

Lines changed: 9 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -43,12 +43,15 @@ def bulk_batch_size(self, fields, objs):
4343
are the fields going to be inserted in the batch, the objs contains
4444
all the objects to be inserted.
4545
"""
46-
objs_len, fields_len, max_row_values = len(objs), len(fields), 1000
47-
if (objs_len * fields_len) <= max_row_values:
48-
size = objs_len
49-
else:
50-
size = max_row_values // fields_len
51-
return size
46+
fields_len = len(fields)
47+
# MSSQL allows a query to have 2100 parameters but some parameters are
48+
# taken up defining `NVARCHAR` parameters to store the query text and
49+
# query parameters for the `sp_executesql` call. This should only take
50+
# up 2 parameters but I've had this error when sending 2098 parameters.
51+
max_query_params = 2050
52+
# inserts are capped at 1000 rows. Other operations do not have this
53+
# limit.
54+
return min(1000, max_query_params // fields_len)
5255

5356
def bulk_insert_sql(self, fields, placeholder_rows):
5457
placeholder_rows_sql = (", ".join(row) for row in placeholder_rows)

0 commit comments

Comments
 (0)