You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We tried to build benthos flow which read records from file and insert them into a table in Microsoft SQL server 2019,
We have used unarchive with json_array and then send it to output section:
"...
output:
sql_insert:
...
"
We tried to use batching with sql_insert but it failed with or without this section.
ERROR:
The incoming request has too many parameters. The server supports a maximum of 2100 parameters
Our solution for now, was to use sql_raw, there we did insert query, one row each insert. But we know its not efficient way to do it DB (not insert in batch).
Is there a way to insert bulk/multiple rows and limit the max row in each insert ?
The text was updated successfully, but these errors were encountered:
We tried to build benthos flow which read records from file and insert them into a table in Microsoft SQL server 2019,
We have used unarchive with json_array and then send it to output section:
"...
output:
sql_insert:
...
"
We tried to use batching with sql_insert but it failed with or without this section.
ERROR:
The incoming request has too many parameters. The server supports a maximum of 2100 parameters
Our solution for now, was to use sql_raw, there we did insert query, one row each insert. But we know its not efficient way to do it DB (not insert in batch).
Is there a way to insert bulk/multiple rows and limit the max row in each insert ?
The text was updated successfully, but these errors were encountered: