Hey, I noticed a bug in the code for replace_data_table_rows method, you never populate the row_sizes list.
row_sizes = []
# Validate each row isn't too large before processing
for i, row in enumerate(rows):
row_size = _estimate_row_json_size(row)
if row_size > 4000000:
raise SecOpsError(
"Single row is too large to process "
f"(>{row_size} bytes): {rows[i][:100]}..."
)
all_responses = []
but using this list later:
first_batch_sizes = row_sizes[:first_batch_size]