问题
我编写了一个将数据从 BigQuery 同步到 MySQL 数据库的应用程序。我尝试每 3 小时分批插入大约 10-20k 行(每批最多 10 项)。出于某种原因,当它尝试将这些行更新插入 MySQL 时,我收到以下错误:
不能创建超过 max_prepared_stmt_count 个语句:
错误 1461:不能创建超过 max_prepared_stmt_count 个语句(当前值:2000)
我的“相关代码”
// ProcessProjectSkuCost receives the given sku cost entries and sends them in batches to upsertProjectSkuCosts()
func ProcessProjectSkuCost(done <-chan bigquery.SkuCost) {
var skuCosts []bigquery.SkuCost
var rowsAffected int64
for skuCostRow := range done {
skuCosts = append(skuCosts, skuCostRow)
if len(skuCosts) == 10 {
rowsAffected += upsertProjectSkuCosts(skuCosts)
skuCosts = []bigquery.SkuCost{}
}
}
if len(skuCosts) > 0 {
rowsAffected += upsertProjectSkuCosts(skuCosts)
}
log.Infof("Completed upserting project sku costs. Affected rows: '%d'", rowsAffected)
}
// upsertProjectSkuCosts inserts or updates ProjectSkuCosts into SQL in batches
func upsertProjectSkuCosts(skuCosts []bigquery.SkuCost) int64 {
// properties are table fields
tableFields := []string{"project_name", "sku_id", "sku_description", "usage_start_time", "usage_end_time",
"cost", "currency", "usage_amount", "usage_unit", "usage_amount_in_pricing_units", "usage_pricing_unit",
"invoice_month"}
tableFieldString := fmt.Sprintf("(%s)", strings.Join(tableFields, ","))
我的问题:
为什么我max_prepared_stmt_count在立即执行准备好的语句时点击了(请参阅函数upsertProjectSkuCosts)?
我只能想象这是某种并发性,它在准备和执行所有这些语句之间同时创建大量准备好的语句。另一方面,我不明白为什么会有这么多并发,因为通道中的通道ProcessProjectSkuCost是一个大小为 20 的缓冲通道。
泛舟湖上清波郎朗
相关分类