Skip to content

bulk insert 报错,丢失数据 #91

Closed
@WangXiangUSTC

Description

@WangXiangUSTC

执行go-mysql-elasticsearch,首先mysqldump表中原来的数据,再插入到elasticsearch中。表中总共有几十万条数据,数据量不算大,但是经常会报错:

2017/03/30 17:52:25 sync.go:340: [error] index index: rolename_test, type: rolename, id: 9099892, status: 429, error: {"type":"es_rejected_execution_exception","reason":"rejected execution of org.elasticsearch.transport.TransportService$4@526d2921 on EsThreadPoolExecutor[bulk, queue capacity = 50, org.elasticsearch.common.util.concurrent.EsThreadPoolExecutor@730af97f[Running, pool size = 32, active threads = 32, queued tasks = 50, completed tasks = 1258381273]]"}

我这边的elasticsearch是一个集群,已有的业务每秒钟入库上千条日志。是否是因为go-mysql-elasticsearch 在批量插入数据的时候没有设置bulk size?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions