Rails Searchkickで circuit_breaking_exception Data too largeエラー 分割インデックスする方法

RailsアプリでSearchkickを使いElasticsearchのreindexをしようとしたところデータ量が大きすぎるとのエラー

Elasticsearch::Transport::Transport::ServerError · [429] {"error":{"root_cause":[{"type":"circuit_breaking_exception","reason":"[parent] Data too large, data for [<http_request>] would be [7474977728/6.9gb], which is larger than the limit of [7429763891/6.9gb], real usage: [7473437560/6.9gb], new bytes reserved: [1540168/1.4mb]","bytes_wanted":7474977728,"bytes_limit":7429763891,"durability":"PERMANENT"}],"type":"circuit_breaking_exception","reason":"[parent] Data too large, data for [<http_request>] would be [7474977728/6.9gb], which is larger than the limit of [7429763891/6.9gb], real usage: [7473437560/6.9gb], new bytes reserved: [1540168/1.4mb]","bytes_wanted":7474977728,"bytes_limit":7429763891,"durability":"PERMANENT"},"status":429}

Parallel Reindexingをすることバックグラウンドジョブ(今回だとSidekiqを利用)を使って解決できるらしい

GitHub - ankane/searchkick: Intelligent search made easy with Rails and Elasticsearch

Product.reindex(async: true)

# indexが終わるタイミングを検知できないため
# indexが終わってからaliasを書き換える
Product.search_index.promote(index_name)

indexが終わるのを待って自動で切り替えまでするには

Searchkick.redis = Redis.new
Product.reindex(async: {wait: true})

とのことだが永遠に続く Batches left: 1

Product.reindex(async: {wait: true})
=> Enqueued Searchkick::BulkReindexJob (Job ID: 12345) to Sidekiq(searchkick) with arguments: {:class_name=>"Product", :index_name=>"products_20201109161642890", :batch_id=>1, :min_id=>1, :max_id=>1000}
Created index: products_20201109161642890
Batches left: 1
Batches left: 1
Batches left: 1
Batches left: 1
Batches left: 1
...

調査してみると Searchkick.redis = Redis.new の設定をSidekiq側でもしないといけなかった

config/initializers/elasticsearch.rb に以下の設定を追加することで無事 Parallel Reindexing できました。

Searchkick.redis = Redis.new