Jason Toy 2011-11-15, 20:41
-Re: getting lots of errors doing bulk insertion
Otis Gospodnetic 2011-11-16, 00:39
What you read is valid advice. Just don't commit that often, or even at all until the very end if you can wait. :)
And make sure you are indexing to a machine that doesn't warm up caches and searcher every time you commit.
Sematext :: http://sematext.com/ :: Solr - Lucene - Nutch
Lucene ecosystem search :: http://search-lucene.com/
>From: Jason Toy <[EMAIL PROTECTED]>
>To: [EMAIL PROTECTED]
>Sent: Tuesday, November 15, 2011 3:41 PM
>Subject: getting lots of errors doing bulk insertion
>I've written a script that does bulk insertion from my database, it
>grabs chunks of 500 docs (out of 100 million ) and inserts them into
>solr over http. I have 5 threads that are inserting from a queue.
>After each insert I issue a commit.
>Every 20 or so inserts I get this error message:
>Error: Error opening new searcher. exceeded limit of
>maxWarmingSearchers=2, try again later.
>I saw that people suggest to reduce the commit frequency to fix this.
>Is this really the way to need to fix this? The reason I was
>committing after every insert of 500 docs is so there would not be too
>much uncommmited data.
>- sent from my mobile