lucene-solr-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "anand.mahajan" <an...@zerebral.co.in>
Subject Re: SolrCloud Scale Struggle
Date Fri, 01 Aug 2014 13:56:34 GMT
Thanks for the reply Shalin.

1. I'll try increasing the softCommit interval and the autoSoftCommit too.
One mistake I made that I realized just now is that I am using /solr/select
and expecting it to do an NRT - for NRT search its got to be /select/get
handler that needs to be used. Please confirm.

2. Also, on the number of shards - I made 36 (even with 6 machines) as I was
hoping I'd get more hardware and i'll be able to distribute existing shards
on the new boxes. That has not happened yet. But even with current
deployment - less number of shards would mean more docs per shard and would
that now slow down search queries?

3. Increasing the commit interval would mean more RAM usage and could that
make the situation bad? as there is already less RAM in there compared to
the total doc size (with all fields stored)  [FYI - ramBufferSizeMB and
maxBufferedDocs are set to default - 100MB and 1000 respectively]

4. I read DataStack Enterprise edition could be an answer here? Is there an
easy way to migrate to DSE - and something that would not cause too many
code changes? (I had a discussion with the DSE folks a few weeks ago and
they mentioned migration would be breeze from Solr to DSE and there would
not be 'any' code changes required too on the ingestion and search code.
(Perhaps I was talking to the Sales guy maybe?))  - With DSE - the data
would sit in Cassendra and the search will still be with Solr plugged into
DSE. but would that work with a 6 Node cluster?  (Sorry if I'm deviating
here a bit from the core problem i'm trying to fix - but if DSE could work
with a very minimal time and effort requirement - i wont mind trying it
out.)



--
View this message in context: http://lucene.472066.n3.nabble.com/SolrCloud-Scale-Struggle-tp4150592p4150619.html
Sent from the Solr - User mailing list archive at Nabble.com.

Mime
View raw message