lucene-solr-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Shawn Heisey <>
Subject Re: adding documents to a secured solr server.
Date Thu, 02 Nov 2017 02:41:41 GMT
On 11/1/2017 7:59 PM, Phil Scadden wrote:
> After some digging, I tried this approach...
>         solr = new ConcurrentUpdateSolrClient.Builder(solrUrl)
>                 .withQueueSize(20)
>                 .build();
>               SolrInputDocument up = new SolrInputDocument();
>               up.addField("id",f.getCanonicalPath());
>               up.addField("title",title);
>               up.addField("author",author);
>               String content = textHandler.toString();
>               up.addField("_text_",content);
>               UpdateRequest req = new UpdateRequest();
>               req.setCommitWithin(1000);
>               req.add(up);
>               req.setBasicAuthCredentials("solrAdmin", password);
>               UpdateResponse ur =  req.process(solr);

You need to create an UpdateRequest object and set the auth parameters 
on that object, rather than using the sugar methods on the client to 
have it add the docs directly.

See this:

An alternate idea would be to create a custom HttpClient object (using 
their Builder methods) that has the authentication credentials baked 
into it, and build the solr client using that object.  If you do that, 
then you won't need to add authentication to any request object.

Side note about custom HttpClient objects:  If you intend to use your 
solr client object with multiple threads, you will need to create a 
custom HttpClient object anyway.  This is because the default thread 
limit on the HttpClient that is created in the background is two 
threads.  This limit is not in Solr code, it's in HttpClient.  To allow 
more, the HttpClient object must be custom-built.  I suspect that the 
reason you chose ConcurrentUpdateSolrClient was for automatic handling 
of several threads (you set the queue size to 20) ... but with a default 
object, that won't be what you actually get.  I have filed the following 
issue to try and improve the default situation:

Something else to add as a strong caution:  ConcurrentUpdateSolrClient 
swallows all indexing errors.  If the Solr server were completely down, 
you would not see any exceptions on "add" calls, even though the 
requests all would fail ... the program would only get an error on the 
"commit" call, and it is fairly common for developers to leave the 
commit out, letting Solr handle all commits.  If you want your program 
to be aware of all indexing errors, you will need to use HttpSolrClient 
or CloudSolrClient and handle multiple threads in your own code.


View raw message