lucene-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Policeman Jenkins Server <jenk...@thetaphi.de>
Subject [JENKINS] Lucene-Solr-master-Linux (32bit/jdk1.8.0_112) - Build # 18711 - Unstable!
Date Fri, 06 Jan 2017 12:42:08 GMT
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-Linux/18711/
Java: 32bit/jdk1.8.0_112 -client -XX:+UseParallelGC

1 tests failed.
FAILED:  org.apache.solr.cloud.PeerSyncReplicationTest.test

Error Message:
timeout waiting to see all nodes active

Stack Trace:
java.lang.AssertionError: timeout waiting to see all nodes active
	at __randomizedtesting.SeedInfo.seed([84F4EA3AA629126F:CA0D5E008D57F97]:0)
	at org.junit.Assert.fail(Assert.java:93)
	at org.apache.solr.cloud.PeerSyncReplicationTest.waitTillNodesActive(PeerSyncReplicationTest.java:311)
	at org.apache.solr.cloud.PeerSyncReplicationTest.bringUpDeadNodeAndEnsureNoReplication(PeerSyncReplicationTest.java:262)
	at org.apache.solr.cloud.PeerSyncReplicationTest.forceNodeFailureAndDoPeerSync(PeerSyncReplicationTest.java:244)
	at org.apache.solr.cloud.PeerSyncReplicationTest.test(PeerSyncReplicationTest.java:133)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1713)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:907)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:943)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:957)
	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:985)
	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:960)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:367)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:811)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:462)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:916)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:802)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:852)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:863)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:367)
	at java.lang.Thread.run(Thread.java:745)




Build Log:
[...truncated 11219 lines...]
   [junit4] Suite: org.apache.solr.cloud.PeerSyncReplicationTest
   [junit4]   2> Creating dataDir: /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/temp/solr.cloud.PeerSyncReplicationTest_84F4EA3AA629126F-001/init-core-data-001
   [junit4]   2> 281282 INFO  (SUITE-PeerSyncReplicationTest-seed#[84F4EA3AA629126F]-worker) [    ] o.a.s.SolrTestCaseJ4 Randomized ssl (true) and clientAuth (true) via: @org.apache.solr.util.RandomizeSSL(reason=, ssl=NaN, value=NaN, clientAuth=NaN)
   [junit4]   2> 281283 INFO  (SUITE-PeerSyncReplicationTest-seed#[84F4EA3AA629126F]-worker) [    ] o.a.s.BaseDistributedSearchTestCase Setting hostContext system property: /my/k
   [junit4]   2> 281284 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.c.ZkTestServer STARTING ZK TEST SERVER
   [junit4]   2> 281285 INFO  (Thread-373) [    ] o.a.s.c.ZkTestServer client port:0.0.0.0/0.0.0.0:0
   [junit4]   2> 281285 INFO  (Thread-373) [    ] o.a.s.c.ZkTestServer Starting server
   [junit4]   2> 281385 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.c.ZkTestServer start zk server on port:42987
   [junit4]   2> 281395 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/core/src/test-files/solr/collection1/conf/solrconfig-tlog.xml to /configs/conf1/solrconfig.xml
   [junit4]   2> 281397 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/core/src/test-files/solr/collection1/conf/schema.xml to /configs/conf1/schema.xml
   [junit4]   2> 281398 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/core/src/test-files/solr/collection1/conf/solrconfig.snippet.randomindexconfig.xml to /configs/conf1/solrconfig.snippet.randomindexconfig.xml
   [junit4]   2> 281400 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/core/src/test-files/solr/collection1/conf/stopwords.txt to /configs/conf1/stopwords.txt
   [junit4]   2> 281401 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/core/src/test-files/solr/collection1/conf/protwords.txt to /configs/conf1/protwords.txt
   [junit4]   2> 281402 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/core/src/test-files/solr/collection1/conf/currency.xml to /configs/conf1/currency.xml
   [junit4]   2> 281403 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/core/src/test-files/solr/collection1/conf/enumsConfig.xml to /configs/conf1/enumsConfig.xml
   [junit4]   2> 281405 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/core/src/test-files/solr/collection1/conf/open-exchange-rates.json to /configs/conf1/open-exchange-rates.json
   [junit4]   2> 281406 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/core/src/test-files/solr/collection1/conf/mapping-ISOLatin1Accent.txt to /configs/conf1/mapping-ISOLatin1Accent.txt
   [junit4]   2> 281407 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/core/src/test-files/solr/collection1/conf/old_synonyms.txt to /configs/conf1/old_synonyms.txt
   [junit4]   2> 281408 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/core/src/test-files/solr/collection1/conf/synonyms.txt to /configs/conf1/synonyms.txt
   [junit4]   2> 281492 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.SolrTestCaseJ4 Writing core.properties file to /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/temp/solr.cloud.PeerSyncReplicationTest_84F4EA3AA629126F-001/control-001/cores/collection1
   [junit4]   2> 281494 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.e.j.s.Server jetty-9.3.14.v20161028
   [junit4]   2> 281495 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@1a013a{/my/k,null,AVAILABLE}
   [junit4]   2> 281497 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.e.j.s.AbstractConnector Started ServerConnector@1098e69{SSL,[ssl, http/1.1]}{127.0.0.1:41056}
   [junit4]   2> 281497 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.e.j.s.Server Started @282776ms
   [junit4]   2> 281497 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {solr.data.dir=/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/temp/solr.cloud.PeerSyncReplicationTest_84F4EA3AA629126F-001/tempDir-001/control/data, hostContext=/my/k, hostPort=41056, coreRootDirectory=/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/temp/solr.cloud.PeerSyncReplicationTest_84F4EA3AA629126F-001/control-001/cores}
   [junit4]   2> 281498 ERROR (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.s.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 281498 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr? version 7.0.0
   [junit4]   2> 281498 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 281498 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null
   [junit4]   2> 281498 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2017-01-06T12:03:51.638Z
   [junit4]   2> 281501 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 281501 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.c.SolrXmlConfig Loading container configuration from /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/temp/solr.cloud.PeerSyncReplicationTest_84F4EA3AA629126F-001/control-001/solr.xml
   [junit4]   2> 281509 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:42987/solr
   [junit4]   2> 281529 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [n:127.0.0.1:41056_my%2Fk    ] o.a.s.c.OverseerElectionContext I am going to be the leader 127.0.0.1:41056_my%2Fk
   [junit4]   2> 281529 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [n:127.0.0.1:41056_my%2Fk    ] o.a.s.c.Overseer Overseer (id=97236040510668805-127.0.0.1:41056_my%2Fk-n_0000000000) starting
   [junit4]   2> 281534 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [n:127.0.0.1:41056_my%2Fk    ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:41056_my%2Fk
   [junit4]   2> 281536 INFO  (zkCallback-233-thread-1-processing-n:127.0.0.1:41056_my%2Fk) [n:127.0.0.1:41056_my%2Fk    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 281865 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [n:127.0.0.1:41056_my%2Fk    ] o.a.s.c.CorePropertiesLocator Found 1 core definitions underneath /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/temp/solr.cloud.PeerSyncReplicationTest_84F4EA3AA629126F-001/control-001/cores
   [junit4]   2> 281865 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [n:127.0.0.1:41056_my%2Fk    ] o.a.s.c.CorePropertiesLocator Cores are: [collection1]
   [junit4]   2> 281870 INFO  (OverseerStateUpdate-97236040510668805-127.0.0.1:41056_my%2Fk-n_0000000000) [n:127.0.0.1:41056_my%2Fk    ] o.a.s.c.o.ReplicaMutator Assigning new node to shard shard=shard1
   [junit4]   2> 282887 WARN  (coreLoadExecutor-622-thread-1-processing-n:127.0.0.1:41056_my%2Fk) [n:127.0.0.1:41056_my%2Fk c:control_collection   x:collection1] o.a.s.c.Config Beginning with Solr 5.5, <mergePolicy> is deprecated, use <mergePolicyFactory> instead.
   [junit4]   2> 282888 INFO  (coreLoadExecutor-622-thread-1-processing-n:127.0.0.1:41056_my%2Fk) [n:127.0.0.1:41056_my%2Fk c:control_collection   x:collection1] o.a.s.c.SolrConfig Using Lucene MatchVersion: 7.0.0
   [junit4]   2> 282906 INFO  (coreLoadExecutor-622-thread-1-processing-n:127.0.0.1:41056_my%2Fk) [n:127.0.0.1:41056_my%2Fk c:control_collection   x:collection1] o.a.s.s.IndexSchema [collection1] Schema name=test
   [junit4]   2> 283020 WARN  (coreLoadExecutor-622-thread-1-processing-n:127.0.0.1:41056_my%2Fk) [n:127.0.0.1:41056_my%2Fk c:control_collection   x:collection1] o.a.s.s.IndexSchema [collection1] default search field in schema is text. WARNING: Deprecated, please use 'df' on request instead.
   [junit4]   2> 283023 INFO  (coreLoadExecutor-622-thread-1-processing-n:127.0.0.1:41056_my%2Fk) [n:127.0.0.1:41056_my%2Fk c:control_collection   x:collection1] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id
   [junit4]   2> 283044 INFO  (coreLoadExecutor-622-thread-1-processing-n:127.0.0.1:41056_my%2Fk) [n:127.0.0.1:41056_my%2Fk c:control_collection   x:collection1] o.a.s.c.CoreContainer Creating SolrCore 'collection1' using configuration from collection control_collection
   [junit4]   2> 283045 INFO  (coreLoadExecutor-622-thread-1-processing-n:127.0.0.1:41056_my%2Fk) [n:127.0.0.1:41056_my%2Fk c:control_collection   x:collection1] o.a.s.c.SolrCore [[collection1] ] Opening new SolrCore at [/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/temp/solr.cloud.PeerSyncReplicationTest_84F4EA3AA629126F-001/control-001/cores/collection1], dataDir=[/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/temp/solr.cloud.PeerSyncReplicationTest_84F4EA3AA629126F-001/control-001/cores/collection1/data/]
   [junit4]   2> 283045 INFO  (coreLoadExecutor-622-thread-1-processing-n:127.0.0.1:41056_my%2Fk) [n:127.0.0.1:41056_my%2Fk c:control_collection   x:collection1] o.a.s.c.JmxMonitoredMap JMX monitoring is enabled. Adding Solr mbeans to JMX Server: com.sun.jmx.mbeanserver.JmxMBeanServer@de3858
   [junit4]   2> 283047 INFO  (coreLoadExecutor-622-thread-1-processing-n:127.0.0.1:41056_my%2Fk) [n:127.0.0.1:41056_my%2Fk c:control_collection   x:collection1] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=16, maxMergeAtOnceExplicit=39, maxMergedSegmentMB=80.634765625, floorSegmentMB=1.9072265625, forceMergeDeletesPctAllowed=29.643981740979196, segmentsPerTier=12.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.43336023902830456
   [junit4]   2> 283122 WARN  (coreLoadExecutor-622-thread-1-processing-n:127.0.0.1:41056_my%2Fk) [n:127.0.0.1:41056_my%2Fk c:control_collection   x:collection1] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 283142 INFO  (coreLoadExecutor-622-thread-1-processing-n:127.0.0.1:41056_my%2Fk) [n:127.0.0.1:41056_my%2Fk c:control_collection   x:collection1] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog
   [junit4]   2> 283142 INFO  (coreLoadExecutor-622-thread-1-processing-n:127.0.0.1:41056_my%2Fk) [n:127.0.0.1:41056_my%2Fk c:control_collection   x:collection1] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=1000 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 283144 INFO  (coreLoadExecutor-622-thread-1-processing-n:127.0.0.1:41056_my%2Fk) [n:127.0.0.1:41056_my%2Fk c:control_collection   x:collection1] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 283144 INFO  (coreLoadExecutor-622-thread-1-processing-n:127.0.0.1:41056_my%2Fk) [n:127.0.0.1:41056_my%2Fk c:control_collection   x:collection1] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 283145 INFO  (coreLoadExecutor-622-thread-1-processing-n:127.0.0.1:41056_my%2Fk) [n:127.0.0.1:41056_my%2Fk c:control_collection   x:collection1] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=30, maxMergeAtOnceExplicit=47, maxMergedSegmentMB=18.94921875, floorSegmentMB=2.080078125, forceMergeDeletesPctAllowed=28.70662462224781, segmentsPerTier=40.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.5672675486129973
   [junit4]   2> 283146 INFO  (coreLoadExecutor-622-thread-1-processing-n:127.0.0.1:41056_my%2Fk) [n:127.0.0.1:41056_my%2Fk c:control_collection   x:collection1] o.a.s.s.SolrIndexSearcher Opening [Searcher@7ebf94[collection1] main]
   [junit4]   2> 283148 INFO  (coreLoadExecutor-622-thread-1-processing-n:127.0.0.1:41056_my%2Fk) [n:127.0.0.1:41056_my%2Fk c:control_collection   x:collection1] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 283148 INFO  (coreLoadExecutor-622-thread-1-processing-n:127.0.0.1:41056_my%2Fk) [n:127.0.0.1:41056_my%2Fk c:control_collection   x:collection1] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 283148 INFO  (coreLoadExecutor-622-thread-1-processing-n:127.0.0.1:41056_my%2Fk) [n:127.0.0.1:41056_my%2Fk c:control_collection   x:collection1] o.a.s.h.ReplicationHandler Commits will be reserved for  10000
   [junit4]   2> 283150 INFO  (searcherExecutor-623-thread-1-processing-n:127.0.0.1:41056_my%2Fk x:collection1 c:control_collection) [n:127.0.0.1:41056_my%2Fk c:control_collection   x:collection1] o.a.s.c.SolrCore [collection1] Registered new searcher Searcher@7ebf94[collection1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 283152 INFO  (coreLoadExecutor-622-thread-1-processing-n:127.0.0.1:41056_my%2Fk) [n:127.0.0.1:41056_my%2Fk c:control_collection   x:collection1] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1555776650128392192
   [junit4]   2> 283165 INFO  (coreZkRegister-615-thread-1-processing-n:127.0.0.1:41056_my%2Fk x:collection1 c:control_collection) [n:127.0.0.1:41056_my%2Fk c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 283165 INFO  (coreZkRegister-615-thread-1-processing-n:127.0.0.1:41056_my%2Fk x:collection1 c:control_collection) [n:127.0.0.1:41056_my%2Fk c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 283165 INFO  (coreZkRegister-615-thread-1-processing-n:127.0.0.1:41056_my%2Fk x:collection1 c:control_collection) [n:127.0.0.1:41056_my%2Fk c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.c.SyncStrategy Sync replicas to https://127.0.0.1:41056/my/k/collection1/
   [junit4]   2> 283166 INFO  (coreZkRegister-615-thread-1-processing-n:127.0.0.1:41056_my%2Fk x:collection1 c:control_collection) [n:127.0.0.1:41056_my%2Fk c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 283166 INFO  (coreZkRegister-615-thread-1-processing-n:127.0.0.1:41056_my%2Fk x:collection1 c:control_collection) [n:127.0.0.1:41056_my%2Fk c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.c.SyncStrategy https://127.0.0.1:41056/my/k/collection1/ has no replicas
   [junit4]   2> 283172 INFO  (coreZkRegister-615-thread-1-processing-n:127.0.0.1:41056_my%2Fk x:collection1 c:control_collection) [n:127.0.0.1:41056_my%2Fk c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.c.ShardLeaderElectionContext I am the new leader: https://127.0.0.1:41056/my/k/collection1/ shard1
   [junit4]   2> 283324 INFO  (coreZkRegister-615-thread-1-processing-n:127.0.0.1:41056_my%2Fk x:collection1 c:control_collection) [n:127.0.0.1:41056_my%2Fk c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.c.ZkController I am the leader, no recovery necessary
   [junit4]   2> 283453 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 283454 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:42987/solr ready
   [junit4]   2> 283455 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.c.ChaosMonkey monkey: init - expire sessions:false cause connection loss:false
   [junit4]   2> 283561 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.SolrTestCaseJ4 Writing core.properties file to /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/temp/solr.cloud.PeerSyncReplicationTest_84F4EA3AA629126F-001/shard-1-001/cores/collection1
   [junit4]   2> 283562 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 1 in directory /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/temp/solr.cloud.PeerSyncReplicationTest_84F4EA3AA629126F-001/shard-1-001
   [junit4]   2> 283565 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.e.j.s.Server jetty-9.3.14.v20161028
   [junit4]   2> 283566 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@16d7b88{/my/k,null,AVAILABLE}
   [junit4]   2> 283566 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.e.j.s.AbstractConnector Started ServerConnector@1e1fe6c{SSL,[ssl, http/1.1]}{127.0.0.1:40787}
   [junit4]   2> 283567 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.e.j.s.Server Started @284846ms
   [junit4]   2> 283567 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {solr.data.dir=/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/temp/solr.cloud.PeerSyncReplicationTest_84F4EA3AA629126F-001/tempDir-001/jetty1, solrconfig=solrconfig.xml, hostContext=/my/k, hostPort=40787, coreRootDirectory=/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/temp/solr.cloud.PeerSyncReplicationTest_84F4EA3AA629126F-001/shard-1-001/cores}
   [junit4]   2> 283567 ERROR (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.s.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 283569 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr? version 7.0.0
   [junit4]   2> 283569 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 283569 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null
   [junit4]   2> 283569 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2017-01-06T12:03:53.709Z
   [junit4]   2> 283573 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 283573 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.c.SolrXmlConfig Loading container configuration from /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/temp/solr.cloud.PeerSyncReplicationTest_84F4EA3AA629126F-001/shard-1-001/solr.xml
   [junit4]   2> 283584 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:42987/solr
   [junit4]   2> 283598 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [n:127.0.0.1:40787_my%2Fk    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 283607 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [n:127.0.0.1:40787_my%2Fk    ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:40787_my%2Fk
   [junit4]   2> 283609 INFO  (zkCallback-237-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 283610 INFO  (zkCallback-233-thread-1-processing-n:127.0.0.1:41056_my%2Fk) [n:127.0.0.1:41056_my%2Fk    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 283610 INFO  (zkCallback-242-thread-1-processing-n:127.0.0.1:40787_my%2Fk) [n:127.0.0.1:40787_my%2Fk    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 283816 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [n:127.0.0.1:40787_my%2Fk    ] o.a.s.c.CorePropertiesLocator Found 1 core definitions underneath /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/temp/solr.cloud.PeerSyncReplicationTest_84F4EA3AA629126F-001/shard-1-001/cores
   [junit4]   2> 283816 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [n:127.0.0.1:40787_my%2Fk    ] o.a.s.c.CorePropertiesLocator Cores are: [collection1]
   [junit4]   2> 283819 INFO  (OverseerStateUpdate-97236040510668805-127.0.0.1:41056_my%2Fk-n_0000000000) [n:127.0.0.1:41056_my%2Fk    ] o.a.s.c.o.ReplicaMutator Assigning new node to shard shard=shard1
   [junit4]   2> 284837 WARN  (coreLoadExecutor-633-thread-1-processing-n:127.0.0.1:40787_my%2Fk) [n:127.0.0.1:40787_my%2Fk c:collection1   x:collection1] o.a.s.c.Config Beginning with Solr 5.5, <mergePolicy> is deprecated, use <mergePolicyFactory> instead.
   [junit4]   2> 284838 INFO  (coreLoadExecutor-633-thread-1-processing-n:127.0.0.1:40787_my%2Fk) [n:127.0.0.1:40787_my%2Fk c:collection1   x:collection1] o.a.s.c.SolrConfig Using Lucene MatchVersion: 7.0.0
   [junit4]   2> 284868 INFO  (coreLoadExecutor-633-thread-1-processing-n:127.0.0.1:40787_my%2Fk) [n:127.0.0.1:40787_my%2Fk c:collection1   x:collection1] o.a.s.s.IndexSchema [collection1] Schema name=test
   [junit4]   2> 284977 WARN  (coreLoadExecutor-633-thread-1-processing-n:127.0.0.1:40787_my%2Fk) [n:127.0.0.1:40787_my%2Fk c:collection1   x:collection1] o.a.s.s.IndexSchema [collection1] default search field in schema is text. WARNING: Deprecated, please use 'df' on request instead.
   [junit4]   2> 284981 INFO  (coreLoadExecutor-633-thread-1-processing-n:127.0.0.1:40787_my%2Fk) [n:127.0.0.1:40787_my%2Fk c:collection1   x:collection1] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id
   [junit4]   2> 285001 INFO  (coreLoadExecutor-633-thread-1-processing-n:127.0.0.1:40787_my%2Fk) [n:127.0.0.1:40787_my%2Fk c:collection1   x:collection1] o.a.s.c.CoreContainer Creating SolrCore 'collection1' using configuration from collection collection1
   [junit4]   2> 285001 INFO  (coreLoadExecutor-633-thread-1-processing-n:127.0.0.1:40787_my%2Fk) [n:127.0.0.1:40787_my%2Fk c:collection1   x:collection1] o.a.s.c.SolrCore [[collection1] ] Opening new SolrCore at [/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/temp/solr.cloud.PeerSyncReplicationTest_84F4EA3AA629126F-001/shard-1-001/cores/collection1], dataDir=[/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/temp/solr.cloud.PeerSyncReplicationTest_84F4EA3AA629126F-001/shard-1-001/cores/collection1/data/]
   [junit4]   2> 285002 INFO  (coreLoadExecutor-633-thread-1-processing-n:127.0.0.1:40787_my%2Fk) [n:127.0.0.1:40787_my%2Fk c:collection1   x:collection1] o.a.s.c.JmxMonitoredMap JMX monitoring is enabled. Adding Solr mbeans to JMX Server: com.sun.jmx.mbeanserver.JmxMBeanServer@de3858
   [junit4]   2> 285006 INFO  (coreLoadExecutor-633-thread-1-processing-n:127.0.0.1:40787_my%2Fk) [n:127.0.0.1:40787_my%2Fk c:collection1   x:collection1] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=16, maxMergeAtOnceExplicit=39, maxMergedSegmentMB=80.634765625, floorSegmentMB=1.9072265625, forceMergeDeletesPctAllowed=29.643981740979196, segmentsPerTier=12.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.43336023902830456
   [junit4]   2> 285060 WARN  (coreLoadExecutor-633-thread-1-processing-n:127.0.0.1:40787_my%2Fk) [n:127.0.0.1:40787_my%2Fk c:collection1   x:collection1] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 285089 INFO  (coreLoadExecutor-633-thread-1-processing-n:127.0.0.1:40787_my%2Fk) [n:127.0.0.1:40787_my%2Fk c:collection1   x:collection1] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog
   [junit4]   2> 285089 INFO  (coreLoadExecutor-633-thread-1-processing-n:127.0.0.1:40787_my%2Fk) [n:127.0.0.1:40787_my%2Fk c:collection1   x:collection1] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=1000 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 285090 INFO  (coreLoadExecutor-633-thread-1-processing-n:127.0.0.1:40787_my%2Fk) [n:127.0.0.1:40787_my%2Fk c:collection1   x:collection1] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 285090 INFO  (coreLoadExecutor-633-thread-1-processing-n:127.0.0.1:40787_my%2Fk) [n:127.0.0.1:40787_my%2Fk c:collection1   x:collection1] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 285091 INFO  (coreLoadExecutor-633-thread-1-processing-n:127.0.0.1:40787_my%2Fk) [n:127.0.0.1:40787_my%2Fk c:collection1   x:collection1] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=30, maxMergeAtOnceExplicit=47, maxMergedSegmentMB=18.94921875, floorSegmentMB=2.080078125, forceMergeDeletesPctAllowed=28.70662462224781, segmentsPerTier=40.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.5672675486129973
   [junit4]   2> 285092 INFO  (coreLoadExecutor-633-thread-1-processing-n:127.0.0.1:40787_my%2Fk) [n:127.0.0.1:40787_my%2Fk c:collection1   x:collection1] o.a.s.s.SolrIndexSearcher Opening [Searcher@caec[collection1] main]
   [junit4]   2> 285093 INFO  (coreLoadExecutor-633-thread-1-processing-n:127.0.0.1:40787_my%2Fk) [n:127.0.0.1:40787_my%2Fk c:collection1   x:collection1] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 285094 INFO  (coreLoadExecutor-633-thread-1-processing-n:127.0.0.1:40787_my%2Fk) [n:127.0.0.1:40787_my%2Fk c:collection1   x:collection1] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 285094 INFO  (coreLoadExecutor-633-thread-1-processing-n:127.0.0.1:40787_my%2Fk) [n:127.0.0.1:40787_my%2Fk c:collection1   x:collection1] o.a.s.h.ReplicationHandler Commits will be reserved for  10000
   [junit4]   2> 285095 INFO  (coreLoadExecutor-633-thread-1-processing-n:127.0.0.1:40787_my%2Fk) [n:127.0.0.1:40787_my%2Fk c:collection1   x:collection1] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1555776652165775360
   [junit4]   2> 285099 INFO  (searcherExecutor-634-thread-1-processing-n:127.0.0.1:40787_my%2Fk x:collection1 c:collection1) [n:127.0.0.1:40787_my%2Fk c:collection1   x:collection1] o.a.s.c.SolrCore [collection1] Registered new searcher Searcher@caec[collection1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 285104 INFO  (coreZkRegister-628-thread-1-processing-n:127.0.0.1:40787_my%2Fk x:collection1 c:collection1) [n:127.0.0.1:40787_my%2Fk c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 285104 INFO  (coreZkRegister-628-thread-1-processing-n:127.0.0.1:40787_my%2Fk x:collection1 c:collection1) [n:127.0.0.1:40787_my%2Fk c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 285104 INFO  (coreZkRegister-628-thread-1-processing-n:127.0.0.1:40787_my%2Fk x:collection1 c:collection1) [n:127.0.0.1:40787_my%2Fk c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.c.SyncStrategy Sync replicas to https://127.0.0.1:40787/my/k/collection1/
   [junit4]   2> 285104 INFO  (coreZkRegister-628-thread-1-processing-n:127.0.0.1:40787_my%2Fk x:collection1 c:collection1) [n:127.0.0.1:40787_my%2Fk c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 285104 INFO  (coreZkRegister-628-thread-1-processing-n:127.0.0.1:40787_my%2Fk x:collection1 c:collection1) [n:127.0.0.1:40787_my%2Fk c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.c.SyncStrategy https://127.0.0.1:40787/my/k/collection1/ has no replicas
   [junit4]   2> 285111 INFO  (coreZkRegister-628-thread-1-processing-n:127.0.0.1:40787_my%2Fk x:collection1 c:collection1) [n:127.0.0.1:40787_my%2Fk c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.c.ShardLeaderElectionContext I am the new leader: https://127.0.0.1:40787/my/k/collection1/ shard1
   [junit4]   2> 285264 INFO  (coreZkRegister-628-thread-1-processing-n:127.0.0.1:40787_my%2Fk x:collection1 c:collection1) [n:127.0.0.1:40787_my%2Fk c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.c.ZkController I am the leader, no recovery necessary
   [junit4]   2> 285509 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.SolrTestCaseJ4 Writing core.properties file to /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/temp/solr.cloud.PeerSyncReplicationTest_84F4EA3AA629126F-001/shard-2-001/cores/collection1
   [junit4]   2> 285510 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 2 in directory /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/temp/solr.cloud.PeerSyncReplicationTest_84F4EA3AA629126F-001/shard-2-001
   [junit4]   2> 285512 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.e.j.s.Server jetty-9.3.14.v20161028
   [junit4]   2> 285513 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@77da1b{/my/k,null,AVAILABLE}
   [junit4]   2> 285514 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.e.j.s.AbstractConnector Started ServerConnector@163639{SSL,[ssl, http/1.1]}{127.0.0.1:34596}
   [junit4]   2> 285514 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.e.j.s.Server Started @286793ms
   [junit4]   2> 285514 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {solr.data.dir=/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/temp/solr.cloud.PeerSyncReplicationTest_84F4EA3AA629126F-001/tempDir-001/jetty2, solrconfig=solrconfig.xml, hostContext=/my/k, hostPort=34596, coreRootDirectory=/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/temp/solr.cloud.PeerSyncReplicationTest_84F4EA3AA629126F-001/shard-2-001/cores}
   [junit4]   2> 285515 ERROR (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.s.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 285515 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr? version 7.0.0
   [junit4]   2> 285515 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 285515 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null
   [junit4]   2> 285515 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2017-01-06T12:03:55.655Z
   [junit4]   2> 285520 WARN  (NIOServerCxn.Factory:0.0.0.0/0.0.0.0:0) [    ] o.a.z.s.NIOServerCnxn caught end of stream exception
   [junit4]   2> EndOfStreamException: Unable to read additional data from client sessionid 0x15973a9e601000a, likely client has closed socket
   [junit4]   2> 	at org.apache.zookeeper.server.NIOServerCnxn.doIO(NIOServerCnxn.java:228)
   [junit4]   2> 	at org.apache.zookeeper.server.NIOServerCnxnFactory.run(NIOServerCnxnFactory.java:208)
   [junit4]   2> 	at java.lang.Thread.run(Thread.java:745)
   [junit4]   2> 285521 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 285521 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.c.SolrXmlConfig Loading container configuration from /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/temp/solr.cloud.PeerSyncReplicationTest_84F4EA3AA629126F-001/shard-2-001/solr.xml
   [junit4]   2> 285530 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:42987/solr
   [junit4]   2> 285561 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [n:127.0.0.1:34596_my%2Fk    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (2)
   [junit4]   2> 285569 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [n:127.0.0.1:34596_my%2Fk    ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:34596_my%2Fk
   [junit4]   2> 285572 INFO  (zkCallback-233-thread-2-processing-n:127.0.0.1:41056_my%2Fk) [n:127.0.0.1:41056_my%2Fk    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3)
   [junit4]   2> 285572 INFO  (zkCallback-237-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3)
   [junit4]   2> 285572 INFO  (zkCallback-242-thread-1-processing-n:127.0.0.1:40787_my%2Fk) [n:127.0.0.1:40787_my%2Fk    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3)
   [junit4]   2> 285573 INFO  (zkCallback-248-thread-1-processing-n:127.0.0.1:34596_my%2Fk) [n:127.0.0.1:34596_my%2Fk    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3)
   [junit4]   2> 285974 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [n:127.0.0.1:34596_my%2Fk    ] o.a.s.c.CorePropertiesLocator Found 1 core definitions underneath /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/temp/solr.cloud.PeerSyncReplicationTest_84F4EA3AA629126F-001/shard-2-001/cores
   [junit4]   2> 285974 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [n:127.0.0.1:34596_my%2Fk    ] o.a.s.c.CorePropertiesLocator Cores are: [collection1]
   [junit4]   2> 285978 INFO  (OverseerStateUpdate-97236040510668805-127.0.0.1:41056_my%2Fk-n_0000000000) [n:127.0.0.1:41056_my%2Fk    ] o.a.s.c.o.ReplicaMutator Assigning new node to shard shard=shard1
   [junit4]   2> 286990 WARN  (coreLoadExecutor-644-thread-1-processing-n:127.0.0.1:34596_my%2Fk) [n:127.0.0.1:34596_my%2Fk c:collection1   x:collection1] o.a.s.c.Config Beginning with Solr 5.5, <mergePolicy> is deprecated, use <mergePolicyFactory> instead.
   [junit4]   2> 286991 INFO  (coreLoadExecutor-644-thread-1-processing-n:127.0.0.1:34596_my%2Fk) [n:127.0.0.1:34596_my%2Fk c:collection1   x:collection1] o.a.s.c.SolrConfig Using Lucene MatchVersion: 7.0.0
   [junit4]   2> 287013 INFO  (coreLoadExecutor-644-thread-1-processing-n:127.0.0.1:34596_my%2Fk) [n:127.0.0.1:34596_my%2Fk c:collection1   x:collection1] o.a.s.s.IndexSchema [collection1] Schema name=test
   [junit4]   2> 287121 WARN  (coreLoadExecutor-644-thread-1-processing-n:127.0.0.1:34596_my%2Fk) [n:127.0.0.1:34596_my%2Fk c:collection1   x:collection1] o.a.s.s.IndexSchema [collection1] default search field in schema is text. WARNING: Deprecated, please use 'df' on request instead.
   [junit4]   2> 287123 INFO  (coreLoadExecutor-644-thread-1-processing-n:127.0.0.1:34596_my%2Fk) [n:127.0.0.1:34596_my%2Fk c:collection1   x:collection1] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id
   [junit4]   2> 287136 INFO  (coreLoadExecutor-644-thread-1-processing-n:127.0.0.1:34596_my%2Fk) [n:127.0.0.1:34596_my%2Fk c:collection1   x:collection1] o.a.s.c.CoreContainer Creating SolrCore 'collection1' using configuration from collection collection1
   [junit4]   2> 287136 INFO  (coreLoadExecutor-644-thread-1-processing-n:127.0.0.1:34596_my%2Fk) [n:127.0.0.1:34596_my%2Fk c:collection1   x:collection1] o.a.s.c.SolrCore [[collection1] ] Opening new SolrCore at [/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/temp/solr.cloud.PeerSyncReplicationTest_84F4EA3AA629126F-001/shard-2-001/cores/collection1], dataDir=[/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/temp/solr.cloud.PeerSyncReplicationTest_84F4EA3AA629126F-001/shard-2-001/cores/collection1/data/]
   [junit4]   2> 287136 INFO  (coreLoadExecutor-644-thread-1-processing-n:127.0.0.1:34596_my%2Fk) [n:127.0.0.1:34596_my%2Fk c:collection1   x:collection1] o.a.s.c.JmxMonitoredMap JMX monitoring is enabled. Adding Solr mbeans to JMX Server: com.sun.jmx.mbeanserver.JmxMBeanServer@de3858
   [junit4]   2> 287138 INFO  (coreLoadExecutor-644-thread-1-processing-n:127.0.0.1:34596_my%2Fk) [n:127.0.0.1:34596_my%2Fk c:collection1   x:collection1] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=16, maxMergeAtOnceExplicit=39, maxMergedSegmentMB=80.634765625, floorSegmentMB=1.9072265625, forceMergeDeletesPctAllowed=29.643981740979196, segmentsPerTier=12.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.43336023902830456
   [junit4]   2> 287170 WARN  (coreLoadExecutor-644-thread-1-processing-n:127.0.0.1:34596_my%2Fk) [n:127.0.0.1:34596_my%2Fk c:collection1   x:collection1] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 287196 INFO  (coreLoadExecutor-644-thread-1-processing-n:127.0.0.1:34596_my%2Fk) [n:127.0.0.1:34596_my%2Fk c:collection1   x:collection1] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog
   [junit4]   2> 287196 INFO  (coreLoadExecutor-644-thread-1-processing-n:127.0.0.1:34596_my%2Fk) [n:127.0.0.1:34596_my%2Fk c:collection1   x:collection1] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=1000 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 287196 INFO  (coreLoadExecutor-644-thread-1-processing-n:127.0.0.1:34596_my%2Fk) [n:127.0.0.1:34596_my%2Fk c:collection1   x:collection1] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 287196 INFO  (coreLoadExecutor-644-thread-1-processing-n:127.0.0.1:34596_my%2Fk) [n:127.0.0.1:34596_my%2Fk c:collection1   x:collection1] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 287198 INFO  (coreLoadExecutor-644-thread-1-processing-n:127.0.0.1:34596_my%2Fk) [n:127.0.0.1:34596_my%2Fk c:collection1   x:collection1] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=30, maxMergeAtOnceExplicit=47, maxMergedSegmentMB=18.94921875, floorSegmentMB=2.080078125, forceMergeDeletesPctAllowed=28.70662462224781, segmentsPerTier=40.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.5672675486129973
   [junit4]   2> 287198 INFO  (coreLoadExecutor-644-thread-1-processing-n:127.0.0.1:34596_my%2Fk) [n:127.0.0.1:34596_my%2Fk c:collection1   x:collection1] o.a.s.s.SolrIndexSearcher Opening [Searcher@aa8cab[collection1] main]
   [junit4]   2> 287200 INFO  (coreLoadExecutor-644-thread-1-processing-n:127.0.0.1:34596_my%2Fk) [n:127.0.0.1:34596_my%2Fk c:collection1   x:collection1] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 287200 INFO  (coreLoadExecutor-644-thread-1-processing-n:127.0.0.1:34596_my%2Fk) [n:127.0.0.1:34596_my%2Fk c:collection1   x:collection1] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 287200 INFO  (coreLoadExecutor-644-thread-1-processing-n:127.0.0.1:34596_my%2Fk) [n:127.0.0.1:34596_my%2Fk c:collection1   x:collection1] o.a.s.h.ReplicationHandler Commits will be reserved for  10000
   [junit4]   2> 287202 INFO  (searcherExecutor-645-thread-1-processing-n:127.0.0.1:34596_my%2Fk x:collection1 c:collection1) [n:127.0.0.1:34596_my%2Fk c:collection1   x:collection1] o.a.s.c.SolrCore [collection1] Registered new searcher Searcher@aa8cab[collection1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 287203 INFO  (coreLoadExecutor-644-thread-1-processing-n:127.0.0.1:34596_my%2Fk) [n:127.0.0.1:34596_my%2Fk c:collection1   x:collection1] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1555776654376173568
   [junit4]   2> 287206 INFO  (coreZkRegister-639-thread-1-processing-n:127.0.0.1:34596_my%2Fk x:collection1 c:collection1) [n:127.0.0.1:34596_my%2Fk c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.c.ZkController Core needs to recover:collection1
   [junit4]   2> 287206 INFO  (updateExecutor-245-thread-1-processing-n:127.0.0.1:34596_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:34596_my%2Fk c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.DefaultSolrCoreState Running recovery
   [junit4]   2> 287206 INFO  (recoveryExecutor-246-thread-1-processing-n:127.0.0.1:34596_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:34596_my%2Fk c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.c.RecoveryStrategy Starting recovery process. recoveringAfterStartup=true
   [junit4]   2> 287207 INFO  (recoveryExecutor-246-thread-1-processing-n:127.0.0.1:34596_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:34596_my%2Fk c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.c.RecoveryStrategy ###### startupVersions=[[]]
   [junit4]   2> 287207 INFO  (recoveryExecutor-246-thread-1-processing-n:127.0.0.1:34596_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:34596_my%2Fk c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.c.RecoveryStrategy Begin buffering updates. core=[collection1]
   [junit4]   2> 287207 INFO  (recoveryExecutor-246-thread-1-processing-n:127.0.0.1:34596_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:34596_my%2Fk c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.UpdateLog Starting to buffer updates. FSUpdateLog{state=ACTIVE, tlog=null}
   [junit4]   2> 287207 INFO  (recoveryExecutor-246-thread-1-processing-n:127.0.0.1:34596_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:34596_my%2Fk c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.c.RecoveryStrategy Publishing state of core [collection1] as recovering, leader is [https://127.0.0.1:40787/my/k/collection1/] and I am [https://127.0.0.1:34596/my/k/collection1/]
   [junit4]   2> 287209 INFO  (recoveryExecutor-246-thread-1-processing-n:127.0.0.1:34596_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:34596_my%2Fk c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.c.RecoveryStrategy Sending prep recovery command to [https://127.0.0.1:40787/my/k]; [WaitForState: action=PREPRECOVERY&core=collection1&nodeName=127.0.0.1:34596_my%252Fk&coreNodeName=core_node2&state=recovering&checkLive=true&onlyIfLeader=true&onlyIfLeaderActive=true]
   [junit4]   2> 287314 INFO  (qtp24575723-1751) [n:127.0.0.1:40787_my%2Fk    ] o.a.s.h.a.PrepRecoveryOp Going to wait for coreNodeName: core_node2, state: recovering, checkLive: true, onlyIfLeader: true, onlyIfLeaderActive: true
   [junit4]   2> 287314 INFO  (qtp24575723-1751) [n:127.0.0.1:40787_my%2Fk    ] o.a.s.h.a.PrepRecoveryOp Will wait a max of 183 seconds to see collection1 (shard1 of collection1) have state: recovering
   [junit4]   2> 287314 INFO  (qtp24575723-1751) [n:127.0.0.1:40787_my%2Fk    ] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection1, shard=shard1, thisCore=collection1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=recovering, localState=active, nodeName=127.0.0.1:34596_my%2Fk, coreNodeName=core_node2, onlyIfActiveCheckResult=false, nodeProps: core_node2:{"core":"collection1","base_url":"https://127.0.0.1:34596/my/k","node_name":"127.0.0.1:34596_my%2Fk","state":"recovering"}
   [junit4]   2> 287314 INFO  (qtp24575723-1751) [n:127.0.0.1:40787_my%2Fk    ] o.a.s.h.a.PrepRecoveryOp Waited coreNodeName: core_node2, state: recovering, checkLive: true, onlyIfLeader: true for: 0 seconds.
   [junit4]   2> 287315 INFO  (qtp24575723-1751) [n:127.0.0.1:40787_my%2Fk    ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={nodeName=127.0.0.1:34596_my%252Fk&onlyIfLeaderActive=true&core=collection1&coreNodeName=core_node2&action=PREPRECOVERY&checkLive=true&state=recovering&onlyIfLeader=true&wt=javabin&version=2} status=0 QTime=0
   [junit4]   2> 287642 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.SolrTestCaseJ4 Writing core.properties file to /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/temp/solr.cloud.PeerSyncReplicationTest_84F4EA3AA629126F-001/shard-3-001/cores/collection1
   [junit4]   2> 287643 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 3 in directory /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/temp/solr.cloud.PeerSyncReplicationTest_84F4EA3AA629126F-001/shard-3-001
   [junit4]   2> 287645 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.e.j.s.Server jetty-9.3.14.v20161028
   [junit4]   2> 287646 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@978934{/my/k,null,AVAILABLE}
   [junit4]   2> 287648 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.e.j.s.AbstractConnector Started ServerConnector@28f74{SSL,[ssl, http/1.1]}{127.0.0.1:45541}
   [junit4]   2> 287648 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.e.j.s.Server Started @288927ms
   [junit4]   2> 287648 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {solr.data.dir=/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/temp/solr.cloud.PeerSyncReplicationTest_84F4EA3AA629126F-001/tempDir-001/jetty3, solrconfig=solrconfig.xml, hostContext=/my/k, hostPort=45541, coreRootDirectory=/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/temp/solr.cloud.PeerSyncReplicationTest_84F4EA3AA629126F-001/shard-3-001/cores}
   [junit4]   2> 287648 ERROR (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.s.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 287648 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr? version 7.0.0
   [junit4]   2> 287648 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 287648 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null
   [junit4]   2> 287648 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2017-01-06T12:03:57.788Z
   [junit4]   2> 287652 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 287652 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.c.SolrXmlConfig Loading container configuration from /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/temp/solr.cloud.PeerSyncReplicationTest_84F4EA3AA629126F-001/shard-3-001/solr.xml
   [junit4]   2> 287660 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:42987/solr
   [junit4]   2> 287667 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [n:127.0.0.1:45541_my%2Fk    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (3)
   [junit4]   2> 287671 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [n:127.0.0.1:45541_my%2Fk    ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:45541_my%2Fk
   [junit4]   2> 287672 INFO  (zkCallback-233-thread-2-processing-n:127.0.0.1:41056_my%2Fk) [n:127.0.0.1:41056_my%2Fk    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4)
   [junit4]   2> 287672 INFO  (zkCallback-242-thread-1-processing-n:127.0.0.1:40787_my%2Fk) [n:127.0.0.1:40787_my%2Fk    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4)
   [junit4]   2> 287672 INFO  (zkCallback-248-thread-1-processing-n:127.0.0.1:34596_my%2Fk) [n:127.0.0.1:34596_my%2Fk    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4)
   [junit4]   2> 287673 INFO  (zkCallback-255-thread-1-processing-n:127.0.0.1:45541_my%2Fk) [n:127.0.0.1:45541_my%2Fk    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4)
   [junit4]   2> 287672 INFO  (zkCallback-237-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4)
   [junit4]   2> 287923 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [n:127.0.0.1:45541_my%2Fk    ] o.a.s.c.CorePropertiesLocator Found 1 core definitions underneath /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/temp/solr.cloud.PeerSyncReplicationTest_84F4EA3AA629126F-001/shard-3-001/cores
   [junit4]   2> 287923 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [n:127.0.0.1:45541_my%2Fk    ] o.a.s.c.CorePropertiesLocator Cores are: [collection1]
   [junit4]   2> 287925 INFO  (OverseerStateUpdate-97236040510668805-127.0.0.1:41056_my%2Fk-n_0000000000) [n:127.0.0.1:41056_my%2Fk    ] o.a.s.c.o.ReplicaMutator Assigning new node to shard shard=shard1
   [junit4]   2> 288936 WARN  (coreLoadExecutor-655-thread-1-processing-n:127.0.0.1:45541_my%2Fk) [n:127.0.0.1:45541_my%2Fk c:collection1   x:collection1] o.a.s.c.Config Beginning with Solr 5.5, <mergePolicy> is deprecated, use <mergePolicyFactory> instead.
   [junit4]   2> 288937 INFO  (coreLoadExecutor-655-thread-1-processing-n:127.0.0.1:45541_my%2Fk) [n:127.0.0.1:45541_my%2Fk c:collection1   x:collection1] o.a.s.c.SolrConfig Using Lucene MatchVersion: 7.0.0
   [junit4]   2> 288954 INFO  (coreLoadExecutor-655-thread-1-processing-n:127.0.0.1:45541_my%2Fk) [n:127.0.0.1:45541_my%2Fk c:collection1   x:collection1] o.a.s.s.IndexSchema [collection1] Schema name=test
   [junit4]   2> 289043 WARN  (coreLoadExecutor-655-thread-1-processing-n:127.0.0.1:45541_my%2Fk) [n:127.0.0.1:45541_my%2Fk c:collection1   x:collection1] o.a.s.s.IndexSchema [collection1] default search field in schema is text. WARNING: Deprecated, please use 'df' on request instead.
   [junit4]   2> 289046 INFO  (coreLoadExecutor-655-thread-1-processing-n:127.0.0.1:45541_my%2Fk) [n:127.0.0.1:45541_my%2Fk c:collection1   x:collection1] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id
   [junit4]   2> 289057 INFO  (coreLoadExecutor-655-thread-1-processing-n:127.0.0.1:45541_my%2Fk) [n:127.0.0.1:45541_my%2Fk c:collection1   x:collection1] o.a.s.c.CoreContainer Creating SolrCore 'collection1' using configuration from collection collection1
   [junit4]   2> 289058 INFO  (coreLoadExecutor-655-thread-1-processing-n:127.0.0.1:45541_my%2Fk) [n:127.0.0.1:45541_my%2Fk c:collection1   x:collection1] o.a.s.c.SolrCore [[collection1] ] Opening new SolrCore at [/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/temp/solr.cloud.PeerSyncReplicationTest_84F4EA3AA629126F-001/shard-3-001/cores/collection1], dataDir=[/home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/temp/solr.cloud.PeerSyncReplicationTest_84F4EA3AA629126F-001/shard-3-001/cores/collection1/data/]
   [junit4]   2> 289058 INFO  (coreLoadExecutor-655-thread-1-processing-n:127.0.0.1:45541_my%2Fk) [n:127.0.0.1:45541_my%2Fk c:collection1   x:collection1] o.a.s.c.JmxMonitoredMap JMX monitoring is enabled. Adding Solr mbeans to JMX Server: com.sun.jmx.mbeanserver.JmxMBeanServer@de3858
   [junit4]   2> 289061 INFO  (coreLoadExecutor-655-thread-1-processing-n:127.0.0.1:45541_my%2Fk) [n:127.0.0.1:45541_my%2Fk c:collection1   x:collection1] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=16, maxMergeAtOnceExplicit=39, maxMergedSegmentMB=80.634765625, floorSegmentMB=1.9072265625, forceMergeDeletesPctAllowed=29.643981740979196, segmentsPerTier=12.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.43336023902830456
   [junit4]   2> 289093 WARN  (coreLoadExecutor-655-thread-1-processing-n:127.0.0.1:45541_my%2Fk) [n:127.0.0.1:45541_my%2Fk c:collection1   x:collection1] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 289112 INFO  (coreLoadExecutor-655-thread-1-processing-n:127.0.0.1:45541_my%2Fk) [n:127.0.0.1:45541_my%2Fk c:collection1   x:collection1] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog
   [junit4]   2> 289112 INFO  (coreLoadExecutor-655-thread-1-processing-n:127.0.0.1:45541_my%2Fk) [n:127.0.0.1:45541_my%2Fk c:collection1   x:collection1] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=1000 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 289113 INFO  (coreLoadExecutor-655-thread-1-processing-n:127.0.0.1:45541_my%2Fk) [n:127.0.0.1:45541_my%2Fk c:collection1   x:collection1] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 289113 INFO  (coreLoadExecutor-655-thread-1-processing-n:127.0.0.1:45541_my%2Fk) [n:127.0.0.1:45541_my%2Fk c:collection1   x:collection1] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 289114 INFO  (coreLoadExecutor-655-thread-1-processing-n:127.0.0.1:45541_my%2Fk) [n:127.0.0.1:45541_my%2Fk c:collection1   x:collection1] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=30, maxMergeAtOnceExplicit=47, maxMergedSegmentMB=18.94921875, floorSegmentMB=2.080078125, forceMergeDeletesPctAllowed=28.70662462224781, segmentsPerTier=40.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.5672675486129973
   [junit4]   2> 289115 INFO  (coreLoadExecutor-655-thread-1-processing-n:127.0.0.1:45541_my%2Fk) [n:127.0.0.1:45541_my%2Fk c:collection1   x:collection1] o.a.s.s.SolrIndexSearcher Opening [Searcher@11160a6[collection1] main]
   [junit4]   2> 289116 INFO  (coreLoadExecutor-655-thread-1-processing-n:127.0.0.1:45541_my%2Fk) [n:127.0.0.1:45541_my%2Fk c:collection1   x:collection1] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 289116 INFO  (coreLoadExecutor-655-thread-1-processing-n:127.0.0.1:45541_my%2Fk) [n:127.0.0.1:45541_my%2Fk c:collection1   x:collection1] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 289116 INFO  (coreLoadExecutor-655-thread-1-processing-n:127.0.0.1:45541_my%2Fk) [n:127.0.0.1:45541_my%2Fk c:collection1   x:collection1] o.a.s.h.ReplicationHandler Commits will be reserved for  10000
   [junit4]   2> 289118 INFO  (searcherExecutor-656-thread-1-processing-n:127.0.0.1:45541_my%2Fk x:collection1 c:collection1) [n:127.0.0.1:45541_my%2Fk c:collection1   x:collection1] o.a.s.c.SolrCore [collection1] Registered new searcher Searcher@11160a6[collection1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 289119 INFO  (coreLoadExecutor-655-thread-1-processing-n:127.0.0.1:45541_my%2Fk) [n:127.0.0.1:45541_my%2Fk c:collection1   x:collection1] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1555776656385245184
   [junit4]   2> 289122 INFO  (coreZkRegister-650-thread-1-processing-n:127.0.0.1:45541_my%2Fk x:collection1 c:collection1) [n:127.0.0.1:45541_my%2Fk c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.c.ZkController Core needs to recover:collection1
   [junit4]   2> 289122 INFO  (updateExecutor-252-thread-1-processing-n:127.0.0.1:45541_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:45541_my%2Fk c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.DefaultSolrCoreState Running recovery
   [junit4]   2> 289122 INFO  (recoveryExecutor-253-thread-1-processing-n:127.0.0.1:45541_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:45541_my%2Fk c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.c.RecoveryStrategy Starting recovery process. recoveringAfterStartup=true
   [junit4]   2> 289123 INFO  (recoveryExecutor-253-thread-1-processing-n:127.0.0.1:45541_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:45541_my%2Fk c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.c.RecoveryStrategy ###### startupVersions=[[]]
   [junit4]   2> 289123 INFO  (recoveryExecutor-253-thread-1-processing-n:127.0.0.1:45541_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:45541_my%2Fk c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.c.RecoveryStrategy Begin buffering updates. core=[collection1]
   [junit4]   2> 289123 INFO  (recoveryExecutor-253-thread-1-processing-n:127.0.0.1:45541_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:45541_my%2Fk c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.UpdateLog Starting to buffer updates. FSUpdateLog{state=ACTIVE, tlog=null}
   [junit4]   2> 289123 INFO  (recoveryExecutor-253-thread-1-processing-n:127.0.0.1:45541_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:45541_my%2Fk c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.c.RecoveryStrategy Publishing state of core [collection1] as recovering, leader is [https://127.0.0.1:40787/my/k/collection1/] and I am [https://127.0.0.1:45541/my/k/collection1/]
   [junit4]   2> 289125 INFO  (recoveryExecutor-253-thread-1-processing-n:127.0.0.1:45541_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:45541_my%2Fk c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.c.RecoveryStrategy Sending prep recovery command to [https://127.0.0.1:40787/my/k]; [WaitForState: action=PREPRECOVERY&core=collection1&nodeName=127.0.0.1:45541_my%252Fk&coreNodeName=core_node3&state=recovering&checkLive=true&onlyIfLeader=true&onlyIfLeaderActive=true]
   [junit4]   2> 289133 INFO  (qtp24575723-1756) [n:127.0.0.1:40787_my%2Fk    ] o.a.s.h.a.PrepRecoveryOp Going to wait for coreNodeName: core_node3, state: recovering, checkLive: true, onlyIfLeader: true, onlyIfLeaderActive: true
   [junit4]   2> 289133 INFO  (qtp24575723-1756) [n:127.0.0.1:40787_my%2Fk    ] o.a.s.h.a.PrepRecoveryOp Will wait a max of 183 seconds to see collection1 (shard1 of collection1) have state: recovering
   [junit4]   2> 289133 INFO  (qtp24575723-1756) [n:127.0.0.1:40787_my%2Fk    ] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection1, shard=shard1, thisCore=collection1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=down, localState=active, nodeName=127.0.0.1:45541_my%2Fk, coreNodeName=core_node3, onlyIfActiveCheckResult=false, nodeProps: core_node3:{"core":"collection1","base_url":"https://127.0.0.1:45541/my/k","node_name":"127.0.0.1:45541_my%2Fk","state":"down"}
   [junit4]   2> 289478 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.SolrTestCaseJ4 ###Starting test
   [junit4]   2> 289478 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.c.AbstractFullDistribZkTestBase Wait for recoveries to finish - wait 30 for each attempt
   [junit4]   2> 289478 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.c.AbstractDistribZkTestBase Wait for recoveries to finish - collection: collection1 failOnTimeout:true timeout (sec):30
   [junit4]   2> 290134 INFO  (qtp24575723-1756) [n:127.0.0.1:40787_my%2Fk    ] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection1, shard=shard1, thisCore=collection1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=recovering, localState=active, nodeName=127.0.0.1:45541_my%2Fk, coreNodeName=core_node3, onlyIfActiveCheckResult=false, nodeProps: core_node3:{"core":"collection1","base_url":"https://127.0.0.1:45541/my/k","node_name":"127.0.0.1:45541_my%2Fk","state":"recovering"}
   [junit4]   2> 290134 INFO  (qtp24575723-1756) [n:127.0.0.1:40787_my%2Fk    ] o.a.s.h.a.PrepRecoveryOp Waited coreNodeName: core_node3, state: recovering, checkLive: true, onlyIfLeader: true for: 1 seconds.
   [junit4]   2> 290134 INFO  (qtp24575723-1756) [n:127.0.0.1:40787_my%2Fk    ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={nodeName=127.0.0.1:45541_my%252Fk&onlyIfLeaderActive=true&core=collection1&coreNodeName=core_node3&action=PREPRECOVERY&checkLive=true&state=recovering&onlyIfLeader=true&wt=javabin&version=2} status=0 QTime=1001
   [junit4]   2> 294316 INFO  (recoveryExecutor-246-thread-1-processing-n:127.0.0.1:34596_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:34596_my%2Fk c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.c.RecoveryStrategy Attempting to PeerSync from [https://127.0.0.1:40787/my/k/collection1/] - recoveringAfterStartup=[true]
   [junit4]   2> 294317 INFO  (recoveryExecutor-246-thread-1-processing-n:127.0.0.1:34596_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:34596_my%2Fk c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.PeerSync PeerSync: core=collection1 url=https://127.0.0.1:34596/my/k START replicas=[https://127.0.0.1:40787/my/k/collection1/] nUpdates=1000
   [junit4]   2> 294325 INFO  (qtp24575723-1752) [n:127.0.0.1:40787_my%2Fk c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.IndexFingerprint IndexFingerprint millis:0.0 result:{maxVersionSpecified=9223372036854775807, maxVersionEncountered=0, maxInHash=0, versionsHash=0, numVersions=0, numDocs=0, maxDoc=0}
   [junit4]   2> 294325 INFO  (qtp24575723-1752) [n:127.0.0.1:40787_my%2Fk c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.c.S.Request [collection1]  webapp=/my/k path=/get params={distrib=false&qt=/get&getFingerprint=9223372036854775807&wt=javabin&version=2} status=0 QTime=1
   [junit4]   2> 294326 INFO  (recoveryExecutor-246-thread-1-processing-n:127.0.0.1:34596_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:34596_my%2Fk c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.IndexFingerprint IndexFingerprint millis:0.0 result:{maxVersionSpecified=9223372036854775807, maxVersionEncountered=0, maxInHash=0, versionsHash=0, numVersions=0, numDocs=0, maxDoc=0}
   [junit4]   2> 294326 INFO  (recoveryExecutor-246-thread-1-processing-n:127.0.0.1:34596_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:34596_my%2Fk c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.PeerSync We are already in sync. No need to do a PeerSync 
   [junit4]   2> 294327 INFO  (recoveryExecutor-246-thread-1-processing-n:127.0.0.1:34596_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:34596_my%2Fk c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.DirectUpdateHandler2 start commit{,optimize=false,openSearcher=true,waitSearcher=true,expungeDeletes=false,softCommit=false,prepareCommit=false}
   [junit4]   2> 294327 INFO  (recoveryExecutor-246-thread-1-processing-n:127.0.0.1:34596_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:34596_my%2Fk c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.DirectUpdateHandler2 No uncommitted changes. Skipping IW.commit.
   [junit4]   2> 294328 INFO  (recoveryExecutor-246-thread-1-processing-n:127.0.0.1:34596_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:34596_my%2Fk c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.DirectUpdateHandler2 end_commit_flush
   [junit4]   2> 294328 INFO  (recoveryExecutor-246-thread-1-processing-n:127.0.0.1:34596_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:34596_my%2Fk c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.c.RecoveryStrategy PeerSync stage of recovery was successful.
   [junit4]   2> 294328 INFO  (recoveryExecutor-246-thread-1-processing-n:127.0.0.1:34596_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:34596_my%2Fk c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.c.RecoveryStrategy Replaying updates buffered during PeerSync.
   [junit4]   2> 294328 INFO  (recoveryExecutor-246-thread-1-processing-n:127.0.0.1:34596_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:34596_my%2Fk c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.c.RecoveryStrategy No replay needed.
   [junit4]   2> 294328 INFO  (recoveryExecutor-246-thread-1-processing-n:127.0.0.1:34596_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:34596_my%2Fk c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.c.RecoveryStrategy Registering as Active after recovery.
   [junit4]   2> 297136 INFO  (recoveryExecutor-253-thread-1-processing-n:127.0.0.1:45541_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:45541_my%2Fk c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.c.RecoveryStrategy Attempting to PeerSync from [https://127.0.0.1:40787/my/k/collection1/] - recoveringAfterStartup=[true]
   [junit4]   2> 297136 INFO  (recoveryExecutor-253-thread-1-processing-n:127.0.0.1:45541_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:45541_my%2Fk c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.PeerSync PeerSync: core=collection1 url=https://127.0.0.1:45541/my/k START replicas=[https://127.0.0.1:40787/my/k/collection1/] nUpdates=1000
   [junit4]   2> 297149 INFO  (qtp24575723-1758) [n:127.0.0.1:40787_my%2Fk c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.IndexFingerprint IndexFingerprint millis:0.0 result:{maxVersionSpecified=9223372036854775807, maxVersionEncountered=0, maxInHash=0, versionsHash=0, numVersions=0, numDocs=0, maxDoc=0}
   [junit4]   2> 297149 INFO  (qtp24575723-1758) [n:127.0.0.1:40787_my%2Fk c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.c.S.Request [collection1]  webapp=/my/k path=/get params={distrib=false&qt=/get&getFingerprint=9223372036854775807&wt=javabin&version=2} status=0 QTime=2
   [junit4]   2> 297159 INFO  (recoveryExecutor-253-thread-1-processing-n:127.0.0.1:45541_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:45541_my%2Fk c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.IndexFingerprint IndexFingerprint millis:1.0 result:{maxVersionSpecified=9223372036854775807, maxVersionEncountered=0, maxInHash=0, versionsHash=0, numVersions=0, numDocs=0, maxDoc=0}
   [junit4]   2> 297159 INFO  (recoveryExecutor-253-thread-1-processing-n:127.0.0.1:45541_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:45541_my%2Fk c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.PeerSync We are already in sync. No need to do a PeerSync 
   [junit4]   2> 297159 INFO  (recoveryExecutor-253-thread-1-processing-n:127.0.0.1:45541_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:45541_my%2Fk c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.DirectUpdateHandler2 start commit{,optimize=false,openSearcher=true,waitSearcher=true,expungeDeletes=false,softCommit=false,prepareCommit=false}
   [junit4]   2> 297159 INFO  (recoveryExecutor-253-thread-1-processing-n:127.0.0.1:45541_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:45541_my%2Fk c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.DirectUpdateHandler2 No uncommitted changes. Skipping IW.commit.
   [junit4]   2> 297161 INFO  (recoveryExecutor-253-thread-1-processing-n:127.0.0.1:45541_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:45541_my%2Fk c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.DirectUpdateHandler2 end_commit_flush
   [junit4]   2> 297161 INFO  (recoveryExecutor-253-thread-1-processing-n:127.0.0.1:45541_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:45541_my%2Fk c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.c.RecoveryStrategy PeerSync stage of recovery was successful.
   [junit4]   2> 297161 INFO  (recoveryExecutor-253-thread-1-processing-n:127.0.0.1:45541_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:45541_my%2Fk c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.c.RecoveryStrategy Replaying updates buffered during PeerSync.
   [junit4]   2> 297161 INFO  (recoveryExecutor-253-thread-1-processing-n:127.0.0.1:45541_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:45541_my%2Fk c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.c.RecoveryStrategy No replay needed.
   [junit4]   2> 297161 INFO  (recoveryExecutor-253-thread-1-processing-n:127.0.0.1:45541_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:45541_my%2Fk c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.c.RecoveryStrategy Registering as Active after recovery.
   [junit4]   2> 297479 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.c.AbstractDistribZkTestBase Recoveries finished - collection: collection1
   [junit4]   2> 297656 INFO  (qtp15855452-1714) [n:127.0.0.1:41056_my%2Fk c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.u.DirectUpdateHandler2 start commit{,optimize=false,openSearcher=true,waitSearcher=true,expungeDeletes=false,softCommit=false,prepareCommit=false}
   [junit4]   2> 297656 INFO  (qtp15855452-1714) [n:127.0.0.1:41056_my%2Fk c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.u.DirectUpdateHandler2 No uncommitted changes. Skipping IW.commit.
   [junit4]   2> 297657 INFO  (qtp15855452-1714) [n:127.0.0.1:41056_my%2Fk c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.u.DirectUpdateHandler2 end_commit_flush
   [junit4]   2> 297657 INFO  (qtp15855452-1714) [n:127.0.0.1:41056_my%2Fk c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/my/k path=/update params={waitSearcher=true&commit=true&softCommit=false&wt=javabin&version=2}{commit=} 0 1
   [junit4]   2> 297773 INFO  (qtp24575723-1756) [n:127.0.0.1:40787_my%2Fk c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.DirectUpdateHandler2 start commit{,optimize=false,openSearcher=true,waitSearcher=true,expungeDeletes=false,softCommit=false,prepareCommit=false}
   [junit4]   2> 297773 INFO  (qtp24575723-1756) [n:127.0.0.1:40787_my%2Fk c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.DirectUpdateHandler2 No uncommitted changes. Skipping IW.commit.
   [junit4]   2> 297774 INFO  (qtp24575723-1756) [n:127.0.0.1:40787_my%2Fk c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.DirectUpdateHandler2 end_commit_flush
   [junit4]   2> 297774 INFO  (qtp24575723-1756) [n:127.0.0.1:40787_my%2Fk c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/my/k path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=true&commit=true&softCommit=false&distrib.from=https://127.0.0.1:40787/my/k/collection1/&commit_end_point=true&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 1
   [junit4]   2> 297958 INFO  (qtp32011800-1785) [n:127.0.0.1:34596_my%2Fk c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.DirectUpdateHandler2 start commit{,optimize=false,openSearcher=true,waitSearcher=true,expungeDeletes=false,softCommit=false,prepareCommit=false}
   [junit4]   2> 297961 INFO  (qtp32011800-1785) [n:127.0.0.1:34596_my%2Fk c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.DirectUpdateHandler2 No uncommitted changes. Skipping IW.commit.
   [junit4]   2> 297962 INFO  (qtp32011800-1785) [n:127.0.0.1:34596_my%2Fk c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.DirectUpdateHandler2 end_commit_flush
   [junit4]   2> 297962 INFO  (qtp32011800-1785) [n:127.0.0.1:34596_my%2Fk c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/my/k path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=true&commit=true&softCommit=false&distrib.from=https://127.0.0.1:40787/my/k/collection1/&commit_end_point=true&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 4
   [junit4]   2> 297990 INFO  (qtp32438619-1821) [n:127.0.0.1:45541_my%2Fk c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.DirectUpdateHandler2 start commit{,optimize=false,openSearcher=true,waitSearcher=true,expungeDeletes=false,softCommit=false,prepareCommit=false}
   [junit4]   2> 297991 INFO  (qtp32438619-1821) [n:127.0.0.1:45541_my%2Fk c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.DirectUpdateHandler2 No uncommitted changes. Skipping IW.commit.
   [junit4]   2> 297992 INFO  (qtp32438619-1821) [n:127.0.0.1:45541_my%2Fk c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.DirectUpdateHandler2 end_commit_flush
   [junit4]   2> 297992 INFO  (qtp32438619-1821) [n:127.0.0.1:45541_my%2Fk c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/my/k path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=true&commit=true&softCommit=false&distrib.from=https://127.0.0.1:40787/my/k/collection1/&commit_end_point=true&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 2
   [junit4]   2> 297997 INFO  (qtp24575723-1758) [n:127.0.0.1:40787_my%2Fk c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/my/k path=/update params={waitSearcher=true&commit=true&softCommit=false&wt=javabin&version=2}{commit=} 0 329
   [junit4]   2> 298006 INFO  (qtp24575723-1757) [n:127.0.0.1:40787_my%2Fk c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.c.S.Request [collection1]  webapp=/my/k path=/select params={q=*:*&distrib=false&tests=checkShardConsistency&rows=0&wt=javabin&version=2} hits=0 status=0 QTime=0
   [junit4]   2> 298027 INFO  (qtp32011800-1785) [n:127.0.0.1:34596_my%2Fk c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.c.S.Request [collection1]  webapp=/my/k path=/select params={q=*:*&distrib=false&tests=checkShardConsistency&rows=0&wt=javabin&version=2} hits=0 status=0 QTime=0
   [junit4]   2> 298039 INFO  (qtp32438619-1816) [n:127.0.0.1:45541_my%2Fk c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.c.S.Request [collection1]  webapp=/my/k path=/select params={q=*:*&distrib=false&tests=checkShardConsistency&rows=0&wt=javabin&version=2} hits=0 status=0 QTime=0
   [junit4]   2> 300046 INFO  (qtp15855452-1719) [n:127.0.0.1:41056_my%2Fk c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/my/k path=/update params={wt=javabin&version=2}{deleteByQuery=*:* (-1555776667839889408)} 0 3
   [junit4]   2> 300059 INFO  (qtp32011800-1779) [n:127.0.0.1:34596_my%2Fk c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/my/k path=/update params={update.distrib=FROMLEADER&_version_=-1555776667845132288&distrib.from=https://127.0.0.1:40787/my/k/collection1/&wt=javabin&version=2}{deleteByQuery=*:* (-1555776667845132288)} 0 4
   [junit4]   2> 300059 INFO  (qtp32438619-1822) [n:127.0.0.1:45541_my%2Fk c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/my/k path=/update params={update.distrib=FROMLEADER&_version_=-1555776667845132288&distrib.from=https://127.0.0.1:40787/my/k/collection1/&wt=javabin&version=2}{deleteByQuery=*:* (-1555776667845132288)} 0 4
   [junit4]   2> 300060 INFO  (qtp24575723-1754) [n:127.0.0.1:40787_my%2Fk c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/my/k path=/update params={wt=javabin&version=2}{deleteByQuery=*:* (-1555776667845132288)} 0 12
   [junit4]   2> 300068 INFO  (qtp32438619-1820) [n:127.0.0.1:45541_my%2Fk c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/my/k path=/update params={update.distrib=FROMLEADER&distrib.from=https://127.0.0.1:40787/my/k/collection1/&wt=javabin&version=2}{add=[0 (1555776667858763776)]} 0 2
   [junit4]   2> 300068 INFO  (qtp32011800-1784) [n:127.0.0.1:34596_my%2Fk c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/my/k path=/update params={update.distrib=FROMLEADER&distrib.from=https://127.0.0.1:40787/my/k/collection1/&wt=javabin&version=2}{add=[0 (1555776667858763776)]} 0 2
   [junit4]   2> 300069 INFO  (qtp24575723-1751) [n:127.0.0.1:40787_my%2Fk c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/my/k path=/update params={wt=javabin&version=2}{add=[0 (1555776667858763776)]} 0 7
   [junit4]   2> 300073 INFO  (qtp32011800-1786) [n:127.0.0.1:34596_my%2Fk c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/my/k path=/update params={update.distrib=FROMLEADER&distrib.from=https://127.0.0.1:40787/my/k/collection1/&wt=javabin&version=2}{add=[1 (1555776667868200960)]} 0 1
   [junit4]   2> 300074 INFO  (qtp32438619-1821) [n:127.0.0.1:45541_my%2Fk c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/my/k path=/update params={update.distrib=FROMLEADER&distrib.from=https://127.0.0.1:40787/my/k/collection1/&wt=javabin&version=2}{add=[1 (1555776667868200960)]} 0 1
   [junit4]   2> 300075 INFO  (qtp24575723-1756) [n:127.0.0.1:40787_my%2Fk c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/my/k path=/update params={wt=javabin&version=2}{add=[1 (1555776667868200960)]} 0 4
   [junit4]   2> 300080 INFO  (qtp32438619-1816) [n:127.0.0.1:45541_my%2Fk c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/my/k path=/update params={update.distrib=FROMLEADER&distrib.from=https://127.0.0.1:40787/my/k/collection1/&wt=javabin&version=2}{add=[2 (1555776667876589568)]} 0 1
   [junit4]   2> 300080 INFO  (qtp32011800-1785) [n:127.0.0.1:34596_my%2Fk c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/my/k path=/update params={update.distrib=FROMLEADER&distrib.from=https://127.0.0.1:40787/my/k/collection1/&wt=javabin&version=2}{add=[2 (1555776667876589568)]} 0 1
   [junit4]   2> 300081 INFO  (qtp24575723-1758) [n:127.0.0.1:40787_my%2Fk c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/my/k path=/update params={wt=javabin&version=2}{add=[2 (1555776667876589568)]} 0 3
   [junit4]   2> 300085 INFO  (qtp32011800-1779) [n:127.0.0.1:34596_my%2Fk c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/my/k path=/update params={update.distrib=FROMLEADER&distrib.from=https://127.0.0.1:40787/my/k/collection1/&wt=javabin&version=2}{add=[3 (1555776667880783872)]} 0 1
   [junit4]   2> 300085 INFO  (qtp32438619-1822) [n:127.0.0.1:45541_my%2Fk c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/my/k path=/update params={update.distrib=FROMLEADER&distrib.from=https://127.0.0.1:40787/my/k/collection1/&wt=javabin&version=2}{add=[3 (1555776667880783872)]} 0 1
   [junit4]   2> 300086 INFO  (qtp24575723-1754) [n:127.0.0.1:40787_my%2Fk c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/my/k path=/update params={wt=javabin&version=2}{add=[3 (1555776667880783872)]} 0 3
   [junit4]   2> 300091 INFO  (qtp32011800-1784) [n:127.0.0.1:34596_my%2Fk c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/my/k path=/update params={update.distrib=FROMLEADER&distrib.from=https://127.0.0.1:40787/my/k/collection1/&wt=javabin&version=2}{add=[4 (1555776667886026752)]} 0 0
   [junit4]   2> 300091 INFO  (qtp32438619-1820) [n:127.0.0.1:45541_my%2Fk c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/my/k path=/update params={update.distrib=FROMLEADER&distrib.from=https://127.0.0.1:40787/my/k/collection1/&wt=javabin&version=2}{add=[4 (1555776667886026752)]} 0 1
   [junit4]   2> 300092 INFO  (qtp24575723-1751) [n:127.0.0.1:40787_my%2Fk c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1]  webapp=/my/k path=/update params={wt=javabin&version=2}{add=[4 (1555776667886026752)]} 0 4
   [junit4]   2> 300096 INFO  (qtp32011800-1786) [n:127.0.0.1:34596_my%2Fk c:collection1 s:shard1 r

[...truncated too long message...]

read.strategy.ExecuteProduceConsume.executeProduceConsume(ExecuteProduceConsume.java:303)
   [junit4]   2> 	at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.produceConsume(ExecuteProduceConsume.java:148)
   [junit4]   2> 	at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:136)
   [junit4]   2> 	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:671)
   [junit4]   2> 	at org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:589)
   [junit4]   2> 	at java.lang.Thread.run(Thread.java:745)
   [junit4]   2> 
   [junit4]   2> 486038 INFO  (qtp24575723-1906) [n:127.0.0.1:40787_my%2Fk    ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={nodeName=127.0.0.1:34596_my%252Fk&onlyIfLeaderActive=true&core=collection1&coreNodeName=core_node2&action=PREPRECOVERY&checkLive=true&state=recovering&onlyIfLeader=true&wt=javabin&version=2} status=400 QTime=162480
   [junit4]   2> 486040 INFO  (qtp24575723-1756) [n:127.0.0.1:40787_my%2Fk    ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={nodeName=127.0.0.1:34596_my%252Fk&onlyIfLeaderActive=true&core=collection1&coreNodeName=core_node2&action=PREPRECOVERY&checkLive=true&state=recovering&onlyIfLeader=true&wt=javabin&version=2} status=400 QTime=172501
   [junit4]   2> 486933 WARN  (zkCallback-242-thread-2-processing-n:127.0.0.1:40787_my%2Fk) [n:127.0.0.1:40787_my%2Fk    ] o.a.s.c.c.ZkStateReader ZooKeeper watch triggered, but Solr cannot talk to ZK: [KeeperErrorCode = Session expired for /live_nodes]
   [junit4]   2> 486933 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.m.SolrMetricManager Closing metric reporters for: solr.node
   [junit4]   2> 486934 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.e.j.s.AbstractConnector Stopped ServerConnector@1e1fe6c{SSL,[ssl, http/1.1]}{127.0.0.1:0}
   [junit4]   2> 486935 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@16d7b88{/my/k,null,UNAVAILABLE}
   [junit4]   2> 486935 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.c.ChaosMonkey monkey: stop shard! 34596
   [junit4]   2> 486936 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.c.CoreContainer Shutting down CoreContainer instance=22118352
   [junit4]   2> 486936 WARN  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.c.RecoveryStrategy Stopping recovery for core=[collection1] coreNodeName=[core_node2]
   [junit4]   2> 486938 WARN  (updateExecutor-259-thread-2-processing-n:127.0.0.1:34596_my%2Fk) [n:127.0.0.1:34596_my%2Fk c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.DefaultSolrCoreState Skipping recovery because Solr is shutdown
   [junit4]   2> 490791 INFO  (recoveryExecutor-260-thread-1-processing-n:127.0.0.1:34596_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:34596_my%2Fk c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.c.RecoveryStrategy RecoveryStrategy has been closed
   [junit4]   2> 490792 INFO  (recoveryExecutor-260-thread-1-processing-n:127.0.0.1:34596_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:34596_my%2Fk c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.c.RecoveryStrategy Finished recovery process, successful=[false]
   [junit4]   2> 490792 INFO  (recoveryExecutor-260-thread-1-processing-n:127.0.0.1:34596_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:34596_my%2Fk c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.c.SolrCore [collection1]  CLOSING SolrCore org.apache.solr.core.SolrCore@d8ae6a
   [junit4]   2> 490793 WARN  (recoveryExecutor-260-thread-1-processing-n:127.0.0.1:34596_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:34596_my%2Fk c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.c.RecoveryStrategy Stopping recovery for core=[collection1] coreNodeName=[core_node2]
   [junit4]   2> 490816 INFO  (recoveryExecutor-260-thread-1-processing-n:127.0.0.1:34596_my%2Fk x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:34596_my%2Fk c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.m.SolrMetricManager Closing metric reporters for: solr.core.collection1
   [junit4]   2> 490816 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.c.Overseer Overseer (id=97236040510668818-127.0.0.1:34596_my%2Fk-n_0000000004) closing
   [junit4]   2> 490816 INFO  (OverseerStateUpdate-97236040510668818-127.0.0.1:34596_my%2Fk-n_0000000004) [n:127.0.0.1:34596_my%2Fk    ] o.a.s.c.Overseer Overseer Loop exiting : 127.0.0.1:34596_my%2Fk
   [junit4]   2> 490818 WARN  (zkCallback-262-thread-3-processing-n:127.0.0.1:34596_my%2Fk) [n:127.0.0.1:34596_my%2Fk    ] o.a.s.c.c.ZkStateReader ZooKeeper watch triggered, but Solr cannot talk to ZK: [KeeperErrorCode = Session expired for /live_nodes]
   [junit4]   2> 490818 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.m.SolrMetricManager Closing metric reporters for: solr.node
   [junit4]   2> 490819 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.e.j.s.AbstractConnector Stopped ServerConnector@a652dd{SSL,[ssl, http/1.1]}{127.0.0.1:34596}
   [junit4]   2> 490819 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@1e70974{/my/k,null,UNAVAILABLE}
   [junit4]   2> 490819 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.c.ChaosMonkey monkey: stop shard! 45541
   [junit4]   2> 490820 INFO  (TEST-PeerSyncReplicationTest.test-seed#[84F4EA3AA629126F]) [    ] o.a.s.c.ZkTestServer connecting to 127.0.0.1:42987 42987
   [junit4]   2> 490856 INFO  (Thread-373) [    ] o.a.s.c.ZkTestServer connecting to 127.0.0.1:42987 42987
   [junit4]   2> 490856 WARN  (Thread-373) [    ] o.a.s.c.ZkTestServer Watch limit violations: 
   [junit4]   2> Maximum concurrent create/delete watches above limit:
   [junit4]   2> 
   [junit4]   2> 	6	/solr/aliases.json
   [junit4]   2> 	5	/solr/security.json
   [junit4]   2> 	5	/solr/configs/conf1
   [junit4]   2> 	4	/solr/collections/collection1/state.json
   [junit4]   2> 
   [junit4]   2> Maximum concurrent data watches above limit:
   [junit4]   2> 
   [junit4]   2> 	6	/solr/clusterstate.json
   [junit4]   2> 	6	/solr/clusterprops.json
   [junit4]   2> 	2	/solr/collections/collection1/leader_elect/shard1/election/97236040510668809-core_node1-n_0000000000
   [junit4]   2> 	2	/solr/overseer_elect/election/97236040510668809-127.0.0.1:40787_my%2Fk-n_0000000001
   [junit4]   2> 
   [junit4]   2> Maximum concurrent children watches above limit:
   [junit4]   2> 
   [junit4]   2> 	209	/solr/overseer/collection-queue-work
   [junit4]   2> 	29	/solr/overseer/queue
   [junit4]   2> 	6	/solr/collections
   [junit4]   2> 	5	/solr/live_nodes
   [junit4]   2> 	5	/solr/overseer/queue-work
   [junit4]   2> 
   [junit4]   2> NOTE: reproduce with: ant test  -Dtestcase=PeerSyncReplicationTest -Dtests.method=test -Dtests.seed=84F4EA3AA629126F -Dtests.multiplier=3 -Dtests.slow=true -Dtests.locale=es-MX -Dtests.timezone=Africa/Monrovia -Dtests.asserts=true -Dtests.file.encoding=US-ASCII
   [junit4] FAILURE  210s J0 | PeerSyncReplicationTest.test <<<
   [junit4]    > Throwable #1: java.lang.AssertionError: timeout waiting to see all nodes active
   [junit4]    > 	at __randomizedtesting.SeedInfo.seed([84F4EA3AA629126F:CA0D5E008D57F97]:0)
   [junit4]    > 	at org.apache.solr.cloud.PeerSyncReplicationTest.waitTillNodesActive(PeerSyncReplicationTest.java:311)
   [junit4]    > 	at org.apache.solr.cloud.PeerSyncReplicationTest.bringUpDeadNodeAndEnsureNoReplication(PeerSyncReplicationTest.java:262)
   [junit4]    > 	at org.apache.solr.cloud.PeerSyncReplicationTest.forceNodeFailureAndDoPeerSync(PeerSyncReplicationTest.java:244)
   [junit4]    > 	at org.apache.solr.cloud.PeerSyncReplicationTest.test(PeerSyncReplicationTest.java:133)
   [junit4]    > 	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:985)
   [junit4]    > 	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:960)
   [junit4]    > 	at java.lang.Thread.run(Thread.java:745)
   [junit4]   2> 490860 INFO  (SUITE-PeerSyncReplicationTest-seed#[84F4EA3AA629126F]-worker) [    ] o.a.s.SolrTestCaseJ4 ###deleteCore
   [junit4]   2> NOTE: leaving temporary files on disk at: /home/jenkins/workspace/Lucene-Solr-master-Linux/solr/build/solr-core/test/J0/temp/solr.cloud.PeerSyncReplicationTest_84F4EA3AA629126F-001
   [junit4]   2> Jan 06, 2017 12:07:21 PM com.carrotsearch.randomizedtesting.ThreadLeakControl checkThreadLeaks
   [junit4]   2> WARNING: Will linger awaiting termination of 1 leaked thread(s).
   [junit4]   2> NOTE: test params are: codec=Lucene70, sim=RandomSimilarity(queryNorm=true): {}, locale=es-MX, timezone=Africa/Monrovia
   [junit4]   2> NOTE: Linux 4.4.0-53-generic i386/Oracle Corporation 1.8.0_112 (32-bit)/cpus=12,threads=1,free=49403416,total=251133952
   [junit4]   2> NOTE: All tests run in this JVM: [SolrRequestParserTest, TestAtomicUpdateErrorCases, LeaderFailureAfterFreshStartTest, TestDistribIDF, HttpPartitionTest, BufferStoreTest, CopyFieldTest, HdfsWriteToMultipleCollectionsTest, TestNumericTerms32, TestRecoveryHdfs, CursorMarkTest, TestJsonFacetRefinement, TestDistributedSearch, TestDownShardTolerantSearch, TestFieldCollectionResource, JavabinLoaderTest, TestSchemaSimilarityResource, TestLegacyNumericRangeQueryBuilder, ZkStateReaderTest, NumericFieldsTest, TestSlowCompositeReaderWrapper, PeerSyncReplicationTest]
   [junit4] Completed [162/675 (1!)] on J0 in 209.86s, 1 test, 1 failure <<< FAILURES!

[...truncated 56258 lines...]


Mime
View raw message