lucene-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Policeman Jenkins Server <jenk...@thetaphi.de>
Subject [JENKINS] Lucene-Solr-master-Windows (64bit/jdk-10.0.1) - Build # 7499 - Failure!
Date Sat, 01 Sep 2018 22:42:02 GMT
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-Windows/7499/
Java: 64bit/jdk-10.0.1 -XX:+UseCompressedOops -XX:+UseConcMarkSweepGC

1 tests failed.
FAILED:  org.apache.solr.cloud.cdcr.CdcrBidirectionalTest.testBiDir

Error Message:
Captured an uncaught exception in thread: Thread[id=11405, name=cdcr-replicator-4181-thread-1, state=RUNNABLE, group=TGRP-CdcrBidirectionalTest]

Stack Trace:
com.carrotsearch.randomizedtesting.UncaughtExceptionError: Captured an uncaught exception in thread: Thread[id=11405, name=cdcr-replicator-4181-thread-1, state=RUNNABLE, group=TGRP-CdcrBidirectionalTest]
Caused by: java.lang.AssertionError: 1610443849825517568 != 1610443849824468992
	at __randomizedtesting.SeedInfo.seed([F99E706FED9975AE]:0)
	at org.apache.solr.update.CdcrUpdateLog$CdcrLogReader.forwardSeek(CdcrUpdateLog.java:611)
	at org.apache.solr.handler.CdcrReplicator.run(CdcrReplicator.java:125)
	at org.apache.solr.handler.CdcrReplicatorScheduler.lambda$start$0(CdcrReplicatorScheduler.java:81)
	at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:209)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1135)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
	at java.base/java.lang.Thread.run(Thread.java:844)




Build Log:
[...truncated 2004 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\core\test\temp\junit4-J1-20180901_211050_45515210943047583230613.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\core\test\temp\junit4-J0-20180901_211050_4558589081336946145269.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 316 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\test-framework\test\temp\junit4-J1-20180901_211700_6579075508798453849934.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\test-framework\test\temp\junit4-J0-20180901_211700_65710870789107684935757.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 1081 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\analysis\common\test\temp\junit4-J0-20180901_211805_1213948917319459829115.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\analysis\common\test\temp\junit4-J1-20180901_211805_12118161937204984273841.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 255 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\analysis\icu\test\temp\junit4-J0-20180901_211947_4851278446077628787780.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 5 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\analysis\icu\test\temp\junit4-J1-20180901_211947_48517823013621627465341.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 254 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\analysis\kuromoji\test\temp\junit4-J0-20180901_211959_55610256127213619681548.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\analysis\kuromoji\test\temp\junit4-J1-20180901_211959_5561214725277875054977.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 163 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\analysis\morfologik\test\temp\junit4-J1-20180901_212027_4676251310113527403973.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\analysis\morfologik\test\temp\junit4-J0-20180901_212027_46712377200360921926336.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 211 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\analysis\nori\test\temp\junit4-J0-20180901_212032_1596179746348096847406.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\analysis\nori\test\temp\junit4-J1-20180901_212032_15914576378959027081491.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 172 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\analysis\opennlp\test\temp\junit4-J0-20180901_212040_4111830162211448326532.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\analysis\opennlp\test\temp\junit4-J1-20180901_212040_4116934588920096432803.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 180 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\analysis\phonetic\test\temp\junit4-J0-20180901_212044_1678260689252107705898.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\analysis\phonetic\test\temp\junit4-J1-20180901_212044_16811444583608130668557.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 160 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\analysis\smartcn\test\temp\junit4-J0-20180901_212053_71813600283070663576895.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\analysis\smartcn\test\temp\junit4-J1-20180901_212053_7188746361911892933610.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 168 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\analysis\stempel\test\temp\junit4-J0-20180901_212059_69316933531153791481367.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\analysis\stempel\test\temp\junit4-J1-20180901_212059_69310225268018234863481.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 161 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\backward-codecs\test\temp\junit4-J1-20180901_212103_6674430289895830866206.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 19 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\backward-codecs\test\temp\junit4-J0-20180901_212103_66810482922335738531221.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 1409 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\benchmark\test\temp\junit4-J0-20180901_212135_5729783610998323311893.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\benchmark\test\temp\junit4-J1-20180901_212135_57212232419040353770467.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 252 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\classification\test\temp\junit4-J0-20180901_212146_8581437493182715511128.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\classification\test\temp\junit4-J1-20180901_212146_8595261853586375068823.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 262 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\codecs\test\temp\junit4-J0-20180901_212202_1858565840894208352768.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\codecs\test\temp\junit4-J1-20180901_212202_18516361337033861388071.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 235 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\demo\test\temp\junit4-J0-20180901_212324_9243409093708268147497.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\demo\test\temp\junit4-J1-20180901_212324_9246224451950223344751.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 177 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\expressions\test\temp\junit4-J1-20180901_212328_71816583732265680228886.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\expressions\test\temp\junit4-J0-20180901_212328_71811142274144391869696.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 238 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\facet\test\temp\junit4-J0-20180901_212334_5609344832092625398631.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\facet\test\temp\junit4-J1-20180901_212334_56014653794666735324846.syserr
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 186 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\grouping\test\temp\junit4-J0-20180901_212400_5139913472055006303616.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\grouping\test\temp\junit4-J1-20180901_212400_5125940706687846316029.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 259 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\highlighter\test\temp\junit4-J1-20180901_212406_79211350325319496003470.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\highlighter\test\temp\junit4-J0-20180901_212406_79216135139316027991633.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 167 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\join\test\temp\junit4-J0-20180901_212424_280913753039892797348.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\join\test\temp\junit4-J1-20180901_212424_28012503078202493698238.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 156 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\memory\test\temp\junit4-J0-20180901_212434_591742422776898791697.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\memory\test\temp\junit4-J1-20180901_212434_5913619341128800549861.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 196 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\misc\test\temp\junit4-J0-20180901_212440_8418153915314398552531.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\misc\test\temp\junit4-J1-20180901_212440_8418203833475733795717.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 310 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\queries\test\temp\junit4-J0-20180901_212459_81511867843798372290254.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\queries\test\temp\junit4-J1-20180901_212459_81513334624089266870942.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 235 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\queryparser\test\temp\junit4-J0-20180901_212508_9061588058890182761903.syserr
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\queryparser\test\temp\junit4-J1-20180901_212508_9065988289131770882237.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 220 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\replicator\test\temp\junit4-J1-20180901_212518_3806065558666024124935.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\replicator\test\temp\junit4-J0-20180901_212518_380618957336946480949.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 201 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\sandbox\test\temp\junit4-J0-20180901_212535_4854444180523091865009.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 14 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\sandbox\test\temp\junit4-J1-20180901_212535_4852308928280221669191.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 297 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\spatial-extras\test\temp\junit4-J1-20180901_212624_05914029636569266202426.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\spatial-extras\test\temp\junit4-J0-20180901_212624_0598552427637765061548.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 187 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\spatial3d\test\temp\junit4-J0-20180901_212657_844658767452631003384.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 11 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\spatial3d\test\temp\junit4-J1-20180901_212657_84416007153807319608496.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 152 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\spatial\test\temp\junit4-J0-20180901_212716_09513627490373045155193.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 256 lines...]
   [junit4] JVM J0: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\suggest\test\temp\junit4-J0-20180901_212719_8094824753893601140102.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\suggest\test\temp\junit4-J1-20180901_212719_8094534193547745836760.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 3002 lines...]
   [junit4] Suite: org.apache.solr.cloud.cdcr.CdcrBidirectionalTest
   [junit4]   2> Creating dataDir: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.cdcr.CdcrBidirectionalTest_F99E706FED9975AE-001\init-core-data-001
   [junit4]   2> 1596059 WARN  (SUITE-CdcrBidirectionalTest-seed#[F99E706FED9975AE]-worker) [    ] o.a.s.SolrTestCaseJ4 startTrackingSearchers: numOpens=2 numCloses=2
   [junit4]   2> 1596059 INFO  (SUITE-CdcrBidirectionalTest-seed#[F99E706FED9975AE]-worker) [    ] o.a.s.SolrTestCaseJ4 Using PointFields (NUMERIC_POINTS_SYSPROP=true) w/NUMERIC_DOCVALUES_SYSPROP=false
   [junit4]   2> 1596060 INFO  (SUITE-CdcrBidirectionalTest-seed#[F99E706FED9975AE]-worker) [    ] o.a.s.SolrTestCaseJ4 Randomized ssl (false) and clientAuth (false) via: @org.apache.solr.util.RandomizeSSL(reason="", value=0.0/0.0, ssl=0.0/0.0, clientAuth=0.0/0.0)
   [junit4]   2> 1596060 INFO  (SUITE-CdcrBidirectionalTest-seed#[F99E706FED9975AE]-worker) [    ] o.a.s.SolrTestCaseJ4 SecureRandom sanity checks: test.solr.allowed.securerandom=null & java.security.egd=file:/dev/./urandom
   [junit4]   2> 1596062 INFO  (TEST-CdcrBidirectionalTest.testBiDir-seed#[F99E706FED9975AE]) [    ] o.a.s.SolrTestCaseJ4 ###Starting testBiDir
   [junit4]   2> 1596062 INFO  (TEST-CdcrBidirectionalTest.testBiDir-seed#[F99E706FED9975AE]) [    ] o.a.s.c.MiniSolrCloudCluster Starting cluster of 1 servers in C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.cdcr.CdcrBidirectionalTest_F99E706FED9975AE-001\cdcr-cluster2-001
   [junit4]   2> 1596062 INFO  (TEST-CdcrBidirectionalTest.testBiDir-seed#[F99E706FED9975AE]) [    ] o.a.s.c.ZkTestServer STARTING ZK TEST SERVER
   [junit4]   2> 1596062 INFO  (Thread-1632) [    ] o.a.s.c.ZkTestServer client port:0.0.0.0/0.0.0.0:0
   [junit4]   2> 1596062 INFO  (Thread-1632) [    ] o.a.s.c.ZkTestServer Starting server
   [junit4]   2> 1596066 ERROR (Thread-1632) [    ] o.a.z.s.ZooKeeperServer ZKShutdownHandler is not registered, so ZooKeeper server won't take any action on ERROR or SHUTDOWN server state changes
   [junit4]   2> 1596163 INFO  (TEST-CdcrBidirectionalTest.testBiDir-seed#[F99E706FED9975AE]) [    ] o.a.s.c.ZkTestServer start zk server on port:63695
   [junit4]   2> 1596166 INFO  (zkConnectionManagerCallback-5131-thread-1) [    ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1596170 INFO  (jetty-launcher-5128-thread-1) [    ] o.e.j.s.Server jetty-9.4.11.v20180605; built: 2018-06-05T18:24:03.829Z; git: d5fc0523cfa96bfebfbda19606cad384d772f04c; jvm 10.0.1+10
   [junit4]   2> 1596170 INFO  (jetty-launcher-5128-thread-1) [    ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 1596170 INFO  (jetty-launcher-5128-thread-1) [    ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 1596171 INFO  (jetty-launcher-5128-thread-1) [    ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 1596171 INFO  (jetty-launcher-5128-thread-1) [    ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@3cfa4ca3{/solr,null,AVAILABLE}
   [junit4]   2> 1596173 INFO  (jetty-launcher-5128-thread-1) [    ] o.e.j.s.AbstractConnector Started ServerConnector@439073a8{HTTP/1.1,[http/1.1]}{127.0.0.1:63699}
   [junit4]   2> 1596173 INFO  (jetty-launcher-5128-thread-1) [    ] o.e.j.s.Server Started @1596204ms
   [junit4]   2> 1596173 INFO  (jetty-launcher-5128-thread-1) [    ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/solr, hostPort=63699}
   [junit4]   2> 1596173 ERROR (jetty-launcher-5128-thread-1) [    ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 1596174 INFO  (jetty-launcher-5128-thread-1) [    ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 1596174 INFO  (jetty-launcher-5128-thread-1) [    ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr? version 8.0.0
   [junit4]   2> 1596174 INFO  (jetty-launcher-5128-thread-1) [    ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 1596174 INFO  (jetty-launcher-5128-thread-1) [    ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null
   [junit4]   2> 1596174 INFO  (jetty-launcher-5128-thread-1) [    ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2018-09-01T21:55:12.073359Z
   [junit4]   2> 1596176 INFO  (zkConnectionManagerCallback-5133-thread-1) [    ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1596176 INFO  (jetty-launcher-5128-thread-1) [    ] o.a.s.s.SolrDispatchFilter solr.xml found in ZooKeeper. Loading...
   [junit4]   2> 1596186 WARN  (NIOServerCxn.Factory:0.0.0.0/0.0.0.0:0) [    ] o.a.z.s.NIOServerCnxn Unable to read additional data from client sessionid 0x1005583b7d60001, likely client has closed socket
   [junit4]   2> 1596659 INFO  (jetty-launcher-5128-thread-1) [    ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:63695/solr
   [junit4]   2> 1596661 INFO  (zkConnectionManagerCallback-5137-thread-1) [    ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1596663 INFO  (zkConnectionManagerCallback-5139-thread-1) [    ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1596766 INFO  (jetty-launcher-5128-thread-1) [n:127.0.0.1:63699_solr    ] o.a.s.c.OverseerElectionContext I am going to be the leader 127.0.0.1:63699_solr
   [junit4]   2> 1596767 INFO  (jetty-launcher-5128-thread-1) [n:127.0.0.1:63699_solr    ] o.a.s.c.Overseer Overseer (id=72151618251259907-127.0.0.1:63699_solr-n_0000000000) starting
   [junit4]   2> 1596776 INFO  (zkConnectionManagerCallback-5146-thread-1) [    ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1596778 INFO  (jetty-launcher-5128-thread-1) [n:127.0.0.1:63699_solr    ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:63695/solr ready
   [junit4]   2> 1596779 INFO  (jetty-launcher-5128-thread-1) [n:127.0.0.1:63699_solr    ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:63699_solr
   [junit4]   2> 1596779 INFO  (OverseerStateUpdate-72151618251259907-127.0.0.1:63699_solr-n_0000000000) [n:127.0.0.1:63699_solr    ] o.a.s.c.Overseer Starting to work on the main queue : 127.0.0.1:63699_solr
   [junit4]   2> 1596779 DEBUG (OverseerAutoScalingTriggerThread-72151618251259907-127.0.0.1:63699_solr-n_0000000000) [    ] o.a.s.c.a.OverseerTriggerThread Adding .auto_add_replicas and .scheduled_maintenance triggers
   [junit4]   2> 1596781 DEBUG (OverseerAutoScalingTriggerThread-72151618251259907-127.0.0.1:63699_solr-n_0000000000) [    ] o.a.s.c.a.OverseerTriggerThread Refreshing /autoscaling.json with znode version 1
   [junit4]   2> 1596782 DEBUG (OverseerAutoScalingTriggerThread-72151618251259907-127.0.0.1:63699_solr-n_0000000000) [    ] o.a.s.c.a.OverseerTriggerThread Current znodeVersion 1, lastZnodeVersion -1
   [junit4]   2> 1596782 DEBUG (OverseerAutoScalingTriggerThread-72151618251259907-127.0.0.1:63699_solr-n_0000000000) [    ] o.a.s.c.a.OverseerTriggerThread Processed trigger updates upto znodeVersion 1
   [junit4]   2> 1596782 INFO  (OverseerStateUpdate-72151618251259907-127.0.0.1:63699_solr-n_0000000000) [n:127.0.0.1:63699_solr    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 1596785 INFO  (zkCallback-5145-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 1596787 DEBUG (OverseerAutoScalingTriggerThread-72151618251259907-127.0.0.1:63699_solr-n_0000000000) [    ] o.a.s.c.a.NodeLostTrigger NodeLostTrigger .auto_add_replicas - Initial livenodes: [127.0.0.1:63699_solr]
   [junit4]   2> 1596787 DEBUG (OverseerAutoScalingTriggerThread-72151618251259907-127.0.0.1:63699_solr-n_0000000000) [    ] o.a.s.c.a.OverseerTriggerThread -- clean old nodeAdded markers
   [junit4]   2> 1596787 DEBUG (OverseerAutoScalingTriggerThread-72151618251259907-127.0.0.1:63699_solr-n_0000000000) [    ] o.a.s.c.a.OverseerTriggerThread Current znodeVersion 1, lastZnodeVersion 1
   [junit4]   2> 1596789 DEBUG (ScheduledTrigger-4127-thread-1) [    ] o.a.s.c.a.NodeLostTrigger Running NodeLostTrigger: .auto_add_replicas with currently live nodes: 1
   [junit4]   2> 1596801 INFO  (jetty-launcher-5128-thread-1) [n:127.0.0.1:63699_solr    ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 1596819 INFO  (jetty-launcher-5128-thread-1) [n:127.0.0.1:63699_solr    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr_63699.solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@61742141
   [junit4]   2> 1596825 INFO  (jetty-launcher-5128-thread-1) [n:127.0.0.1:63699_solr    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr_63699.solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@61742141
   [junit4]   2> 1596825 INFO  (jetty-launcher-5128-thread-1) [n:127.0.0.1:63699_solr    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr_63699.solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@61742141
   [junit4]   2> 1596826 INFO  (jetty-launcher-5128-thread-1) [n:127.0.0.1:63699_solr    ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.cdcr.CdcrBidirectionalTest_F99E706FED9975AE-001\cdcr-cluster2-001\node1\.
   [junit4]   2> 1596845 INFO  (zkConnectionManagerCallback-5149-thread-1) [    ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1596846 WARN  (NIOServerCxn.Factory:0.0.0.0/0.0.0.0:0) [    ] o.a.z.s.NIOServerCnxn Unable to read additional data from client sessionid 0x1005583b7d60005, likely client has closed socket
   [junit4]   2> 1596852 INFO  (zkConnectionManagerCallback-5152-thread-1) [    ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1596852 WARN  (NIOServerCxn.Factory:0.0.0.0/0.0.0.0:0) [    ] o.a.z.s.NIOServerCnxn Unable to read additional data from client sessionid 0x1005583b7d60006, likely client has closed socket
   [junit4]   2> 1596853 INFO  (TEST-CdcrBidirectionalTest.testBiDir-seed#[F99E706FED9975AE]) [    ] o.a.s.c.MiniSolrCloudCluster Starting cluster of 1 servers in C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.cdcr.CdcrBidirectionalTest_F99E706FED9975AE-001\cdcr-cluster1-001
   [junit4]   2> 1596853 INFO  (TEST-CdcrBidirectionalTest.testBiDir-seed#[F99E706FED9975AE]) [    ] o.a.s.c.ZkTestServer STARTING ZK TEST SERVER
   [junit4]   2> 1596853 INFO  (Thread-1642) [    ] o.a.s.c.ZkTestServer client port:0.0.0.0/0.0.0.0:0
   [junit4]   2> 1596853 INFO  (Thread-1642) [    ] o.a.s.c.ZkTestServer Starting server
   [junit4]   2> 1596859 ERROR (Thread-1642) [    ] o.a.z.s.ZooKeeperServer ZKShutdownHandler is not registered, so ZooKeeper server won't take any action on ERROR or SHUTDOWN server state changes
   [junit4]   2> 1596954 INFO  (TEST-CdcrBidirectionalTest.testBiDir-seed#[F99E706FED9975AE]) [    ] o.a.s.c.ZkTestServer start zk server on port:63722
   [junit4]   2> 1596957 INFO  (zkConnectionManagerCallback-5156-thread-1) [    ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1596961 INFO  (jetty-launcher-5153-thread-1) [    ] o.e.j.s.Server jetty-9.4.11.v20180605; built: 2018-06-05T18:24:03.829Z; git: d5fc0523cfa96bfebfbda19606cad384d772f04c; jvm 10.0.1+10
   [junit4]   2> 1596961 INFO  (jetty-launcher-5153-thread-1) [    ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 1596961 INFO  (jetty-launcher-5153-thread-1) [    ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 1596961 INFO  (jetty-launcher-5153-thread-1) [    ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 1596962 INFO  (jetty-launcher-5153-thread-1) [    ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@2ab369f{/solr,null,AVAILABLE}
   [junit4]   2> 1596962 INFO  (jetty-launcher-5153-thread-1) [    ] o.e.j.s.AbstractConnector Started ServerConnector@5d2875e9{HTTP/1.1,[http/1.1]}{127.0.0.1:63726}
   [junit4]   2> 1596963 INFO  (jetty-launcher-5153-thread-1) [    ] o.e.j.s.Server Started @1596993ms
   [junit4]   2> 1596963 INFO  (jetty-launcher-5153-thread-1) [    ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/solr, hostPort=63726}
   [junit4]   2> 1596963 ERROR (jetty-launcher-5153-thread-1) [    ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 1596963 INFO  (jetty-launcher-5153-thread-1) [    ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 1596963 INFO  (jetty-launcher-5153-thread-1) [    ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr? version 8.0.0
   [junit4]   2> 1596963 INFO  (jetty-launcher-5153-thread-1) [    ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 1596963 INFO  (jetty-launcher-5153-thread-1) [    ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null
   [junit4]   2> 1596963 INFO  (jetty-launcher-5153-thread-1) [    ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2018-09-01T21:55:12.862719900Z
   [junit4]   2> 1596965 INFO  (zkConnectionManagerCallback-5158-thread-1) [    ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1596966 INFO  (jetty-launcher-5153-thread-1) [    ] o.a.s.s.SolrDispatchFilter solr.xml found in ZooKeeper. Loading...
   [junit4]   2> 1597105 INFO  (jetty-launcher-5153-thread-1) [    ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:63722/solr
   [junit4]   2> 1597107 INFO  (zkConnectionManagerCallback-5162-thread-1) [    ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1597110 INFO  (zkConnectionManagerCallback-5164-thread-1) [    ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1597222 INFO  (jetty-launcher-5153-thread-1) [n:127.0.0.1:63726_solr    ] o.a.s.c.OverseerElectionContext I am going to be the leader 127.0.0.1:63726_solr
   [junit4]   2> 1597223 INFO  (jetty-launcher-5153-thread-1) [n:127.0.0.1:63726_solr    ] o.a.s.c.Overseer Overseer (id=72151618303229955-127.0.0.1:63726_solr-n_0000000000) starting
   [junit4]   2> 1597232 INFO  (zkConnectionManagerCallback-5171-thread-1) [    ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1597234 INFO  (jetty-launcher-5153-thread-1) [n:127.0.0.1:63726_solr    ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:63722/solr ready
   [junit4]   2> 1597235 INFO  (OverseerStateUpdate-72151618303229955-127.0.0.1:63726_solr-n_0000000000) [n:127.0.0.1:63726_solr    ] o.a.s.c.Overseer Starting to work on the main queue : 127.0.0.1:63726_solr
   [junit4]   2> 1597235 INFO  (jetty-launcher-5153-thread-1) [n:127.0.0.1:63726_solr    ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:63726_solr
   [junit4]   2> 1597236 DEBUG (OverseerAutoScalingTriggerThread-72151618303229955-127.0.0.1:63726_solr-n_0000000000) [    ] o.a.s.c.a.OverseerTriggerThread Adding .auto_add_replicas and .scheduled_maintenance triggers
   [junit4]   2> 1597237 INFO  (OverseerStateUpdate-72151618303229955-127.0.0.1:63726_solr-n_0000000000) [n:127.0.0.1:63726_solr    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 1597238 DEBUG (OverseerAutoScalingTriggerThread-72151618303229955-127.0.0.1:63726_solr-n_0000000000) [    ] o.a.s.c.a.OverseerTriggerThread Refreshing /autoscaling.json with znode version 1
   [junit4]   2> 1597239 DEBUG (OverseerAutoScalingTriggerThread-72151618303229955-127.0.0.1:63726_solr-n_0000000000) [    ] o.a.s.c.a.OverseerTriggerThread Current znodeVersion 1, lastZnodeVersion -1
   [junit4]   2> 1597239 DEBUG (OverseerAutoScalingTriggerThread-72151618303229955-127.0.0.1:63726_solr-n_0000000000) [    ] o.a.s.c.a.OverseerTriggerThread Processed trigger updates upto znodeVersion 1
   [junit4]   2> 1597244 DEBUG (OverseerAutoScalingTriggerThread-72151618303229955-127.0.0.1:63726_solr-n_0000000000) [    ] o.a.s.c.a.NodeLostTrigger NodeLostTrigger .auto_add_replicas - Initial livenodes: []
   [junit4]   2> 1597245 DEBUG (OverseerAutoScalingTriggerThread-72151618303229955-127.0.0.1:63726_solr-n_0000000000) [    ] o.a.s.c.a.OverseerTriggerThread -- clean old nodeAdded markers
   [junit4]   2> 1597250 DEBUG (OverseerAutoScalingTriggerThread-72151618303229955-127.0.0.1:63726_solr-n_0000000000) [    ] o.a.s.c.a.OverseerTriggerThread Current znodeVersion 1, lastZnodeVersion 1
   [junit4]   2> 1597251 INFO  (zkCallback-5170-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 1597251 DEBUG (ScheduledTrigger-4140-thread-1) [    ] o.a.s.c.a.NodeLostTrigger Running NodeLostTrigger: .auto_add_replicas with currently live nodes: 1
   [junit4]   2> 1597264 INFO  (jetty-launcher-5153-thread-1) [n:127.0.0.1:63726_solr    ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 1597286 INFO  (jetty-launcher-5153-thread-1) [n:127.0.0.1:63726_solr    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr_63726.solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@61742141
   [junit4]   2> 1597292 INFO  (jetty-launcher-5153-thread-1) [n:127.0.0.1:63726_solr    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr_63726.solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@61742141
   [junit4]   2> 1597292 INFO  (jetty-launcher-5153-thread-1) [n:127.0.0.1:63726_solr    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr_63726.solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@61742141
   [junit4]   2> 1597293 INFO  (jetty-launcher-5153-thread-1) [n:127.0.0.1:63726_solr    ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.cdcr.CdcrBidirectionalTest_F99E706FED9975AE-001\cdcr-cluster1-001\node1\.
   [junit4]   2> 1597317 INFO  (zkConnectionManagerCallback-5174-thread-1) [    ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1597321 INFO  (zkConnectionManagerCallback-5177-thread-1) [    ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1597322 INFO  (TEST-CdcrBidirectionalTest.testBiDir-seed#[F99E706FED9975AE]) [    ] o.a.s.c.c.CdcrBidirectionalTest cluster2 zkHost = 127.0.0.1:63695/solr
   [junit4]   2> 1597322 INFO  (TEST-CdcrBidirectionalTest.testBiDir-seed#[F99E706FED9975AE]) [    ] o.a.s.c.c.CdcrBidirectionalTest cluster1 zkHost = 127.0.0.1:63722/solr
   [junit4]   2> 1597325 INFO  (zkConnectionManagerCallback-5179-thread-1) [    ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1597328 WARN  (NIOServerCxn.Factory:0.0.0.0/0.0.0.0:0) [    ] o.a.z.s.NIOServerCnxn Unable to read additional data from client sessionid 0x1005583baef0007, likely client has closed socket
   [junit4]   2> 1597331 INFO  (zkConnectionManagerCallback-5183-thread-1) [    ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1597332 INFO  (TEST-CdcrBidirectionalTest.testBiDir-seed#[F99E706FED9975AE]) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 1597332 INFO  (TEST-CdcrBidirectionalTest.testBiDir-seed#[F99E706FED9975AE]) [    ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:63722/solr ready
   [junit4]   2> 1597338 INFO  (qtp704377026-11271) [n:127.0.0.1:63726_solr    ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :create with params collection.configName=cdcr-cluster1&maxShardsPerNode=2&name=cdcr-cluster1&nrtReplicas=1&action=CREATE&numShards=2&property.solr.directoryFactory=solr.StandardDirectoryFactory&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 1597342 INFO  (OverseerThreadFactory-4142-thread-1-processing-n:127.0.0.1:63726_solr) [n:127.0.0.1:63726_solr    ] o.a.s.c.a.c.CreateCollectionCmd Create collection cdcr-cluster1
   [junit4]   2> 1597452 INFO  (OverseerStateUpdate-72151618303229955-127.0.0.1:63726_solr-n_0000000000) [n:127.0.0.1:63726_solr    ] o.a.s.c.o.SliceMutator createReplica() {
   [junit4]   2>   "operation":"ADDREPLICA",
   [junit4]   2>   "collection":"cdcr-cluster1",
   [junit4]   2>   "shard":"shard1",
   [junit4]   2>   "core":"cdcr-cluster1_shard1_replica_n1",
   [junit4]   2>   "state":"down",
   [junit4]   2>   "base_url":"http://127.0.0.1:63726/solr",
   [junit4]   2>   "type":"NRT",
   [junit4]   2>   "waitForFinalState":"false"} 
   [junit4]   2> 1597455 INFO  (OverseerStateUpdate-72151618303229955-127.0.0.1:63726_solr-n_0000000000) [n:127.0.0.1:63726_solr    ] o.a.s.c.o.SliceMutator createReplica() {
   [junit4]   2>   "operation":"ADDREPLICA",
   [junit4]   2>   "collection":"cdcr-cluster1",
   [junit4]   2>   "shard":"shard2",
   [junit4]   2>   "core":"cdcr-cluster1_shard2_replica_n2",
   [junit4]   2>   "state":"down",
   [junit4]   2>   "base_url":"http://127.0.0.1:63726/solr",
   [junit4]   2>   "type":"NRT",
   [junit4]   2>   "waitForFinalState":"false"} 
   [junit4]   2> 1597660 INFO  (qtp704377026-11267) [n:127.0.0.1:63726_solr    x:cdcr-cluster1_shard1_replica_n1] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=cdcr-cluster1&newCollection=true&collection=cdcr-cluster1&version=2&replicaType=NRT&coreNodeName=core_node3&name=cdcr-cluster1_shard1_replica_n1&action=CREATE&numShards=2&shard=shard1&property.solr.directoryFactory=solr.StandardDirectoryFactory&wt=javabin
   [junit4]   2> 1597660 INFO  (qtp704377026-11270) [n:127.0.0.1:63726_solr    x:cdcr-cluster1_shard2_replica_n2] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=cdcr-cluster1&newCollection=true&collection=cdcr-cluster1&version=2&replicaType=NRT&coreNodeName=core_node4&name=cdcr-cluster1_shard2_replica_n2&action=CREATE&numShards=2&shard=shard2&property.solr.directoryFactory=solr.StandardDirectoryFactory&wt=javabin
   [junit4]   2> 1597661 INFO  (qtp704377026-11267) [n:127.0.0.1:63726_solr    x:cdcr-cluster1_shard1_replica_n1] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 2147483647 transient cores
   [junit4]   2> 1597790 DEBUG (ScheduledTrigger-4127-thread-3) [    ] o.a.s.c.a.NodeLostTrigger Running NodeLostTrigger: .auto_add_replicas with currently live nodes: 1
   [junit4]   2> 1598254 DEBUG (ScheduledTrigger-4140-thread-2) [    ] o.a.s.c.a.NodeLostTrigger Running NodeLostTrigger: .auto_add_replicas with currently live nodes: 1
   [junit4]   2> 1598672 INFO  (qtp704377026-11270) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard2 r:core_node4 x:cdcr-cluster1_shard2_replica_n2] o.a.s.c.SolrConfig Using Lucene MatchVersion: 8.0.0
   [junit4]   2> 1598672 INFO  (qtp704377026-11267) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard1 r:core_node3 x:cdcr-cluster1_shard1_replica_n1] o.a.s.c.SolrConfig Using Lucene MatchVersion: 8.0.0
   [junit4]   2> 1598684 INFO  (qtp704377026-11267) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard1 r:core_node3 x:cdcr-cluster1_shard1_replica_n1] o.a.s.s.IndexSchema [cdcr-cluster1_shard1_replica_n1] Schema name=minimal
   [junit4]   2> 1598684 INFO  (qtp704377026-11270) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard2 r:core_node4 x:cdcr-cluster1_shard2_replica_n2] o.a.s.s.IndexSchema [cdcr-cluster1_shard2_replica_n2] Schema name=minimal
   [junit4]   2> 1598685 INFO  (qtp704377026-11270) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard2 r:core_node4 x:cdcr-cluster1_shard2_replica_n2] o.a.s.s.IndexSchema Loaded schema minimal/1.1 with uniqueid field id
   [junit4]   2> 1598685 INFO  (qtp704377026-11267) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard1 r:core_node3 x:cdcr-cluster1_shard1_replica_n1] o.a.s.s.IndexSchema Loaded schema minimal/1.1 with uniqueid field id
   [junit4]   2> 1598685 INFO  (qtp704377026-11267) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard1 r:core_node3 x:cdcr-cluster1_shard1_replica_n1] o.a.s.c.CoreContainer Creating SolrCore 'cdcr-cluster1_shard1_replica_n1' using configuration from collection cdcr-cluster1, trusted=true
   [junit4]   2> 1598685 INFO  (qtp704377026-11270) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard2 r:core_node4 x:cdcr-cluster1_shard2_replica_n2] o.a.s.c.CoreContainer Creating SolrCore 'cdcr-cluster1_shard2_replica_n2' using configuration from collection cdcr-cluster1, trusted=true
   [junit4]   2> 1598686 INFO  (qtp704377026-11270) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard2 r:core_node4 x:cdcr-cluster1_shard2_replica_n2] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr_63726.solr.core.cdcr-cluster1.shard2.replica_n2' (registry 'solr.core.cdcr-cluster1.shard2.replica_n2') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@61742141
   [junit4]   2> 1598686 INFO  (qtp704377026-11267) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard1 r:core_node3 x:cdcr-cluster1_shard1_replica_n1] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr_63726.solr.core.cdcr-cluster1.shard1.replica_n1' (registry 'solr.core.cdcr-cluster1.shard1.replica_n1') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@61742141
   [junit4]   2> 1598686 INFO  (qtp704377026-11270) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard2 r:core_node4 x:cdcr-cluster1_shard2_replica_n2] o.a.s.c.SolrCore solr.RecoveryStrategy.Builder
   [junit4]   2> 1598686 INFO  (qtp704377026-11267) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard1 r:core_node3 x:cdcr-cluster1_shard1_replica_n1] o.a.s.c.SolrCore solr.RecoveryStrategy.Builder
   [junit4]   2> 1598686 INFO  (qtp704377026-11270) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard2 r:core_node4 x:cdcr-cluster1_shard2_replica_n2] o.a.s.c.SolrCore [[cdcr-cluster1_shard2_replica_n2] ] Opening new SolrCore at [C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.cdcr.CdcrBidirectionalTest_F99E706FED9975AE-001\cdcr-cluster1-001\node1\cdcr-cluster1_shard2_replica_n2], dataDir=[C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.cdcr.CdcrBidirectionalTest_F99E706FED9975AE-001\cdcr-cluster1-001\node1\.\cdcr-cluster1_shard2_replica_n2\data\]
   [junit4]   2> 1598686 INFO  (qtp704377026-11267) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard1 r:core_node3 x:cdcr-cluster1_shard1_replica_n1] o.a.s.c.SolrCore [[cdcr-cluster1_shard1_replica_n1] ] Opening new SolrCore at [C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.cdcr.CdcrBidirectionalTest_F99E706FED9975AE-001\cdcr-cluster1-001\node1\cdcr-cluster1_shard1_replica_n1], dataDir=[C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.cdcr.CdcrBidirectionalTest_F99E706FED9975AE-001\cdcr-cluster1-001\node1\.\cdcr-cluster1_shard1_replica_n1\data\]
   [junit4]   2> 1598792 DEBUG (ScheduledTrigger-4127-thread-4) [    ] o.a.s.c.a.NodeLostTrigger Running NodeLostTrigger: .auto_add_replicas with currently live nodes: 1
   [junit4]   2> 1598811 INFO  (qtp704377026-11270) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard2 r:core_node4 x:cdcr-cluster1_shard2_replica_n2] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.CdcrUpdateLog
   [junit4]   2> 1598811 INFO  (qtp704377026-11267) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard1 r:core_node3 x:cdcr-cluster1_shard1_replica_n1] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.CdcrUpdateLog
   [junit4]   2> 1598811 INFO  (qtp704377026-11267) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard1 r:core_node3 x:cdcr-cluster1_shard1_replica_n1] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 1598811 INFO  (qtp704377026-11270) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard2 r:core_node4 x:cdcr-cluster1_shard2_replica_n2] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 1598813 INFO  (qtp704377026-11267) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard1 r:core_node3 x:cdcr-cluster1_shard1_replica_n1] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 1598813 INFO  (qtp704377026-11270) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard2 r:core_node4 x:cdcr-cluster1_shard2_replica_n2] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 1598813 INFO  (qtp704377026-11270) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard2 r:core_node4 x:cdcr-cluster1_shard2_replica_n2] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 1598813 INFO  (qtp704377026-11267) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard1 r:core_node3 x:cdcr-cluster1_shard1_replica_n1] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 1598816 INFO  (qtp704377026-11270) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard2 r:core_node4 x:cdcr-cluster1_shard2_replica_n2] o.a.s.s.SolrIndexSearcher Opening [Searcher@2c50cff5[cdcr-cluster1_shard2_replica_n2] main]
   [junit4]   2> 1598816 INFO  (qtp704377026-11267) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard1 r:core_node3 x:cdcr-cluster1_shard1_replica_n1] o.a.s.s.SolrIndexSearcher Opening [Searcher@7bed168c[cdcr-cluster1_shard1_replica_n1] main]
   [junit4]   2> 1598818 INFO  (qtp704377026-11267) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard1 r:core_node3 x:cdcr-cluster1_shard1_replica_n1] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/cdcr-cluster1
   [junit4]   2> 1598818 INFO  (qtp704377026-11270) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard2 r:core_node4 x:cdcr-cluster1_shard2_replica_n2] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/cdcr-cluster1
   [junit4]   2> 1598818 INFO  (qtp704377026-11267) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard1 r:core_node3 x:cdcr-cluster1_shard1_replica_n1] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/cdcr-cluster1
   [junit4]   2> 1598818 INFO  (qtp704377026-11270) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard2 r:core_node4 x:cdcr-cluster1_shard2_replica_n2] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/cdcr-cluster1
   [junit4]   2> 1598818 INFO  (qtp704377026-11270) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard2 r:core_node4 x:cdcr-cluster1_shard2_replica_n2] o.a.s.s.ZkIndexSchemaReader Creating ZooKeeper watch for the managed schema at /configs/cdcr-cluster1/managed-schema
   [junit4]   2> 1598818 INFO  (qtp704377026-11267) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard1 r:core_node3 x:cdcr-cluster1_shard1_replica_n1] o.a.s.s.ZkIndexSchemaReader Creating ZooKeeper watch for the managed schema at /configs/cdcr-cluster1/managed-schema
   [junit4]   2> 1598819 INFO  (qtp704377026-11267) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard1 r:core_node3 x:cdcr-cluster1_shard1_replica_n1] o.a.s.s.ZkIndexSchemaReader Current schema version 0 is already the latest
   [junit4]   2> 1598819 INFO  (qtp704377026-11270) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard2 r:core_node4 x:cdcr-cluster1_shard2_replica_n2] o.a.s.s.ZkIndexSchemaReader Current schema version 0 is already the latest
   [junit4]   2> 1598819 INFO  (qtp704377026-11270) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard2 r:core_node4 x:cdcr-cluster1_shard2_replica_n2] o.a.s.h.ReplicationHandler Commits will be reserved for 10000ms.
   [junit4]   2> 1598819 INFO  (qtp704377026-11267) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard1 r:core_node3 x:cdcr-cluster1_shard1_replica_n1] o.a.s.h.ReplicationHandler Commits will be reserved for 10000ms.
   [junit4]   2> 1598822 INFO  (qtp704377026-11267) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard1 r:core_node3 x:cdcr-cluster1_shard1_replica_n1] o.a.s.h.CdcrBufferStateManager Created znode /collections/cdcr-cluster1/cdcr/state/buffer
   [junit4]   2> 1598822 INFO  (qtp704377026-11267) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard1 r:core_node3 x:cdcr-cluster1_shard1_replica_n1] o.a.s.h.CdcrProcessStateManager Created znode /collections/cdcr-cluster1/cdcr/state/process
   [junit4]   2> 1598824 INFO  (qtp704377026-11267) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard1 r:core_node3 x:cdcr-cluster1_shard1_replica_n1] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1610443825844584448
   [junit4]   2> 1598824 INFO  (qtp704377026-11270) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard2 r:core_node4 x:cdcr-cluster1_shard2_replica_n2] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1610443825844584448
   [junit4]   2> 1598824 INFO  (searcherExecutor-4148-thread-1-processing-n:127.0.0.1:63726_solr x:cdcr-cluster1_shard1_replica_n1 c:cdcr-cluster1 s:shard1 r:core_node3) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard1 r:core_node3 x:cdcr-cluster1_shard1_replica_n1] o.a.s.c.SolrCore [cdcr-cluster1_shard1_replica_n1] Registered new searcher Searcher@7bed168c[cdcr-cluster1_shard1_replica_n1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 1598825 INFO  (searcherExecutor-4147-thread-1-processing-n:127.0.0.1:63726_solr x:cdcr-cluster1_shard2_replica_n2 c:cdcr-cluster1 s:shard2 r:core_node4) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard2 r:core_node4 x:cdcr-cluster1_shard2_replica_n2] o.a.s.c.SolrCore [cdcr-cluster1_shard2_replica_n2] Registered new searcher Searcher@2c50cff5[cdcr-cluster1_shard2_replica_n2] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 1598829 INFO  (qtp704377026-11270) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard2 r:core_node4 x:cdcr-cluster1_shard2_replica_n2] o.a.s.c.ZkShardTerms Successful update of terms at /collections/cdcr-cluster1/terms/shard2 to Terms{values={core_node4=0}, version=0}
   [junit4]   2> 1598830 INFO  (qtp704377026-11267) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard1 r:core_node3 x:cdcr-cluster1_shard1_replica_n1] o.a.s.c.ZkShardTerms Successful update of terms at /collections/cdcr-cluster1/terms/shard1 to Terms{values={core_node3=0}, version=0}
   [junit4]   2> 1598831 INFO  (qtp704377026-11270) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard2 r:core_node4 x:cdcr-cluster1_shard2_replica_n2] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 1598831 INFO  (qtp704377026-11270) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard2 r:core_node4 x:cdcr-cluster1_shard2_replica_n2] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 1598831 INFO  (qtp704377026-11270) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard2 r:core_node4 x:cdcr-cluster1_shard2_replica_n2] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:63726/solr/cdcr-cluster1_shard2_replica_n2/
   [junit4]   2> 1598831 INFO  (qtp704377026-11270) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard2 r:core_node4 x:cdcr-cluster1_shard2_replica_n2] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 1598831 INFO  (qtp704377026-11267) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard1 r:core_node3 x:cdcr-cluster1_shard1_replica_n1] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 1598831 INFO  (qtp704377026-11267) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard1 r:core_node3 x:cdcr-cluster1_shard1_replica_n1] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 1598831 INFO  (qtp704377026-11267) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard1 r:core_node3 x:cdcr-cluster1_shard1_replica_n1] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:63726/solr/cdcr-cluster1_shard1_replica_n1/
   [junit4]   2> 1598831 INFO  (qtp704377026-11270) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard2 r:core_node4 x:cdcr-cluster1_shard2_replica_n2] o.a.s.c.SyncStrategy http://127.0.0.1:63726/solr/cdcr-cluster1_shard2_replica_n2/ has no replicas
   [junit4]   2> 1598831 INFO  (qtp704377026-11270) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard2 r:core_node4 x:cdcr-cluster1_shard2_replica_n2] o.a.s.c.ShardLeaderElectionContext Found all replicas participating in election, clear LIR
   [junit4]   2> 1598835 INFO  (qtp704377026-11267) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard1 r:core_node3 x:cdcr-cluster1_shard1_replica_n1] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 1598836 INFO  (qtp704377026-11267) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard1 r:core_node3 x:cdcr-cluster1_shard1_replica_n1] o.a.s.c.SyncStrategy http://127.0.0.1:63726/solr/cdcr-cluster1_shard1_replica_n1/ has no replicas
   [junit4]   2> 1598836 INFO  (qtp704377026-11267) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard1 r:core_node3 x:cdcr-cluster1_shard1_replica_n1] o.a.s.c.ShardLeaderElectionContext Found all replicas participating in election, clear LIR
   [junit4]   2> 1598839 INFO  (zkCallback-5163-thread-1) [    ] o.a.s.h.CdcrLeaderStateManager Received new leader state @ cdcr-cluster1:shard2
   [junit4]   2> 1598840 INFO  (zkCallback-5163-thread-2) [    ] o.a.s.h.CdcrLeaderStateManager Received new leader state @ cdcr-cluster1:shard1
   [junit4]   2> 1598843 INFO  (qtp704377026-11270) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard2 r:core_node4 x:cdcr-cluster1_shard2_replica_n2] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:63726/solr/cdcr-cluster1_shard2_replica_n2/ shard2
   [junit4]   2> 1598844 INFO  (qtp704377026-11267) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard1 r:core_node3 x:cdcr-cluster1_shard1_replica_n1] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:63726/solr/cdcr-cluster1_shard1_replica_n1/ shard1
   [junit4]   2> 1598947 INFO  (qtp704377026-11267) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard1 r:core_node3 x:cdcr-cluster1_shard1_replica_n1] o.a.s.c.ZkController I am the leader, no recovery necessary
   [junit4]   2> 1598950 INFO  (qtp704377026-11267) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard1 r:core_node3 x:cdcr-cluster1_shard1_replica_n1] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=cdcr-cluster1&newCollection=true&collection=cdcr-cluster1&version=2&replicaType=NRT&coreNodeName=core_node3&name=cdcr-cluster1_shard1_replica_n1&action=CREATE&numShards=2&shard=shard1&property.solr.directoryFactory=solr.StandardDirectoryFactory&wt=javabin} status=0 QTime=1290
   [junit4]   2> 1598996 INFO  (qtp704377026-11270) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard2 r:core_node4 x:cdcr-cluster1_shard2_replica_n2] o.a.s.c.ZkController I am the leader, no recovery necessary
   [junit4]   2> 1598999 INFO  (qtp704377026-11270) [n:127.0.0.1:63726_solr c:cdcr-cluster1 s:shard2 r:core_node4 x:cdcr-cluster1_shard2_replica_n2] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=cdcr-cluster1&newCollection=true&collection=cdcr-cluster1&version=2&replicaType=NRT&coreNodeName=core_node4&name=cdcr-cluster1_shard2_replica_n2&action=CREATE&numShards=2&shard=shard2&property.solr.directoryFactory=solr.StandardDirectoryFactory&wt=javabin} status=0 QTime=1339
   [junit4]   2> 1599002 INFO  (qtp704377026-11271) [n:127.0.0.1:63726_solr    ] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 30 seconds. Check all shard replicas
   [junit4]   2> 1599101 INFO  (zkCallback-5163-thread-2) [    ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/cdcr-cluster1/state.json] for collection [cdcr-cluster1] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 1599253 DEBUG (ScheduledTrigger-4140-thread-1) [    ] o.a.s.c.a.NodeLostTrigger Running NodeLostTrigger: .auto_add_replicas with currently live nodes: 1
   [junit4]   2> 1599345 INFO  (OverseerCollectionConfigSetProcessor-72151618303229955-127.0.0.1:63726_solr-n_0000000000) [n:127.0.0.1:63726_solr    ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000000 doesn't exist.  Requestor may have disconnected from ZooKeeper
   [junit4]   2> 1599793 DEBUG (ScheduledTrigger-4127-thread-3) [    ] o.a.s.c.a.NodeLostTrigger Running NodeLostTrigger: .auto_add_replicas with currently live nodes: 1
   [junit4]   2> 1600002 INFO  (qtp704377026-11271) [n:127.0.0.1:63726_solr    ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={collection.configName=cdcr-cluster1&maxShardsPerNode=2&name=cdcr-cluster1&nrtReplicas=1&action=CREATE&numShards=2&property.solr.directoryFactory=solr.StandardDirectoryFactory&wt=javabin&version=2} status=0 QTime=2663
   [junit4]   2> 1600005 INFO  (zkConnectionManagerCallback-5187-thread-1) [    ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1600007 WARN  (NIOServerCxn.Factory:0.0.0.0/0.0.0.0:0) [    ] o.a.z.s.NIOServerCnxn Unable to read additional data from client sessionid 0x1005583b7d60007, likely client has closed socket
   [junit4]   2> 1600012 INFO  (zkConnectionManagerCallback-5191-thread-1) [    ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1600013 INFO  (TEST-CdcrBidirectionalTest.testBiDir-seed#[F99E706FED9975AE]) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 1600013 INFO  (TEST-CdcrBidirectionalTest.testBiDir-seed#[F99E706FED9975AE]) [    ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:63695/solr ready
   [junit4]   2> 1600015 INFO  (qtp1359337204-11215) [n:127.0.0.1:63699_solr    ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :create with params collection.configName=cdcr-cluster2&maxShardsPerNode=2&name=cdcr-cluster2&nrtReplicas=1&action=CREATE&numShards=2&property.solr.directoryFactory=solr.StandardDirectoryFactory&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 1600018 INFO  (OverseerThreadFactory-4129-thread-1-processing-n:127.0.0.1:63699_solr) [n:127.0.0.1:63699_solr    ] o.a.s.c.a.c.CreateCollectionCmd Create collection cdcr-cluster2
   [junit4]   2> 1600128 INFO  (OverseerStateUpdate-72151618251259907-127.0.0.1:63699_solr-n_0000000000) [n:127.0.0.1:63699_solr    ] o.a.s.c.o.SliceMutator createReplica() {
   [junit4]   2>   "operation":"ADDREPLICA",
   [junit4]   2>   "collection":"cdcr-cluster2",
   [junit4]   2>   "shard":"shard1",
   [junit4]   2>   "core":"cdcr-cluster2_shard1_replica_n1",
   [junit4]   2>   "state":"down",
   [junit4]   2>   "base_url":"http://127.0.0.1:63699/solr",
   [junit4]   2>   "type":"NRT",
   [junit4]   2>   "waitForFinalState":"false"} 
   [junit4]   2> 1600129 INFO  (OverseerStateUpdate-72151618251259907-127.0.0.1:63699_solr-n_0000000000) [n:127.0.0.1:63699_solr    ] o.a.s.c.o.SliceMutator createReplica() {
   [junit4]   2>   "operation":"ADDREPLICA",
   [junit4]   2>   "collection":"cdcr-cluster2",
   [junit4]   2>   "shard":"shard2",
   [junit4]   2>   "core":"cdcr-cluster2_shard2_replica_n2",
   [junit4]   2>   "state":"down",
   [junit4]   2>   "base_url":"http://127.0.0.1:63699/solr",
   [junit4]   2>   "type":"NRT",
   [junit4]   2>   "waitForFinalState":"false"} 
   [junit4]   2> 1600254 DEBUG (ScheduledTrigger-4140-thread-4) [    ] o.a.s.c.a.NodeLostTrigger Running NodeLostTrigger: .auto_add_replicas with currently live nodes: 1
   [junit4]   2> 1600334 INFO  (qtp1359337204-11213) [n:127.0.0.1:63699_solr    x:cdcr-cluster2_shard2_replica_n2] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=cdcr-cluster2&newCollection=true&collection=cdcr-cluster2&version=2&replicaType=NRT&coreNodeName=core_node4&name=cdcr-cluster2_shard2_replica_n2&action=CREATE&numShards=2&shard=shard2&property.solr.directoryFactory=solr.StandardDirectoryFactory&wt=javabin
   [junit4]   2> 1600334 INFO  (qtp1359337204-11216) [n:127.0.0.1:63699_solr    x:cdcr-cluster2_shard1_replica_n1] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=cdcr-cluster2&newCollection=true&collection=cdcr-cluster2&version=2&replicaType=NRT&coreNodeName=core_node3&name=cdcr-cluster2_shard1_replica_n1&action=CREATE&numShards=2&shard=shard1&property.solr.directoryFactory=solr.StandardDirectoryFactory&wt=javabin
   [junit4]   2> 1600334 INFO  (qtp1359337204-11216) [n:127.0.0.1:63699_solr    x:cdcr-cluster2_shard1_replica_n1] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 2147483647 transient cores
   [junit4]   2> 1600794 DEBUG (ScheduledTrigger-4127-thread-4) [    ] o.a.s.c.a.NodeLostTrigger Running NodeLostTrigger: .auto_add_replicas with currently live nodes: 1
   [junit4]   2> 1601255 DEBUG (ScheduledTrigger-4140-thread-1) [    ] o.a.s.c.a.NodeLostTrigger Running NodeLostTrigger: .auto_add_replicas with currently live nodes: 1
   [junit4]   2> 1601355 INFO  (qtp1359337204-11216) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard1 r:core_node3 x:cdcr-cluster2_shard1_replica_n1] o.a.s.c.SolrConfig Using Lucene MatchVersion: 8.0.0
   [junit4]   2> 1601355 INFO  (qtp1359337204-11213) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard2 r:core_node4 x:cdcr-cluster2_shard2_replica_n2] o.a.s.c.SolrConfig Using Lucene MatchVersion: 8.0.0
   [junit4]   2> 1601366 INFO  (qtp1359337204-11216) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard1 r:core_node3 x:cdcr-cluster2_shard1_replica_n1] o.a.s.s.IndexSchema [cdcr-cluster2_shard1_replica_n1] Schema name=minimal
   [junit4]   2> 1601366 INFO  (qtp1359337204-11213) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard2 r:core_node4 x:cdcr-cluster2_shard2_replica_n2] o.a.s.s.IndexSchema [cdcr-cluster2_shard2_replica_n2] Schema name=minimal
   [junit4]   2> 1601370 INFO  (qtp1359337204-11216) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard1 r:core_node3 x:cdcr-cluster2_shard1_replica_n1] o.a.s.s.IndexSchema Loaded schema minimal/1.1 with uniqueid field id
   [junit4]   2> 1601370 INFO  (qtp1359337204-11213) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard2 r:core_node4 x:cdcr-cluster2_shard2_replica_n2] o.a.s.s.IndexSchema Loaded schema minimal/1.1 with uniqueid field id
   [junit4]   2> 1601370 INFO  (qtp1359337204-11216) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard1 r:core_node3 x:cdcr-cluster2_shard1_replica_n1] o.a.s.c.CoreContainer Creating SolrCore 'cdcr-cluster2_shard1_replica_n1' using configuration from collection cdcr-cluster2, trusted=true
   [junit4]   2> 1601370 INFO  (qtp1359337204-11213) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard2 r:core_node4 x:cdcr-cluster2_shard2_replica_n2] o.a.s.c.CoreContainer Creating SolrCore 'cdcr-cluster2_shard2_replica_n2' using configuration from collection cdcr-cluster2, trusted=true
   [junit4]   2> 1601370 INFO  (qtp1359337204-11216) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard1 r:core_node3 x:cdcr-cluster2_shard1_replica_n1] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr_63699.solr.core.cdcr-cluster2.shard1.replica_n1' (registry 'solr.core.cdcr-cluster2.shard1.replica_n1') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@61742141
   [junit4]   2> 1601370 INFO  (qtp1359337204-11216) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard1 r:core_node3 x:cdcr-cluster2_shard1_replica_n1] o.a.s.c.SolrCore solr.RecoveryStrategy.Builder
   [junit4]   2> 1601370 INFO  (qtp1359337204-11213) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard2 r:core_node4 x:cdcr-cluster2_shard2_replica_n2] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr_63699.solr.core.cdcr-cluster2.shard2.replica_n2' (registry 'solr.core.cdcr-cluster2.shard2.replica_n2') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@61742141
   [junit4]   2> 1601370 INFO  (qtp1359337204-11216) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard1 r:core_node3 x:cdcr-cluster2_shard1_replica_n1] o.a.s.c.SolrCore [[cdcr-cluster2_shard1_replica_n1] ] Opening new SolrCore at [C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.cdcr.CdcrBidirectionalTest_F99E706FED9975AE-001\cdcr-cluster2-001\node1\cdcr-cluster2_shard1_replica_n1], dataDir=[C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.cdcr.CdcrBidirectionalTest_F99E706FED9975AE-001\cdcr-cluster2-001\node1\.\cdcr-cluster2_shard1_replica_n1\data\]
   [junit4]   2> 1601370 INFO  (qtp1359337204-11213) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard2 r:core_node4 x:cdcr-cluster2_shard2_replica_n2] o.a.s.c.SolrCore solr.RecoveryStrategy.Builder
   [junit4]   2> 1601370 INFO  (qtp1359337204-11213) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard2 r:core_node4 x:cdcr-cluster2_shard2_replica_n2] o.a.s.c.SolrCore [[cdcr-cluster2_shard2_replica_n2] ] Opening new SolrCore at [C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.cdcr.CdcrBidirectionalTest_F99E706FED9975AE-001\cdcr-cluster2-001\node1\cdcr-cluster2_shard2_replica_n2], dataDir=[C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\J0\temp\solr.cloud.cdcr.CdcrBidirectionalTest_F99E706FED9975AE-001\cdcr-cluster2-001\node1\.\cdcr-cluster2_shard2_replica_n2\data\]
   [junit4]   2> 1601473 INFO  (qtp1359337204-11216) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard1 r:core_node3 x:cdcr-cluster2_shard1_replica_n1] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.CdcrUpdateLog
   [junit4]   2> 1601473 INFO  (qtp1359337204-11216) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard1 r:core_node3 x:cdcr-cluster2_shard1_replica_n1] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 1601473 INFO  (qtp1359337204-11213) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard2 r:core_node4 x:cdcr-cluster2_shard2_replica_n2] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.CdcrUpdateLog
   [junit4]   2> 1601473 INFO  (qtp1359337204-11213) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard2 r:core_node4 x:cdcr-cluster2_shard2_replica_n2] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 1601478 INFO  (qtp1359337204-11213) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard2 r:core_node4 x:cdcr-cluster2_shard2_replica_n2] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 1601478 INFO  (qtp1359337204-11216) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard1 r:core_node3 x:cdcr-cluster2_shard1_replica_n1] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 1601478 INFO  (qtp1359337204-11216) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard1 r:core_node3 x:cdcr-cluster2_shard1_replica_n1] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 1601478 INFO  (qtp1359337204-11213) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard2 r:core_node4 x:cdcr-cluster2_shard2_replica_n2] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 1601482 INFO  (qtp1359337204-11216) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard1 r:core_node3 x:cdcr-cluster2_shard1_replica_n1] o.a.s.s.SolrIndexSearcher Opening [Searcher@12967cfc[cdcr-cluster2_shard1_replica_n1] main]
   [junit4]   2> 1601482 INFO  (qtp1359337204-11213) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard2 r:core_node4 x:cdcr-cluster2_shard2_replica_n2] o.a.s.s.SolrIndexSearcher Opening [Searcher@378f6e34[cdcr-cluster2_shard2_replica_n2] main]
   [junit4]   2> 1601483 INFO  (qtp1359337204-11216) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard1 r:core_node3 x:cdcr-cluster2_shard1_replica_n1] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/cdcr-cluster2
   [junit4]   2> 1601483 INFO  (qtp1359337204-11213) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard2 r:core_node4 x:cdcr-cluster2_shard2_replica_n2] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/cdcr-cluster2
   [junit4]   2> 1601483 INFO  (qtp1359337204-11216) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard1 r:core_node3 x:cdcr-cluster2_shard1_replica_n1] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/cdcr-cluster2
   [junit4]   2> 1601483 INFO  (qtp1359337204-11213) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard2 r:core_node4 x:cdcr-cluster2_shard2_replica_n2] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/cdcr-cluster2
   [junit4]   2> 1601483 INFO  (qtp1359337204-11213) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard2 r:core_node4 x:cdcr-cluster2_shard2_replica_n2] o.a.s.s.ZkIndexSchemaReader Creating ZooKeeper watch for the managed schema at /configs/cdcr-cluster2/managed-schema
   [junit4]   2> 1601483 INFO  (qtp1359337204-11216) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard1 r:core_node3 x:cdcr-cluster2_shard1_replica_n1] o.a.s.s.ZkIndexSchemaReader Creating ZooKeeper watch for the managed schema at /configs/cdcr-cluster2/managed-schema
   [junit4]   2> 1601483 INFO  (qtp1359337204-11216) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard1 r:core_node3 x:cdcr-cluster2_shard1_replica_n1] o.a.s.s.ZkIndexSchemaReader Current schema version 0 is already the latest
   [junit4]   2> 1601483 INFO  (qtp1359337204-11213) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard2 r:core_node4 x:cdcr-cluster2_shard2_replica_n2] o.a.s.s.ZkIndexSchemaReader Current schema version 0 is already the latest
   [junit4]   2> 1601483 INFO  (qtp1359337204-11213) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard2 r:core_node4 x:cdcr-cluster2_shard2_replica_n2] o.a.s.h.ReplicationHandler Commits will be reserved for 10000ms.
   [junit4]   2> 1601483 INFO  (qtp1359337204-11216) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard1 r:core_node3 x:cdcr-cluster2_shard1_replica_n1] o.a.s.h.ReplicationHandler Commits will be reserved for 10000ms.
   [junit4]   2> 1601487 INFO  (qtp1359337204-11213) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard2 r:core_node4 x:cdcr-cluster2_shard2_replica_n2] o.a.s.h.CdcrBufferStateManager Created znode /collections/cdcr-cluster2/cdcr/state/buffer
   [junit4]   2> 1601488 INFO  (qtp1359337204-11213) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard2 r:core_node4 x:cdcr-cluster2_shard2_replica_n2] o.a.s.h.CdcrProcessStateManager Created znode /collections/cdcr-cluster2/cdcr/state/process
   [junit4]   2> 1601489 INFO  (qtp1359337204-11213) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard2 r:core_node4 x:cdcr-cluster2_shard2_replica_n2] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1610443828639039488
   [junit4]   2> 1601489 INFO  (qtp1359337204-11216) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard1 r:core_node3 x:cdcr-cluster2_shard1_replica_n1] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1610443828639039488
   [junit4]   2> 1601489 INFO  (searcherExecutor-4160-thread-1-processing-n:127.0.0.1:63699_solr x:cdcr-cluster2_shard2_replica_n2 c:cdcr-cluster2 s:shard2 r:core_node4) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard2 r:core_node4 x:cdcr-cluster2_shard2_replica_n2] o.a.s.c.SolrCore [cdcr-cluster2_shard2_replica_n2] Registered new searcher Searcher@378f6e34[cdcr-cluster2_shard2_replica_n2] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 1601489 INFO  (searcherExecutor-4159-thread-1-processing-n:127.0.0.1:63699_solr x:cdcr-cluster2_shard1_replica_n1 c:cdcr-cluster2 s:shard1 r:core_node3) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard1 r:core_node3 x:cdcr-cluster2_shard1_replica_n1] o.a.s.c.SolrCore [cdcr-cluster2_shard1_replica_n1] Registered new searcher Searcher@12967cfc[cdcr-cluster2_shard1_replica_n1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 1601495 INFO  (qtp1359337204-11213) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard2 r:core_node4 x:cdcr-cluster2_shard2_replica_n2] o.a.s.c.ZkShardTerms Successful update of terms at /collections/cdcr-cluster2/terms/shard2 to Terms{values={core_node4=0}, version=0}
   [junit4]   2> 1601496 INFO  (qtp1359337204-11216) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard1 r:core_node3 x:cdcr-cluster2_shard1_replica_n1] o.a.s.c.ZkShardTerms Successful update of terms at /collections/cdcr-cluster2/terms/shard1 to Terms{values={core_node3=0}, version=0}
   [junit4]   2> 1601498 INFO  (qtp1359337204-11213) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard2 r:core_node4 x:cdcr-cluster2_shard2_replica_n2] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 1601498 INFO  (qtp1359337204-11213) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard2 r:core_node4 x:cdcr-cluster2_shard2_replica_n2] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 1601498 INFO  (qtp1359337204-11213) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard2 r:core_node4 x:cdcr-cluster2_shard2_replica_n2] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:63699/solr/cdcr-cluster2_shard2_replica_n2/
   [junit4]   2> 1601498 INFO  (qtp1359337204-11213) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard2 r:core_node4 x:cdcr-cluster2_shard2_replica_n2] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 1601498 INFO  (qtp1359337204-11213) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard2 r:core_node4 x:cdcr-cluster2_shard2_replica_n2] o.a.s.c.SyncStrategy http://127.0.0.1:63699/solr/cdcr-cluster2_shard2_replica_n2/ has no replicas
   [junit4]   2> 1601498 INFO  (qtp1359337204-11213) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard2 r:core_node4 x:cdcr-cluster2_shard2_replica_n2] o.a.s.c.ShardLeaderElectionContext Found all replicas participating in election, clear LIR
   [junit4]   2> 1601499 INFO  (qtp1359337204-11216) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard1 r:core_node3 x:cdcr-cluster2_shard1_replica_n1] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 1601499 INFO  (qtp1359337204-11216) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard1 r:core_node3 x:cdcr-cluster2_shard1_replica_n1] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 1601499 INFO  (qtp1359337204-11216) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard1 r:core_node3 x:cdcr-cluster2_shard1_replica_n1] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:63699/solr/cdcr-cluster2_shard1_replica_n1/
   [junit4]   2> 1601499 INFO  (qtp1359337204-11216) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard1 r:core_node3 x:cdcr-cluster2_shard1_replica_n1] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 1601500 INFO  (qtp1359337204-11216) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard1 r:core_node3 x:cdcr-cluster2_shard1_replica_n1] o.a.s.c.SyncStrategy http://127.0.0.1:63699/solr/cdcr-cluster2_shard1_replica_n1/ has no replicas
   [junit4]   2> 1601500 INFO  (qtp1359337204-11216) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard1 r:core_node3 x:cdcr-cluster2_shard1_replica_n1] o.a.s.c.ShardLeaderElectionContext Found all replicas participating in election, clear LIR
   [junit4]   2> 1601501 INFO  (zkCallback-5138-thread-1) [    ] o.a.s.h.CdcrLeaderStateManager Received new leader state @ cdcr-cluster2:shard2
   [junit4]   2> 1601502 INFO  (zkCallback-5138-thread-1) [    ] o.a.s.h.CdcrLeaderStateManager Received new leader state @ cdcr-cluster2:shard1
   [junit4]   2> 1601504 INFO  (qtp1359337204-11213) [n:127.0.0.1:63699_solr c:cdcr-cluster2 s:shard2 r:core_node4 x:cdcr-cluster2_shard2_replica_n2] o.a.s.c.ShardLeaderElectionContext I am the new leader: ht

[...truncated too long message...]

s\workspace\Lucene-Solr-master-Windows\solr\core\test-lib\apacheds-interceptors-exception-2.0.0-M15.jar;C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\test-lib\apacheds-interceptors-journal-2.0.0-M15.jar;C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\test-lib\apacheds-interceptors-normalization-2.0.0-M15.jar;C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\test-lib\apacheds-interceptors-operational-2.0.0-M15.jar;C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\test-lib\apacheds-interceptors-referral-2.0.0-M15.jar;C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\test-lib\apacheds-interceptors-schema-2.0.0-M15.jar;C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\test-lib\apacheds-interceptors-subtree-2.0.0-M15.jar;C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\test-lib\apacheds-interceptors-trigger-2.0.0-M15.jar;C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\test-lib\apacheds-jdbm-partition-2.0.0-M15.jar;C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\test-lib\apacheds-jdbm1-2.0.0-M2.jar;C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\test-lib\apacheds-kerberos-codec-2.0.0-M15.jar;C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\test-lib\apacheds-ldif-partition-2.0.0-M15.jar;C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\test-lib\apacheds-mavibot-partition-2.0.0-M15.jar;C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\test-lib\apacheds-protocol-kerberos-2.0.0-M15.jar;C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\test-lib\apacheds-protocol-ldap-2.0.0-M15.jar;C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\test-lib\apacheds-protocol-shared-2.0.0-M15.jar;C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\test-lib\apacheds-xdbm-partition-2.0.0-M15.jar;C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\test-lib\api-all-1.0.0-M20.jar;C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\test-lib\bcprov-jdk15on-1.54.jar;C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\test-lib\byte-buddy-1.6.2.jar;C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\test-lib\commons-collections-3.2.2.jar;C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\test-lib\commons-math3-3.6.1.jar;C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\test-lib\ehcache-core-2.4.4.jar;C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\test-lib\hadoop-common-2.7.4-tests.jar;C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\test-lib\hadoop-hdfs-2.7.4-tests.jar;C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\test-lib\hadoop-minikdc-2.7.4.jar;C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\test-lib\htrace-core-3.2.0-incubating.jar;C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\test-lib\jersey-core-1.9.jar;C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\test-lib\jersey-server-1.9.jar;C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\test-lib\jetty-6.1.26.jar;C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\test-lib\jetty-sslengine-6.1.26.jar;C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\test-lib\jetty-util-6.1.26.jar;C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\test-lib\mina-core-2.0.0-M5.jar;C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\test-lib\mockito-core-2.6.2.jar;C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\test-lib\netty-all-4.0.36.Final.jar;C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\core\test-lib\objenesis-2.5.jar;C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\analysis\icu\lucene-analyzers-icu-8.0.0-SNAPSHOT.jar;C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\contrib\solr-analysis-extras\classes\java;C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\queryparser\classes\test;C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\build\backward-codecs\classes\test;C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\contrib\analysis-extras\lib\icu4j-62.1.jar;C:\Users\jenkins\tools\hudson.tasks.Ant_AntInstallation\ANT_1.8.2\lib\ant-launcher.jar;C:\Users\jenkins\.ant\lib\ivy-2.4.0.jar;C:\Users\jenkins\tools\hudson.tasks.Ant_AntInstallation\ANT_1.8.2\lib\ant-antlr.jar;C:\Users\jenkins\tools\hudson.tasks.Ant_AntInstallation\ANT_1.8.2\lib\ant-apache-bcel.jar;C:\Users\jenkins\tools\hudson.tasks.Ant_AntInstallation\ANT_1.8.2\lib\ant-apache-bsf.jar;C:\Users\jenkins\tools\hudson.tasks.Ant_AntInstallation\ANT_1.8.2\lib\ant-apache-log4j.jar;C:\Users\jenkins\tools\hudson.tasks.Ant_AntInstallation\ANT_1.8.2\lib\ant-apache-oro.jar;C:\Users\jenkins\tools\hudson.tasks.Ant_AntInstallation\ANT_1.8.2\lib\ant-apache-regexp.jar;C:\Users\jenkins\tools\hudson.tasks.Ant_AntInstallation\ANT_1.8.2\lib\ant-apache-resolver.jar;C:\Users\jenkins\tools\hudson.tasks.Ant_AntInstallation\ANT_1.8.2\lib\ant-apache-xalan2.jar;C:\Users\jenkins\tools\hudson.tasks.Ant_AntInstallation\ANT_1.8.2\lib\ant-commons-logging.jar;C:\Users\jenkins\tools\hudson.tasks.Ant_AntInstallation\ANT_1.8.2\lib\ant-commons-net.jar;C:\Users\jenkins\tools\hudson.tasks.Ant_AntInstallation\ANT_1.8.2\lib\ant-jai.jar;C:\Users\jenkins\tools\hudson.tasks.Ant_AntInstallation\ANT_1.8.2\lib\ant-javamail.jar;C:\Users\jenkins\tools\hudson.tasks.Ant_AntInstallation\ANT_1.8.2\lib\ant-jdepend.jar;C:\Users\jenkins\tools\hudson.tasks.Ant_AntInstallation\ANT_1.8.2\lib\ant-jmf.jar;C:\Users\jenkins\tools\hudson.tasks.Ant_AntInstallation\ANT_1.8.2\lib\ant-jsch.jar;C:\Users\jenkins\tools\hudson.tasks.Ant_AntInstallation\ANT_1.8.2\lib\ant-junit.jar;C:\Users\jenkins\tools\hudson.tasks.Ant_AntInstallation\ANT_1.8.2\lib\ant-junit4.jar;C:\Users\jenkins\tools\hudson.tasks.Ant_AntInstallation\ANT_1.8.2\lib\ant-netrexx.jar;C:\Users\jenkins\tools\hudson.tasks.Ant_AntInstallation\ANT_1.8.2\lib\ant-swing.jar;C:\Users\jenkins\tools\hudson.tasks.Ant_AntInstallation\ANT_1.8.2\lib\ant-testutil.jar;C:\Users\jenkins\tools\hudson.tasks.Ant_AntInstallation\ANT_1.8.2\lib\ant.jar;C:\Users\jenkins\.ivy2\cache\com.carrotsearch.randomizedtesting\junit4-ant\jars\junit4-ant-2.6.0.jar com.carrotsearch.ant.tasks.junit4.slave.SlaveMainSafe -eventsfile C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\temp\junit4-J1-20180901_212835_801495707084595812642.events @C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\temp\junit4-J1-20180901_212835_80116187403350568917641.suites -stdin
   [junit4] ERROR: JVM J1 ended with an exception: Forked process returned with error code: 1. Very likely a JVM crash.  See process stdout at: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\temp\junit4-J1-20180901_212835_80116092720935537144356.sysout See process stderr at: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\temp\junit4-J1-20180901_212835_8015323821094269289828.syserr
   [junit4] 	at com.carrotsearch.ant.tasks.junit4.JUnit4.executeSlave(JUnit4.java:1519)
   [junit4] 	at com.carrotsearch.ant.tasks.junit4.JUnit4.access$000(JUnit4.java:126)
   [junit4] 	at com.carrotsearch.ant.tasks.junit4.JUnit4$2.call(JUnit4.java:982)
   [junit4] 	at com.carrotsearch.ant.tasks.junit4.JUnit4$2.call(JUnit4.java:979)
   [junit4] 	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
   [junit4] 	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1135)
   [junit4] 	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
   [junit4] 	at java.base/java.lang.Thread.run(Thread.java:844)

BUILD FAILED
C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\build.xml:633: The following error occurred while executing this line:
C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\build.xml:577: The following error occurred while executing this line:
C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\build.xml:59: The following error occurred while executing this line:
C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build.xml:267: The following error occurred while executing this line:
C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\common-build.xml:558: The following error occurred while executing this line:
C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\common-build.xml:1568: The following error occurred while executing this line:
C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\lucene\common-build.xml:1092: At least one slave process threw an exception, first: Forked process returned with error code: 1. Very likely a JVM crash.  See process stdout at: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\temp\junit4-J1-20180901_212835_80116092720935537144356.sysout See process stderr at: C:\Users\jenkins\workspace\Lucene-Solr-master-Windows\solr\build\solr-core\test\temp\junit4-J1-20180901_212835_8015323821094269289828.syserr

Total time: 91 minutes 21 seconds
Build step 'Invoke Ant' marked build as failure
Archiving artifacts
Setting ANT_1_8_2_HOME=C:\Users\jenkins\tools\hudson.tasks.Ant_AntInstallation\ANT_1.8.2
[WARNINGS] Skipping publisher since build result is FAILURE
Recording test results
Setting ANT_1_8_2_HOME=C:\Users\jenkins\tools\hudson.tasks.Ant_AntInstallation\ANT_1.8.2
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any
Setting ANT_1_8_2_HOME=C:\Users\jenkins\tools\hudson.tasks.Ant_AntInstallation\ANT_1.8.2
Setting ANT_1_8_2_HOME=C:\Users\jenkins\tools\hudson.tasks.Ant_AntInstallation\ANT_1.8.2
Setting ANT_1_8_2_HOME=C:\Users\jenkins\tools\hudson.tasks.Ant_AntInstallation\ANT_1.8.2
Setting ANT_1_8_2_HOME=C:\Users\jenkins\tools\hudson.tasks.Ant_AntInstallation\ANT_1.8.2

Mime
View raw message