spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Jungtaek Lim (Jira)" <j...@apache.org>
Subject [jira] [Issue Comment Deleted] (SPARK-29322) History server is stuck reading incomplete event log file compressed with zstd
Date Wed, 02 Oct 2019 03:23:00 GMT

     [ https://issues.apache.org/jira/browse/SPARK-29322?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Jungtaek Lim updated SPARK-29322:
---------------------------------
    Comment: was deleted

(was: I'll work on PR to propose removing zstd from supported compressions for event log.
We may want to apply another approach: we can discuss further in the PR.)

> History server is stuck reading incomplete event log file compressed with zstd
> ------------------------------------------------------------------------------
>
>                 Key: SPARK-29322
>                 URL: https://issues.apache.org/jira/browse/SPARK-29322
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 3.0.0
>            Reporter: Jungtaek Lim
>            Priority: Major
>         Attachments: history-server-1.jstack, history-server-2.jstack, history-server-3.jstack,
history-server-4.jstack
>
>
> While working on SPARK-28869, I've discovered the issue that reading inprogress event
log file on zstd compression could lead the thread being stuck. I just experimented with Spark
History Server and observed same issue. I'll attach the jstack files.
> Only listing the thread stack trace being stuck across jstack files:
> {code}
> 2019-10-02 11:32:36
> Full thread dump Java HotSpot(TM) 64-Bit Server VM (25.191-b12 mixed mode):
> ...
> "qtp2072313080-30" #30 daemon prio=5 os_prio=31 tid=0x00007ff5b90e7800 nid=0x9703 runnable
[0x000070000f220000]
>    java.lang.Thread.State: RUNNABLE
> 	at java.io.FileInputStream.readBytes(Native Method)
> 	at java.io.FileInputStream.read(FileInputStream.java:255)
> 	at org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileInputStream.read(RawLocalFileSystem.java:156)
> 	at java.io.BufferedInputStream.read1(BufferedInputStream.java:284)
> 	at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
> 	- locked <0x00000007b5f97c60> (a org.apache.hadoop.fs.BufferedFSInputStream)
> 	at java.io.DataInputStream.read(DataInputStream.java:149)
> 	at org.apache.hadoop.fs.FSInputChecker.readFully(FSInputChecker.java:436)
> 	at org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSInputChecker.readChunk(ChecksumFileSystem.java:257)
> 	at org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.java:276)
> 	at org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:228)
> 	at org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:196)
> 	- locked <0x00000007b5f97b58> (a org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSInputChecker)
> 	at java.io.DataInputStream.read(DataInputStream.java:149)
> 	at java.io.BufferedInputStream.read1(BufferedInputStream.java:284)
> 	at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
> 	- locked <0x00000007b5f97af8> (a java.io.BufferedInputStream)
> 	at com.github.luben.zstd.ZstdInputStream.readInternal(ZstdInputStream.java:129)
> 	at com.github.luben.zstd.ZstdInputStream.read(ZstdInputStream.java:107)
> 	- locked <0x00000007b5f97ac0> (a com.github.luben.zstd.ZstdInputStream)
> 	at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> 	at java.io.BufferedInputStream.read1(BufferedInputStream.java:286)
> 	at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
> 	- locked <0x00000007b5cd3bd0> (a java.io.BufferedInputStream)
> 	at sun.nio.cs.StreamDecoder.readBytes(StreamDecoder.java:284)
> 	at sun.nio.cs.StreamDecoder.implRead(StreamDecoder.java:326)
> 	at sun.nio.cs.StreamDecoder.read(StreamDecoder.java:178)
> 	- locked <0x00000007b5f94a00> (a java.io.InputStreamReader)
> 	at java.io.InputStreamReader.read(InputStreamReader.java:184)
> 	at java.io.BufferedReader.fill(BufferedReader.java:161)
> 	at java.io.BufferedReader.readLine(BufferedReader.java:324)
> 	- locked <0x00000007b5f94a00> (a java.io.InputStreamReader)
> 	at java.io.BufferedReader.readLine(BufferedReader.java:389)
> 	at scala.io.BufferedSource$BufferedLineIterator.hasNext(BufferedSource.scala:74)
> 	at scala.collection.Iterator$$anon$20.hasNext(Iterator.scala:884)
> 	at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:511)
> 	at org.apache.spark.scheduler.ReplayListenerBus.replay(ReplayListenerBus.scala:80)
> 	at org.apache.spark.scheduler.ReplayListenerBus.replay(ReplayListenerBus.scala:58)
> 	at org.apache.spark.deploy.history.FsHistoryProvider.$anonfun$rebuildAppStore$5(FsHistoryProvider.scala:976)
> 	at org.apache.spark.deploy.history.FsHistoryProvider.$anonfun$rebuildAppStore$5$adapted(FsHistoryProvider.scala:975)
> 	at org.apache.spark.deploy.history.FsHistoryProvider$$Lambda$662/1267867461.apply(Unknown
Source)
> 	at org.apache.spark.util.Utils$.tryWithResource(Utils.scala:2567)
> 	at org.apache.spark.deploy.history.FsHistoryProvider.rebuildAppStore(FsHistoryProvider.scala:975)
> 	at org.apache.spark.deploy.history.FsHistoryProvider.createInMemoryStore(FsHistoryProvider.scala:1093)
> 	at org.apache.spark.deploy.history.FsHistoryProvider.getAppUI(FsHistoryProvider.scala:346)
> 	at org.apache.spark.deploy.history.HistoryServer.getAppUI(HistoryServer.scala:188)
> 	at org.apache.spark.deploy.history.ApplicationCache.$anonfun$loadApplicationEntry$2(ApplicationCache.scala:163)
> 	at org.apache.spark.deploy.history.ApplicationCache$$Lambda$592/2060065989.apply(Unknown
Source)
> 	at org.apache.spark.deploy.history.ApplicationCache.time(ApplicationCache.scala:135)
> 	at org.apache.spark.deploy.history.ApplicationCache.org$apache$spark$deploy$history$ApplicationCache$$loadApplicationEntry(ApplicationCache.scala:161)
> 	at org.apache.spark.deploy.history.ApplicationCache$$anon$1.load(ApplicationCache.scala:56)
> 	at org.apache.spark.deploy.history.ApplicationCache$$anon$1.load(ApplicationCache.scala:52)
> 	at org.sparkproject.guava.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599)
> 	at org.sparkproject.guava.cache.LocalCache$Segment.loadSync(LocalCache.java:2379)
> 	at org.sparkproject.guava.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342)
> 	- locked <0x00000007b5cd3de0> (a org.sparkproject.guava.cache.LocalCache$StrongAccessEntry)
> 	at org.sparkproject.guava.cache.LocalCache$Segment.get(LocalCache.java:2257)
> 	at org.sparkproject.guava.cache.LocalCache.get(LocalCache.java:4000)
> 	at org.sparkproject.guava.cache.LocalCache.getOrLoad(LocalCache.java:4004)
> 	at org.sparkproject.guava.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874)
> 	at org.apache.spark.deploy.history.ApplicationCache.get(ApplicationCache.scala:89)
> 	at org.apache.spark.deploy.history.ApplicationCache.withSparkUI(ApplicationCache.scala:101)
> 	at org.apache.spark.deploy.history.HistoryServer.org$apache$spark$deploy$history$HistoryServer$$loadAppUi(HistoryServer.scala:245)
> 	at org.apache.spark.deploy.history.HistoryServer$$anon$1.doGet(HistoryServer.scala:98)
> 	at javax.servlet.http.HttpServlet.service(HttpServlet.java:687)
> 	at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
> 	at org.sparkproject.jetty.servlet.ServletHolder.handle(ServletHolder.java:873)
> 	at org.sparkproject.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1623)
> 	at org.apache.spark.ui.HttpSecurityFilter.doFilter(HttpSecurityFilter.scala:95)
> 	at org.sparkproject.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1610)
> 	at org.sparkproject.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:540)
> 	at org.sparkproject.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255)
> 	at org.sparkproject.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1345)
> 	at org.sparkproject.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203)
> 	at org.sparkproject.jetty.servlet.ServletHandler.doScope(ServletHandler.java:480)
> 	at org.sparkproject.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201)
> 	at org.sparkproject.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1247)
> 	at org.sparkproject.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144)
> 	at org.sparkproject.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:753)
> 	at org.sparkproject.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:220)
> 	at org.sparkproject.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
> 	at org.sparkproject.jetty.server.Server.handle(Server.java:505)
> 	at org.sparkproject.jetty.server.HttpChannel.handle(HttpChannel.java:370)
> 	at org.sparkproject.jetty.server.HttpConnection.onFillable(HttpConnection.java:267)
> 	at org.sparkproject.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:305)
> 	at org.sparkproject.jetty.io.FillInterest.fillable(FillInterest.java:103)
> 	at org.sparkproject.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)
> 	at org.sparkproject.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:333)
> 	at org.sparkproject.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:310)
> 	at org.sparkproject.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:168)
> 	at org.sparkproject.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:126)
> 	at org.sparkproject.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:366)
> 	at org.sparkproject.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:698)
> 	at org.sparkproject.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:804)
> 	at java.lang.Thread.run(Thread.java:748)
> 2019-10-02 11:33:13
> Full thread dump Java HotSpot(TM) 64-Bit Server VM (25.191-b12 mixed mode):
> ...
> "qtp2072313080-30" #30 daemon prio=5 os_prio=31 tid=0x00007ff5b90e7800 nid=0x9703 runnable
[0x000070000f220000]
>    java.lang.Thread.State: RUNNABLE
> 	at java.io.FileInputStream.readBytes(Native Method)
> 	at java.io.FileInputStream.read(FileInputStream.java:255)
> 	at org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileInputStream.read(RawLocalFileSystem.java:156)
> 	at java.io.BufferedInputStream.read1(BufferedInputStream.java:284)
> 	at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
> 	- locked <0x00000007b5f97c60> (a org.apache.hadoop.fs.BufferedFSInputStream)
> 	at java.io.DataInputStream.read(DataInputStream.java:149)
> 	at org.apache.hadoop.fs.FSInputChecker.readFully(FSInputChecker.java:436)
> 	at org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSInputChecker.readChunk(ChecksumFileSystem.java:257)
> 	at org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.java:276)
> 	at org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:228)
> 	at org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:196)
> 	- locked <0x00000007b5f97b58> (a org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSInputChecker)
> 	at java.io.DataInputStream.read(DataInputStream.java:149)
> 	at java.io.BufferedInputStream.read1(BufferedInputStream.java:284)
> 	at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
> 	- locked <0x00000007b5f97af8> (a java.io.BufferedInputStream)
> 	at com.github.luben.zstd.ZstdInputStream.readInternal(ZstdInputStream.java:129)
> 	at com.github.luben.zstd.ZstdInputStream.read(ZstdInputStream.java:107)
> 	- locked <0x00000007b5f97ac0> (a com.github.luben.zstd.ZstdInputStream)
> 	at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> 	at java.io.BufferedInputStream.read1(BufferedInputStream.java:286)
> 	at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
> 	- locked <0x00000007b5cd3bd0> (a java.io.BufferedInputStream)
> 	at sun.nio.cs.StreamDecoder.readBytes(StreamDecoder.java:284)
> 	at sun.nio.cs.StreamDecoder.implRead(StreamDecoder.java:326)
> 	at sun.nio.cs.StreamDecoder.read(StreamDecoder.java:178)
> 	- locked <0x00000007b5f94a00> (a java.io.InputStreamReader)
> 	at java.io.InputStreamReader.read(InputStreamReader.java:184)
> 	at java.io.BufferedReader.fill(BufferedReader.java:161)
> 	at java.io.BufferedReader.readLine(BufferedReader.java:324)
> 	- locked <0x00000007b5f94a00> (a java.io.InputStreamReader)
> 	at java.io.BufferedReader.readLine(BufferedReader.java:389)
> 	at scala.io.BufferedSource$BufferedLineIterator.hasNext(BufferedSource.scala:74)
> 	at scala.collection.Iterator$$anon$20.hasNext(Iterator.scala:884)
> 	at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:511)
> 	at org.apache.spark.scheduler.ReplayListenerBus.replay(ReplayListenerBus.scala:80)
> 	at org.apache.spark.scheduler.ReplayListenerBus.replay(ReplayListenerBus.scala:58)
> 	at org.apache.spark.deploy.history.FsHistoryProvider.$anonfun$rebuildAppStore$5(FsHistoryProvider.scala:976)
> 	at org.apache.spark.deploy.history.FsHistoryProvider.$anonfun$rebuildAppStore$5$adapted(FsHistoryProvider.scala:975)
> 	at org.apache.spark.deploy.history.FsHistoryProvider$$Lambda$662/1267867461.apply(Unknown
Source)
> 	at org.apache.spark.util.Utils$.tryWithResource(Utils.scala:2567)
> 	at org.apache.spark.deploy.history.FsHistoryProvider.rebuildAppStore(FsHistoryProvider.scala:975)
> 	at org.apache.spark.deploy.history.FsHistoryProvider.createInMemoryStore(FsHistoryProvider.scala:1093)
> 	at org.apache.spark.deploy.history.FsHistoryProvider.getAppUI(FsHistoryProvider.scala:346)
> 	at org.apache.spark.deploy.history.HistoryServer.getAppUI(HistoryServer.scala:188)
> 	at org.apache.spark.deploy.history.ApplicationCache.$anonfun$loadApplicationEntry$2(ApplicationCache.scala:163)
> 	at org.apache.spark.deploy.history.ApplicationCache$$Lambda$592/2060065989.apply(Unknown
Source)
> 	at org.apache.spark.deploy.history.ApplicationCache.time(ApplicationCache.scala:135)
> 	at org.apache.spark.deploy.history.ApplicationCache.org$apache$spark$deploy$history$ApplicationCache$$loadApplicationEntry(ApplicationCache.scala:161)
> 	at org.apache.spark.deploy.history.ApplicationCache$$anon$1.load(ApplicationCache.scala:56)
> 	at org.apache.spark.deploy.history.ApplicationCache$$anon$1.load(ApplicationCache.scala:52)
> 	at org.sparkproject.guava.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599)
> 	at org.sparkproject.guava.cache.LocalCache$Segment.loadSync(LocalCache.java:2379)
> 	at org.sparkproject.guava.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342)
> 	- locked <0x00000007b5cd3de0> (a org.sparkproject.guava.cache.LocalCache$StrongAccessEntry)
> 	at org.sparkproject.guava.cache.LocalCache$Segment.get(LocalCache.java:2257)
> 	at org.sparkproject.guava.cache.LocalCache.get(LocalCache.java:4000)
> 	at org.sparkproject.guava.cache.LocalCache.getOrLoad(LocalCache.java:4004)
> 	at org.sparkproject.guava.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874)
> 	at org.apache.spark.deploy.history.ApplicationCache.get(ApplicationCache.scala:89)
> 	at org.apache.spark.deploy.history.ApplicationCache.withSparkUI(ApplicationCache.scala:101)
> 	at org.apache.spark.deploy.history.HistoryServer.org$apache$spark$deploy$history$HistoryServer$$loadAppUi(HistoryServer.scala:245)
> 	at org.apache.spark.deploy.history.HistoryServer$$anon$1.doGet(HistoryServer.scala:98)
> 	at javax.servlet.http.HttpServlet.service(HttpServlet.java:687)
> 	at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
> 	at org.sparkproject.jetty.servlet.ServletHolder.handle(ServletHolder.java:873)
> 	at org.sparkproject.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1623)
> 	at org.apache.spark.ui.HttpSecurityFilter.doFilter(HttpSecurityFilter.scala:95)
> 	at org.sparkproject.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1610)
> 	at org.sparkproject.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:540)
> 	at org.sparkproject.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255)
> 	at org.sparkproject.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1345)
> 	at org.sparkproject.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203)
> 	at org.sparkproject.jetty.servlet.ServletHandler.doScope(ServletHandler.java:480)
> 	at org.sparkproject.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201)
> 	at org.sparkproject.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1247)
> 	at org.sparkproject.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144)
> 	at org.sparkproject.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:753)
> 	at org.sparkproject.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:220)
> 	at org.sparkproject.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
> 	at org.sparkproject.jetty.server.Server.handle(Server.java:505)
> 	at org.sparkproject.jetty.server.HttpChannel.handle(HttpChannel.java:370)
> 	at org.sparkproject.jetty.server.HttpConnection.onFillable(HttpConnection.java:267)
> 	at org.sparkproject.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:305)
> 	at org.sparkproject.jetty.io.FillInterest.fillable(FillInterest.java:103)
> 	at org.sparkproject.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)
> 	at org.sparkproject.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:333)
> 	at org.sparkproject.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:310)
> 	at org.sparkproject.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:168)
> 	at org.sparkproject.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:126)
> 	at org.sparkproject.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:366)
> 	at org.sparkproject.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:698)
> 	at org.sparkproject.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:804)
> 	at java.lang.Thread.run(Thread.java:748)
> 2019-10-02 11:34:00
> Full thread dump Java HotSpot(TM) 64-Bit Server VM (25.191-b12 mixed mode):
> ...
> "qtp2072313080-30" #30 daemon prio=5 os_prio=31 tid=0x00007ff5b90e7800 nid=0x9703 runnable
[0x000070000f220000]
>    java.lang.Thread.State: RUNNABLE
> 	at java.io.FileInputStream.readBytes(Native Method)
> 	at java.io.FileInputStream.read(FileInputStream.java:255)
> 	at org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileInputStream.read(RawLocalFileSystem.java:156)
> 	at java.io.BufferedInputStream.read1(BufferedInputStream.java:284)
> 	at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
> 	- locked <0x00000007b5f97c60> (a org.apache.hadoop.fs.BufferedFSInputStream)
> 	at java.io.DataInputStream.read(DataInputStream.java:149)
> 	at org.apache.hadoop.fs.FSInputChecker.readFully(FSInputChecker.java:436)
> 	at org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSInputChecker.readChunk(ChecksumFileSystem.java:257)
> 	at org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.java:276)
> 	at org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:228)
> 	at org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:196)
> 	- locked <0x00000007b5f97b58> (a org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSInputChecker)
> 	at java.io.DataInputStream.read(DataInputStream.java:149)
> 	at java.io.BufferedInputStream.read1(BufferedInputStream.java:284)
> 	at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
> 	- locked <0x00000007b5f97af8> (a java.io.BufferedInputStream)
> 	at com.github.luben.zstd.ZstdInputStream.readInternal(ZstdInputStream.java:129)
> 	at com.github.luben.zstd.ZstdInputStream.read(ZstdInputStream.java:107)
> 	- locked <0x00000007b5f97ac0> (a com.github.luben.zstd.ZstdInputStream)
> 	at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> 	at java.io.BufferedInputStream.read1(BufferedInputStream.java:286)
> 	at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
> 	- locked <0x00000007b5cd3bd0> (a java.io.BufferedInputStream)
> 	at sun.nio.cs.StreamDecoder.readBytes(StreamDecoder.java:284)
> 	at sun.nio.cs.StreamDecoder.implRead(StreamDecoder.java:326)
> 	at sun.nio.cs.StreamDecoder.read(StreamDecoder.java:178)
> 	- locked <0x00000007b5f94a00> (a java.io.InputStreamReader)
> 	at java.io.InputStreamReader.read(InputStreamReader.java:184)
> 	at java.io.BufferedReader.fill(BufferedReader.java:161)
> 	at java.io.BufferedReader.readLine(BufferedReader.java:324)
> 	- locked <0x00000007b5f94a00> (a java.io.InputStreamReader)
> 	at java.io.BufferedReader.readLine(BufferedReader.java:389)
> 	at scala.io.BufferedSource$BufferedLineIterator.hasNext(BufferedSource.scala:74)
> 	at scala.collection.Iterator$$anon$20.hasNext(Iterator.scala:884)
> 	at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:511)
> 	at org.apache.spark.scheduler.ReplayListenerBus.replay(ReplayListenerBus.scala:80)
> 	at org.apache.spark.scheduler.ReplayListenerBus.replay(ReplayListenerBus.scala:58)
> 	at org.apache.spark.deploy.history.FsHistoryProvider.$anonfun$rebuildAppStore$5(FsHistoryProvider.scala:976)
> 	at org.apache.spark.deploy.history.FsHistoryProvider.$anonfun$rebuildAppStore$5$adapted(FsHistoryProvider.scala:975)
> 	at org.apache.spark.deploy.history.FsHistoryProvider$$Lambda$662/1267867461.apply(Unknown
Source)
> 	at org.apache.spark.util.Utils$.tryWithResource(Utils.scala:2567)
> 	at org.apache.spark.deploy.history.FsHistoryProvider.rebuildAppStore(FsHistoryProvider.scala:975)
> 	at org.apache.spark.deploy.history.FsHistoryProvider.createInMemoryStore(FsHistoryProvider.scala:1093)
> 	at org.apache.spark.deploy.history.FsHistoryProvider.getAppUI(FsHistoryProvider.scala:346)
> 	at org.apache.spark.deploy.history.HistoryServer.getAppUI(HistoryServer.scala:188)
> 	at org.apache.spark.deploy.history.ApplicationCache.$anonfun$loadApplicationEntry$2(ApplicationCache.scala:163)
> 	at org.apache.spark.deploy.history.ApplicationCache$$Lambda$592/2060065989.apply(Unknown
Source)
> 	at org.apache.spark.deploy.history.ApplicationCache.time(ApplicationCache.scala:135)
> 	at org.apache.spark.deploy.history.ApplicationCache.org$apache$spark$deploy$history$ApplicationCache$$loadApplicationEntry(ApplicationCache.scala:161)
> 	at org.apache.spark.deploy.history.ApplicationCache$$anon$1.load(ApplicationCache.scala:56)
> 	at org.apache.spark.deploy.history.ApplicationCache$$anon$1.load(ApplicationCache.scala:52)
> 	at org.sparkproject.guava.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599)
> 	at org.sparkproject.guava.cache.LocalCache$Segment.loadSync(LocalCache.java:2379)
> 	at org.sparkproject.guava.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342)
> 	- locked <0x00000007b5cd3de0> (a org.sparkproject.guava.cache.LocalCache$StrongAccessEntry)
> 	at org.sparkproject.guava.cache.LocalCache$Segment.get(LocalCache.java:2257)
> 	at org.sparkproject.guava.cache.LocalCache.get(LocalCache.java:4000)
> 	at org.sparkproject.guava.cache.LocalCache.getOrLoad(LocalCache.java:4004)
> 	at org.sparkproject.guava.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874)
> 	at org.apache.spark.deploy.history.ApplicationCache.get(ApplicationCache.scala:89)
> 	at org.apache.spark.deploy.history.ApplicationCache.withSparkUI(ApplicationCache.scala:101)
> 	at org.apache.spark.deploy.history.HistoryServer.org$apache$spark$deploy$history$HistoryServer$$loadAppUi(HistoryServer.scala:245)
> 	at org.apache.spark.deploy.history.HistoryServer$$anon$1.doGet(HistoryServer.scala:98)
> 	at javax.servlet.http.HttpServlet.service(HttpServlet.java:687)
> 	at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
> 	at org.sparkproject.jetty.servlet.ServletHolder.handle(ServletHolder.java:873)
> 	at org.sparkproject.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1623)
> 	at org.apache.spark.ui.HttpSecurityFilter.doFilter(HttpSecurityFilter.scala:95)
> 	at org.sparkproject.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1610)
> 	at org.sparkproject.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:540)
> 	at org.sparkproject.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255)
> 	at org.sparkproject.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1345)
> 	at org.sparkproject.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203)
> 	at org.sparkproject.jetty.servlet.ServletHandler.doScope(ServletHandler.java:480)
> 	at org.sparkproject.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201)
> 	at org.sparkproject.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1247)
> 	at org.sparkproject.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144)
> 	at org.sparkproject.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:753)
> 	at org.sparkproject.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:220)
> 	at org.sparkproject.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
> 	at org.sparkproject.jetty.server.Server.handle(Server.java:505)
> 	at org.sparkproject.jetty.server.HttpChannel.handle(HttpChannel.java:370)
> 	at org.sparkproject.jetty.server.HttpConnection.onFillable(HttpConnection.java:267)
> 	at org.sparkproject.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:305)
> 	at org.sparkproject.jetty.io.FillInterest.fillable(FillInterest.java:103)
> 	at org.sparkproject.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)
> 	at org.sparkproject.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:333)
> 	at org.sparkproject.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:310)
> 	at org.sparkproject.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:168)
> 	at org.sparkproject.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:126)
> 	at org.sparkproject.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:366)
> 	at org.sparkproject.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:698)
> 	at org.sparkproject.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:804)
> 	at java.lang.Thread.run(Thread.java:748)
> 2019-10-02 11:38:33
> Full thread dump Java HotSpot(TM) 64-Bit Server VM (25.191-b12 mixed mode):
> ...
> "qtp2072313080-30" #30 daemon prio=5 os_prio=31 tid=0x00007ff5b90e7800 nid=0x9703 runnable
[0x000070000f220000]
>    java.lang.Thread.State: RUNNABLE
> 	at java.io.FileInputStream.readBytes(Native Method)
> 	at java.io.FileInputStream.read(FileInputStream.java:255)
> 	at org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileInputStream.read(RawLocalFileSystem.java:156)
> 	at java.io.BufferedInputStream.read1(BufferedInputStream.java:284)
> 	at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
> 	- locked <0x00000007b5f97c60> (a org.apache.hadoop.fs.BufferedFSInputStream)
> 	at java.io.DataInputStream.read(DataInputStream.java:149)
> 	at org.apache.hadoop.fs.FSInputChecker.readFully(FSInputChecker.java:436)
> 	at org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSInputChecker.readChunk(ChecksumFileSystem.java:257)
> 	at org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.java:276)
> 	at org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:228)
> 	at org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:196)
> 	- locked <0x00000007b5f97b58> (a org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSInputChecker)
> 	at java.io.DataInputStream.read(DataInputStream.java:149)
> 	at java.io.BufferedInputStream.read1(BufferedInputStream.java:284)
> 	at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
> 	- locked <0x00000007b5f97af8> (a java.io.BufferedInputStream)
> 	at com.github.luben.zstd.ZstdInputStream.readInternal(ZstdInputStream.java:129)
> 	at com.github.luben.zstd.ZstdInputStream.read(ZstdInputStream.java:107)
> 	- locked <0x00000007b5f97ac0> (a com.github.luben.zstd.ZstdInputStream)
> 	at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> 	at java.io.BufferedInputStream.read1(BufferedInputStream.java:286)
> 	at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
> 	- locked <0x00000007b5cd3bd0> (a java.io.BufferedInputStream)
> 	at sun.nio.cs.StreamDecoder.readBytes(StreamDecoder.java:284)
> 	at sun.nio.cs.StreamDecoder.implRead(StreamDecoder.java:326)
> 	at sun.nio.cs.StreamDecoder.read(StreamDecoder.java:178)
> 	- locked <0x00000007b5f94a00> (a java.io.InputStreamReader)
> 	at java.io.InputStreamReader.read(InputStreamReader.java:184)
> 	at java.io.BufferedReader.fill(BufferedReader.java:161)
> 	at java.io.BufferedReader.readLine(BufferedReader.java:324)
> 	- locked <0x00000007b5f94a00> (a java.io.InputStreamReader)
> 	at java.io.BufferedReader.readLine(BufferedReader.java:389)
> 	at scala.io.BufferedSource$BufferedLineIterator.hasNext(BufferedSource.scala:74)
> 	at scala.collection.Iterator$$anon$20.hasNext(Iterator.scala:884)
> 	at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:511)
> 	at org.apache.spark.scheduler.ReplayListenerBus.replay(ReplayListenerBus.scala:80)
> 	at org.apache.spark.scheduler.ReplayListenerBus.replay(ReplayListenerBus.scala:58)
> 	at org.apache.spark.deploy.history.FsHistoryProvider.$anonfun$rebuildAppStore$5(FsHistoryProvider.scala:976)
> 	at org.apache.spark.deploy.history.FsHistoryProvider.$anonfun$rebuildAppStore$5$adapted(FsHistoryProvider.scala:975)
> 	at org.apache.spark.deploy.history.FsHistoryProvider$$Lambda$662/1267867461.apply(Unknown
Source)
> 	at org.apache.spark.util.Utils$.tryWithResource(Utils.scala:2567)
> 	at org.apache.spark.deploy.history.FsHistoryProvider.rebuildAppStore(FsHistoryProvider.scala:975)
> 	at org.apache.spark.deploy.history.FsHistoryProvider.createInMemoryStore(FsHistoryProvider.scala:1093)
> 	at org.apache.spark.deploy.history.FsHistoryProvider.getAppUI(FsHistoryProvider.scala:346)
> 	at org.apache.spark.deploy.history.HistoryServer.getAppUI(HistoryServer.scala:188)
> 	at org.apache.spark.deploy.history.ApplicationCache.$anonfun$loadApplicationEntry$2(ApplicationCache.scala:163)
> 	at org.apache.spark.deploy.history.ApplicationCache$$Lambda$592/2060065989.apply(Unknown
Source)
> 	at org.apache.spark.deploy.history.ApplicationCache.time(ApplicationCache.scala:135)
> 	at org.apache.spark.deploy.history.ApplicationCache.org$apache$spark$deploy$history$ApplicationCache$$loadApplicationEntry(ApplicationCache.scala:161)
> 	at org.apache.spark.deploy.history.ApplicationCache$$anon$1.load(ApplicationCache.scala:56)
> 	at org.apache.spark.deploy.history.ApplicationCache$$anon$1.load(ApplicationCache.scala:52)
> 	at org.sparkproject.guava.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599)
> 	at org.sparkproject.guava.cache.LocalCache$Segment.loadSync(LocalCache.java:2379)
> 	at org.sparkproject.guava.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342)
> 	- locked <0x00000007b5cd3de0> (a org.sparkproject.guava.cache.LocalCache$StrongAccessEntry)
> 	at org.sparkproject.guava.cache.LocalCache$Segment.get(LocalCache.java:2257)
> 	at org.sparkproject.guava.cache.LocalCache.get(LocalCache.java:4000)
> 	at org.sparkproject.guava.cache.LocalCache.getOrLoad(LocalCache.java:4004)
> 	at org.sparkproject.guava.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874)
> 	at org.apache.spark.deploy.history.ApplicationCache.get(ApplicationCache.scala:89)
> 	at org.apache.spark.deploy.history.ApplicationCache.withSparkUI(ApplicationCache.scala:101)
> 	at org.apache.spark.deploy.history.HistoryServer.org$apache$spark$deploy$history$HistoryServer$$loadAppUi(HistoryServer.scala:245)
> 	at org.apache.spark.deploy.history.HistoryServer$$anon$1.doGet(HistoryServer.scala:98)
> 	at javax.servlet.http.HttpServlet.service(HttpServlet.java:687)
> 	at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
> 	at org.sparkproject.jetty.servlet.ServletHolder.handle(ServletHolder.java:873)
> 	at org.sparkproject.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1623)
> 	at org.apache.spark.ui.HttpSecurityFilter.doFilter(HttpSecurityFilter.scala:95)
> 	at org.sparkproject.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1610)
> 	at org.sparkproject.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:540)
> 	at org.sparkproject.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255)
> 	at org.sparkproject.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1345)
> 	at org.sparkproject.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203)
> 	at org.sparkproject.jetty.servlet.ServletHandler.doScope(ServletHandler.java:480)
> 	at org.sparkproject.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201)
> 	at org.sparkproject.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1247)
> 	at org.sparkproject.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144)
> 	at org.sparkproject.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:753)
> 	at org.sparkproject.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:220)
> 	at org.sparkproject.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
> 	at org.sparkproject.jetty.server.Server.handle(Server.java:505)
> 	at org.sparkproject.jetty.server.HttpChannel.handle(HttpChannel.java:370)
> 	at org.sparkproject.jetty.server.HttpConnection.onFillable(HttpConnection.java:267)
> 	at org.sparkproject.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:305)
> 	at org.sparkproject.jetty.io.FillInterest.fillable(FillInterest.java:103)
> 	at org.sparkproject.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)
> 	at org.sparkproject.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:333)
> 	at org.sparkproject.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:310)
> 	at org.sparkproject.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:168)
> 	at org.sparkproject.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:126)
> 	at org.sparkproject.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:366)
> 	at org.sparkproject.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:698)
> 	at org.sparkproject.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:804)
> 	at java.lang.Thread.run(Thread.java:748)
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message