spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Tim Chou <timchou....@gmail.com>
Subject Re: My task is finished successfully, however, I find some exceptions in webpage.
Date Sat, 04 Oct 2014 16:10:28 GMT
Can anyone help me?
I find if I doesn't use a hdfs file as the input, then there's no this kind
of exceptions.

I search online and find nothing. How to debug spark program?

Thanks,
Tim

2014-10-03 17:46 GMT-05:00 Tim Chou <timchou.hit@gmail.com>:

> Hi All,
>
> Sorry to disturb you.
>
> I have built a spark cluster based on mesos.
> I run some tests on spark shell.
>
> It works. However I can find some exceptions in webpage.
> scala> val textFile = sc.textFile("hdfs://10.1.2.12:9000/README.md")
> scala> textFile.count()
> 14/10/03 15:20:54 INFO mapred.FileInputFormat: Total input paths to
> process : 1
> 14/10/03 15:20:54 INFO spark.SparkContext: Starting job: count at
> <console>:15
> 14/10/03 15:20:54 INFO scheduler.DAGScheduler: Got job 0 (count at
> <console>:15) with 2 output partitions (allowLocal=false)
> 14/10/03 15:20:54 INFO scheduler.DAGScheduler: Final stage: Stage 0(count
> at <console>:15)
> 14/10/03 15:20:54 INFO scheduler.DAGScheduler: Parents of final stage:
> List()
> 14/10/03 15:20:54 INFO scheduler.DAGScheduler: Missing parents: List()
> 14/10/03 15:20:54 INFO scheduler.DAGScheduler: Submitting Stage 0 (hdfs://
> 10.1.2.12:9000/README.md MappedRDD[1] at textFile at <console>:12), which
> has no missing parents
> ......
> res0: Long = 141
>
>
> What's the problem? How can I solve it?
>
> Thanks,
> Tim
>
> The error information:
> count at <console>:15 +details
> org.apache.spark.rdd.RDD.count(RDD.scala:904)
> $line9.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:15)
> $line9.$read$$iwC$$iwC$$iwC.<init>(<console>:20)
> $line9.$read$$iwC$$iwC.<init>(<console>:22)
> $line9.$read$$iwC.<init>(<console>:24)
> $line9.$read.<init>(<console>:26)
> $line9.$read$.<init>(<console>:30)
> $line9.$read$.<clinit>(<console>)
> $line9.$eval$.<init>(<console>:7)
> $line9.$eval$.<clinit>(<console>)
> $line9.$eval.$print(<console>)
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> java.lang.reflect.Method.invoke(Method.java:606)
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:789)
> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1062)
> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:615)
> org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:646)
>
>

Mime
View raw message