From user-return-48454-apmail-spark-user-archive=spark.apache.org@spark.apache.org Tue Dec 22 18:39:46 2015 Return-Path: X-Original-To: apmail-spark-user-archive@minotaur.apache.org Delivered-To: apmail-spark-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id BDA0D18FF3 for ; Tue, 22 Dec 2015 18:39:46 +0000 (UTC) Received: (qmail 83410 invoked by uid 500); 22 Dec 2015 18:39:40 -0000 Delivered-To: apmail-spark-user-archive@spark.apache.org Received: (qmail 83229 invoked by uid 500); 22 Dec 2015 18:39:40 -0000 Mailing-List: contact user-help@spark.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Delivered-To: mailing list user@spark.apache.org Received: (qmail 83182 invoked by uid 99); 22 Dec 2015 18:39:40 -0000 Received: from Unknown (HELO spamd4-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 22 Dec 2015 18:39:40 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd4-us-west.apache.org (ASF Mail Server at spamd4-us-west.apache.org) with ESMTP id 886B6C0608 for ; Tue, 22 Dec 2015 18:39:39 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd4-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 4.879 X-Spam-Level: **** X-Spam-Status: No, score=4.879 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, DKIM_VALID_AU=-0.1, HTML_MESSAGE=3, KAM_BADIPHTTP=2, RCVD_IN_MSPIKE_H3=-0.01, RCVD_IN_MSPIKE_WL=-0.01, SPF_PASS=-0.001] autolearn=disabled Authentication-Results: spamd4-us-west.apache.org (amavisd-new); dkim=pass (2048-bit key) header.d=gmail.com Received: from mx1-eu-west.apache.org ([10.40.0.8]) by localhost (spamd4-us-west.apache.org [10.40.0.11]) (amavisd-new, port 10024) with ESMTP id eOdgn7pR7Bfu for ; Tue, 22 Dec 2015 18:39:38 +0000 (UTC) Received: from mail-yk0-f174.google.com (mail-yk0-f174.google.com [209.85.160.174]) by mx1-eu-west.apache.org (ASF Mail Server at mx1-eu-west.apache.org) with ESMTPS id 5FFAB20F5E for ; Tue, 22 Dec 2015 18:39:37 +0000 (UTC) Received: by mail-yk0-f174.google.com with SMTP id v6so172415870ykc.2 for ; Tue, 22 Dec 2015 10:39:37 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :cc:content-type; bh=B3jFmmcnBjzhQLwkoZ1KE1i7cbPX3kY+8p1BlYgFsAI=; b=jhuYCR2O3iqBrf6TgPhEtqjSqCXQuyBYVyRgrpaqshF1RXAtsy/7KtvhRcczTcPQnq bEH7SMSCmZsK/BKdfsIFwUgmPlOZxD1dSsK4CBqX3xZ1Qha77xiXBAqIgVhuAU7Coe3v aBkomVdG1iCrZpyGCvuwjMxYnhwKJgTkRinFmL+9tJz5rR1kfNfRNiNYNo+h6OGKTN7D DjV8E1XAALNq/6xE0W1xVnGA3Nji/zuPvHF8Ry6a3O5X83AMvm+RR/H9d8EV7wmjLn+U Vkik0LpipfWbt/fBox0rDHrGvmhQSdNNSo/M3wsIslhlgV8M3CBVmJTdKX6xoxWk/Hyu UpZQ== MIME-Version: 1.0 X-Received: by 10.129.128.66 with SMTP id q63mr23021814ywf.47.1450809576395; Tue, 22 Dec 2015 10:39:36 -0800 (PST) Received: by 10.37.201.194 with HTTP; Tue, 22 Dec 2015 10:39:36 -0800 (PST) In-Reply-To: References: Date: Tue, 22 Dec 2015 10:39:36 -0800 Message-ID: Subject: Re: Stand Alone Cluster - Strange issue From: Ted Yu To: Madabhattula Rajesh Kumar Cc: "user@spark.apache.org" , "user@spark.incubator.apache.org" Content-Type: multipart/alternative; boundary=94eb2c0327f0701dde052780ed97 --94eb2c0327f0701dde052780ed97 Content-Type: text/plain; charset=UTF-8 This should be related: https://issues.apache.org/jira/browse/SPARK-4170 On Tue, Dec 22, 2015 at 9:34 AM, Madabhattula Rajesh Kumar < mrajaforu@gmail.com> wrote: > Hi, > > I have a standalone cluster. One Master + One Slave. I'm getting below > "NULL POINTER" exception. > > Could you please help me on this issue. > > > *Code Block :-* > val accum = sc.accumulator(0) > sc.parallelize(Array(1, 2, 3, 4)).foreach(x => accum += x) *==> This line > giving exception.* > > Exception :- > > 15/12/22 09:18:26 WARN scheduler.TaskSetManager: Lost task 1.0 in stage > 0.0 (TID 1, 172.25.111.123): *java.lang.NullPointerException* > at com.cc.ss.etl.Main$$anonfun$1.apply$mcVI$sp(Main.scala:25) > at com.cc.ss.etl.Main$$anonfun$1.apply(Main.scala:25) > at com.cc.ss.etl.Main$$anonfun$1.apply(Main.scala:25) > at scala.collection.Iterator$class.foreach(Iterator.scala:727) > at > org.apache.spark.InterruptibleIterator.foreach(InterruptibleIterator.scala:28) > at > org.apache.spark.rdd.RDD$$anonfun$foreach$1$$anonfun$apply$28.apply(RDD.scala:890) > at > org.apache.spark.rdd.RDD$$anonfun$foreach$1$$anonfun$apply$28.apply(RDD.scala:890) > at > org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1848) > at > org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1848) > at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66) > at org.apache.spark.scheduler.Task.run(Task.scala:88) > at > org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) > at java.lang.Thread.run(Thread.java:745) > > 15/12/22 09:18:26 INFO scheduler.TaskSetManager: Lost task 0.0 in stage > 0.0 (TID 0) on executor 172.25.111.123: java.lang.NullPointerException > (null) [duplicate 1] > 15/12/22 09:18:26 INFO scheduler.TaskSetManager: Starting task 0.1 in > stage 0.0 (TID 2, 172.25.111.123, PROCESS_LOCAL, 2155 bytes) > 15/12/22 09:18:26 INFO scheduler.TaskSetManager: Starting task 1.1 in > stage 0.0 (TID 3, 172.25.111.123, PROCESS_LOCAL, 2155 bytes) > 15/12/22 09:18:26 WARN scheduler.TaskSetManager: Lost task 1.1 in stage > 0.0 (TID 3, 172.25.111.123): > > Regards, > Rajesh > --94eb2c0327f0701dde052780ed97 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable

On Tue, Dec 22, 2015 at 9:34 AM, Madabhattula Rajesh Kumar <m= rajaforu@gmail.com> wrote:
=
Hi,

I have a standalone c= luster. One Master + One Slave. I'm getting below "NULL POINTER&qu= ot; exception.

Could you please help me on this issue.

Code Block :-

=C2=A0val accum =3D sc.accumula= tor(0)
sc.parallelize(Array(1, 2, 3, 4)).foreach(x =3D> accum +=3D x)= =3D=3D> This line giving exception.

Exception :-
15/12/22 09:18:26 WARN scheduler.TaskSetManager: Lost task 1.0 in stag= e 0.0 (TID 1, 172.25.111.123): java.lang.NullPointerException
=C2= =A0=C2=A0=C2=A0 at com.cc.ss.etl.Main$$anonfun$1.apply$mcVI$sp(Main.scala:2= 5)
=C2=A0=C2=A0=C2=A0 at com.cc.ss.etl.Main$$anonfun$1.apply(Main.scala:= 25)
=C2=A0=C2=A0=C2=A0 at com.cc.ss.etl.Main$$anonfun$1.apply(Main.scala= :25)
=C2=A0=C2=A0=C2=A0 at scala.collection.Iterator$class.foreach(Itera= tor.scala:727)
=C2=A0=C2=A0=C2=A0 at org.apache.spark.InterruptibleItera= tor.foreach(InterruptibleIterator.scala:28)
=C2=A0=C2=A0=C2=A0 at org.ap= ache.spark.rdd.RDD$$anonfun$foreach$1$$anonfun$apply$28.apply(RDD.scala:890= )
=C2=A0=C2=A0=C2=A0 at org.apache.spark.rdd.RDD$$anonfun$foreach$1$$ano= nfun$apply$28.apply(RDD.scala:890)
=C2=A0=C2=A0=C2=A0 at org.apache.spar= k.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1848)
=C2=A0= =C2=A0=C2=A0 at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(Spark= Context.scala:1848)
=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.Res= ultTask.runTask(ResultTask.scala:66)
=C2=A0=C2=A0=C2=A0 at org.apache.sp= ark.scheduler.Task.run(Task.scala:88)
=C2=A0=C2=A0=C2=A0 at org.apache.s= park.executor.Executor$TaskRunner.run(Executor.scala:214)
=C2=A0=C2=A0= =C2=A0 at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecu= tor.java:1145)
=C2=A0=C2=A0=C2=A0 at java.util.concurrent.ThreadPoolExec= utor$Worker.run(ThreadPoolExecutor.java:615)
=C2=A0=C2=A0=C2=A0 at java.= lang.Thread.run(Thread.java:745)

15/12/22 09:18:26 INFO scheduler.Ta= skSetManager: Lost task 0.0 in stage 0.0 (TID 0) on executor 172.25.111.123: java.lang.NullPoi= nterException (null) [duplicate 1]
15/12/22 09:18:26 INFO scheduler.Task= SetManager: Starting task 0.1 in stage 0.0 (TID 2, 172.25.111.123, PROCESS_= LOCAL, 2155 bytes)
15/12/22 09:18:26 INFO scheduler.TaskSetManager: Star= ting task 1.1 in stage 0.0 (TID 3, 172.25.111.123, PROCESS_LOCAL, 2155 byte= s)
15/12/22 09:18:26 WARN scheduler.TaskSetManager: Lost task 1.1 in sta= ge 0.0 (TID 3, 172.25.111.123):

Regards,
Rajesh
<= /div>

--94eb2c0327f0701dde052780ed97--