From user-return-62257-apmail-spark-user-archive=spark.apache.org@spark.apache.org Sat Sep 3 22:00:05 2016 Return-Path: X-Original-To: apmail-spark-user-archive@minotaur.apache.org Delivered-To: apmail-spark-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 706441926B for ; Sat, 3 Sep 2016 22:00:05 +0000 (UTC) Received: (qmail 72128 invoked by uid 500); 3 Sep 2016 21:59:58 -0000 Delivered-To: apmail-spark-user-archive@spark.apache.org Received: (qmail 72005 invoked by uid 500); 3 Sep 2016 21:59:58 -0000 Mailing-List: contact user-help@spark.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Delivered-To: mailing list user@spark.apache.org Received: (qmail 71994 invoked by uid 99); 3 Sep 2016 21:59:58 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd3-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Sat, 03 Sep 2016 21:59:58 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd3-us-west.apache.org (ASF Mail Server at spamd3-us-west.apache.org) with ESMTP id 0A446180657 for ; Sat, 3 Sep 2016 21:59:57 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd3-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 2.943 X-Spam-Level: ** X-Spam-Status: No, score=2.943 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, DKIM_VALID_AU=-0.1, FREEMAIL_ENVFROM_END_DIGIT=0.25, HTML_MESSAGE=2, KAM_TRACKIMAGE=0.2, RCVD_IN_DNSWL_LOW=-0.7, RCVD_IN_MSPIKE_H3=-0.01, RCVD_IN_MSPIKE_WL=-0.01, SPF_PASS=-0.001, URI_HEX=1.313, WEIRD_PORT=0.001] autolearn=disabled Authentication-Results: spamd3-us-west.apache.org (amavisd-new); dkim=pass (2048-bit key) header.d=gmail.com Received: from mx2-lw-us.apache.org ([10.40.0.8]) by localhost (spamd3-us-west.apache.org [10.40.0.10]) (amavisd-new, port 10024) with ESMTP id YaR6cfTGcrKX for ; Sat, 3 Sep 2016 21:59:53 +0000 (UTC) Received: from mail-qk0-f181.google.com (mail-qk0-f181.google.com [209.85.220.181]) by mx2-lw-us.apache.org (ASF Mail Server at mx2-lw-us.apache.org) with ESMTPS id 207655F1EB for ; Sat, 3 Sep 2016 21:59:53 +0000 (UTC) Received: by mail-qk0-f181.google.com with SMTP id v123so158966354qkh.2 for ; Sat, 03 Sep 2016 14:59:53 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=from:to:cc:subject:message-id:date:in-reply-to:references :mime-version; bh=8o+sFQXq9q8xC5A8e1X6DIOBEP+V8P3aZ6DpvUek5Ug=; b=GdgOm04adpSuXswXibr84Zy24cfIPCEkSxYvrTNmKP5fHkpGAXmVYXQJL3pCAo9u/L p4lK+Uj8YsU/2kmzxd0y4f+b7g2KV9qAQOMP/ZCtWxvuHVOfXuVxZC++DW8NGHXOcjWA hkHdmhsRbjDBvq/2+lsvCUcmkY4EdDptAWWV7ujz7cDO63Oj+GBp+vUuMeRUS93lAWlS t8bF5w/Dx5koRy6LzKvl6zLGK/sJmf8EK8ipB3i/9zHAtin1furT5GPPyfe9fP09uVUE 9NOxH4Yo5o1dp77JyWUbebG6K92/JnX++8OmYlvndts+yuUTfpxPnh40aJWw7WCtqIJm fIRw== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20130820; h=x-gm-message-state:from:to:cc:subject:message-id:date:in-reply-to :references:mime-version; bh=8o+sFQXq9q8xC5A8e1X6DIOBEP+V8P3aZ6DpvUek5Ug=; b=J1xf6AQ7HwarWjoGQuqZty4b6f62480mKsPXk4mvnpoWD9S6uj0FAUfTVUOBofH+0a 1KhMgwIBQYNklRgRhMtBSrCsoaIQyf3daGzZ69wWXRCWEX2QqB4W5I+FPbkXrywOZY+B 41REOW9aASSM2sVyDdIi51B4oJP2D+23ux9Lo3nPNnKb4M5ckaAbjbAMkJdZlDul+8Ju WFXTJMrUWe5348M5jzEXvvMzr+aopeqGU++vCetMmuxHUickibqtvaCd5R7TxBBLVhdY OVyY0CJZ0QBCju/4u4IoKZaM/8syc8A2sUJ4zJNe385XyZl4l/04BHgfXm2fzIUqvCCq 6FdA== X-Gm-Message-State: AE9vXwMnPGovkCV/pA5BA4Opz+XtafhxP/lmW+k5xcXNmtVygwlfNNsWPzYd0yyyUNPA6w== X-Received: by 10.55.77.11 with SMTP id a11mr29215021qkb.112.1472939992693; Sat, 03 Sep 2016 14:59:52 -0700 (PDT) Received: from [127.0.0.1] (send-23.prod.mixmax-mailer.com. [52.206.110.60]) by smtp.gmail.com with ESMTPSA id x42sm10131611qtc.29.2016.09.03.14.59.52 (version=TLS1_2 cipher=ECDHE-RSA-AES128-GCM-SHA256 bits=128/128); Sat, 03 Sep 2016 14:59:52 -0700 (PDT) Content-Type: multipart/alternative; boundary="----sinikael-?=_1-14729399919430.43779887491837144" From: kant kodali To: Fridtjof Sander Cc: Tal Grynbaum , user@spark.apache.org Subject: Re: any idea what this error could be? Message-Id: <756e4411-36b3-0c68-cddc-f9263ec24f53@mixmax.com> Date: Sat, 03 Sep 2016 21:59:51 +0000 In-Reply-To: References: X-Mixmax-Message-Id: 87IlIN3ltpCZNdylp X-Mailer: Mixmax (mixmax.com) MIME-Version: 1.0 ------sinikael-?=_1-14729399919430.43779887491837144 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: quoted-printable @Fridtjof you are right! changing it to this Fixed it! ompile group: org.apache.spark' name: 'spark-core_2.11' version: '2.0.0' compile group: 'org.apache.spark' name: 'spark-streaming_2.11' version: '2.= 0.0' On Sat, Sep 3, 2016 12:30 PM, kant kodali kanth909@gmail.com wrote: I increased the memory but nothing has changed I still get the same = error. @Fridtjofon my driver side I am using the following = dependenciescompile group: org.apache.spark' name: 'spark-core_2.10' = version: '2.0.0' compile group: 'org.apache.spark' name: 'spark-streaming_2= .10' version: '2.0.0' on the executor side I don't know what jars are being= used but I have installed using this zip filespark-2.0.0-bin-hadoop2.7.tgz =20 On Sat, Sep 3, 2016 4:20 AM, Fridtjof Sander fridtjof.= sander@googlemail.com wrote: There is an InvalidClassException complaining = about non-matching serialVersionUIDs. Shouldn't that be caused by different= jars on executors and driver? Am 03.09.2016 1:04 nachm. schrieb "Tal = Grynbaum" : My guess is that you're running out of = memory somewhere.=C2=A0 Try to increase the driver memory and/or executor = memory.=C2=A0=C2=A0=20 On Sat, Sep 3, 2016, 11:42 kant kodali = wrote: I am running this on aws. =20 On Fri, Sep 2, 2016 11:49 PM, kant kodali kanth909@gmail.com wrote: I am running spark in stand alone mode. I guess this error when I run my = driver program..I am using spark 2.0.0. any idea what this error could be? Using Spark's default log4j profile: = org/apache/spark/log4j-defaults.properties16/09/02 23:44:44 INFO = SparkContext: Running Spark version 2.0.016/09/02 23:44:44 WARN = NativeCodeLoader: Unable to load native-hadoop library for your platform...= using builtin-java classes where applicable16/09/02 23:44:45 INFO = SecurityManager: Changing view acls to: kantkodali16/09/02 23:44:45 INFO = SecurityManager: Changing modify acls to: kantkodali16/09/02 23:44:45 INFO = SecurityManager: Changing view acls groups to: 16/09/02 23:44:45 INFO = SecurityManager: Changing modify acls groups to: 16/09/02 23:44:45 INFO = SecurityManager: SecurityManager: authentication disabled; ui acls = disabled; users with view permissions: Set(kantkodali); groups with view = permissions: Set(); users with modify permissions: Set(kantkodali); groups= with modify permissions: Set()16/09/02 23:44:45 INFO Utils: Successfully = started service 'sparkDriver' on port 62256.16/09/02 23:44:45 INFO = SparkEnv: Registering MapOutputTracker16/09/02 23:44:45 INFO SparkEnv: = Registering BlockManagerMaster16/09/02 23:44:45 INFO DiskBlockManager: = Created local directory at /private/var/folders/_6/lfxt933j3bd_xhq0m7dwm8s0= 0000gn/T/blockmgr-b56eea49-0102-4570-865a-1d3d230f0ffc16/09/02 23:44:45 = INFO MemoryStore: MemoryStore started with capacity 2004.6 MB16/09/02 = 23:44:45 INFO SparkEnv: Registering OutputCommitCoordinator16/09/02 = 23:44:45 INFO Utils: Successfully started service 'SparkUI' on port 4040.= 16/09/02 23:44:45 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at = http://192.168.0.191:404016/09/02 23:44:45 INFO StandaloneAppClient$ClientE= ndpoint: Connecting to master spark://52.43.37.223:7077...16/09/02 23:44:46= INFO TransportClientFactory: Successfully created connection to /52.43.37.= 223:7077 after 70 ms (0 ms spent in bootstraps)16/09/02 23:44:46 WARN = StandaloneAppClient$ClientEndpoint: Failed to connect to master 52.43.37.= 223:7077org.apache.spark.SparkException: Exception thrown in awaitResult = at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.= scala:77) at org.apache.spark.rpc.RpcTimeout$$anonfun$1.= applyOrElse(RpcTimeout.scala:75) at scala.runtime.= AbstractPartialFunction.apply(AbstractPartialFunction.scala:33) at org.= apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.= applyOrElse(RpcTimeout.scala:59) at org.apache.spark.rpc.= RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59) = at scala.PartialFunction$OrElse.apply(PartialFunction.scala:162) at = org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:83) at org.= apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:88) at org.= apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:96) at org.apache.= spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anonfun$tryRegister= AllMasters$1$$anon$1.run(StandaloneAppClient.scala:109) at java.util.= concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.= util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.= concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at= java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.= java:617) at java.lang.Thread.run(Thread.java:745)Caused by: java.lang.= RuntimeException: java.io.InvalidClassException: org.apache.spark.rpc.netty= .RequestMessage; local class incompatible: stream classdesc = serialVersionUID =3D -2221986757032131007, local class serialVersionUID =3D= -5447855329526097695 at java.io.ObjectStreamClass.= initNonProxy(ObjectStreamClass.java:616) at java.io.ObjectInputStream.= readNonProxyDesc(ObjectInputStream.java: ------sinikael-?=_1-14729399919430.43779887491837144 Content-Type: text/html; charset=utf-8 Content-Transfer-Encoding: quoted-printable =20 =20 =20 =20 =20 =20 =20
@Fridtjof you are right!=

changing it to this Fixed it!=

ompile group: org.apache.= spark' name: 'spark-core_2.11' version: '2.0.0'
compile group: 'org.= apache.spark' name: 'spark-streaming_2.11' version: '2.0.= 0'
=C2=A0


3D""



On Sat, Sep 3, 2016 12:30 PM, kant = kodali kanth909@gmail.com wrote:
=20 =20 =20 =20
I = increased the memory but nothing has changed I still get the same error.=

on my driver side I am using the = following dependencies
@Fridtjof
compile group: org.apache.spark' name: 'spark-core_2.10' = version: '2.0.0'
compile group: 'org.apache.spark' name: = 'spark-streaming_2.10' version: '2.0.0'
=C2=A0

on the executor side I don't know what jars are being used but I have = installed using this zip file=C2=A0spark-2.0.0-bin-hadoop2.7.tgz3D""



On Sat, Sep 3, 2016 4:20 AM, Fridtjof Sander fridtjof.sander@googlemail.com wrote:

There is an= InvalidClassException complaining about non-matching serialVersionUIDs. = Shouldn't that be caused by different jars on executors and driver?


Am 03.09.2016 1:04 nachm. schrieb "Tal Grynbaum" = <tal.grynbaum@gmail.= com>:

My guess is= that you're running out of memory somewhere.=C2=A0 Try to increase the = driver memory and/or executor memory.=C2=A0=C2=A0


On Sat, Sep 3, 2016, 11:42 kant kodali <kanth909@gmail.= com> wrote:
=20 =20 =20 =20 =20 =20 =20 =20 =20 =20 =20
=20 =20 =20 =20
=20
I am running this on aws.=C2=A0
3D"" =20



On Fri, Sep 2, 2016 11:49 PM, kant kodali = kanth909@gmail.com wrote:
=20 =20 =20 =20 =20 =
=20
I am running spark in stand alone mode. I guess this error= when I run my driver program..I am using spark 2.0.0. any idea what this = error could be?


                  
Using Spark's = default log4j profile: org/apache/spark/log4j-defaults.properties
16/09/02 = 23:44:44 INFO SparkContext: Running Spark version 2.0.0
16/09/02 = 23:44:44 WARN NativeCodeLoader: Unable to load = native-hadoop library for your platform... using builtin-java classes where applicable
16/09/02 23:44:45 = INFO SecurityManager: Changing view acls to: = kantkodali
16/09/02 23:44:45 = INFO SecurityManager: Changing modify acls to: = kantkodali
16/09/02 23:44:45 = INFO SecurityManager: Changing view acls groups= to:
16/09/02 23:44:45 = INFO SecurityManager: Changing modify acls groups= to:
16/09/02 23:44:45 = INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view = permissions: Set(kantkodali); groups= with view permissions: Set();= users with modify permissions: Set(kantkodali); groups= with modify permissions: Set()
16/09/02 23:44:45 = INFO Utils: Successfully started service 'sparkDriver' on = port 62256.
16/09/02 = 23:44:45 INFO SparkEnv: Registering MapOutputTracker
16/09/02 23:44:45 = INFO SparkEnv: Registering BlockManagerMaster
16/09/02 23:44:45 = INFO DiskBlockManager: Created local directory at = /private/var/folders/_6/lfxt933j3bd_xhq0m7dwm8s00000gn/T/blockmgr-b56eea49-0102-4570-865a-1d3d230f0ffc
16/09/02 = 23:44:45 INFO MemoryStore: MemoryStore started with capacity 2004.= 6 MB
16/09/02 23:44:45 = INFO SparkEnv: Registering OutputCommitCoordinator
16/09/02 23:44:45 INFO Utils: Successfully started service 'SparkUI' on port 4040= .
16/09/02 = 23:44:45 INFO SparkUI: Bound SparkUI to = 0.= 0.0.0, and started at = http://192.168.0.191:4040
16/09/02 = 23:44:45 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://52.43.37.223:7077...
16/09/02 = 23:44:46 INFO TransportClientFactory: Successfully created connection to /52.= 43.37.223:7077 after 70 = ms (0 ms spent = in bootstraps)
16/09/02 23:44:46 = WARN StandaloneAppClient$ClientEndpoint: Failed to connect to = master 52.43.37.223:7077
org.apache= .spark.SparkException: Exception thrown in awaitResult
at = org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:77)
at org.apache= .spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:75)
at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:33)
at org.apache= .spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)
at org.apache= .spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)
at scala.PartialFunction$OrElse.apply(PartialFunction.scala:162)
at org.apache= .spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:83)
at = org.apache.spark.rpc.RpcEnv= .setupEndpointRefByURI(RpcEnv.scala:88)
at org.apache= .spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:96)
at = org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$ano= nfun$tryRegisterAllMasters$1$$anon$1.run(StandaloneAppClient.scala:109)
at = java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.= concurrent.FutureTask.run(FutureTask.java:266)
at java.util.= concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.= concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at = java.lang.Thread.run(Thread.java:745)
Caused by: java.= lang.RuntimeException: java.= io.InvalidClassException: org.apache= .spark.rpc.netty.RequestMessage; local class incompatible: stream classdesc serialVersionUID =3D -2221986757032131007, local class serialVersionUID = =3D -5447855329526097695
at java.io.= ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616)
at java.io.= ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:
<= /td>
=20 =20 =20 ------sinikael-?=_1-14729399919430.43779887491837144--