Continuing this thread beyond standalone - onto clusters, does anyone have experience successfully running any Spark cluster on IPv6 only (not dual stack) machines? More companies are moving to IPv6 and some such as Facebook are only allocating new clusters on IPv6 only network, so this is getting more relevant. YARN still doesn’t support IPv6 per http://wiki.apache.org/hadoop/HadoopIPv6 Mesos is questionable per https://issues.apache.org/jira/browse/MESOS-1027 , did anyone get it working? Standalone: Even though below worked in a single node mode, when I tried to connect to a remote master - it failed with the following, nor did it work with IPv6 address directly like "./bin/spark-shell --master spark://[2401:db00:2030:709b:face:0:9:0]:7078" client side: [root@dispark002.ash3 ~/spark-1.4.0-bin-hadoop2.6]# ./bin/spark-shell --master spark://dispark001:7078 log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory). log4j:WARN Please initialize the log4j system properly. log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 15/06/24 10:34:03 INFO SecurityManager: Changing view acls to: root 15/06/24 10:34:03 INFO SecurityManager: Changing modify acls to: root 15/06/24 10:34:03 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root) 15/06/24 10:34:03 INFO HttpServer: Starting HTTP Server 15/06/24 10:34:03 INFO Utils: Successfully started service 'HTTP class server' on port 49189. Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 1.4.0 /_/ Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_25) Type in expressions to have them evaluated. Type :help for more information. 15/06/24 10:34:05 INFO SparkContext: Running Spark version 1.4.0 15/06/24 10:34:05 INFO SecurityManager: Changing view acls to: root 15/06/24 10:34:05 INFO SecurityManager: Changing modify acls to: root 15/06/24 10:34:05 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root) 15/06/24 10:34:06 INFO Slf4jLogger: Slf4jLogger started 15/06/24 10:34:06 INFO Remoting: Starting remoting 15/06/24 10:34:06 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@dispark002:59150] 15/06/24 10:34:06 INFO Utils: Successfully started service 'sparkDriver' on port 59150. 15/06/24 10:34:06 INFO SparkEnv: Registering MapOutputTracker 15/06/24 10:34:06 INFO SparkEnv: Registering BlockManagerMaster 15/06/24 10:34:06 INFO DiskBlockManager: Created local directory at /tmp/spark-b4248e03-80c2-4d54-b3af-5044c8228f68/blockmgr-bb240921-31bf-48da-b96a-7120f118d002 15/06/24 10:34:06 INFO MemoryStore: MemoryStore started with capacity 265.1 MB 15/06/24 10:34:06 INFO HttpFileServer: HTTP File server directory is /tmp/spark-b4248e03-80c2-4d54-b3af-5044c8228f68/httpd-a7cbeb43-aefd-4da8-8df2-89a528b35c9e 15/06/24 10:34:06 INFO HttpServer: Starting HTTP Server 15/06/24 10:34:06 INFO Utils: Successfully started service 'HTTP file server' on port 57293. 15/06/24 10:34:06 INFO SparkEnv: Registering OutputCommitCoordinator 15/06/24 10:34:06 INFO Utils: Successfully started service 'SparkUI' on port 4040. 15/06/24 10:34:06 INFO SparkUI: Started SparkUI at http://[2401:db00:2030:709b:face:0:f:0]:4040 15/06/24 10:34:06 INFO AppClient$ClientActor: Connecting to master akka.tcp://sparkMaster@dispark001:7078/user/Master... 15/06/24 10:34:06 INFO SparkDeploySchedulerBackend: Connected to Spark cluster with app ID app-20150624103406-0004 15/06/24 10:34:06 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 64775. 15/06/24 10:34:06 INFO NettyBlockTransferService: Server created on 64775 15/06/24 10:34:06 ERROR SparkContext: Error initializing SparkContext. java.lang.AssertionError: assertion failed: Expected hostname at scala.Predef$.assert(Predef.scala:179) at org.apache.spark.util.Utils$.checkHost(Utils.scala:882) at org.apache.spark.storage.BlockManagerId.(BlockManagerId.scala:48) at org.apache.spark.storage.BlockManagerId$.apply(BlockManagerId.scala:107) at org.apache.spark.storage.BlockManager.initialize(BlockManager.scala:188) at org.apache.spark.SparkContext.(SparkContext.scala:502) at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017) at $line3.$read$$iwC$$iwC.(:9) at $line3.$read$$iwC.(:18) at $line3.$read.(:20) at $line3.$read$.(:24) at $line3.$read$.() at $line3.$eval$.(:7) at $line3.$eval$.() at $line3.$eval.$print() at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:483) at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065) at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338) at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819) at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857) at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902) at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814) at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:123) at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:122) at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324) at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:122) at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974) at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:157) at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64) at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106) at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135) at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945) at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059) at org.apache.spark.repl.Main$.main(Main.scala:31) at org.apache.spark.repl.Main.main(Main.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:483) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) server side: 15/06/24 10:34:06 INFO Master: Registering app Spark shell 15/06/24 10:34:06 INFO Master: Registered app Spark shell with ID app-20150624103406-0004 15/06/24 10:34:06 INFO Master: Received unregister request from application app-20150624103406-0004 15/06/24 10:34:06 INFO Master: Removing app app-20150624103406-0004 15/06/24 10:34:07 INFO Master: akka.tcp://sparkDriver@dispark002:59150 got disassociated, removing it. From: Kevin Liu > Date: Wednesday, June 17, 2015 at 11:21 AM To: Akhil Das > Cc: "user@spark.apache.org" > Subject: Re: IPv6 support You Sir - are a genius. Thank you so much, works now… Still wondering why I have to do this for IPv6 only machines when the default just works on dual-stack machines, but I am happy enough for now. Kevin From: Akhil Das > Date: Wednesday, June 17, 2015 at 12:33 AM To: Kevin Liu > Cc: "user@spark.apache.org" > Subject: Re: IPv6 support If you look at this, 15/06/16 22:25:14 INFO Executor: Starting executor ID driver on host localhost 15/06/16 22:25:14 ERROR SparkContext: Error initializing SparkContext. java.lang.AssertionError: assertion failed: Expected hostname your spark.driver.host is being set to localhost which fails host.indexOf(':') == -1. Try setting your spark.driver.host to dispark001.ash3 (from whichever machine you are running the code) Thanks Best Regards On Wed, Jun 17, 2015 at 10:57 AM, Kevin Liu > wrote: Thanks Akhil, it seems that what you said should have fixed it. 1) There was a known issue directly related to below, but it has been fixed in 1.4 https://issues.apache.org/jira/browse/SPARK-6440 2) Now, with 1.4 - I see the following errors - even after I setting SPARK_MASTER_IP - your description seems to be right on, any more thoughts? [root@dispark001.ash3 ~/spark-1.4.0-bin-hadoop2.6]# ping6 dispark001.ash3 PING dispark001.ash3(dispark001.ash3.facebook.com) 56 data bytes 64 bytes from dispark001.ash3.facebook.com: icmp_seq=1 ttl=64 time=0.012 ms 64 bytes from dispark001.ash3.facebook.com: icmp_seq=2 ttl=64 time=0.021 ms 64 bytes from dispark001.ash3.facebook.com: icmp_seq=3 ttl=64 time=0.011 ms ^C --- dispark001.ash3 ping statistics --- 3 packets transmitted, 3 received, 0% packet loss, time 2113ms rtt min/avg/max/mdev = 0.011/0.014/0.021/0.006 ms [root@dispark001.ash3 ~/spark-1.4.0-bin-hadoop2.6]# export SPARK_MASTER_IP="dispark001.ash3" [root@dispark001.ash3 ~/spark-1.4.0-bin-hadoop2.6]# ./bin/run-example SparkPi 10 Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 15/06/16 22:25:13 INFO SparkContext: Running Spark version 1.4.0 15/06/16 22:25:13 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 15/06/16 22:25:13 INFO SecurityManager: Changing view acls to: root 15/06/16 22:25:13 INFO SecurityManager: Changing modify acls to: root 15/06/16 22:25:13 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root) 15/06/16 22:25:13 INFO Slf4jLogger: Slf4jLogger started 15/06/16 22:25:13 INFO Remoting: Starting remoting 15/06/16 22:25:13 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@2401:db00:2030:709b:face:0:9:0:50916] 15/06/16 22:25:13 INFO Utils: Successfully started service 'sparkDriver' on port 50916. 15/06/16 22:25:13 INFO SparkEnv: Registering MapOutputTracker 15/06/16 22:25:13 INFO SparkEnv: Registering BlockManagerMaster 15/06/16 22:25:13 INFO DiskBlockManager: Created local directory at /tmp/spark-325ae594-5b00-42e2-bd47-70c92c4e5f66/blockmgr-4bcbf896-03ad-4b11-8db5-a6eaa5f0222b 15/06/16 22:25:13 INFO MemoryStore: MemoryStore started with capacity 265.1 MB 15/06/16 22:25:13 INFO HttpFileServer: HTTP File server directory is /tmp/spark-325ae594-5b00-42e2-bd47-70c92c4e5f66/httpd-2b6a5162-0686-4cc5-accb-1bb66fddf705 15/06/16 22:25:13 INFO HttpServer: Starting HTTP Server 15/06/16 22:25:13 INFO Utils: Successfully started service 'HTTP file server' on port 35895. 15/06/16 22:25:13 INFO SparkEnv: Registering OutputCommitCoordinator 15/06/16 22:25:13 INFO Utils: Successfully started service 'SparkUI' on port 4040. 15/06/16 22:25:13 INFO SparkUI: Started SparkUI at http://[2401:db00:2030:709b:face:0:9:0]:4040 15/06/16 22:25:14 INFO SparkContext: Added JAR file:/root/spark-1.4.0-bin-hadoop2.6/lib/spark-examples-1.4.0-hadoop2.6.0.jar at http://[2401:db00:2030:709b:face:0:9:0]:35895/jars/spark-examples-1.4.0-hadoop2.6.0.jar with timestamp 1434518714122 15/06/16 22:25:14 INFO Executor: Starting executor ID driver on host localhost 15/06/16 22:25:14 ERROR SparkContext: Error initializing SparkContext. java.lang.AssertionError: assertion failed: Expected hostname at scala.Predef$.assert(Predef.scala:179) at org.apache.spark.util.Utils$.checkHost(Utils.scala:882) at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:35) at org.apache.spark.executor.Executor.(Executor.scala:413) at org.apache.spark.scheduler.local.LocalEndpoint.(LocalBackend.scala:53) at org.apache.spark.scheduler.local.LocalBackend.start(LocalBackend.scala:103) at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:141) at org.apache.spark.SparkContext.(SparkContext.scala:497) at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:28) at org.apache.spark.examples.SparkPi.main(SparkPi.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:483) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 15/06/16 22:25:14 INFO SparkUI: Stopped Spark web UI at http://[2401:db00:2030:709b:face:0:9:0]:4040 15/06/16 22:25:14 INFO DAGScheduler: Stopping DAGScheduler 15/06/16 22:25:14 ERROR SparkContext: Error stopping SparkContext after init error. java.lang.NullPointerException at org.apache.spark.scheduler.local.LocalBackend.stop(LocalBackend.scala:107) at org.apache.spark.scheduler.TaskSchedulerImpl.stop(TaskSchedulerImpl.scala:416) at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1404) at org.apache.spark.SparkContext.stop(SparkContext.scala:1642) at org.apache.spark.SparkContext.(SparkContext.scala:565) at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:28) at org.apache.spark.examples.SparkPi.main(SparkPi.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:483) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Exception in thread "main" java.lang.AssertionError: assertion failed: Expected hostname at scala.Predef$.assert(Predef.scala:179) at org.apache.spark.util.Utils$.checkHost(Utils.scala:882) at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:35) at org.apache.spark.executor.Executor.(Executor.scala:413) at org.apache.spark.scheduler.local.LocalEndpoint.(LocalBackend.scala:53) at org.apache.spark.scheduler.local.LocalBackend.start(LocalBackend.scala:103) at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:141) at org.apache.spark.SparkContext.(SparkContext.scala:497) at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:28) at org.apache.spark.examples.SparkPi.main(SparkPi.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:483) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 15/06/16 22:25:14 INFO DiskBlockManager: Shutdown hook called 15/06/16 22:25:14 INFO Utils: path = /tmp/spark-325ae594-5b00-42e2-bd47-70c92c4e5f66/blockmgr-4bcbf896-03ad-4b11-8db5-a6eaa5f0222b, already present as root for deletion. 15/06/16 22:25:14 INFO Utils: Shutdown hook called 15/06/16 22:25:14 INFO Utils: Deleting directory /tmp/spark-325ae594-5b00-42e2-bd47-70c92c4e5f66 [root@dispark001.ash3 ~/spark-1.4.0-bin-hadoop2.6]# From: Akhil Das > Date: Monday, May 25, 2015 at 9:33 AM To: Kevin Liu > Cc: "user@spark.apache.org" > Subject: Re: IPv6 support Hi Kevin, Did you try adding a host name for the ipv6? I have a few ipv6 boxes, spark failed for me when i use just the ipv6 addresses, but it works fine when i use the host names. Here's an entry in my /etc/hosts: 2607:5300:0100:0200:0000:0000:0000:0a4d hacked.work My spark-env.sh file: export SPARK_MASTER_IP="hacked.work" Here's the master listening on my v6: [Inline image 1] The Master UI with running spark-shell: [Inline image 2] I even ran a simple sc.parallelize(1 to 100).collect(). Thanks Best Regards On Wed, May 20, 2015 at 11:09 PM, Kevin Liu > wrote: Hello, I have to work with IPv6 only servers and when I installed the 1.3.1 hadoop 2.6 build, I couldn¹t get the example to run due to IPv6 issues (errors below). I tried to add the -Djava.net.preferIPv6Addresses=true setting but it still doesn¹t work. A search on Spark¹s support for IPv6 is inconclusive. Can someone help clarify the current status for IPv6? Thanks Kevin ‹‹ errors ‹ 5/05/20 10:17:30 INFO Executor: Fetching http://2401:db00:2030:709b:face:0:9:0:51453/jars/spark-examples-1.3.1-hadoo p2.6.0.jar with timestamp 1432142250197 15/05/20 10:17:30 INFO Executor: Fetching http://2401:db00:2030:709b:face:0:9:0:51453/jars/spark-examples-1.3.1-hadoo p2.6.0.jar with timestamp 1432142250197 15/05/20 10:17:30 ERROR Executor: Exception in task 5.0 in stage 0.0 (TID 5) java.net.MalformedURLException: For input string: "db00:2030:709b:face:0:9:0:51453" at java.net.URL.(URL.java:620) at java.net.URL.(URL.java:483) at java.net.URL.(URL.java:432) at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:603) at org.apache.spark.util.Utils$.fetchFile(Utils.scala:431) at org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Execu tor$$updateDependencies$5.apply(Executor.scala:374) at org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Execu tor$$updateDependencies$5.apply(Executor.scala:366) at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(Traver sableLike.scala:772) at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98) at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98) at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226) at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39) at scala.collection.mutable.HashMap.foreach(HashMap.scala:98) at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:7 71) at org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$upda teDependencies(Executor.scala:366) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:184) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1 142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java: 617) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.NumberFormatException: For input string: "db00:2030:709b:face:0:9:0:51453" at java.lang.NumberFormatException.forInputString(NumberFormatException.java:6 5) at java.lang.Integer.parseInt(Integer.java:580) at java.lang.Integer.parseInt(Integer.java:615) at java.net.URLStreamHandler.parseURL(URLStreamHandler.java:216) at java.net.URL.(URL.java:615) ... 18 more --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscribe@spark.apache.org For additional commands, e-mail: user-help@spark.apache.org