spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Yuming Wang (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-19394) "assertion failed: Expected hostname" on macOS when self-assigned IP contains a percent sign
Date Wed, 01 Aug 2018 00:15:00 GMT

    [ https://issues.apache.org/jira/browse/SPARK-19394?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16564532#comment-16564532
] 

Yuming Wang commented on SPARK-19394:
-------------------------------------

Try to add {{::1             localhost}} to /etc/hosts.

> "assertion failed: Expected hostname" on macOS when self-assigned IP contains a percent
sign
> --------------------------------------------------------------------------------------------
>
>                 Key: SPARK-19394
>                 URL: https://issues.apache.org/jira/browse/SPARK-19394
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.1.0
>            Reporter: Jacek Laskowski
>            Priority: Minor
>
> See [this question on StackOverflow|http://stackoverflow.com/q/41914586/1305344].
> {quote}
> So when I am not connected to internet, spark shell fails to load in local mode. I am
running Apache Spark 2.1.0 downloaded from internet, running on my Mac. So I run ./bin/spark-shell
and it gives me the error below.
> So I have read the Spark code and it is using Java's InetAddress.getLocalHost() to find
the localhost's IP address. So when I am connected to internet, I get back an IPv4 with my
local hostname.
> scala> InetAddress.getLocalHost
> res9: java.net.InetAddress = AliKheyrollahis-MacBook-Pro.local/192.168.1.26
> but the key is, when disconnected, I get an IPv6 with a percentage in the values (it
is scoped):
> scala> InetAddress.getLocalHost
> res10: java.net.InetAddress = AliKheyrollahis-MacBook-Pro.local/fe80:0:0:0:2b9a:4521:a301:e9a5%10
> And this IP is the same as the one you see in the error message. I feel my problem is
that it throws Spark since it cannot handle %10 in the result.
> ...
> 17/01/28 22:03:28 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://fe80:0:0:0:2b9a:4521:a301:e9a5%10:4040
> 17/01/28 22:03:28 INFO Executor: Starting executor ID driver on host localhost
> 17/01/28 22:03:28 INFO Executor: Using REPL class URI: spark://fe80:0:0:0:2b9a:4521:a301:e9a5%10:56107/classes
> 17/01/28 22:03:28 ERROR SparkContext: Error initializing SparkContext.
> java.lang.AssertionError: assertion failed: Expected hostname
>     at scala.Predef$.assert(Predef.scala:170)
>     at org.apache.spark.util.Utils$.checkHost(Utils.scala:931)
>     at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:31)
>     at org.apache.spark.executor.Executor.<init>(Executor.scala:121)
>     at org.apache.spark.scheduler.local.LocalEndpoint.<init>(LocalSchedulerBackend.scala:59)
>     at org.apache.spark.scheduler.local.LocalSchedulerBackend.start(LocalSchedulerBackend.scala:126)
>     at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:156)
>     at org.apache.spark.SparkContext.<init>(SparkContext.scala:509)
>     at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2313)
>     at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868)
>     at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860)
>     at scala.Option.getOrElse(Option.scala:121)
>     at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860)
>     at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
> {quote}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message