spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Shixiong(Ryan) Zhu" <shixi...@databricks.com>
Subject Re: Failed to connect to master ...
Date Wed, 08 Mar 2017 07:38:35 GMT
The Spark master may bind to a different address. Take a look at this page
to find the correct URL: http://VM_IPAddress:8080/

On Tue, Mar 7, 2017 at 10:13 PM, Mina Aslani <aslanimina@gmail.com> wrote:

> Master and worker processes are running!
>
> On Wed, Mar 8, 2017 at 12:38 AM, ayan guha <guha.ayan@gmail.com> wrote:
>
>> You need to start Master and worker processes before connecting to them.
>>
>> On Wed, Mar 8, 2017 at 3:33 PM, Mina Aslani <aslanimina@gmail.com> wrote:
>>
>>> Hi,
>>>
>>> I am writing a spark Transformer in intelliJ in Java and trying to
>>> connect to the spark in a VM using setMaster. I get "Failed to connect to
>>> master ..."
>>>
>>> I get 17/03/07 16:20:55 WARN StandaloneAppClient$ClientEndpoint: Failed
>>> to connect to master VM_IPAddress:7077
>>> org.apache.spark.SparkException: Exception thrown in awaitResult
>>> at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTi
>>> meout.scala:77)
>>> at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTi
>>> meout.scala:75)
>>> at scala.runtime.AbstractPartialFunction.apply(AbstractPartialF
>>> unction.scala:36)
>>> at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout
>>> $1.applyOrElse(RpcTimeout.scala:59)
>>>
>>> SparkSession spark = SparkSession
>>>       .builder()
>>>       .appName("Java Spark SQL")
>>>       //.master("local[1]")
>>>       .master("spark://VM_IPAddress:7077")
>>>       .getOrCreate();
>>>
>>> Dataset<String> lines = spark
>>>       .readStream()
>>>       .format("kafka")      .option("kafka.bootstrap.servers", brokers)     
.option("subscribe", topic)      .load()
>>>       .selectExpr("CAST(value AS STRING)")      .as(Encoders.STRING());
>>>
>>>
>>>
>>> I get same error when I try master("*spark://spark-master:7077**"*).
>>>
>>> *However, .master("local[1]") *no exception is thrown*.*
>>> *
>>> My Kafka is in the same VM and being new to SPARK still trying to understand:
>>> *
>>>
>>> - Why I get above exception and how I can fix it (connect to SPARK in VM and
read form KAfKA in VM)?
>>>
>>> - Why using "local[1]" no exception is thrown and how to setup to read from kafka
in VM?
>>>
>>> *- How to stream from Kafka (data in the topic is in json format)?
>>> *
>>> Your input is appreciated!
>>>
>>> Best regards,
>>> Mina
>>>
>>>
>>>
>>>
>>
>>
>> --
>> Best Regards,
>> Ayan Guha
>>
>
>

Mime
View raw message