spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Akhil Das <ak...@sigmoidanalytics.com>
Subject Re: Multiple exceptions in Spark Streaming
Date Wed, 01 Oct 2014 07:36:40 GMT
In that case, fire-up a sparkshell and try the following:

scala>import org.apache.spark.streaming.{Seconds, StreamingContext}
> scala>import org.apache.spark.streaming.StreamingContext._
> scala>val ssc = new
> StreamingContext("spark://YOUR-SPARK-MASTER-URI","Streaming
> Job",Seconds(5),"/home/akhld/mobi/localclusterxxx/spark-1")
> scala>val lines=ssc.socketTextStream("PUT-YOUR-MASTERs-PRIVATE-IP", 12345)
> scala>lines.print()
> scala>ssc.start()
> scala>ssc.awaitTermination()


And open up another terminal window with *nc -lp 12345* on the master node
and give some data for the program to consume. If it is a configuration
issue then you will hit the issue same :)



Thanks
Best Regards

On Wed, Oct 1, 2014 at 12:45 PM, Shaikh Riyaz <shaikh.r.a@gmail.com> wrote:

> Hi Akhil,
>
> Thanks for your reply.
>
> We are using CDH 5.1.3 and spark configuration is taken care by Cloudera
> configuration. Please let me know if you would like to review the
> configuration.
>
> Regards,
> Riyaz
>
> On Wed, Oct 1, 2014 at 10:10 AM, Akhil Das <akhil@sigmoidanalytics.com>
> wrote:
>
>> Looks like a configuration issue, can you paste your spark-env.sh on the
>> worker?
>>
>> Thanks
>> Best Regards
>>
>> On Wed, Oct 1, 2014 at 8:27 AM, Tathagata Das <
>> tathagata.das1565@gmail.com> wrote:
>>
>>> It would help to turn on debug level logging in log4j and see the logs.
>>> Just looking at the error logs is not giving me any sense. :(
>>>
>>> TD
>>>
>>> On Tue, Sep 30, 2014 at 4:30 PM, Shaikh Riyaz <shaikh.r.a@gmail.com>
>>> wrote:
>>>
>>>> Hi TD,
>>>>
>>>> Thanks for your reply.
>>>>
>>>> Attachment in previous email was from Master.
>>>>
>>>> Below is the log message from one of the worker.
>>>>
>>>> -----------------------------------------------------------------------------------------------
>>>> 2014-10-01 01:49:22,348 ERROR akka.remote.EndpointWriter:
>>>> AssociationError [akka.tcp://sparkWorker@*<worker4>*:7078] ->
>>>> [akka.tcp://sparkExecutor@*<worker4>*:34010]: Error [Association
>>>> failed with [akka.tcp://sparkExecutor@*<worker4>*:34010]] [
>>>> akka.remote.EndpointAssociationException: Association failed with
>>>> [akka.tcp://sparkExecutor@*<worker4>*:34010]
>>>> Caused by:
>>>> akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2:
>>>> Connection refused: *<worker4>*:34010
>>>> ]
>>>> 2014-10-01 02:14:54,868 ERROR akka.remote.EndpointWriter:
>>>> AssociationError [akka.tcp://sparkWorker@*<worker4>*:7078] ->
>>>> [akka.tcp://sparkExecutor@*<worker4>*:33184]: Error [Association
>>>> failed with [akka.tcp://sparkExecutor@*<worker4>*:33184]] [
>>>> akka.remote.EndpointAssociationException: Association failed with
>>>> [akka.tcp://sparkExecutor@*<worker4>*:33184]
>>>> Caused by:
>>>> akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2:
>>>> Connection refused: *<worker4>*:33184
>>>> ]
>>>> 2014-10-01 02:14:54,878 ERROR akka.remote.EndpointWriter:
>>>> AssociationError [akka.tcp://sparkWorker@*<worker4>*:7078] ->
>>>> [akka.tcp://sparkExecutor@*<worker4>*:33184]: Error [Association
>>>> failed with [akka.tcp://sparkExecutor@*<worker4>*:33184]] [
>>>> akka.remote.EndpointAssociationException: Association failed with
>>>> [akka.tcp://sparkExecutor@*<worker4>*:33184]
>>>> Caused by:
>>>> akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2:
>>>> Connection refused: *<worker4>*:33184
>>>> ]
>>>> 2014-10-01 02:14:54,887 ERROR akka.remote.EndpointWriter:
>>>> AssociationError [akka.tcp://sparkWorker@*<worker4>*:7078] ->
>>>> [akka.tcp://sparkExecutor@*<worker4>*:33184]: Error [Association
>>>> failed with [akka.tcp://sparkExecutor@*<worker4>*:33184]] [
>>>> akka.remote.EndpointAssociationException: Association failed with
>>>> [akka.tcp://sparkExecutor@*<worker4>*:33184]
>>>> Caused by:
>>>> akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2:
>>>> Connection refused: *<worker4>*:33184
>>>> ]
>>>>
>>>> -----------------------------------------------------------------------------
>>>>
>>>> Your support will be highly appreciated.
>>>>
>>>> Regards,
>>>> Riyaz
>>>>
>>>> On Wed, Oct 1, 2014 at 1:16 AM, Tathagata Das <
>>>> tathagata.das1565@gmail.com> wrote:
>>>>
>>>>> Is this the logs of the worker where the failure occurs? I think
>>>>> issues similar to these have since been solved in later versions of Spark.
>>>>>
>>>>> TD
>>>>>
>>>>> On Tue, Sep 30, 2014 at 11:33 AM, Shaikh Riyaz <shaikh.r.a@gmail.com>
>>>>> wrote:
>>>>>
>>>>>> Dear All,
>>>>>>
>>>>>> We are using Spark streaming version 1.0.0 in our Cloudea Hadoop
>>>>>> cluster CDH 5.1.3.
>>>>>>
>>>>>> Spark streaming is reading messages from Kafka using
>>>>>> https://github.com/dibbhatt/kafka-spark-consumer.
>>>>>>
>>>>>> We have allocated 4gb memory to executor and 5gb to each workers.
We
>>>>>> have total 6 workers spread across 6 machines.
>>>>>>
>>>>>> Please find the attach log file for detailed error messages.
>>>>>>
>>>>>>
>>>>>> Thanks in advance.
>>>>>>
>>>>>> --
>>>>>> Regards,
>>>>>>
>>>>>> Riyaz
>>>>>>
>>>>>>
>>>>>>
>>>>>> ---------------------------------------------------------------------
>>>>>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>>>>>> For additional commands, e-mail: user-help@spark.apache.org
>>>>>>
>>>>>
>>>>>
>>>>
>>>>
>>>> --
>>>> Regards,
>>>>
>>>> Riyaz
>>>>
>>>>
>>>
>>
>
>
> --
> Regards,
>
> Riyaz
>
>

Mime
View raw message