spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Charles Allen <charles.al...@metamarkets.com>
Subject Re: Spark 1.5 on Mesos
Date Wed, 02 Mar 2016 22:28:54 GMT
Re: Spark on Mesos.... Warning regarding disk space:
https://issues.apache.org/jira/browse/SPARK-12330

That's a spark flaw I encountered on a very regular basis on mesos. That
and a few other annoyances are fixed in
https://github.com/metamx/spark/tree/v1.5.2-mmx

Here's another mild annoyance I've encountered:
https://issues.apache.org/jira/browse/SPARK-11714

On Wed, Mar 2, 2016 at 1:31 PM Ashish Soni <asoni.learn@gmail.com> wrote:

> I have no luck and i would to ask the question to spark committers will
> this be ever designed to run on mesos ?
>
> spark app as a docker container not working at all on mesos  ,if any one
> would like the code i can send it over to have a look.
>
> Ashish
>
> On Wed, Mar 2, 2016 at 12:23 PM, Sathish Kumaran Vairavelu <
> vsathishkumaran@gmail.com> wrote:
>
>> Try passing jar using --jars option
>>
>> On Wed, Mar 2, 2016 at 10:17 AM Ashish Soni <asoni.learn@gmail.com>
>> wrote:
>>
>>> I made some progress but now i am stuck at this point , Please help as
>>> looks like i am close to get it working
>>>
>>> I have everything running in docker container including mesos slave and
>>> master
>>>
>>> When i try to submit the pi example i get below error
>>> *Error: Cannot load main class from JAR file:/opt/spark/Example*
>>>
>>> Below is the command i use to submit as a docker container
>>>
>>> docker run -it --rm -e SPARK_MASTER="mesos://10.0.2.15:7077"  -e
>>> SPARK_IMAGE="spark_driver:latest" spark_driver:latest ./bin/spark-submit
>>> --deploy-mode cluster --name "PI Example" --class
>>> org.apache.spark.examples.SparkPi --driver-memory 512m --executor-memory
>>> 512m --executor-cores 1
>>> http://10.0.2.15/spark-examples-1.6.0-hadoop2.6.0.jar
>>>
>>>
>>> On Tue, Mar 1, 2016 at 2:59 PM, Timothy Chen <tim@mesosphere.io> wrote:
>>>
>>>> Can you go through the Mesos UI and look at the driver/executor log
>>>> from steer file and see what the problem is?
>>>>
>>>> Tim
>>>>
>>>> On Mar 1, 2016, at 8:05 AM, Ashish Soni <asoni.learn@gmail.com> wrote:
>>>>
>>>> Not sure what is the issue but i am getting below error  when i try to
>>>> run spark PI example
>>>>
>>>> Blacklisting Mesos slave value: "5345asdasdasdkas234234asdasdasdasd"
>>>>    due to too many failures; is Spark installed on it?
>>>>     WARN TaskSchedulerImpl: Initial job has not accepted any resources; check
your cluster UI to ensure that workers are registered and have sufficient resources
>>>>
>>>>
>>>> On Mon, Feb 29, 2016 at 1:39 PM, Sathish Kumaran Vairavelu <
>>>> vsathishkumaran@gmail.com> wrote:
>>>>
>>>>> May be the Mesos executor couldn't find spark image or the constraints
>>>>> are not satisfied. Check your Mesos UI if you see Spark application in
the
>>>>> Frameworks tab
>>>>>
>>>>> On Mon, Feb 29, 2016 at 12:23 PM Ashish Soni <asoni.learn@gmail.com>
>>>>> wrote:
>>>>>
>>>>>> What is the Best practice , I have everything running as docker
>>>>>> container in single host ( mesos and marathon also as docker container
)
>>>>>>  and everything comes up fine but when i try to launch the spark
shell i
>>>>>> get below error
>>>>>>
>>>>>>
>>>>>> SQL context available as sqlContext.
>>>>>>
>>>>>> scala> val data = sc.parallelize(1 to 100)
>>>>>> data: org.apache.spark.rdd.RDD[Int] = ParallelCollectionRDD[0] at
>>>>>> parallelize at <console>:27
>>>>>>
>>>>>> scala> data.count
>>>>>> [Stage 0:>
>>>>>>  (0 + 0) / 2]16/02/29 18:21:12 WARN TaskSchedulerImpl: Initial job
has not
>>>>>> accepted any resources; check your cluster UI to ensure that workers
are
>>>>>> registered and have sufficient resources
>>>>>> 16/02/29 18:21:27 WARN TaskSchedulerImpl: Initial job has not
>>>>>> accepted any resources; check your cluster UI to ensure that workers
are
>>>>>> registered and have sufficient resources
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Mon, Feb 29, 2016 at 12:04 PM, Tim Chen <tim@mesosphere.io>
wrote:
>>>>>>
>>>>>>> No you don't have to run Mesos in docker containers to run Spark
in
>>>>>>> docker containers.
>>>>>>>
>>>>>>> Once you have Mesos cluster running you can then specfiy the
Spark
>>>>>>> configurations in your Spark job (i.e: spark.mesos.executor.docker.image=mesosphere/spark:1.6)
>>>>>>> and Mesos will automatically launch docker containers for you.
>>>>>>>
>>>>>>> Tim
>>>>>>>
>>>>>>> On Mon, Feb 29, 2016 at 7:36 AM, Ashish Soni <asoni.learn@gmail.com>
>>>>>>> wrote:
>>>>>>>
>>>>>>>> Yes i read that and not much details here.
>>>>>>>>
>>>>>>>> Is it true that we need to have spark installed on each mesos
>>>>>>>> docker container ( master and slave ) ...
>>>>>>>>
>>>>>>>> Ashish
>>>>>>>>
>>>>>>>> On Fri, Feb 26, 2016 at 2:14 PM, Tim Chen <tim@mesosphere.io>
>>>>>>>> wrote:
>>>>>>>>
>>>>>>>>> https://spark.apache.org/docs/latest/running-on-mesos.html
should
>>>>>>>>> be the best source, what problems were you running into?
>>>>>>>>>
>>>>>>>>> Tim
>>>>>>>>>
>>>>>>>>> On Fri, Feb 26, 2016 at 11:06 AM, Yin Yang <yy201602@gmail.com>
>>>>>>>>> wrote:
>>>>>>>>>
>>>>>>>>>> Have you read this ?
>>>>>>>>>> https://spark.apache.org/docs/latest/running-on-mesos.html
>>>>>>>>>>
>>>>>>>>>> On Fri, Feb 26, 2016 at 11:03 AM, Ashish Soni <
>>>>>>>>>> asoni.learn@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> Hi All ,
>>>>>>>>>>>
>>>>>>>>>>> Is there any proper documentation as how to run
spark on mesos ,
>>>>>>>>>>> I am trying from the last few days and not able
to make it work.
>>>>>>>>>>>
>>>>>>>>>>> Please help
>>>>>>>>>>>
>>>>>>>>>>> Ashish
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>
>>>
>

Mime
View raw message