spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Andrew Or <>
Subject Re: pyspark on yarn hdp hortonworks
Date Wed, 03 Sep 2014 21:19:44 GMT
Hi Oleg,

There isn't much you need to do to setup a Yarn cluster to run PySpark. You
need to make sure all machines have python installed, and... that's about
it. Your assembly jar will be shipped to all containers along with all the
pyspark and py4j files needed. One caveat, however, is that the jar needs
to be built in maven and not on a Red Hat-based OS,

In addition, it should be built with Java 6 because of a known issue with
building jars with Java 7 and including python files in them ( Lastly, if you have
trouble getting it to work, you can follow the steps I have listed in a
different thread to figure out what's wrong:

Let me know if you can get it working,

2014-09-03 5:03 GMT-07:00 Oleg Ruchovets <>:

> Hi all.
>    I am trying to run pyspark on yarn already couple of days:
> I posted exception on previous posts. It looks that I didn't do correct
> configuration.
>   I googled quite a lot and I can't find the steps should be done to
> configure PySpark running on Yarn.
> Can you please share the steps (critical points) should be configured to
> use PaSpark on Yarn ( hortonworks distribution) :
>   Environment variables.
>   Classpath
>   copy jars to all machine
>   other configuration.
> Thanks
> Oleg.

View raw message