spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Kelum Perera <kelum0...@gmail.com>
Subject Re: spark-shell not starting ( in a Kali linux 2 OS)
Date Sun, 13 Nov 2016 12:18:00 GMT
Thanks Marco, Sea, & Oshadha,

Changed the permission to the files in spark directory using "chmod" & now
it works.

Thank you very much for the help.

Kelum


On Sun, Nov 13, 2016 at 5:31 PM, Marco Mistroni <mmistroni@gmail.com> wrote:

> Hi
>   not a Linux expert.... but how did you installed Spark ? as a root user?
> The error above seems to indicate you dont have permissions to access that
> directory. If you have full control of the host  you can try to do a chmod
> 777 to the directory where you installed Spark and its subdirs
>
> Anwyay, my 2 cents here
> 2 options for  installing Spark
>
> 1 - get the zipped version, unpack it anywhere you want (even in ur home
> folder).set the SPARK_HOME variable to where you installed it, then go to
> the <SPARK_HOME>/bin  and launch spark-shell   (i am *guessing* this might
> rely on having Scala installed on ur host)
>
> 2 - build it from source. Might take a littletime longer, but  if you do
> it this way Spark will download Scala for you  For that, try the following
> commands in ur Linux (i have built Spk on Ubuntu...so there might be some
> tweaks you need to do it to get it working on ur Linxu version)
>
> # Install Git (Ubuntu)
> apt-get install -y git
> # Getting Spark
> git clone git://github.com/apache/spark.git
> # Buil Spark
> ./build/mvn -Pyarn  -DskipTests clean package
> # Export variables to spark home and spark's bin directory
> export SPARK_HOME="/spark"                 # This  is the directory where
> you installed spark
> export PATH="$SPARK_HOME/bin:${PATH}"
>
>
> Please note that on my small laptop zinc(used by Spark to speed up the
> compilation) somehow gets jammed, so i have to split the ./build/mvn comman
> into two:
> 1. ./build/mvn -Pyarn -DskipTests clean compile and
> 2../build/mvn -Pyarn -DskipTests package
>
> hope this helps. good luck
>
> kr
>  Marco
>
>
>
>
>
>
>
>
>
> On Sun, Nov 13, 2016 at 10:44 AM, Kelum Perera <kelum0823@gmail.com>
> wrote:
>
>> Thanks Oshadha & Sean,
>>
>> Now, When i enter "spark-shell", this error pops as;
>>
>> bash: /root/spark/bin/pyspark: Permission denied
>>
>> Same error comes for "pyspark" too.
>>
>> Any help on this.
>>
>> Thanks for your help.
>> Kelum
>>
>>
>>
>> On Sun, Nov 13, 2016 at 2:14 PM, Oshadha Gunawardena <
>> oshadha.rocky@gmail.com> wrote:
>>
>>> On Nov 13, 2016 10:20 AM, "Kelum Perera" <kelum0823@gmail.com> wrote:
>>> >
>>> > Dear Users,
>>> >
>>> > I'm a newbie, trying to get spark-shell using kali linux OS, but
>>> getting error - "spark-shell: command not found"
>>> >
>>> > I'm running on Kali Linux 2 (64bit)
>>> >
>>> > I followed several tutorial including:
>>> > https://www.tutorialspoint.com/apache_spark/apache_spark_ins
>>> tallation.htm
>>> > https://www.youtube.com/watch?v=wo8Q_j8bnQU
>>> >
>>> > Scala (2.11.8), python (2.7), Java (1.8.0_111), spark (1.6.3) are
>>> available in /usr/local/
>>> >
>>> > I have amended the "bashrc" for the paths of above folders & sourced
>>> it.
>>> >
>>> > export SCALA_HOME=/root/scala
>>> > export PATH=$SCALA_HOME/bin:$PATH
>>> >
>>> > export SCALA_HOME=/root/spark
>>> > export PATH=$SPARK_HOME/bin:$PATH
>>> >
>>> >
>>> > When i run "echo $SCALA_HOME" it shows the path correctly.
>>> > but for "echo $SPARK_HOME" its a just empty line prints but no error
>>> pops & courser moves to next line.
>>> >
>>> > I tried keeping the files in "/usr/local/" folder too, but same result.
>>> > Also i tried with "pyspark", but the same result
>>> >
>>> > Its great if someone can help me on this.
>>> >
>>> >
>>> > Thanks for your time & effort.
>>> > Regards,
>>> > kelum
>>> >
>>> >
>>>
>>> In your spark path configuration it should be 'SPARK_HOME=/root/spark'
>>>
>>> Then do a 'source /.bashrc'
>>>
>>
>>
>

Mime
View raw message