spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Alonso Isidoro Roman <>
Subject Re: installation of spark
Date Wed, 05 Jun 2019 10:38:12 GMT
When using osx, it is recommended to install java, scala and spark using

Run these commands on a terminal:

brew update

brew install scala

brew install sbt

brew cask install java

brew install spark

There is no need to install HDFS, you  can use your local file system
without a problem.

*How to set JAVA_HOME on Mac OS X **temporary *

   1. Open *Terminal*.
   2. Confirm you have JDK by typing “which java”. ...
   3. Check you have the needed version of Java, by typing “java -version”.
   4. *Set JAVA_HOME* using this command in *Terminal*: *export JAVA_HOME*
   5. echo $*JAVA_HOME* on *Terminal* to confirm the path.
   6. You should now be able to run your application.

*How to set JAVA_HOME on Mac OS X permanently*

$ vim .bash_profile

$ export JAVA_HOME=$(/usr/libexec/java_home)

$ source .bash_profile

$ echo $JAVA_HOME

Have fun!


El mié., 5 jun. 2019 a las 6:10, Jack Kolokasis (<>)

> Hello,
>     at first you will need to make sure that JAVA is installed, or install
> it otherwise. Then install scala and a build tool (sbt or maven). In my
> point of view, IntelliJ IDEA is a good option to create your Spark
> applications.  At the end you have to install a distributed file system e.g
>     I think there is no an all-in-one configuration. But there are
> examples about how to configure you Spark cluster (e.g
> ).
> Best,
> --Iacovos
> On 5/6/19 5:50 π.μ., ya wrote:
> Dear list,
> I am very new to spark, and I am having trouble installing it on my mac. I
> have following questions, please give me some guidance. Thank you very much.
> 1. How many and what software should I install before installing spark? I
> have been searching online, people discussing their experiences on this
> topic with different opinions, some says there is no need to install hadoop
> before install spark, some says hadoop has to be installed before spark.
> Some other people say scala has to be installed, whereas others say scala
> is included in spark, and it is installed automatically once spark in
> installed. So I am confused what to install for a start.
> 2.  Is there an simple way to configure these software? for instance, an
> all-in-one configuration file? It takes forever for me to configure things
> before I can really use it for data analysis.
> I hope my questions make sense. Thank you very much.
> Best regards,
> YA

Alonso Isidoro Roman
[image: https://]

View raw message