spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mich Talebzadeh <mich.talebza...@gmail.com>
Subject Re: What do I loose if I run spark without using HDFS or Zookeeper?
Date Thu, 25 Aug 2016 16:18:03 GMT
You can use Spark on Oracle as a query tool.

It all depends on the mode of the operation.

If you running Spark with yarn-client/cluster then you will need yarn. It
comes as part of Hadoop core (HDFS, Map-reduce and Yarn).

I have not gone and installed Yarn without installing Hadoop.

What is the overriding reason to have the Spark on its own?

 You can use Spark in Local or Standalone mode if you do not want Hadoop
core.

HTH

Dr Mich Talebzadeh



LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.



On 24 August 2016 at 21:54, kant kodali <kanth909@gmail.com> wrote:

> What do I loose if I run spark without using HDFS or Zookeper ? which of
> them is almost a must in practice?
>

Mime
View raw message