spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Patrick Wendell <>
Subject Re: EC2 scripts documentations lacks how to actually run applications
Date Wed, 08 Jan 2014 19:57:51 GMT
Hey Aureliano,

Yes, people run long running applications with standalone mode and run
it in production. spark-class is a utility function for convenience.
If you want to run a long running application you would write a spark
application, bundle it, and submit it to the cluster. You can then
launch your own application with no-hup or however else you want to
daemonize it.

Here is an example of a standalone application.

The pull request Mark referred to adds some support for submitting
your driver program to the cluster... but it's just an extra feature.
Launching packaged applications is the way you want to go for your use

- Patrick

On Wed, Jan 8, 2014 at 10:31 AM, Mark Hamstra <> wrote:
> On Wed, Jan 8, 2014 at 10:12 AM, Aureliano Buendia <>
> wrote:
>> Here is a refactored version of the question:
>> How to run spark-class for long running applications? Why is that
>> spark-class doesn't launch a daemon?
>> On Wed, Jan 8, 2014 at 3:21 AM, Aureliano Buendia <>
>> wrote:
>>> Hi,
>>> The EC2 documents has a section called 'Running Applications', but it
>>> actually lacks the step which should describe how to run the application.
>>> The spark_ec2 script seems to set up a standalone cluster, although I'm
>>> not sure why AMI_PREFIX point to mesos ami list.
>>> Assuming that the cluster type is standalone, we could run the app by
>>> spark-class script. Is this the missing step in the documentations?
>>> spark-class script does not launch a daemon, is it suppose to be used
>>> with nohup for long running applications?
>>> Finally, is the standalone cluster type used for real world applications,
>>> or do people use spark on yarn and mesos when it comes to production?

View raw message