spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Apache Spark (JIRA)" <>
Subject [jira] [Commented] (SPARK-15600) Make local mode as default mode
Date Fri, 27 May 2016 10:14:12 GMT


Apache Spark commented on SPARK-15600:

User 'zjffdu' has created a pull request for this issue:

> Make local mode as default mode
> -------------------------------
>                 Key: SPARK-15600
>                 URL:
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>            Reporter: Jeff Zhang
>            Priority: Minor
> Usually I would write spark application in IDE and will run it on small dataset locally.
But I have to specify the master as local, otherwise will get the following error. But if
I want to run it on large dataset in cluster, then I have to  the remove the code of "SparkConf.setMaster(local[4])"
which is inconvenient for me. so I create this ticket to propose that if there's no master
specified, then use local mode. 
> {code}
> org.apache.spark.SparkException: A master URL must be set in your configuration
> 	at org.apache.spark.SparkContext.<init>(SparkContext.scala:401)
> 	at com.zjffdu.tutorial.spark.kaggle.crime.SFCrime$.main(SFCrime.scala:14)
> 	at com.zjffdu.tutorial.spark.kaggle.crime.SFCrime.main(SFCrime.scala)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> 	at java.lang.reflect.Method.invoke(
> 	at com.intellij.rt.execution.application.AppMain.main(
> 16/05/27 14:31:33 INFO spark.SparkContext: Successfully stopped SparkContext
> {code}

This message was sent by Atlassian JIRA

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message