Change this to 

spark-submit --master local[8] ~/main/py/file --py-files ~/some/other/files

this

spark-submit --master spark://blurred-part:7077 ~/main/py/file --py-files ~/some/other/files


Thanks
Best Regards

On Mon, Nov 10, 2014 at 4:55 PM, Akhil Das <akhil@sigmoidanalytics.com> wrote:
You could be running your application in local mode. In the application specify the master as spark://blurred-part:7077 and then it will appear in the running list.

Thanks
Best Regards

On Mon, Nov 10, 2014 at 4:25 PM, Samarth Mailinglist <mailinglistsamarth@gmail.com> wrote:

There are no applications being shown in the dashboard (I am attaching a screenshot):

Inline image 1

This is my spark-env.sh:

SPARK_MASTER_WEBUI_PORT=8888

SPARK_WORKER_INSTANCES=8 #to set the number of worker processes per node

SPARK_HISTORY_OPTS=" -Dspark.history.fs.logDirectory=/usr/local/spark/history-logs/" #, to set config properties only for the history server (e.g. "-Dx=y")

I have started the history server too..