spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Nitin Kalra <nitinkalra2...@gmail.com>
Subject Re: Apache Spark : spark.eventLog.dir on Windows Environment
Date Tue, 21 Jul 2015 08:27:57 GMT
Hi Akhil,

I don't have HADOOP_HOME or HADOOP_CONF_DIR and even winutils.exe ? What's
the configuration required for this ? From where can I get winutils.exe ?

Thanks and Regards,
Nitin Kalra


On Tue, Jul 21, 2015 at 1:30 PM, Akhil Das <akhil@sigmoidanalytics.com>
wrote:

> Do you have HADOOP_HOME, HADOOP_CONF_DIR and hadoop's winutils.exe in the
> environment?
>
> Thanks
> Best Regards
>
> On Mon, Jul 20, 2015 at 5:45 PM, nitinkalra2000 <nitinkalra2000@gmail.com>
> wrote:
>
>> Hi All,
>>
>> I am working on Spark 1.4 on windows environment. I have to set eventLog
>> directory so that I can reopen the Spark UI after application has
>> finished.
>>
>> But I am not able to set eventLog.dir, It gives an error on Windows
>> environment.
>>
>> Configuation is :
>>
>> <entry key="spark.eventLog.enabled" value="true" />
>> <entry key="spark.eventLog.dir" value="file:///c:/sparklogs" />
>>
>> Exception I get :
>>
>> java.io.IOException: Cannot run program "cygpath": CreateProcess error=2,
>> The system cannot find the file specified
>>     at java.lang.ProcessBuilder.start(Unknown Source)
>>     at org.apache.hadoop.util.Shell.runCommand(Shell.java:206)
>>
>> I have also tried installing Cygwin but still the error doesn't go.
>>
>> Can anybody give any advice on it?
>>
>> I have posted the same question on Stackoverflow as well :
>>
>> http://stackoverflow.com/questions/31468716/apache-spark-spark-eventlog-dir-on-windows-environment
>>
>> Thanks
>> Nitin
>>
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Apache-Spark-spark-eventLog-dir-on-Windows-Environment-tp23913.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> For additional commands, e-mail: user-help@spark.apache.org
>>
>>
>

Mime
View raw message