I am trying to create a RDD by using swebhdfs to a remote hadoop cluster which is protected by Knox and uses SSL.
The code looks like this -
I'm passing the truststore and trustorepassword through extra java options while starting the spark shell as -But I'm always getting the error that -
spark-shell --conf "spark.executor.extraJavaOptions=-Djavax.net.ssl.trustStore=truststor.jks -Djavax.net.ssl.trustStorePassword=<password>" --conf "spark.driver.extraJavaOptions=-Djavax.net.ssl.trustStore=truststore.jks -Djavax.net.ssl.trustStorePassword=<password>"
Message: Remote host closed connection during handshake
Am I passing the truststore and truststore password in right way ?