spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Uchoa, Rodrigo" <>
Subject [SPARK-CORE] JVM Properties passed as -D, not being found inside UDAF classes
Date Tue, 09 Jan 2018 13:59:08 GMT
Hi everyone!

I'm trying to pass -D properties (JVM properties) to a Spark Application where we have some
UDAF (User Defined Aggregate Functions) who will read those properties (using System.getProperty()).
The problem is, the properties are never there when we try to read them.

According to the docs, we have two ways of passing -D variables:

--conf 'spark.driver.extraJavaOptions=-Dmy.key=value1'
--conf 'spark.executor.extraJavaOptions=-Dmy.key=value1'

First for "driver", the second for "executors". So the first one is working, because code
outside the UDAF can read the property. But the second apparently is not.

Can this have something to do with the fact that we're using client mode? '. Or could this
have something to do specifically with the UDAF classes? -driver-java-options' also didn't
work.  To summarise, only the UDAF classes don't see the JVM properties.

I'm using Spark 2.2.0.



This message is for the designated recipient only and may contain privileged, proprietary,
or otherwise confidential information. If you have received it in error, please notify the
sender immediately and delete the original. Any other use of the e-mail by you is prohibited.
Where allowed by local law, electronic communications with Accenture and its affiliates, including
e-mail and instant messaging (including content), may be scanned by our systems for the purposes
of information security and assessment of internal compliance with Accenture policy.

View raw message