spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Amit Joshi <>
Subject Missing required configuration "partition.assignment.strategy" [ Kafka + Spark Structured Streaming ]
Date Sun, 06 Dec 2020 07:00:19 GMT
Hi All,

I am running the Spark Structured Streaming along with Kafka.
Below is the pom.xml

    <!-- Put the Scala version of the cluster -->


<!-- -->

Building the fat jar with shade plugin. The jar is running as expected
in my local setup with the command

*spark-submit --master local[*] --class
--num-executors 3 --driver-memory 2g --executor-cores 2
--executor-memory 3g prism-event-synch-rta.jar*

But when I am trying to run same jar in spark cluster using yarn with command:

*spark-submit --master yarn --deploy-mode cluster --class --num-executors 4 --driver-memory 2g --executor-cores
1 --executor-memory 4g  gs://jars/prism-event-synch-rta.jar*

Getting the this exception:


by: org.apache.kafka.common.config.ConfigException: Missing required
configuration "partition.assignment.strategy" which has no default
value.	at org.apache.kafka.common.config.ConfigDef.parse(*

I have tried setting up the "partition.assignment.strategy", then also
its not working.

Please help.


Amit Joshi

View raw message