spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From nib...@free.fr
Subject Spark Streaming over YARN
Date Fri, 02 Oct 2015 15:40:47 GMT
Hello,
I have a job receiving data from kafka (4 partitions) and persisting data inside MongoDB.
It works fine, but when I deploy it inside YARN cluster (4 nodes with 2 cores) only on node
is receiving all the kafka partitions and only one node is processing my RDD treatment (foreach
function)
How can I force YARN to use all the resources nodes and cores to process the data (receiver
& RDD treatment)

Tks a lot
Nicolas

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message