spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Susan X. Huynh" <>
Subject Re: Advice on multiple streaming job
Date Sun, 06 May 2018 14:57:08 GMT
Hi Dhaval,

Not sure if you have considered this: the port 4040 sounds like a driver UI
port. By default it will try up to 4056, but you can increase that number
with "spark.port.maxRetries". ( Try setting it to
"32". This would help if the only conflict is among the driver UI ports
(like if you have > 16 drivers running on the same host).


On Sun, May 6, 2018 at 12:32 AM, vincent gromakowski <> wrote:

> Use a scheduler that abstract the network away with a CNI for instance or
> other mécanismes (mesos, kubernetes, yarn). The CNI will allow to always
> bind on the same ports because each container will have its own IP. Some
> other solution like mesos and marathon can work without CNI , with host IP
> binding, but will manage the ports for you ensuring there isn't any
> conflict.
> Le sam. 5 mai 2018 à 17:10, Dhaval Modi <> a écrit :
>> Hi All,
>> Need advice on executing multiple streaming jobs.
>> Problem:- We have 100's of streaming job. Every streaming job uses new
>> port. Also, Spark automatically checks port from 4040 to 4056, post that it
>> fails. One of the workaround, is to provide port explicitly.
>> Is there a way to tackle this situation? or Am I missing any thing?
>> Thanking you in advance.
>> Regards,
>> Dhaval Modi

Susan X. Huynh
Software engineer, Data Agility

View raw message