spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From vincent gromakowski <vincent.gromakow...@gmail.com>
Subject Re: Advice on multiple streaming job
Date Sun, 06 May 2018 07:32:28 GMT
Use a scheduler that abstract the network away with a CNI for instance or
other mécanismes (mesos, kubernetes, yarn). The CNI will allow to always
bind on the same ports because each container will have its own IP. Some
other solution like mesos and marathon can work without CNI , with host IP
binding, but will manage the ports for you ensuring there isn't any
conflict.

Le sam. 5 mai 2018 à 17:10, Dhaval Modi <dhavalmodi24@gmail.com> a écrit :

> Hi All,
>
> Need advice on executing multiple streaming jobs.
>
> Problem:- We have 100's of streaming job. Every streaming job uses new
> port. Also, Spark automatically checks port from 4040 to 4056, post that it
> fails. One of the workaround, is to provide port explicitly.
>
> Is there a way to tackle this situation? or Am I missing any thing?
>
> Thanking you in advance.
>
> Regards,
> Dhaval Modi
> dhavalmodi24@gmail.com
>

Mime
View raw message