spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mich Talebzadeh <mich.talebza...@gmail.com>
Subject Re: Unable to set cores while submitting Spark job
Date Fri, 01 Apr 2016 05:59:12 GMT
Hi Shridhar

Can you check on Spark GUI whether the number of cores shown per worker is
the same as you set up? This shows under column "Cores"

HTH

Dr Mich Talebzadeh



LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com



On 1 April 2016 at 05:59, vetal king <greenvetal@gmail.com> wrote:

> Ted, Mich,
>
> Thanks for your replies. I ended up using sparkConf.set(<cores>); and
> accepted cores as a parameter. But still not sure why spark-submits's
> executor-cores or driver-cores property did not work. setting cores within
> main method seems to be bit cumbersome .
>
> Thanks again,
> Shridhar
>
>
>
> On Wed, Mar 30, 2016 at 8:42 PM, Mich Talebzadeh <
> mich.talebzadeh@gmail.com> wrote:
>
>> Hi Ted
>>
>> Can specify the core as follows for example 12 cores?:
>>
>>   val conf = new SparkConf().
>>                setAppName("ImportStat").
>>
>> *setMaster("local[12]").*
>> set("spark.driver.allowMultipleContexts", "true").
>>                set("spark.hadoop.validateOutputSpecs", "false")
>>   val sc = new SparkContext(conf)
>>
>>
>> Dr Mich Talebzadeh
>>
>>
>>
>> LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>>
>>
>>
>> http://talebzadehmich.wordpress.com
>>
>>
>>
>> On 30 March 2016 at 14:59, Ted Yu <yuzhihong@gmail.com> wrote:
>>
>>> -c CORES, --cores CORES Total CPU cores to allow Spark applications to
>>> use on the machine (default: all available); only on worker
>>>
>>> bq. sc.getConf().set()
>>>
>>> I think you should use this pattern (shown in
>>> https://spark.apache.org/docs/latest/spark-standalone.html):
>>>
>>> val conf = new SparkConf()
>>>              .setMaster(...)
>>>              .setAppName(...)
>>>              .set("spark.cores.max", "1")val sc = new SparkContext(conf)
>>>
>>>
>>> On Wed, Mar 30, 2016 at 5:46 AM, vetal king <greenvetal@gmail.com>
>>> wrote:
>>>
>>>> Hi all,
>>>>
>>>> While submitting Spark Job I am am specifying options --executor-cores
>>>> 1  and --driver-cores 1. However, when the job was submitted, the job used
>>>> all available cores. So I tried to limit the cores within my main function
>>>>             sc.getConf().set("spark.cores.max", "1"); however it still used
>>>> all available cores
>>>>
>>>> I am using Spark in standalone mode (spark://<hostname>:7077)
>>>>
>>>> Any idea what I am missing?
>>>> Thanks in Advance,
>>>>
>>>> Shridhar
>>>>
>>>>
>>>
>>
>

Mime
View raw message