spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sean Owen <so...@cloudera.com>
Subject Re: PermGen Space Error
Date Wed, 29 Jul 2015 10:07:24 GMT
Yes, I think this was asked because you didn't say what flags you set
before, and it's worth verifying they're the correct ones.

Although I'd be kind of surprised if 512m isn't enough, did you try more?
You could also try -XX:+CMSClassUnloadingEnabled -XX:+CMSPermGenSweepingEnabled

Also verify your executor/driver actually started with this option to
rule out a config problem.

On Wed, Jul 29, 2015 at 10:45 AM, Sarath Chandra
<sarathchandra.josyam@algofusiontech.com> wrote:
> Yes.
>
> As mentioned in my mail at the end, I tried with both 256 and 512 options.
> But the issue persists.
>
> I'm giving following parameters to spark configuration -
> spark.core.connection.ack.wait.timeout=600
> spark.akka.timeout=1000
> spark.akka.framesize=50
> spark.executor.memory=2g
> spark.task.cpus=2
> spark.scheduler.mode=fair
> spark.driver.extraJavaOptions="-XX:MaxPermSize=256m"
> spark.executor.extraJavaOptions="-XX:MaxPermSize=256m"
>
> The jars being included are of about 21MB, the data being processed by the
> job is around 1000 rows with 25 columns. I'm running on a single node mesos
> cluster on my laptop having 4 CPUs and 12GB RAM.
>
> On Wed, Jul 29, 2015 at 2:49 PM, fightfate@163.com <fightfate@163.com>
> wrote:
>>
>> Hi, Sarath
>>
>> Did you try to use and increase spark.excecutor.extraJaveOptions
>> -XX:PermSize= -XX:MaxPermSize=
>>
>>
>> ________________________________
>> fightfate@163.com
>>
>>
>> From: Sarath Chandra
>> Date: 2015-07-29 17:39
>> To: user@spark.apache.org
>> Subject: PermGen Space Error
>> Dear All,
>>
>> I'm using -
>>  => Spark 1.2.0
>>  => Hive 0.13.1
>>  => Mesos 0.18.1
>>  => Spring
>>  => JDK 1.7
>>
>> I've written a scala program which
>>   => instantiates a spark and hive context
>>   => parses an XML file which provides the where clauses for queries
>>   => generates full fledged hive queries to be run on hive tables
>>   => registers obtained SchemaRDD as temp tables to get reduced data sets
>> to be queried further
>>   => prints the count of finally obtained data set
>>
>> I'm running this scala programatically through java command (command
>> invokes a controller program to create some useful value objects using input
>> parameters and properties files and then calls the above scala program).
>>
>> I'm getting PermGen Space error when it hits the last line to print the
>> count.
>>
>> I'm printing to console the generated hive queries from the scala program.
>> When I run the same from a spark shell it works fine.
>>
>> As mentioned in some posts and blogs I tried using the option
>> spark.driver.extraJavaOptions to increase the size, tried with 256 and 512
>> but still no luck.
>>
>> Please help me in resolving the space issue
>>
>> Thanks & Regards,
>> Sarath.
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message