HI,

I am getting weird error when running spark job in emr cluster. Same program runs in my local machine. Is there anything that I need to do to resolve this?

21/04/12 18:48:45 ERROR SparkContext: Error initializing SparkContext.
java.lang.NumberFormatException: For input string: "30s"

I tried the solution mentioned in the link below but it didn't work for me.

https://hadooptutorials.info/2020/10/11/part-5-using-spark-as-execution-engine-for-hive-2/

Thanks,
Asmath