spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ted Yu <yuzhih...@gmail.com>
Subject Re: LZO configuration can not affect
Date Fri, 20 Mar 2015 01:54:45 GMT
jeanlyn92:
I was not very clear in previous reply: I meant to refer to
/home/hadoop/mylib/hadoop-lzo-SNAPSHOT.jar

But looks like the distro includes hadoop-lzo-0.4.15.jar

Cheers

On Thu, Mar 19, 2015 at 6:26 PM, jeanlyn92 <jeanlyn92@gmail.com> wrote:

> That's not enough .The config must appoint specific jar instead of the
> folder.
>
> 2015-03-19 21:27 GMT+08:00 Ted Yu <yuzhihong@gmail.com>:
>
>> If I read the screenshot correctly, Hadoop lzo jar is under
>> /home/hadoop/mylib
>>
>> Cheers
>>
>>
>>
>> On Mar 19, 2015, at 5:37 AM, jeanlyn92 <jeanlyn92@gmail.com> wrote:
>>
>> You should conf as follow:
>> export
>> SPARK_LIBRARY_PATH="$HADOOP_HOME/lib/native:$HADOOP_HOME/share/hadoop/common/lib/
>> *hadoop-lzo-0.4.15.jar*"
>>
>>
>> On 03/19/2015 05:25 PM, Ted Yu wrote:
>>
>> How did you generate the Hadoop-lzo jar ?
>>
>> Thanks
>>
>>
>>
>>
>> On Mar 17, 2015, at 2:36 AM, 唯我者 <878223104@qq.com> wrote:
>>
>> hi,everybody:
>>        I have configured the  env about LZO like this:
>> <9DA0153E@A75E774D.BBF50755.jpg>
>>
>> <543462BE@A75E774D.BBF50755.jpg>
>>
>>
>>   But when  I execute code with spark-shell ,still  error come out like this:
>>
>> scala> val hdfsfile=sc.textFile("/xiaoming/gps_info")
>>
>> scala> hdfsfile.map(_.split(","))
>>
>> scala> res0.collect
>> ava.lang.RuntimeException: Error in configuring object
>>         at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109)
>>         at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:75)
>>         at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
>>         at org.apache.spark.rdd.HadoopRDD.getInputFormat(HadoopRDD.scala:184)
>>         at org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:197)
>>         at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:222)
>>         at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:220)
>>         at scala.Option.getOrElse(Option.scala:120)
>>         at org.apache.spark.rdd.RDD.partitions(RDD.scala:220)
>>         at org.apache.spark.rdd.MappedRDD.getPartitions(MappedRDD.scala:28)
>>         at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:222)
>>         at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:220)
>>         at scala.Option.getOrElse(Option.scala:120)
>>         at org.apache.spark.rdd.RDD.partitions(RDD.scala:220)
>>         at org.apache.spark.rdd.MappedRDD.getPartitions(MappedRDD.scala:28)
>>         at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:222)
>>         at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:220)
>>         at scala.Option.getOrElse(Option.scala:120)
>>         at org.apache.spark.rdd.RDD.partitions(RDD.scala:220)
>>         at org.apache.spark.SparkContext.runJob(SparkContext.scala:1367)
>>         at org.apache.spark.rdd.RDD.collect(RDD.scala:797)
>>         at $iwC$$iwC$$iwC$$iwC.<init>(<console>:17)
>>         at $iwC$$iwC$$iwC.<init>(<console>:22)
>>         at $iwC$$iwC.<init>(<console>:24)
>>         at $iwC.<init>(<console>:26)
>>         at <init>(<console>:28)
>>         at .<init>(<console>:32)
>>         at .<clinit>(<console>)
>>         at .<init>(<console>:7)
>>         at .<clinit>(<console>)
>>         at $print(<console>)
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>         at java.lang.reflect.Method.invoke(Method.java:606)
>>         at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:852)
>>         at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1125)
>>         at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:674)
>>         at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:705)
>>         at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:669)
>>         at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:828)
>>         at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:873)
>>         at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:785)
>>         at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:628)
>>         at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:636)
>>         at org.apache.spark.repl.SparkILoop.loop(SparkILoop.scala:641)
>>         at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:968)
>>         at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
>>         at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
>>         at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>>         at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:916)
>>         at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1011)
>>         at org.apache.spark.repl.Main$.main(Main.scala:31)
>>         at org.apache.spark.repl.Main.main(Main.scala)
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>         at java.lang.reflect.Method.invoke(Method.java:606)
>>         at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
>>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
>>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>> Caused by: java.lang.reflect.InvocationTargetException
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>         at java.lang.reflect.Method.invoke(Method.java:606)
>>         at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106)
>>         ... 60 more
>> Caused by: java.lang.IllegalArgumentException: Compression codec com.hadoop.compression.lzo.LzoCodec
not found.
>>         at org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:135)
>>         at org.apache.hadoop.io.compress.CompressionCodecFactory.<init>(CompressionCodecFactory.java:175)
>>         at org.apache.hadoop.mapred.TextInputFormat.configure(TextInputFormat.java:45)
>>         ... 65 more
>> Caused by: java.lang.ClassNotFoundException: Class com.hadoop.compression.lzo.LzoCodec
not found
>>         at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1801)
>>         at org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:128)
>>         ... 67 more
>>
>>
>>
>> I turn the env  more then ten times.... I can not find what's wrong..
>>
>>
>>
>>
>>
>>
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> For additional commands, e-mail: user-help@spark.apache.org
>>
>> .
>>
>>
>>
>>
>

Mime
View raw message