Does sqoop-export support --as-sequence option? I know sqoop-import does.
-Deepak
On Thu, Aug 15, 2013 at 11:34 PM, Abraham Elmahrek <abe@cloudera.com> wrote:
> Hey There,
>
> I believe you're missing the --as-sequence directive!
>
> -Abe
>
>
> On Thu, Aug 15, 2013 at 7:16 PM, Deepak Konidena <deepakkoni@gmail.com>wrote:
>
>> Hi,
>>
>> I have a sequence file with with both (key,value) as
>> org.apache.hadoop.io.Text
>>
>> I am trying to export the data into a mysql table with (key,value) mapped
>> to (varchar, blob) since the value is pretty big. and I get the following
>> error:
>>
>> (command) - sqoop export -m "1" -connect
>> "jdbc:mysql://<host>:3306/database" --username "sqoop" --password
>> "sqooppwd" --table "tablename" --export-dir "/path/to/sequencefile"
>> --verbose
>>
>> java.lang.ClassCastException: org.apache.hadoop.io.Text cannot be cast to
>> org.apache.hadoop.io.LongWritable
>> at
>> org.apache.sqoop.mapreduce.CombineShimRecordReader.getCurrentKey(CombineShimRecordReader.java:95)
>> at
>> org.apache.sqoop.mapreduce.CombineShimRecordReader.getCurrentKey(CombineShimRecordReader.java:38)
>> at
>> org.apache.sqoop.mapreduce.CombineFileRecordReader.getCurrentKey(CombineFileRecordReader.java:79)
>> at
>> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.getCurrentKey(MapTask.java:461)
>> at
>> org.apache.hadoop.mapreduce.task.MapContextImpl.getCurrentKey(MapContextImpl.java:66)
>> at
>> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.getCurrentKey(WrappedMapper.java:75)
>> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:140)
>> at
>> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
>> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
>> at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>>
>> The export works fine when I create a text file like so,
>>
>> <key,value1,value2,value3>
>>
>> and upload it to hdfs using -CopyFromLocal
>>
>> But, its only with sequence files that it doesn't seem to work. Any
>> thoughts?
>>
>> Thanks,
>> Deepak
>>
>>
>
|