sqoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sarath <sarathchandra.jos...@algofusiontech.com>
Subject Re: Export not working
Date Fri, 26 Oct 2012 14:45:08 GMT
Sure will check that part. But I'm still struck making export work. I'm 
facing issues exporting data to Oracle database.

The issue is with the Timestamp fields. Sqoop generated code uses 
java.sql.Timestamp which expects date field value in a particular 
format. But in our data, available on hadoop, the format of the date 
field is not guaranteed as the data is placed in this location by 
multiple sources.

I even tried putting hadoop data in the format expected by Timestamp but 
still it complains with IllegalArgumentException.

Is there any workaround?

~Sarath.

On Friday 26 October 2012 08:06 PM, Jarek Jarcec Cecho wrote:
> Hi Sarath,
> I'm glad that Sqoop has started working for you.
>
> The internet advise with parameter -Dhadoopversion=100 should indeed do the trick, so
I'm not sure what has went wrong with your build. Maybe some previous classes were found and
not recompiled, would you mind trying "ant clean package -Dhadoopversion=100" to see if that
helps?
>
> Jarcec
>
> On Fri, Oct 26, 2012 at 10:30:06AM +0530, Sarath wrote:
>> Thanks Jarcec. I downloaded the binary artifact and it's working now.
>>
>> I actually built my previous sqoop binary from the sources using the
>> option -Dhadoopversion=100 (since my hadoop version was 1.0.3) after
>> reading some blogs on the net. Not sure why it was still giving me
>> that exception.
>>
>> Sarath.
>>
>> On Thursday 25 October 2012 09:35 PM, Jarek Jarcec Cecho wrote:
>>> Hi Sarah,
>>> this exception is very typical when someone is messing together incompatible
hadoop binaries and applications (for example sqoop compiled for hadoop 2 running on hadoop
1). Would you mind checking that you've downloaded appropriate binary distribution for your
cluster? You have to use binary artifact sqoop-1.4.2.bin__hadoop-1.0.0.tar.gz for hadoop 1.0.3.
>>>
>>> Jarcec
>>>
>>> On Thu, Oct 25, 2012 at 07:23:49PM +0530, Sarath wrote:
>>>> Hi,
>>>>
>>>> I'm new to Sqoop. I have sqoop 1.4.2 with hadoop 1.0.3. I have both
>>>> hadoop and sqoop home environment variables set.
>>>>
>>>> I'm trying to export a file on HDFS to a table in Oracle database. I
>>>> included all the required parameters inside a file and then ran -
>>>> /sqoop --options-file export_params/
>>>>
>>>> I got the below exception -
>>>> /Exception in thread "main" java.lang.IncompatibleClassChangeError:
>>>> Found class org.apache.hadoop.mapreduce.JobContext, but interface
>>>> was expected//
>>>> //    at org.apache.sqoop.mapreduce.ExportOutputFormat.checkOutputSpecs(ExportOutputFormat.java:57)//
>>>> //    at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:887)//
>>>> //    at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:850)//
>>>> //    at java.security.AccessController.doPrivileged(Native Method)//
>>>> //    ..../
>>>>
>>>> Is there anything more to be configured?
>>>>
>>>> Regards,
>>>> Sarath.

Mime
View raw message