sqoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Kate Ting <k...@cloudera.com>
Subject Re: [sqoop-user] Fwd: Error occur while sqoop import
Date Tue, 13 Dec 2011 18:12:47 GMT
For faster response, please subscribe to sqoop-user@incubator.apache.org.

Hi Bhavesh -

Please go ahead and upgrade to CDH3u2 for both Hadoop and Sqoop.

Regards, Kate
On Mon, Dec 12, 2011 at 10:44 PM, Bhavesh Shah <bhavesh25shah@gmail.com> wrote:
> Hello Kate,
> I got the point. So please suggest me which version of Hadoop will
> support sqoop-1.1.0.
>
> Thanks and Regards,
> Bhavesh Shah.
>
> On Tue, Dec 13, 2011 at 5:04 AM, Kate Ting <kate@cloudera.com> wrote:
>> [Moving the thread to sqoop-user@incubator.apache.org. Please
>> subscribe to this list.]
>> Hi Bhavesh -
>>
>> As Arvind mentioned on this mailing list on Dec 7, this is likely
>> because the HADOOP_HOME environment variable is pointing to a version
>> of Hadoop that does not have the necessary backports needed by Sqoop.
>> You can work around this issue by upgrading to Cloudera's distribution
>> (CDH). Alternatively you can wait for a resolution of SQOOP-384 which
>> will address this issue.
>> https://issues.apache.org/jira/browse/SQOOP-384
>> Regards, Kate
>> On Sun, Dec 11, 2011 at 11:35 PM, Bhavesh Shah <bhavesh25shah@gmail.com> wrote:
>>> Hello,
>>>
>>> 1) While I am trying for sqoop import I am getting errors as:
>>>
>>> Exception in thread "main" java.lang.IncompatibleClassChangeError:
>>> Found interface org.apache.hadoop.mapreduce.JobContext, but class was
>>> expected
>>>        at com.cloudera.sqoop.mapreduce.db.DataDrivenDBInputFormat.getSplits(DataDrivenDBInputFormat.java:198)
>>>        at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:401)
>>>        at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:418)
>>>        at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:338)
>>>        at org.apache.hadoop.mapreduce.Job.submit(Job.java:960)
>>>        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:976)
>>>        at com.cloudera.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:107)
>>>        at com.cloudera.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:166)
>>>        at com.cloudera.sqoop.manager.SqlManager.importTable(SqlManager.java:385)
>>>        at com.cloudera.sqoop.tool.ImportTool.importTable(ImportTool.java:350)
>>>        at com.cloudera.sqoop.tool.ImportTool.run(ImportTool.java:423)
>>>        at com.cloudera.sqoop.Sqoop.run(Sqoop.java:134)
>>>        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:69)
>>>        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:83)
>>>        at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:170)
>>>        at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:196)
>>>        at com.cloudera.sqoop.Sqoop.main(Sqoop.java:205)
>>>
>>> 2) Except import related command all others commands are working like
>>> I have tried for eval, list tables, create hive table, version
>>> I am not able to access the directory under hdfs://localhost:54310/user
>>> It is not giving me permission to access it. Thats why I cant see the
>>> hive created table.
>>>
>>> Pls suggest me solution for the same.
>>>
>>>
>>>
>>> --
>>> Thanks and Regards,
>>> Bhavesh Shah
>>>
>>> --
>>> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor of Apache
Sqoop mailing list sqoop-user@incubator.apache.org. Please subscribe to it by sending an email
to incubator-sqoop-user-subscribe@apache.org.
>>
>> --
>> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor of Apache Sqoop
mailing list sqoop-user@incubator.apache.org. Please subscribe to it by sending an email to
incubator-sqoop-user-subscribe@apache.org.
>
>
>
> --
> Regards,
> Bhavesh Shah
>
> --
> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor of Apache Sqoop
mailing list sqoop-user@incubator.apache.org. Please subscribe to it by sending an email to
incubator-sqoop-user-subscribe@apache.org.

Mime
View raw message