sqoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Abraham Elmahrek <...@cloudera.com>
Subject Re: sqoop 1.4.5 for hadoop 2.4 question (can't find method)
Date Thu, 20 Nov 2014 02:54:15 GMT
https://issues.apache.org/jira/browse/SQOOP-1168

We're shooting for 1.99.5:
https://cwiki.apache.org/confluence/display/SQOOP/Sqoop+2+Roadmap. Sqoop2
development is pretty active these days. Hopefully in the next few months
we'll see 1.99.5. 1.99.4 is in the process of being released.

Until then though, if you just need incremental import, try using an apache
bigtop distribution of Sqoop: http://bigtop.apache.org/,
http://mirror.cogentco.com/pub/apache/bigtop/bigtop-0.8.0/repos/. It will
give you packages to install using your favorite flavor of linux.

If the above doesn't work, you can download binary distributions of Sqoop:
http://www.apache.org/dyn/closer.cgi/sqoop/1.4.5. Choose a mirror and
download the distribution marked hadoop-2.0.4-alpha.

If that doesn't work either, you can try one of the many vendors that
provide Hadoop.

-Abe

On Wed, Nov 19, 2014 at 6:45 PM, 呂佳鴻 <chiahung1227@gmail.com> wrote:

> I need to use the incremental import of sqoop.
> But i noticed that sqoop2  can't support it ?
> Or it can support incremental import ?
>
> Thank you
>
> Luke
>
>
>
>
> On Thu Nov 20 2014 at 上午10:36:37 Abraham Elmahrek <abe@cloudera.com>
> wrote:
>
>> Building against a later version of Hadoop is highly discouraged. Certain
>> client jars have changed which will probably cause issues (likely this one).
>>
>> What's your use case? Maybe we can figure something else out?
>>
>> Also, checkout Sqoop2 for a better Java client if that's what you need.
>>
>> NOTE: Hadoop maintains wire compatibility for major versions. This means
>> that old clients can be used against newer servers as long as they share
>> the same major version number.
>>
>> On Tue, Nov 18, 2014 at 9:27 PM, 呂佳鴻 <chiahung1227@gmail.com> wrote:
>>
>>> Hi, i try not to  change build.xml now  , and still have the same
>>> problem.
>>> (just use "ant package" to build)
>>>
>>> I modified the sqoop.env.sh to add HADOOP_COMMON_HOME &
>>> HADOOP_MAPRED_HOME
>>>
>>> Then I add  mysql-connector-java-5.1.34-bin.jar to sqoop lib
>>> try the same command
>>>
>>> sqoop import --connect jdbc:mysql://localhost:3306/test --username xxx
>>> --password xxxx --table user1 -m 1
>>>
>>> How can I modified the error?
>>> so I can run sqoop on hadoop 2.4.1?
>>> if there is other solution or file for hadoop 2.4.1 that i can check .
>>>
>>> java.lang.NoSuchMethodError:
>>> org.apache.hadoop.http.HttpConfig.getSchemePrefix()Ljava/lang/String;
>>>
>>>
>>> Thank you!
>>>
>>> Luke
>>>
>>>
>>>
>>>
>>> ----------------------------
>>> follow are appear message
>>> Warning: /opt/sqoop-1.4.5/../hcatalog does not exist! HCatalog jobs will
>>> fail.
>>> Please set $HCAT_HOME to the root of your HCatalog installation.
>>> Warning: /opt/sqoop-1.4.5/../accumulo does not exist! Accumulo imports
>>> will fail.
>>> Please set $ACCUMULO_HOME to the root of your Accumulo installation.
>>> Warning: /opt/sqoop-1.4.5/../zookeeper does not exist! Accumulo imports
>>> will fail.
>>> Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
>>> 14/11/19 13:21:47 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5
>>> 14/11/19 13:21:47 WARN tool.BaseSqoopTool: Setting your password on the
>>> command-line is insecure. Consider using -P instead.
>>> 14/11/19 13:21:48 INFO manager.MySQLManager: Preparing to use a MySQL
>>> streaming resultset.
>>> 14/11/19 13:21:48 INFO tool.CodeGenTool: Beginning code generation
>>> SLF4J: Class path contains multiple SLF4J bindings.
>>> SLF4J: Found binding in
>>> [jar:file:/opt/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>> SLF4J: Found binding in
>>> [jar:file:/opt/hbase-0.98.4/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>> SLF4J: Found binding in
>>> [jar:file:/opt/sqoop-1.4.5/build/ivy/lib/sqoop/hadoop200/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>> explanation.
>>> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>>> 14/11/19 13:21:48 INFO manager.SqlManager: Executing SQL statement:
>>> SELECT t.* FROM `user1` AS t LIMIT 1
>>> 14/11/19 13:21:48 INFO manager.SqlManager: Executing SQL statement:
>>> SELECT t.* FROM `user1` AS t LIMIT 1
>>> 14/11/19 13:21:48 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is
>>> /opt/hadoop-2.4.1/share/hadoop/mapreduce
>>> Note:
>>> /tmp/sqoop-hadoop/compile/4e64e5e221ace9afdb32576604ed074b/user1.java uses
>>> or overrides a deprecated API.
>>> Note: Recompile with -Xlint:deprecation for details.
>>> 14/11/19 13:21:49 INFO orm.CompilationManager: Writing jar file:
>>> /tmp/sqoop-hadoop/compile/4e64e5e221ace9afdb32576604ed074b/user1.jar
>>> 14/11/19 13:21:50 WARN manager.MySQLManager: It looks like you are
>>> importing from mysql.
>>> 14/11/19 13:21:50 WARN manager.MySQLManager: This transfer can be
>>> faster! Use the --direct
>>> 14/11/19 13:21:50 WARN manager.MySQLManager: option to exercise a
>>> MySQL-specific fast path.
>>> 14/11/19 13:21:50 INFO manager.MySQLManager: Setting zero DATETIME
>>> behavior to convertToNull (mysql)
>>> 14/11/19 13:21:50 INFO mapreduce.ImportJobBase: Beginning import of user1
>>> Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library
>>> /opt/hadoop-2.4.1/lib/native/libhadoop.so.1.0.0 which might have disabled
>>> stack guard. The VM will try to fix the stack guard now.
>>> It's highly recommended that you fix the library with 'execstack -c
>>> <libfile>', or link it with '-z noexecstack'.
>>> 14/11/19 13:21:50 WARN util.NativeCodeLoader: Unable to load
>>> native-hadoop library for your platform... using builtin-java classes where
>>> applicable
>>> 14/11/19 13:21:50 INFO Configuration.deprecation: mapred.jar is
>>> deprecated. Instead, use mapreduce.job.jar
>>> 14/11/19 13:21:50 INFO Configuration.deprecation: mapred.map.tasks is
>>> deprecated. Instead, use mapreduce.job.maps
>>> 14/11/19 13:21:50 INFO client.RMProxy: Connecting to ResourceManager at /
>>> 0.0.0.0:8032
>>> 14/11/19 13:21:52 INFO db.DBInputFormat: Using read commited transaction
>>> isolation
>>> 14/11/19 13:21:52 INFO mapreduce.JobSubmitter: number of splits:1
>>> 14/11/19 13:21:52 INFO mapreduce.JobSubmitter: Submitting tokens for
>>> job: job_1416370328669_0005
>>> 14/11/19 13:21:52 INFO impl.YarnClientImpl: Submitted application
>>> application_1416370328669_0005
>>> 14/11/19 13:21:52 INFO mapreduce.JobSubmitter: Cleaning up the staging
>>> area /tmp/hadoop-yarn/staging/hadoop/.staging/job_1416370328669_0005
>>> Exception in thread "main" java.lang.NoSuchMethodError:
>>> org.apache.hadoop.http.HttpConfig.getSchemePrefix()Ljava/lang/String;
>>> at
>>> org.apache.hadoop.mapred.ClientServiceDelegate.getJobStatus(ClientServiceDelegate.java:428)
>>> at org.apache.hadoop.mapred.YARNRunner.submitJob(YARNRunner.java:302)
>>> at
>>> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:430)
>>> at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
>>> at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
>>> at java.security.AccessController.doPrivileged(Native Method)
>>> at javax.security.auth.Subject.doAs(Subject.java:422)
>>> at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1556)
>>> at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
>>> at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1286)
>>> at
>>> org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:186)
>>> at
>>> org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:159)
>>> at
>>> org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:247)
>>> at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:665)
>>> at
>>> org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:118)
>>> at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)
>>> at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:601)
>>> at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>>> at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
>>> at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
>>> at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
>>> at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
>>> On Tue Nov 18 2014 at 上午1:38:33 Abraham Elmahrek <abe@cloudera.com>
>>> wrote:
>>>
>>>> You shouldn't need to change the versions in build.xml. If should just
>>>> work out of the box with any Hadoop 2 distribution.
>>>>
>>>> -Abe
>>>>
>>>> On Mon, Nov 17, 2014 at 2:33 AM, lu <chiahung1227@gmail.com> wrote:
>>>>
>>>>> hi ,
>>>>> i download sqoop-1.4.5.tar.gz ,and modified build.xml
>>>>> (just change hadoop version to 2.4.1 )
>>>>> <equals arg1="${hadoopversion}" arg2="200" />
>>>>>       <then>
>>>>>         <property name="hadoop.version" value="2.4.1" />
>>>>>         <property name="hbase94.version" value="0.94.2" />
>>>>>         <property name="zookeeper.version" value="3.4.2" />
>>>>>         <property name="hadoop.version.full" value="2.4.1" />
>>>>>         <property name="hcatalog.version" value="0.13.0" />
>>>>>         <property name="hbasecompatprofile" value="2" />
>>>>>         <property name="avrohadoopprofile" value="2" />
>>>>>       </then>
>>>>>
>>>>> after build and install
>>>>> i try to convert data from mysql to hdfs
>>>>> "sqoop import --connect jdbc:mysql://localhost:3306/test --username
>>>>> xxx --password xxxx --table user1 -m 1 --verbose"
>>>>> it's appear
>>>>>
>>>>> *Exception in thread "main" java.lang.NoSuchMethodError:
>>>>> org.apache.hadoop.http.HttpConfig.getSchemePrefix()Ljava/lang/String;*
>>>>>
>>>>> this method seem to change in hadoop 2.4
>>>>> how can i fix this bug and if sqoop 1.4.5 can support hadoop 2.4.1?
>>>>> i can't find solution on web
>>>>>
>>>>>
>>>>> Thank you!
>>>>>
>>>>> Luke
>>>>>
>>>>
>>>>
>>

Mime
View raw message