sqoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From mark pasterkamp <markpasterk...@hotmail.com>
Subject Re: sqoop 1.4.7 nullpointer during import
Date Tue, 05 Mar 2019 18:23:49 GMT
Dear Markus,

Thank you for your reply, I think I managed to make it work (albeit with a few quirks).
I was following https://sqoop.apache.org/docs/1.99.7/admin/Installation.html to setup sqoop
en somewhere it says that you need to setup a hadoop proxy user. Since the version I have
is 1.4.7 I thought I might remove this part and it started to work again.

 I am very grateful for the reply and thank you for your comments, they did help in giving
a bit more understanding.


Van: Markus Kemper <markus@cloudera.com>
Verzonden: dinsdag 5 maart 2019 14:39
Aan: user@sqoop.apache.org
Onderwerp: Re: sqoop 1.4.7 nullpointer during import

Hey Mark,

Couple of comments

  *   Generally I would discourage using the dot "." character with params, you will likely
observe mixed behavior
  *   When using --hive-import Sqoop expects
     *   Hive targets (--hive-database and --hive-table)
     *   The --target-dir becomes transient, a "temp" location not the Hive table path

If you are trying to use Hive Import try

sqoop import \
--bindir ./ \
--connect {database_string} \
--username {user} \
--password {password} \
--table {table_name} \
--hive-import \
--hive-database {hive_database} \
--hive-table {hive_table}
[--hive-overwrite] /* append is the default */
--target-dir /{temporary_staging_dir}/ \
--delete-target-dir \

If you are trying to bypass Hive Import and write directly to the Hive table path try

sqoop import \
--bindir ./ \
--connect {database_string} \
--username {user} \
--password {password} \
--table {table_name} \
--target-dir /{hive_table_path}/ \
[--delete-target-dir | --append] \

Markus Kemper
Cloudera Support

On Tue, Mar 5, 2019 at 4:54 AM mark pasterkamp <markpasterkamp@hotmail.com<mailto:markpasterkamp@hotmail.com>>
Hi all,

I have been having a lot of issues getting to import data from postgres into hive.
Using things like "--bindir ./" I have sometimes been able to get around the "Class {table_name}
not found exception"
I have been able to get a few runs of the import script without exceptions which resulted
in my hive metastore throwing exceptions which forced me to recreate the metastore.

However, my latest issue is a nullpointer while trying to run the script:

at java.util.Objects.requireNonNull(Objects.java:203)
at java.util.Arrays$ArrayList.<init>(Arrays.java:3813)
at java.util.Arrays.asList(Arrays.java:3800)
at org.apache.sqoop.util.FileListing.getFileListingNoSort(FileListing.java:76)
at org.apache.sqoop.util.FileListing.getFileListingNoSort(FileListing.java:82)
at org.apache.sqoop.util.FileListing.getFileListingNoSort(FileListing.java:82)
at org.apache.sqoop.util.FileListing.getFileListingNoSort(FileListing.java:82)
at org.apache.sqoop.util.FileListing.getFileListing(FileListing.java:67)
at com.cloudera.sqoop.util.FileListing.getFileListing(FileListing.java:39)
at org.apache.sqoop.orm.CompilationManager.addClassFilesFromDir(CompilationManager.java:293)
at org.apache.sqoop.orm.CompilationManager.jar(CompilationManager.java:378)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:108)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:501)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)

and the import script:
sqoop import \
--bindir ./ \
--connect {database_string} \
--username {user} \
--password {password} \
--table {table_name} \
--hive-import \
--target-dir /user/hive/warehouse/{table_name}/ \
--delete-target-dir \

Looking online I have not found a cause or solution for my latest issue with sqoop. Would
any of you perhaps know the cause and the solution?

With kind regards,


View raw message