sqoop-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "jarcec@apache.org" <jar...@apache.org>
Subject Re: Run sqoop from .sh script
Date Thu, 10 Apr 2014 14:16:32 GMT
+CC dev@sqoop.apache.org

On Thu, Apr 10, 2014 at 07:52:01AM +0000, Sandipan.Ghosh wrote:
> Hi,
> 
> I am facing problem running multiple sqoop in a .sh file.
> 
> Here is my .sh file (saved as try_bash_sqoop.sh)
> sqoop --options-file /home_dir/z070061/sqoop_import_param_td.txt --fields-terminated-by
'\t' --warehouse-dir ../../user/z070061/UDC_DPCI/ --table inovbidt.UDC_DPCI
> sqoop --options-file /home_dir/z070061/sqoop_import_param_td.txt --fields-terminated-by
'\t' --warehouse-dir ../../user/z070061/UDC_STR_DC_MAP/ --table inovbidt.UDC_STR_DC_MAP
> 
> When I run it using below command
> bash try_bash_sqoop.sh
> 
> it only run the last sqoop command successfully and failing the 1st one. If I run the
command separately, then both runs fine without any error.
> Attached is the code and the log file (copy from screen output)
> Thanks
> Sandipan
> 
> -----Original Message-----
> From: Jarek Jarcec Cecho [mailto:jarcec@apache.org] 
> Sent: Tuesday, April 08, 2014 7:29 PM
> To: user@sqoop.apache.org
> Subject: Re: Run sqoop from .sh script
> 
> Hi Snadipan,
> I feel that it should be pretty straightforward to run Sqoop from shell script, are you
seeing any troubles with that?
> 
> Jarcec
> 
> On Tue, Apr 08, 2014 at 10:51:19AM +0000, Sandipan.Ghosh wrote:
> > 
> > Hi,
> > 
> > I want to run multiple Sqoop commands saving into a .sh file then executing that
from bash shell.
> > 
> > How will I do it?
> > 
> > Thanks
> > Snadipan

> bash-4.1$ bash try_bash_sqoop.sh
> Warning: /usr/lib/hcatalog does not exist! HCatalog jobs will fail.
> Please set $HCAT_HOME to the root of your HCatalog installation.
> 14/04/10 00:58:05 INFO sqoop.Sqoop: Running Sqoop version: 1.4.3-cdh4.5.0
> 14/04/10 00:58:06 WARN tool.BaseSqoopTool: Setting your password on the command-line
is insecure. Consider using -P instead.
> 14/04/10 00:58:06 INFO manager.SqlManager: Using default fetchSize of 1000
> 14/04/10 00:58:06 INFO tool.CodeGenTool: Beginning code generation
>  AS t WHERE 1=007 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM inovbidt.UDC_DPCI
> 14/04/10 00:58:07 ERROR tool.ImportTool: Imported Failed: Attempted to generate class
with no columns!
> Warning: /usr/lib/hcatalog does not exist! HCatalog jobs will fail.
> Please set $HCAT_HOME to the root of your HCatalog installation.
> 14/04/10 00:58:08 INFO sqoop.Sqoop: Running Sqoop version: 1.4.3-cdh4.5.0
> 14/04/10 00:58:08 WARN tool.BaseSqoopTool: Setting your password on the command-line
is insecure. Consider using -P instead.
> 14/04/10 00:58:08 INFO manager.SqlManager: Using default fetchSize of 1000
> 14/04/10 00:58:08 INFO tool.CodeGenTool: Beginning code generation
> 14/04/10 00:58:33 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM inovbidt.UDC_STR_DC_MAP
AS t WHERE 1=0
> 14/04/10 00:58:34 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /apps/tdp/software/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/bin/../lib/hadoop-0.20-mapreduce
> 14/04/10 00:58:34 INFO orm.CompilationManager: Found hadoop core jar at: /apps/tdp/software/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/bin/../lib/hadoop-0.20-mapreduce/hadoop-core.jar
> Note: /tmp/sqoop-z070061/compile/289b7772769910039ed315db9a7557a8/inovbidt_UDC_STR_DC_MAP.java
uses or overrides a deprecated API.
> Note: Recompile with -Xlint:deprecation for details.
> 14/04/10 00:58:35 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-z070061/compile/289b7772769910039ed315db9a7557a8/inovbidt.UDC_STR_DC_MAP.jar
> 14/04/10 00:58:35 INFO teradata.TeradataManager: Beginning Teradata import
> 14/04/10 00:58:36 INFO util.TeradataUtil: JDBC URL used by Teradata manager: jdbc:teradata://10.67.192.180/DATABASE=INOVBIDT,LOGMECH=LDAP,TYPE=FASTEXPORT
> 14/04/10 00:58:36 INFO mapreduce.ImportJobBase: Beginning import of inovbidt.UDC_STR_DC_MAP
> 14/04/10 00:58:36 INFO util.TeradataUtil: Current database used by Teradata manager:
INOVBIDT
> 14/04/10 00:58:36 INFO imports.TeradataInputFormat: Staging table is turned OFF
> 14/04/10 00:58:37 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments.
Applications should implement Tool for the same.
> 14/04/10 00:58:39 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN("UDC"),
MAX("UDC") FROM inovbidt.UDC_STR_DC_MAP
> 14/04/10 00:58:39 INFO mapred.JobClient: Running job: job_201403121312_6012
> 14/04/10 00:58:40 INFO mapred.JobClient:  map 0% reduce 0%
> 14/04/10 00:58:56 INFO mapred.JobClient:  map 75% reduce 0%
> 14/04/10 00:58:59 INFO mapred.JobClient:  map 100% reduce 0%
> 14/04/10 00:59:00 INFO mapred.JobClient: Job complete: job_201403121312_6012
> 14/04/10 00:59:00 INFO mapred.JobClient: Counters: 23
> 14/04/10 00:59:00 INFO mapred.JobClient:   File System Counters
> 14/04/10 00:59:00 INFO mapred.JobClient:     FILE: Number of bytes read=0
> 14/04/10 00:59:00 INFO mapred.JobClient:     FILE: Number of bytes written=831512
> 14/04/10 00:59:00 INFO mapred.JobClient:     FILE: Number of read operations=0
> 14/04/10 00:59:00 INFO mapred.JobClient:     FILE: Number of large read operations=0
> 14/04/10 00:59:00 INFO mapred.JobClient:     FILE: Number of write operations=0
> 14/04/10 00:59:00 INFO mapred.JobClient:     HDFS: Number of bytes read=425
> 14/04/10 00:59:00 INFO mapred.JobClient:     HDFS: Number of bytes written=52946
> 14/04/10 00:59:00 INFO mapred.JobClient:     HDFS: Number of read operations=4
> 14/04/10 00:59:00 INFO mapred.JobClient:     HDFS: Number of large read operations=0
> 14/04/10 00:59:00 INFO mapred.JobClient:     HDFS: Number of write operations=4
> 14/04/10 00:59:00 INFO mapred.JobClient:   Job Counters
> 14/04/10 00:59:00 INFO mapred.JobClient:     Launched map tasks=4
> 14/04/10 00:59:00 INFO mapred.JobClient:     Total time spent by all maps in occupied
slots (ms)=48401
> 14/04/10 00:59:00 INFO mapred.JobClient:     Total time spent by all reduces in occupied
slots (ms)=0
> 14/04/10 00:59:00 INFO mapred.JobClient:     Total time spent by all maps waiting after
reserving slots (ms)=0
> 14/04/10 00:59:00 INFO mapred.JobClient:     Total time spent by all reduces waiting
after reserving slots (ms)=0
> 14/04/10 00:59:00 INFO mapred.JobClient:   Map-Reduce Framework
> 14/04/10 00:59:00 INFO mapred.JobClient:     Map input records=1883
> 14/04/10 00:59:00 INFO mapred.JobClient:     Map output records=1883
> 14/04/10 00:59:00 INFO mapred.JobClient:     Input split bytes=425
> 14/04/10 00:59:00 INFO mapred.JobClient:     Spilled Records=0
> 14/04/10 00:59:00 INFO mapred.JobClient:     CPU time spent (ms)=13040
> 14/04/10 00:59:00 INFO mapred.JobClient:     Physical memory (bytes) snapshot=1226002432
> 14/04/10 00:59:00 INFO mapred.JobClient:     Virtual memory (bytes) snapshot=19536457728
> 14/04/10 00:59:00 INFO mapred.JobClient:     Total committed heap usage (bytes)=8232108032
> 14/04/10 00:59:00 INFO mapreduce.ImportJobBase: Transferred 51.7051 KB in 24.0544 seconds
(2.1495 KB/sec)
> 14/04/10 00:59:00 INFO mapreduce.ImportJobBase: Retrieved 1883 records.
> bash-4.1$



Mime
View raw message