sqoop-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From SHANKAR REDDY <sankara.teluku...@gmail.com>
Subject Re: Error : java.lang.ArithmeticException: / by zero while importing data from MySQL and running Sqoop2 Job
Date Tue, 02 Jun 2015 00:34:38 GMT
Abe,
It works and I could able to start the job successfully.


Kind Regards,
Sankara Telukutla


On Mon, Jun 1, 2015 at 9:08 AM, Abraham Elmahrek <abe@cloudera.com> wrote:

> Seems like there's a bug in the logic here:
>
> https://github.com/apache/sqoop/blob/branch-1.99.5/connector/connector-generic-jdbc/src/main/java/org/apache/sqoop/connector/jdbc/GenericJdbcPartitioner.java#L77
> .
> The issue is that when null values are allowed, the number of max
> extractors is subtracted by 1. If you set the number of max extractors to
> 1, then this value will become 0. This isn't hard to fix, we just need to
> account for this edge case in code. I created
> https://issues.apache.org/jira/browse/SQOOP-2382 to track this.
>
> To get around the issue, try increasing the number of extractors by 1.
>
> -Abe
>
> On Sun, May 31, 2015 at 7:59 AM, SHANKAR REDDY <
> sankara.telukutla@gmail.com>
> wrote:
>
> > Hi Abe,
> > I was running sqoop from Hue. That means as per your document it is
> sqoop2
> > and I saw that logs were updating.
> >
> > sqoop:000> show job -jid 2
> > 1 job(s) to show:
> > Job with id 2 and name Test Job-copy (Enabled: true, Created by null at
> > 5/21/15 8:45 AM, Updated by null at 5/30/15 4:26 PM)
> > Using link id 1 and Connector id 4
> >   From database configuration
> >     Schema name: clp_sandbox
> >     Table name: HADOOP_TEST
> >     Table SQL statement:
> >     Table column names:
> >     Partition column name: EMPLOYEE_ID
> >     Null value allowed for the partition column: true
> >     Boundary query:
> >   Throttling resources
> >     Extractors: 1
> >     Loaders: 1
> >   ToJob configuration
> >     Override null value:
> >     Null value:
> >     Output format: SEQUENCE_FILE
> >     Compression format: NONE
> >     Custom compression format:
> >     Output directory: /test
> >
> >
> > Please let me know if you need any information.
> >
> >
> > Kind Regards,
> > Sankara Telukutla
> >
> >
> > On Sat, May 30, 2015 at 8:30 PM, Abraham Elmahrek <abe@cloudera.com>
> > wrote:
> >
> > > Hey man,
> > >
> > > I'd be careful not to mix up Sqoop1 and Sqoop2. Check out
> > > http://ingest.tips/2014/10/21/sqoop-1-or-sqoop-2/ for more details on
> > > that.
> > >
> > > It seems like the partitioner might have a bug in it. Could you send
> the
> > > details of "show job --jid <job id>"?
> > >
> > > -Abe
> > >
> > > On Sun, May 31, 2015 at 7:22 AM, SHANKAR REDDY <
> > > sankara.telukutla@gmail.com>
> > > wrote:
> > >
> > > > And I got the below version from after using sqoop version.
> > > >
> > > > ubuntu@ip-172-31-1-201:~$ sqoop version
> > > > Warning:
> > > >
> > > >
> > >
> >
> /opt/cloudera/parcels/CDH-5.4.2-1.cdh5.4.2.p0.2/bin/../lib/sqoop/../accumulo
> > > > does not exist! Accumulo imports will fail.
> > > > Please set $ACCUMULO_HOME to the root of your Accumulo installation.
> > > > 15/05/31 02:19:57 INFO sqoop.Sqoop: Running Sqoop version:
> > 1.4.5-cdh5.4.2
> > > > Sqoop 1.4.5-cdh5.4.2
> > > > git commit id
> > > > Compiled by  on Tue May 19 16:54:41 PDT 2015
> > > >
> > > >
> > > > Kind Regards,
> > > > Sankara Telukutla
> > > > +1 510 936 0999
> > > >
> > > > On Sat, May 30, 2015 at 12:43 PM, SHANKAR REDDY <
> > > > sankara.telukutla@gmail.com
> > > > > wrote:
> > > >
> > > > > Hi Abe,
> > > > > I am using 5.4.2-1.cdh5.4.2.p0.2
> > > > >
> > > > > Is there any way to find the version no directly from this
> > > distribution.
> > > > >
> > > > > Thanks,
> > > > > Shankar
> > > > >
> > > > >
> > > > > On Friday, May 29, 2015, Abraham Elmahrek <abe@cloudera.com>
> wrote:
> > > > >
> > > > >> What version of Sqoop2 are you using?
> > > > >>
> > > > >> On Fri, May 29, 2015 at 1:52 AM, SHANKAR REDDY <
> > > > >> sankara.telukutla@gmail.com>
> > > > >> wrote:
> > > > >>
> > > > >> > Team,
> > > > >> > I am facing the below error while running Sqoop 2 job where
the
> > data
> > > > is
> > > > >> > importing from MySQL. Please let me know if you need any
more
> > > > >> information.
> > > > >> >
> > > > >> > 2015-05-29 08:46:44,773 INFO
> > > > >> > org.apache.sqoop.repository.JdbcRepositoryTransaction:
> Attempting
> > > > >> > transaction commit
> > > > >> > 2015-05-29 08:46:45,025 ERROR
> > > > >> > org.apache.sqoop.submission.mapreduce.MapreduceSubmissionEngine:
> > > Error
> > > > >> in
> > > > >
> > > > >
> > > > >
> > > > >> > submitting job
> > > > >> > java.lang.ArithmeticException: / by zero
> > > > >> >         at
> > > > >> >
> > > > >> >
> > > > >>
> > > >
> > >
> >
> org.apache.sqoop.connector.jdbc.GenericJdbcPartitioner.partitionIntegerColumn(GenericJdbcPartitioner.java:317)
> > > > >> >         at
> > > > >> >
> > > > >> >
> > > > >>
> > > >
> > >
> >
> org.apache.sqoop.connector.jdbc.GenericJdbcPartitioner.getPartitions(GenericJdbcPartitioner.java:86)
> > > > >> >         at
> > > > >> >
> > > > >> >
> > > > >>
> > > >
> > >
> >
> org.apache.sqoop.connector.jdbc.GenericJdbcPartitioner.getPartitions(GenericJdbcPartitioner.java:38)
> > > > >> >         at
> > > > >> >
> > > > >> >
> > > > >>
> > > >
> > >
> >
> org.apache.sqoop.job.mr.SqoopInputFormat.getSplits(SqoopInputFormat.java:74)
> > > > >> >         at
> > > > >> >
> > > > >> >
> > > > >>
> > > >
> > >
> >
> org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:597)
> > > > >> >         at
> > > > >> >
> > > > >>
> > > >
> > >
> >
> org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:614)
> > > > >> >         at
> > > > >> >
> > > > >> >
> > > > >>
> > > >
> > >
> >
> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:492)
> > > > >> >         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1306)
> > > > >> >         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1303)
> > > > >> >         at java.security.AccessController.doPrivileged(Native
> > > Method)
> > > > >> >         at javax.security.auth.Subject.doAs(Subject.java:415)
> > > > >> >         at
> > > > >> >
> > > > >> >
> > > > >>
> > > >
> > >
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
> > > > >> >         at org.apache.hadoop.mapreduce.Job.submit(Job.java:1303)
> > > > >> >         at
> > > > >> >
> > > > >> >
> > > > >>
> > > >
> > >
> >
> org.apache.sqoop.submission.mapreduce.MapreduceSubmissionEngine.submitToCluster(MapreduceSubmissionEngine.java:274)
> > > > >> >         at
> > > > >> >
> > > > >> >
> > > > >>
> > > >
> > >
> >
> org.apache.sqoop.submission.mapreduce.MapreduceSubmissionEngine.submit(MapreduceSubmissionEngine.java:255)
> > > > >> >         at
> > > > org.apache.sqoop.driver.JobManager.start(JobManager.java:288)
> > > > >> >         at
> > > > >> >
> > > > >> >
> > > > >>
> > > >
> > >
> >
> org.apache.sqoop.handler.JobRequestHandler.startJob(JobRequestHandler.java:379)
> > > > >> >         at
> > > > >> >
> > > > >> >
> > > > >>
> > > >
> > >
> >
> org.apache.sqoop.handler.JobRequestHandler.handleEvent(JobRequestHandler.java:115)
> > > > >> >         at
> > > > >> >
> > > > >>
> > > >
> > >
> >
> org.apache.sqoop.server.v1.JobServlet.handlePutRequest(JobServlet.java:96)
> > > > >> >         at
> > > > >> >
> > > > >> >
> > > > >>
> > > >
> > >
> >
> org.apache.sqoop.server.SqoopProtocolServlet.doPut(SqoopProtocolServlet.java:79)
> > > > >> >         at
> > > > javax.servlet.http.HttpServlet.service(HttpServlet.java:646)
> > > > >> >         at
> > > > javax.servlet.http.HttpServlet.service(HttpServlet.java:723)
> > > > >> >         at
> > > > >> >
> > > > >> >
> > > > >>
> > > >
> > >
> >
> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
> > > > >> >         at
> > > > >> >
> > > > >> >
> > > > >>
> > > >
> > >
> >
> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
> > > > >> >         at
> > > > >> >
> > > > >> >
> > > > >>
> > > >
> > >
> >
> org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592)
> > > > >> >         at
> > > > >> >
> > > > >> >
> > > > >>
> > > >
> > >
> >
> org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.doFilter(DelegationTokenAuthenticationFilter.java:277)
> > > > >> >         at
> > > > >> >
> > > > >> >
> > > > >>
> > > >
> > >
> >
> org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:555)
> > > > >> >         at
> > > > >> >
> > > > >> >
> > > > >>
> > > >
> > >
> >
> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
> > > > >> >         at
> > > > >> >
> > > > >> >
> > > > >>
> > > >
> > >
> >
> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
> > > > >> >         at
> > > > >> >
> > > > >> >
> > > > >>
> > > >
> > >
> >
> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
> > > > >> >         at
> > > > >> >
> > > > >> >
> > > > >>
> > > >
> > >
> >
> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
> > > > >> >         at
> > > > >> >
> > > > >> >
> > > > >>
> > > >
> > >
> >
> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
> > > > >> >         at
> > > > >> >
> > > > >> >
> > > > >>
> > > >
> > >
> >
> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
> > > > >> >         at
> > > > >> >
> > > > >> >
> > > > >>
> > > >
> > >
> >
> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
> > > > >> >         at
> > > > >> >
> > > > >>
> > > >
> > >
> >
> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:293)
> > > > >> >         at
> > > > >> >
> > > > >>
> > > >
> > >
> >
> org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:861)
> > > > >> >         at
> > > > >> >
> > > > >> >
> > > > >>
> > > >
> > >
> >
> org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:620)
> > > > >> >         at
> > > > >> >
> > > >
> org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489)
> > > > >> >         at java.lang.Thread.run(Thread.java:745)
> > > > >> > 2015-05-29 08:46:45,025 INFO
> > > > >> > org.apache.sqoop.connector.jdbc.GenericJdbcFromDestroyer:
> Running
> > > > >> generic
> > > > >> > JDBC connector destroyer
> > > > >> > 2015-05-29 08:46:45,027 INFO
> > > > >> > org.apache.sqoop.repository.JdbcRepositoryTransaction:
> Attempting
> > > > >> > transaction commit
> > > > >> >
> > > > >> >
> > > > >> > - Shankar
> > > > >> >
> > > > >>
> > > > >
> > > > >
> > > > > --
> > > > > Regards,
> > > > > Sankara Reddy Telukutla
> > > > >
> > > >
> > >
> >
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message