sqoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Andy Srine <andy.sr...@gmail.com>
Subject Re: Fwd: Class invariant violation
Date Wed, 02 Apr 2014 21:28:44 GMT
Thanks Abe. That was the problem and it works now with FQDN.




On Wed, Mar 26, 2014 at 4:29 PM, Abraham Elmahrek <abe@cloudera.com> wrote:

> Hey there,
>
> Your job has a problem in its JDBC Connection String (
> jdbc:mysql://localhost/test). By using localhost, Sqoop will resolve
> different machines when it parallelizes the process. Does the machine your
> MySQL instance is on have an FQDN in the network? You should be using that
> instead.
>
> -Abe
>
>
> On Wed, Mar 26, 2014 at 4:24 PM, Andy Srine <andy.srine@gmail.com> wrote:
>
>> Thanks Jarek. I got further, but now the job itself fails. I am trying to
>> test with mysql and looks like the problem is with the connection. The
>> swoop connection/job details are below and the log is attached. Please let
>> me know if there are any clues.
>>
>> sqoop:000> show connection -all
>>
>> 1 connection(s) to show:
>>
>> Connection with id 1 and name First Connection (Enabled: true, Created by
>> andy at 3/26/14 10:23 PM, Updated by andy at 3/26/14 10:23 PM)
>>
>> Using Connector id 1
>>
>>   Connection configuration
>>
>>     JDBC Driver Class: com.mysql.jdbc.Driver
>>
>>     JDBC Connection String: jdbc:mysql://localhost/test
>>
>>     Username:
>>
>>     Password:
>>
>>     JDBC Connection Properties:
>>
>>   Security related configuration options
>>
>>     Max connections: 0
>>
>>
>>
>> sqoop:000> show job -all
>>
>> 1 job(s) to show:
>>
>> Job with id 1 and name First Job (Enabled: true, Created by andy at
>> 3/26/14 10:26 PM, Updated by andy at 3/26/14 10:26 PM)
>>
>> Using Connection id 1 and Connector id 1
>>
>>   Database configuration
>>
>>     Schema name: test
>>
>>     Table name: test_table
>>
>>     Table SQL statement:
>>
>>     Table column names:
>>
>>     Partition column name: id
>>
>>     Nulls in partition column:
>>
>>     Boundary query:
>>
>>   Output configuration
>>
>>     Storage type: HDFS
>>
>>     Output format: TEXT_FILE
>>
>>     Compression format: NONE
>>
>>     Output directory: /user/andy/sqoop
>>
>>   Throttling resources
>>
>>     Extractors:
>>
>>     Loaders:
>>
>>
>> Thanks,
>> Andy
>>
>>
>>
>> On Sat, Mar 22, 2014 at 9:37 AM, Jarek Jarcec Cecho <jarcec@apache.org>wrote:
>>
>>> Did you properly configured path to your Hadoop configuration files as
>>> is described in (and the file sqoop.properties):
>>>
>>> http://sqoop.apache.org/docs/1.99.3/Installation.html#configuring-server
>>>
>>> Sqoop will run all jobs on LocalJobRunner in case that proper Hadoop
>>> configs are missing. The default behaviour of LocalJobRunner is that
>>> operation "start job" is blocking which might explain the behaviour you are
>>> seeing.
>>>
>>> Jarcec
>>>
>>> On Fri, Mar 21, 2014 at 07:17:32PM -0700, Andy Srine wrote:
>>> > Thanks Jarek. I added some paths to the libraries and now it works
>>> better.
>>> > But after creating a connection and a job, the "start job --jid 1" just
>>> > hangs. I think it is still some kind of a Hadoop issue, but no clues
>>> except
>>> > the following lines in a loop in the sqoop.log.
>>> >
>>> > 2014-03-21 19:08:59,218 DEBUG repository.JdbcRepositoryTransaction
>>> >
>>> [org.apache.sqoop.repository.JdbcRepositoryTransaction.commit(JdbcRepositoryTransaction.java:84)]
>>> > Tx count-commit: 1, rollback: false
>>> >
>>> > 2014-03-21 19:08:59,218 DEBUG repository.JdbcRepositoryTransaction
>>> >
>>> [org.apache.sqoop.repository.JdbcRepositoryTransaction.close(JdbcRepositoryTransaction.java:103)]
>>> > Tx count-close: 0, rollback: false
>>> >
>>> > 2014-03-21 19:08:59,218 INFO  repository.JdbcRepositoryTransaction
>>> >
>>> [org.apache.sqoop.repository.JdbcRepositoryTransaction.close(JdbcRepositoryTransaction.java:111)]
>>> > Attempting transaction commit
>>> >
>>> > 2014-03-21 19:09:56,510 TRACE core.PropertiesConfigurationProvider
>>> >
>>> [org.apache.sqoop.core.PropertiesConfigurationProvider$ConfigFilePoller.run(PropertiesConfigurationProvider.java:172)]
>>> > Checking config file for changes: server/bin/../conf/sqoop.properties
>>> >
>>> > 2014-03-21 19:10:56,511 TRACE core.PropertiesConfigurationProvider
>>> >
>>> [org.apache.sqoop.core.PropertiesConfigurationProvider$ConfigFilePoller.run(PropertiesConfigurationProvider.java:172)]
>>> > Checking config file for changes: server/bin/../conf/sqoop.properties
>>> >
>>> >
>>> >
>>> >
>>> > On Thu, Mar 20, 2014 at 8:09 PM, Jarek Jarcec Cecho <jarcec@apache.org
>>> >wrote:
>>> >
>>> > > Yes, it actually does - it seems that you do not have all the
>>> dependencies
>>> > > on the classpath. Did you correctly configured the common.loader?
>>> > >
>>> > >
>>> > >
>>> http://sqoop.apache.org/docs/1.99.3/Installation.html#installing-dependencies
>>> > >
>>> > > Jarcec
>>> > >
>>> > > On Thu, Mar 20, 2014 at 05:24:46PM -0700, Andy Srine wrote:
>>> > > > Thanks Vasanth and Jarek. Removing the jar doesn't seem to work,
>>> but I do
>>> > > > think its the Hadoop setup on my laptop causing issues for SQOOP.
>>> I am
>>> > > > trying to run the hadoop 2 binaries and it seem to work, but it
>>> also
>>> > > warns
>>> > > > me that its "Not a native build." The only other error message
I
>>> see is
>>> > > in
>>> > > > the localhost.log. Error below. Not sure if this confirms it's
a
>>> hadoop
>>> > > > issue or gives you guys more clues?
>>> > > >
>>> > > > Mar 20, 2014 4:33:50 PM org.apache.catalina.core.StandardContext
>>> > > > listenerStart
>>> > > >
>>> > > > SEVERE: Exception sending context initialized event to listener
>>> instance
>>> > > of
>>> > > > class org.apache.sqoop.server.ServerInitializer
>>> > > >
>>> > > > java.lang.NoClassDefFoundError:
>>> org/apache/commons/logging/LogFactory
>>> > > >
>>> > > > at
>>> org.apache.hadoop.conf.Configuration.<clinit>(Configuration.java:165)
>>> > > >
>>> > > > at
>>> > > >
>>> > >
>>> org.apache.sqoop.submission.mapreduce.MapreduceSubmissionEngine.initialize(MapreduceSubmissionEngine.java:78)
>>> > > >
>>> > > > at
>>> org.apache.sqoop.framework.JobManager.initialize(JobManager.java:215)
>>> > > >
>>> > > > at
>>> org.apache.sqoop.core.SqoopServer.initialize(SqoopServer.java:53)
>>> > > >
>>> > > > at
>>> > > >
>>> > >
>>> org.apache.sqoop.server.ServerInitializer.contextInitialized(ServerInitializer.java:36)
>>> > > >
>>> > > > at
>>> > > >
>>> > >
>>> org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4206)
>>> > > >
>>> > > > at
>>> > >
>>> org.apache.catalina.core.StandardContext.start(StandardContext.java:4705)
>>> > > >
>>> > > > at
>>> > > >
>>> > >
>>> org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:799)
>>> > > >
>>> > > > at
>>> > >
>>> org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779)
>>> > > >
>>> > > > at
>>> org.apache.catalina.core.StandardHost.addChild(StandardHost.java:601)
>>> > > >
>>> > > > at
>>> org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:943)
>>> > > >
>>> > > > at
>>> org.apache.catalina.startup.HostConfig.deployWARs(HostConfig.java:778)
>>> > > >
>>> > > > at
>>> org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:504)
>>> > > >
>>> > > > at
>>> org.apache.catalina.startup.HostConfig.start(HostConfig.java:1317)
>>> > > >
>>> > > > at
>>> > > >
>>> > >
>>> org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324)
>>> > > >
>>> > > > at
>>> > > >
>>> > >
>>> org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
>>> > > >
>>> > > > at
>>> org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1065)
>>> > > >
>>> > > > at
>>> org.apache.catalina.core.StandardHost.start(StandardHost.java:840)
>>> > > >
>>> > > > at
>>> org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1057)
>>> > > >
>>> > > > at
>>> org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463)
>>> > > >
>>> > > > at
>>> > >
>>> org.apache.catalina.core.StandardService.start(StandardService.java:525)
>>> > > >
>>> > > > at
>>> org.apache.catalina.core.StandardServer.start(StandardServer.java:754)
>>> > > >
>>> > > >
>>> > > > --
>>> > > > Thanks,
>>> > > > Andy
>>> > > >
>>> > > > On Thu, Mar 20, 2014 at 1:30 PM, Jarek Jarcec Cecho <
>>> jarcec@apache.org
>>> > > >wrote:
>>> > > >
>>> > > > > Hi Andy,
>>> > > > > the class invariant is known problem of log4j on tomcat and
it's
>>> > > nothing
>>> > > > > you have to worry about. I would also recommend you to check
out
>>> > > catalina
>>> > > > > log file (usually named catalina.$date.log).
>>> > > > >
>>> > > > > Based on the sqoop.log my guess is that you are either missing
>>> Hadoop
>>> > > > > libraries or proper Hadoop configuration. You might want
to check
>>> > > that. You
>>> > > > > should be able to find proper exception containing more details
>>> in one
>>> > > of
>>> > > > > the logs.
>>> > > > >
>>> > > > > Jarcec
>>> > > > >
>>> > > > > On Wed, Mar 19, 2014 at 05:47:18PM -0700, Andy Srine wrote:
>>> > > > > > Hi Guys,
>>> > > > > >
>>> > > > > >
>>> > > > > > I am new to SQOOP and am stuck at the 5 minute demo.
This is
>>> with
>>> > > > > > sqoop-1.99.3-bin-hadoop200 on a mac. The error I see
is in the
>>> > > > > > catalina.out. I searched the web and the workaround
suggested
>>> > > > > >
>>> > >
>>> ("org.apache.catalina.loader.WebappClassLoader.ENABLE_CLEAR_REFERENCES =
>>> > > > > > false") doesn't seem to help. The sqoop.log has no errors,
but
>>> says
>>> > > > > > "shutting down". Any ideas?
>>> > > > > >
>>> > > > > >
>>> > > > > > SQOOP.LOG
>>> > > > > >
>>> > > > > > ---------
>>> > > > > >
>>> > > > > > 2014-03-18 18:25:04,796 INFO  framework.FrameworkManager
>>> > > > > >
>>> > > > >
>>> > >
>>> [org.apache.sqoop.framework.FrameworkManager.initialize(FrameworkManager.java:159)]
>>> > > > > > Submission manager initialized: OK
>>> > > > > >
>>> > > > > > 2014-03-18 18:25:04,807 INFO
>>>  mapreduce.MapreduceSubmissionEngine
>>> > > > > >
>>> > > > >
>>> > >
>>> [org.apache.sqoop.submission.mapreduce.MapreduceSubmissionEngine.initialize(MapreduceSubmissionEngine.java:75)]
>>> > > > > > Initializing Map-reduce Submission Engine
>>> > > > > >
>>> > > > > > 2014-03-18 18:25:04,846 INFO  core.SqoopServer
>>> > > > > >
>>> [org.apache.sqoop.core.SqoopServer.destroy(SqoopServer.java:35)]
>>> > > Shutting
>>> > > > > > down Sqoop server
>>> > > > > >
>>> > > > > >
>>> > > > > > CATALINA.OUT
>>> > > > > >
>>> > > > > > ------------
>>> > > > > >
>>> > > > > > Mar 18, 2014 6:25:04 PM
>>> org.apache.catalina.core.StandardContext
>>> > > start
>>> > > > > >
>>> > > > > > SEVERE: Error listenerStart
>>> > > > > >
>>> > > > > > Mar 18, 2014 6:25:04 PM
>>> org.apache.catalina.core.StandardContext
>>> > > start
>>> > > > > >
>>> > > > > > SEVERE: Context [/sqoop] startup failed due to previous
errors
>>> > > > > >
>>> > > > > > Mar 18, 2014 6:25:04 PM
>>> org.apache.catalina.loader.WebappClassLoader
>>> > > > > > clearReferencesJdbc
>>> > > > > >
>>> > > > > > SEVERE: The web application [/sqoop] registered the
JDBC driver
>>> > > > > > [org.apache.derby.jdbc.AutoloadedDriver40] but failed
to
>>> unregister
>>> > > it
>>> > > > > when
>>> > > > > > the web application was stopped. To prevent a memory
leak, the
>>> JDBC
>>> > > > > Driver
>>> > > > > > has been forcibly unregistered.
>>> > > > > >
>>> > > > > > Mar 18, 2014 6:25:04 PM
>>> org.apache.catalina.loader.WebappClassLoader
>>> > > > > > clearReferencesThreads
>>> > > > > >
>>> > > > > > SEVERE: The web application [/sqoop] appears to have
started a
>>> thread
>>> > > > > named
>>> > > > > > [sqoop-config-file-poller] but has failed to stop it.
This is
>>> very
>>> > > likely
>>> > > > > > to create a memory leak.
>>> > > > > >
>>> > > > > > Mar 18, 2014 6:25:04 PM
>>> org.apache.catalina.loader.WebappClassLoader
>>> > > > > > checkThreadLocalMapForLeaks
>>> > > > > >
>>> > > > > > SEVERE: The web application [/sqoop] created a ThreadLocal
>>> with key
>>> > > of
>>> > > > > type
>>> > > > > > [java.lang.ThreadLocal] (value [java.lang.ThreadLocal@7b24c107
>>> ])
>>> > > and a
>>> > > > > > value of type
>>> [org.apache.derby.iapi.services.context.ContextManager]
>>> > > > > > (value
>>> > > [org.apache.derby.iapi.services.context.ContextManager@63c39669])
>>> > > > > > but failed to remove it when the web application was
stopped.
>>> This is
>>> > > > > very
>>> > > > > > likely to create a memory leak.
>>> > > > > >
>>> > > > > > Mar 18, 2014 6:25:04 PM
>>> org.apache.catalina.loader.WebappClassLoader
>>> > > > > > checkThreadLocalMapForLeaks
>>> > > > > >
>>> > > > > > SEVERE: The web application [/sqoop] created a ThreadLocal
>>> with key
>>> > > of
>>> > > > > type
>>> > > > > > [java.lang.ThreadLocal] (val
>>> > > > > >
>>> > > > > > ue [java.lang.ThreadLocal@7b24c107]) and a value of
type
>>> > > > > > [org.apache.derby.iapi.services.context.Context
>>> > > > > >
>>> > > > > > Manager] (value
>>> > > > > > [org.apache.derby.iapi.services.context.ContextManager@2887d605
>>> ])
>>> > > but
>>> > > > > > failed to remove i
>>> > > > > >
>>> > > > > > t when the web application was stopped. This is very
likely to
>>> > > create a
>>> > > > > > memory leak.
>>> > > > > >
>>> > > > > > log4j: log4j called after unloading, see
>>> > > > > > http://logging.apache.org/log4j/1.2/faq.html#unload.
>>> > > > > >
>>> > > > > > java.lang.IllegalStateException: Class invariant violation
>>> > > > > >
>>> > > > > >         at
>>> > > > > >
>>> org.apache.log4j.LogManager.getLoggerRepository(LogManager.java:199)
>>> > > > > >
>>> > > > > >         at
>>> org.apache.log4j.LogManager.getLogger(LogManager.java:228)
>>> > > > > >
>>> > > > > >         at org.apache.log4j.Logger.getLogger(Logger.java:117)
>>> > > > > >
>>> > > > > >         at
>>> > > > > >
>>> > > > >
>>> > >
>>> org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer.<clinit>(GenericJdbcImportInitializer.java:42)
>>> > > > > >
>>> > > > > >         at sun.misc.Unsafe.ensureClassInitialized
>>> > > > > >
>>> > > > > >
>>> > > > > >
>>> > > > > > Thanks,
>>> > > > > >
>>> > > > > > Andy
>>> > > > >
>>> > >
>>> >
>>> >
>>> >
>>> > --
>>> > Thanks,
>>> > Andy
>>>
>>
>>
>>
>> --
>> Thanks,
>> Andy
>>
>>
>


-- 
Thanks,
Andy

Mime
View raw message