无法让 Sqoop 1.99.3 在 64 位 Centos 6.5 上与 Apache Hadoop 2.4.0 一起使用

Posted

技术标签:

【中文标题】无法让 Sqoop 1.99.3 在 64 位 Centos 6.5 上与 Apache Hadoop 2.4.0 一起使用【英文标题】:Can't get Sqoop 1.99.3 working with Apache Hadoop 2.4.0 on 64 bit Centos 6.5 【发布时间】:2014-05-11 14:28:47 【问题描述】:

我在 Centos 6.5 KVM 虚拟服务器上运行并安装了 Apache hadoop。它安装在

/home/hduser/yarn/hadoop-2.4.0 and the config files are in /home/hduser/yarn/hadoop-2.4.0/etc/hadoop.

我收到了来自 hadoop 的关于库是 32 位的投诉(猜测二进制安装默认包含这些),所以我做了一个完整的源代码构建来获取 64 位库。但似乎 sqoop 1.99.3 无论如何都只想使用 hadoop jars..(?)

这似乎是主要错误,而且似乎也很受欢迎,但我找不到任何可行的建议。我的 sqoop 安装中不存在 addtowar.sh

**Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.conf.Configuration
    at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1680)**

Sqoop 位于 /home/hduser/sqoop-1.99.3-bin-hadoop200 和 cataline.properties 中:-

common.loader=$catalina.base/lib,$catalina.base/lib/*.jar,$catalina.home/lib,$catalina.home/lib/*.jar,$catalina.home/../lib/*.jar,$HADOOP_PREFIX/share/hadoop/common/*.jar,$HADOOP_PREFIX/share/hadoop/mapreduce/*.jar


    vmcentos01:/home/hduser/sqoop-1.99.3-bin-hadoop200/server/conf>echo $HADOOP_PREFIX
/home/hduser/yarn/hadoop-2.4.0

我执行./sqoop.sh server start..

Sqoop home directory: /home/hduser/sqoop-1.99.3-bin-hadoop200
Setting SQOOP_HTTP_PORT:     12000
Setting SQOOP_ADMIN_PORT:     12001
Using   CATALINA_OPTS:       
Adding to CATALINA_OPTS:    -Dsqoop.http.port=12000 -Dsqoop.admin.port=12001
Using CATALINA_BASE:   /home/hduser/sqoop-1.99.3-bin-hadoop200/server
Using CATALINA_HOME:   /home/hduser/sqoop-1.99.3-bin-hadoop200/server
Using CATALINA_TMPDIR: /home/hduser/sqoop-1.99.3-bin-hadoop200/server/temp
Using JRE_HOME:        /usr/java/jdk1.7.0_15
Using CLASSPATH:       /home/hduser/sqoop-1.99.3-bin-hadoop200/server/bin/bootstrap.jar
vmcentos01:/home/hduser/sqoop-1.99.3-bin-hadoop200/bin>

    vmcentos01:/home/hduser/sqoop-1.99.3-bin-hadoop200/bin>netstat -aln | grep 12000
    tcp        0      0 0.0.0.0:12000               0.0.0.0:*                   LISTEN      
    vmcentos01:/home/hduser/sqoop-1.99.3-bin-hadoop200/bin>

sqoop.war 部署到 webapps/sqoop

/lib:
total 4092
-rw-r--r-- 1 hduser hadoop  160519 Oct 15  2013 commons-dbcp-1.4.jar
-rw-r--r-- 1 hduser hadoop  279193 Oct 15  2013 commons-lang-2.5.jar
-rw-r--r-- 1 hduser hadoop   96221 Oct 15  2013 commons-pool-1.5.4.jar
-rw-r--r-- 1 hduser hadoop    6734 Oct 18  2013 connector-sdk-1.99.3.jar
-rw-r--r-- 1 hduser hadoop 2671577 Oct 15  2013 derby-10.8.2.2.jar
-rw-r--r-- 1 hduser hadoop   16046 Oct 15  2013 json-simple-1.1.jar
-rw-r--r-- 1 hduser hadoop  481535 Oct 15  2013 log4j-1.2.16.jar
-rw-r--r-- 1 hduser hadoop  130387 Oct 18  2013 sqoop-common-1.99.3.jar
-rw-r--r-- 1 hduser hadoop   51382 Oct 18  2013 sqoop-connector-generic-jdbc-1.99.3.jar
-rw-r--r-- 1 hduser hadoop  119652 Oct 18  2013 sqoop-core-1.99.3.jar
-rw-r--r-- 1 hduser hadoop   70692 Oct 18  2013 sqoop-execution-mapreduce-1.99.3-hadoop200.jar
-rw-r--r-- 1 hduser hadoop   41462 Oct 18  2013 sqoop-repository-derby-1.99.3.jar
-rw-r--r-- 1 hduser hadoop   16156 Oct 18  2013 sqoop-spi-1.99.3.jar
-rw-r--r-- 1 hduser hadoop   16590 Oct 18  2013 sqoop-submission-mapreduce-1.99.3-hadoop200.jar
vmcentos01:/home/hduser/sqoop-1.99.3-bin-hadoop200/server/webapps/sqoop/WEB-INF>

然后是日志:-

vmcentos01:/home/hduser/sqoop-1.99.3-bin-hadoop200/server/logs>cat localhost.2014-05-11.log
May 11, 2014 10:15:56 AM org.apache.catalina.core.StandardContext listenerStart
SEVERE: Exception sending context initialized event to listener instance of class org.apache.sqoop.server.ServerInitializer
java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
    at org.apache.sqoop.submission.mapreduce.MapreduceSubmissionEngine.initialize(MapreduceSubmissionEngine.java:78)
    at org.apache.sqoop.framework.JobManager.initialize(JobManager.java:215)
    at org.ap    vmcentos01:/home/hduser/sqoop-1.99.3-bin-hadoop200/server/logs>ls -l
    total 24
    -rw-r--r-- 1 hduser hadoop 3766 May 11 10:15 catalina.2014-05-11.log
    -rw-r--r-- 1 hduser hadoop 8629 May 11 10:15 catalina.out
    -rw-r--r-- 1 hduser hadoop    0 May 11 10:15 host-manager.2014-05-11.log
    -rw-r--r-- 1 hduser hadoop 5032 May 11 10:15 localhost.2014-05-11.log
    -rw-r--r-- 1 hduser hadoop    0 May 11 10:15 manager.2014-05-11.log
    vmcentos01:/home/hduser/sqoop-1.99.3-bin-hadoop200/server/logs>

-------------localhost*.log --------------

ache.sqoop.core.SqoopServer.initialize(SqoopServer.java:53)
    at org.apache.sqoop.server.ServerInitializer.contextInitialized(ServerInitializer.java:36)
    at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4206)
    at org.apache.catalina.core.StandardContext.start(StandardContext.java:4705)
    at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:799)
    at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779)
    at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:601)
    at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:943)
    at org.apache.catalina.startup.HostConfig.deployWARs(HostConfig.java:778)
    at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:504)
    at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1317)
    at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324)
    at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
    at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1065)
    at org.apache.catalina.core.StandardHost.start(StandardHost.java:840)
    at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1057)
    at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463)
    at org.apache.catalina.core.StandardService.start(StandardService.java:525)
    at org.apache.catalina.core.StandardServer.start(StandardServer.java:754)
    at org.apache.catalina.startup.Catalina.start(Catalina.java:595)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:601)
    at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289)
    at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.conf.Configuration
    at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1680)
    at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1526)
    ... 28 more

May 11, 2014 10:15:56 AM org.apache.catalina.core.StandardContext listenerStop
SEVERE: Exception sending context destroyed event to listener instance of class org.apache.sqoop.server.ServerInitializer
java.lang.NullPointerException
    at org.apache.sqoop.framework.JobManager.destroy(JobManager.java:176)
    at org.apache.sqoop.core.SqoopServer.destroy(SqoopServer.java:36)
    at org.apache.sqoop.server.ServerInitializer.contextDestroyed(ServerInitializer.java:32)
    at org.apache.catalina.core.StandardContext.listenerStop(StandardContext.java:4245)
    at org.apache.catalina.core.StandardContext.stop(StandardContext.java:4886)
    at org.apache.catalina.core.StandardContext.start(StandardContext.java:4750)
    at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:799)
    at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779)
    at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:601)
    at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:943)
    at org.apache.catalina.startup.HostConfig.deployWARs(HostConfig.java:778)
    at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:504)
    at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1317)
    at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324)
    at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
    at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1065)
    at org.apache.catalina.core.StandardHost.start(StandardHost.java:840)
    at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1057)
    at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463)
    at org.apache.catalina.core.StandardService.start(StandardService.java:525)
    at org.apache.catalina.core.StandardServer.start(StandardServer.java:754)
    at org.apache.catalina.startup.Catalina.start(Catalina.java:595)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:601)
    at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289)
    at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414)


-----------------------catalina log ----------------------------

vmcentos01:/home/hduser/sqoop-1.99.3-bin-hadoop200/server/logs>cat catalina.2014-05-11.log
May 11, 2014 10:15:54 AM org.apache.catalina.startup.ClassLoaderFactory validateFile
WARNING: Problem with directory [/home/hduser/sqoop-1.99.3-bin-hadoop200/lib], exists: [false], isDirectory: [false], canRead: [false]
May 11, 2014 10:15:54 AM org.apache.catalina.startup.ClassLoaderFactory validateFile
WARNING: Problem with directory [/home/hduser/sqoop-1.99.3-bin-hadoop200/$HADOOP_PREFIX/share/hadoop/common], exists: [false], isDirectory: [false], canRead: [false]
May 11, 2014 10:15:54 AM org.apache.catalina.startup.ClassLoaderFactory validateFile
WARNING: Problem with directory [/home/hduser/sqoop-1.99.3-bin-hadoop200/$HADOOP_PREFIX/share/hadoop/mapreduce], exists: [false], isDirectory: [false], canRead: [false]
May 11, 2014 10:15:54 AM org.apache.catalina.core.AprLifecycleListener init
INFO: The APR based Apache Tomcat Native library which allows optimal performance in production environments was not found on the java.library.path: /usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
May 11, 2014 10:15:54 AM org.apache.coyote.http11.Http11Protocol init
INFO: Initializing Coyote HTTP/1.1 on http-12000
May 11, 2014 10:15:54 AM org.apache.catalina.startup.Catalina load
INFO: Initialization processed in 399 ms
May 11, 2014 10:15:54 AM org.apache.catalina.core.StandardService start
INFO: Starting service Catalina
May 11, 2014 10:15:54 AM org.apache.catalina.core.StandardEngine start
INFO: Starting Servlet Engine: Apache Tomcat/6.0.36
May 11, 2014 10:15:54 AM org.apache.catalina.startup.HostConfig deployWAR
INFO: Deploying web application archive sqoop.war
May 11, 2014 10:15:56 AM org.apache.catalina.core.StandardContext start
SEVERE: Error listenerStart
May 11, 2014 10:15:56 AM org.apache.catalina.core.StandardContext start
SEVERE: Context [/sqoop] startup failed due to previous errors
May 11, 2014 10:15:56 AM org.apache.catalina.loader.WebappClassLoader clearReferencesJdbc
SEVERE: The web application [/sqoop] registered the JDBC driver [org.apache.derby.jdbc.AutoloadedDriver40] but failed to unregister it when the web application was stopped. To prevent a memory leak, the JDBC Driver has been forcibly unregistered.
May 11, 2014 10:15:56 AM org.apache.catalina.loader.WebappClassLoader clearReferencesThreads
SEVERE: The web application [/sqoop] appears to have started a thread named [sqoop-config-file-poller] but has failed to stop it. This is very likely to create a memory leak.
May 11, 2014 10:15:56 AM org.apache.catalina.loader.WebappClassLoader checkThreadLocalMapForLeaks
SEVERE: The web application [/sqoop] created a ThreadLocal with key of type [java.lang.ThreadLocal] (value [java.lang.ThreadLocal@3f782da8]) and a value of type [org.apache.derby.iapi.services.context.ContextManager] (value [org.apache.derby.iapi.services.context.ContextManager@6495dc5a]) but failed to remove it when the web application was stopped. This is very likely to create a memory leak.
May 11, 2014 10:15:56 AM org.apache.catalina.loader.WebappClassLoader checkThreadLocalMapForLeaks
SEVERE: The web application [/sqoop] created a ThreadLocal with key of type [java.lang.ThreadLocal] (value [java.lang.ThreadLocal@3f782da8]) and a value of type [org.apache.derby.iapi.services.context.ContextManager] (value [org.apache.derby.iapi.services.context.ContextManager@3e8a0821]) but failed to remove it when the web application was stopped. This is very likely to create a memory leak.
May 11, 2014 10:15:56 AM org.apache.catalina.startup.HostConfig deployDirectory
INFO: Deploying web application directory ROOT
May 11, 2014 10:15:56 AM org.apache.coyote.http11.Http11Protocol start
INFO: Starting Coyote HTTP/1.1 on http-12000
May 11, 2014 10:15:56 AM org.apache.catalina.startup.Catalina start
INFO: Server startup in 1656 ms
vmcentos01:/home/hduser/sqoop-1.99.3-bin-hadoop200/server/logs>

--------------------- catalina.out -------------------------

vmcentos01:/home/hduser/sqoop-1.99.3-bin-hadoop200/server/logs>cat catalina.out
May 11, 2014 10:15:54 AM org.apache.catalina.startup.ClassLoaderFactory validateFile
WARNING: Problem with directory [/home/hduser/sqoop-1.99.3-bin-hadoop200/lib], exists: [false], isDirectory: [false], canRead: [false]
May 11, 2014 10:15:54 AM org.apache.catalina.startup.ClassLoaderFactory validateFile
WARNING: Problem with directory [/home/hduser/sqoop-1.99.3-bin-hadoop200/$HADOOP_PREFIX/share/hadoop/common], exists: [false], isDirectory: [false], canRead: [false]
May 11, 2014 10:15:54 AM org.apache.catalina.startup.ClassLoaderFactory validateFile
WARNING: Problem with directory [/home/hduser/sqoop-1.99.3-bin-hadoop200/$HADOOP_PREFIX/share/hadoop/mapreduce], exists: [false], isDirectory: [false], canRead: [false]
May 11, 2014 10:15:54 AM org.apache.catalina.core.AprLifecycleListener init
INFO: The APR based Apache Tomcat Native library which allows optimal performance in production environments was not found on the java.library.path: /usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
May 11, 2014 10:15:54 AM org.apache.coyote.http11.Http11Protocol init
INFO: Initializing Coyote HTTP/1.1 on http-12000
May 11, 2014 10:15:54 AM org.apache.catalina.startup.Catalina load
INFO: Initialization processed in 399 ms
May 11, 2014 10:15:54 AM org.apache.catalina.core.StandardService start
INFO: Starting service Catalina
May 11, 2014 10:15:54 AM org.apache.catalina.core.StandardEngine start
INFO: Starting Servlet Engine: Apache Tomcat/6.0.36
May 11, 2014 10:15:54 AM org.apache.catalina.startup.HostConfig deployWAR
INFO: Deploying web application archive sqoop.war
log4j:WARN No appenders could be found for logger (org.apache.sqoop.core.SqoopServer).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
log4j: Parsing for [root] with value=[WARN, file].
log4j: Level token is [WARN].
log4j: Category root set to WARN
log4j: Parsing appender named "file".
log4j: Parsing layout options for "file".
log4j: Setting property [conversionPattern] to [%dISO8601 %-5p %c2 [%l] %m%n].
log4j: End of parsing for "file".
log4j: Setting property [file] to [@LOGDIR@/sqoop.log].
log4j: Setting property [maxBackupIndex] to [5].
log4j: Setting property [maxFileSize] to [25MB].
log4j: setFile called: @LOGDIR@/sqoop.log, true
log4j: setFile ended
log4j: Parsed "file" options.
log4j: Parsing for [org.apache.sqoop] with value=[DEBUG].
log4j: Level token is [DEBUG].
log4j: Category org.apache.sqoop set to DEBUG
log4j: Handling log4j.additivity.org.apache.sqoop=[null]
log4j: Parsing for [org.apache.derby] with value=[INFO].
log4j: Level token is [INFO].
log4j: Category org.apache.derby set to INFO
log4j: Handling log4j.additivity.org.apache.derby=[null]
log4j: Finished configuring.
log4j: Could not find root logger information. Is this OK?
log4j: Parsing for [default] with value=[INFO,defaultAppender].
log4j: Level token is [INFO].
log4j: Category default set to INFO
log4j: Parsing appender named "defaultAppender".
log4j: Parsing layout options for "defaultAppender".
log4j: Setting property [conversionPattern] to [%d %-5p %c: %m%n].
log4j: End of parsing for "defaultAppender".
log4j: Setting property [file] to [@LOGDIR@/default.audit].
log4j: setFile called: @LOGDIR@/default.audit, true
log4j: setFile ended
log4j: Parsed "defaultAppender" options.
log4j: Handling log4j.additivity.default=[null]
log4j: Finished configuring.
May 11, 2014 10:15:56 AM org.apache.catalina.core.StandardContext start
SEVERE: Error listenerStart
May 11, 2014 10:15:56 AM org.apache.catalina.core.StandardContext start
SEVERE: Context [/sqoop] startup failed due to previous errors
May 11, 2014 10:15:56 AM org.apache.catalina.loader.WebappClassLoader clearReferencesJdbc
SEVERE: The web application [/sqoop] registered the JDBC driver [org.apache.derby.jdbc.AutoloadedDriver40] but failed to unregister it when the web application was stopped. To prevent a memory leak, the JDBC Driver has been forcibly unregistered.
May 11, 2014 10:15:56 AM org.apache.catalina.loader.WebappClassLoader clearReferencesThreads
SEVERE: The web application [/sqoop] appears to have started a thread named [sqoop-config-file-poller] but has failed to stop it. This is very likely to create a memory leak.
May 11, 2014 10:15:56 AM org.apache.catalina.loader.WebappClassLoader checkThreadLocalMapForLeaks
SEVERE: The web application [/sqoop] created a ThreadLocal with key of type [java.lang.ThreadLocal] (value [java.lang.ThreadLocal@3f782da8]) and a value of type [org.apache.derby.iapi.services.context.ContextManager] (value [org.apache.derby.iapi.services.context.ContextManager@6495dc5a]) but failed to remove it when the web application was stopped. This is very likely to create a memory leak.
May 11, 2014 10:15:56 AM org.apache.catalina.loader.WebappClassLoader checkThreadLocalMapForLeaks
SEVERE: The web application [/sqoop] created a ThreadLocal with key of type [java.lang.ThreadLocal] (value [java.lang.ThreadLocal@3f782da8]) and a value of type [org.apache.derby.iapi.services.context.ContextManager] (value [org.apache.derby.iapi.services.context.ContextManager@3e8a0821]) but failed to remove it when the web application was stopped. This is very likely to create a memory leak.
log4j: log4j called after unloading, see http://logging.apache.org/log4j/1.2/faq.html#unload.
java.lang.IllegalStateException: Class invariant violation
    at org.apache.log4j.LogManager.getLoggerRepository(LogManager.java:199)
    at org.apache.log4j.LogManager.getLogger(LogManager.java:228)
    at org.apache.log4j.Logger.getLogger(Logger.java:117)
    at org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer.<clinit>(GenericJdbcImportInitializer.java:42)
    at sun.misc.Unsafe.ensureClassInitialized(Native Method)
    at sun.reflect.UnsafeFieldAccessorFactory.newFieldAccessor(UnsafeFieldAccessorFactory.java:43)
    at sun.reflect.ReflectionFactory.newFieldAccessor(ReflectionFactory.java:140)
    at java.lang.reflect.Field.acquireFieldAccessor(Field.java:949)
    at java.lang.reflect.Field.getFieldAccessor(Field.java:930)
    at java.lang.reflect.Field.get(Field.java:372)
    at org.apache.catalina.loader.WebappClassLoader.clearReferencesStaticFinal(WebappClassLoader.java:2066)
    at org.apache.catalina.loader.WebappClassLoader.clearReferences(WebappClassLoader.java:1929)
    at org.apache.catalina.loader.WebappClassLoader.stop(WebappClassLoader.java:1833)
    at org.apache.catalina.loader.WebappLoader.stop(WebappLoader.java:740)
    at org.apache.catalina.core.StandardContext.stop(StandardContext.java:4920)
    at org.apache.catalina.core.StandardContext.start(StandardContext.java:4750)
    at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:799)
    at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779)
    at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:601)
    at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:943)
    at org.apache.catalina.startup.HostConfig.deployWARs(HostConfig.java:778)
    at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:504)
    at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1317)
    at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324)
    at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
    at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1065)
    at org.apache.catalina.core.StandardHost.start(StandardHost.java:840)
    at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1057)
    at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463)
    at org.apache.catalina.core.StandardService.start(StandardService.java:525)
    at org.apache.catalina.core.StandardServer.start(StandardServer.java:754)
    at org.apache.catalina.startup.Catalina.start(Catalina.java:595)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:601)
    at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289)
    at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414)
May 11, 2014 10:15:56 AM org.apache.catalina.startup.HostConfig deployDirectory
INFO: Deploying web application directory ROOT
May 11, 2014 10:15:56 AM org.apache.coyote.http11.Http11Protocol start
INFO: Starting Coyote HTTP/1.1 on http-12000
May 11, 2014 10:15:56 AM org.apache.catalina.startup.Catalina start
INFO: Server startup in 1656 ms
vmcentos01:/home/hduser/sqoop-1.99.3-bin-hadoop200/server/logs>

【问题讨论】:

【参考方案1】:

我遇到了和你一样的问题,我发现你需要在你的 hadoop 安装的 share/hadoop 文件夹中包含所有 jar。一个例子是你已经包含了share/hadoop/mapreduce/*.jar,但是更深一层错过了share/hadoop/mapreduce/lib/*.jar。这是我的 common.loader 设置有效(只需换掉 hadoop 位置前缀):

common.loader=$catalina.base/lib,$catalina.base/lib/*.jar,$catalina.home/lib,$catalina.home/lib/*.jar,$catalina.home/../lib/*.jar,/localdisk/hadoop-2.4.0/share/hadoop/common/*.jar,/localdisk/hadoop-2.4.0/share/hadoop/common/lib/*.jar,/localdisk/hadoop-2.4.0/share/hadoop/hdfs/*.jar,/localdisk/hadoop-2.4.0/share/hadoop/hdfs/lib/*.jar,/localdisk/hadoop-2.4.0/share/hadoop/mapreduce/*.jar,/localdisk/hadoop-2.4.0/share/hadoop/mapreduce/lib/*.jar,/localdisk/hadoop-2.4.0/share/hadoop/tools/*.jar,/localdisk/hadoop-2.4.0/share/hadoop/tools/lib/*.jar,/localdisk/hadoop-2.4.0/share/hadoop/yarn/*.jar,/localdisk/hadoop-2.4.0/share/hadoop/yarn/lib/*.jar

【讨论】:

这解决了我使用 sqoop2 1.99.6 和 Hadoop 2.7.2 的问题。我已将 Hive jar 添加到末尾,例如 ... ,/usr/local/hive/lib/*.jar 。它打印 Verification was successful. 消息,但带有 JDBCREPO_0009:Failed to finalize transaction 。我认为这个 JDBC 相关的东西对于验证目的并不重要。【参考方案2】:

我在使用 Hadoop 2.4.0 时遇到了同样的问题。您是否尝试将所有 hadoop jars(只需将文件夹中存在的所有 jars:share/hadoop/*)添加到 sqoop lib 文件夹中?

这最终使我的 sqoop 服务器运行(但在早期版本中)。这不是解决这个问题的最完美方法,但至少你可以运行 sqoop。

也许这会解决你的问题。

【讨论】:

不,我一直在避免这种情况 - 但现在是尝试的时候了 - 无论我用它做什么,common.loader 都不起作用 - 谢谢 我将我能找到的所有 jars 添加到 cataline erver/lib 目录 - 抱怨 log4 和其他一些丢失 - 也添加了这些 - 现在回到原来的 ClassNotFoundException:org.apache.hadoop.conf .配置 嗨,抱歉耽搁了 - 我不得不把这个问题放下 - 有一个生产问题。当我离开它时,它不起作用。我想很快就会再次感到沮丧..【参考方案3】:

使用 sqoop2 1.99.6 和 Hadoop-2.7.1,我遭受了这种信念

root@some_server sqoop2]# sqoop2-tool verify
Setting conf dir: /sw/apache/sqoop2/bin/../conf
Sqoop home directory: /sw/apache/sqoop2
Sqoop tool executor:
    Version: 2.0.0-SNAPSHOT
    Revision: 81778c37a413eb64aa38ffc397af2ca695909013
    Compiled on Thu Dec 10 22:20:08 WAT 2015 by root
Running tool: class org.apache.sqoop.tools.tool.VerifyTool
0    [main] INFO  org.apache.sqoop.core.SqoopServer  - Initializing Sqoop server.
17   [main] INFO  org.apache.sqoop.core.PropertiesConfigurationProvider  -    Starting config file poller thread
Exception in thread "main" java.lang.NoClassDefFoundError:   org/apache/hadoop/conf/Configuration
at org.apache.sqoop.security.authentication.SimpleAuthenticationHandler.secureLogin(SimpleAuthenticationHandler.java:36)
at org.apache.sqoop.security.AuthenticationManager.initialize(AuthenticationManager.java:98)
at org.apache.sqoop.core.SqoopServer.initialize(SqoopServer.java:55)
at org.apache.sqoop.tools.tool.VerifyTool.runTool(VerifyTool.java:36)
at org.apache.sqoop.tools.ToolRunner.main(ToolRunner.java:72)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.conf.Configuration
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

解决方案, 我尝试了“common.loader”属性方法,但由于某种原因我没有做对。 最后我复制了我所有的hadoop jar文件 $HADOOP_HOME/share/hadoop 和 $HADOOP_HOME/share/hadoop*/lib/ 到 $SQOOP2_HOME/server/lib。

[root@vggdw01 sqoop2]# cp  $HADOOP_HOME/share/hadoop/*.jar  server/lib/
cp: overwrite ‘server/lib/commons-cli-1.2.jar’? n
cp: overwrite ‘server/lib/commons-io-2.4.jar’? n
cp: overwrite ‘server/lib/commons-logging-1.1.3.jar’? n
cp: overwrite ‘server/lib/guava-11.0.2.jar’? n
cp: overwrite ‘server/lib/jackson-core-asl-1.9.13.jar’? n
cp: overwrite ‘server/lib/jackson-mapper-asl-1.9.13.jar’? n
cp: overwrite ‘server/lib/paranamer-2.3.jar’? n
cp: overwrite ‘server/lib/zookeeper-3.4.6.jar’? n

如上所示,我没有覆盖一些 jar 文件。

我想如果它是一个 TOMCAT 的东西......所以我不得不诉诸这些策略。

【讨论】:

【参考方案4】:

我在 HAdoop 2.6.2 和 Sqoop 1.99.6 上,做了以下

 edit $SQOOP_HOME/server/conf/catalina.properties
 modified common.loader and set the paths to my hadoop and hive dir's  

(或)一个惰性修复(作为 sudo)-

ln -s $HADOOP_HOME/share/hadoop/common /usr/lib/hadoop 
ln -s $HADOOP_HOME/share/hadoop/hdfs /usr/lib/hadoop-hdfs 
ln -s $HADOOP_HOME/share/hadoop/mapreduce /usr/lib/hadoop-mapreduce 
ln -s $HADOOP_HOME/share/hadoop/yarn /usr/lib/hadoop-yarn 
mkdir -p /usr/lib/hive
chmod 775 /usr/lib/hive
ln -s $HIVE_HOME/lib /usr/lib/hive/lib 

另外,修改 SQOOP_HOME/server/conf/sqoop.properties 中的 org.apache.sqoop.submission.engine.mapreduce.configuration.directory 属性并将其设置为适当的 hadoop conf 目录。

希望这会有所帮助。

【讨论】:

【参考方案5】:

因为这个问题与 sqoop 无法自动找到 hadoop 配置和库有关。 我指出了这些库的绝对路径

下面是我的 .profile 文件,经过 cen 验证一切正常: sqoop2-工具验证


#SQOOP settings
export SQOOP_HOME=/usr/lib/sqoop
export SQOOP_CONF_DIR=$SQOOP_HOME/conf
export PATH=$PATH:$SQOOP_HOME/bin
SQOOP_SERVER_EXTRA_LIB=/var/lib/sqoop2

#Hadoop settings
export HADOOP_HOME=/usr/local/hadoop
export PATH=$PATH:$HADOOP_HOME/bin
export PATH=$PATH:$HADOOP_HOME/sbin
export HADOOP_MAPRED_HOME=/usr/local/hadoop/share/hadoop/mapreduce
export HADOOP_COMMON_HOME=/usr/local/hadoop/share/hadoop/common
export HADOOP_HDFS_HOME=/usr/local/hadoop/share/hadoop/hdfs
export YARN_HOME=/usr/local/hadoop/share/hadoop/yarn
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
#export HADOOP_CONF_DIR=/usr/local/hadoop/etc/hadoop

【讨论】:

以上是关于无法让 Sqoop 1.99.3 在 64 位 Centos 6.5 上与 Apache Hadoop 2.4.0 一起使用的主要内容,如果未能解决你的问题,请参考以下文章

甘道夫Ubuntu14 server + Hadoop2.2.0环境下Sqoop1.99.3部署记录

甘道夫Sqoop1.99.3基础操作--导入Oracle的数据到HDFS

无法让 IIS Express 8 beta 将网站作为 64 位进程运行

64位进程调用32位dll的解决方法

GCC 无法在 64 位编译 C 程序

如何让32位OFFICE 2003与64位OFFICE 2013共存