简体   繁体   English

卡拉夫OSGi camel-hdfs2

[英]Karaf OSGi camel-hdfs2

I'm having a tough time getting the camel-hdfs2 component to function as expected in a Karaf 4.0 OSGi container. 我很难让camel-hdfs2组件在Karaf 4.0 OSGi容器中按预期运行。 It's a very simple camel route that is reading files from HDFS and simply writing the file name to a new file in /tmp. 这是一条非常简单的骆驼路线,它可以从HDFS读取文件,然后只需将文件名写入/ tmp中的新文件即可。

I've got it to work outside of the Karaf OSGi container just by running the main method (included below), but when I try and start it up in Karaf, I get: 仅通过运行main方法(包括在下面)就可以在Karaf OSGi容器之外运行它,但是当我尝试在Karaf中启动它时,我得到:

java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.LocalFileSystem not found
 at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1882)
 at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2298)
 at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2311)
 at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:90)
 at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2350)
 at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2332)
 at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:369)
 at cas.example.camel_hdfs.LocalRouteBuilder.start(LocalRouteBuilder.java:83)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)[:1.8.0_51]
 at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)[:1.8.0_51]
 at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)[:1.8.0_51]
 at java.lang.reflect.Method.invoke(Method.java:497)[:1.8.0_51]
 at org.apache.felix.scr.impl.helper.BaseMethod.invokeMethod(BaseMethod.java:231)[23:org.apache.felix.scr:1.8.2]
 at org.apache.felix.scr.impl.helper.BaseMethod.access$500(BaseMethod.java:39)[23:org.apache.felix.scr:1.8.2]
 at org.apache.felix.scr.impl.helper.BaseMethod$Resolved.invoke(BaseMethod.java:624)[23:org.apache.felix.scr:1.8.2]
 at org.apache.felix.scr.impl.helper.BaseMethod.invoke(BaseMethod.java:508)[23:org.apache.felix.scr:1.8.2]
 at org.apache.felix.scr.impl.helper.ActivateMethod.invoke(ActivateMethod.java:149)[23:org.apache.felix.scr:1.8.2]
 at org.apache.felix.scr.impl.manager.SingleComponentManager.createImplementationObject(SingleComponentManager.java:315)[23:org.apache.felix.scr:1.8.2]
 at org.apache.felix.scr.impl.manager.SingleComponentManager.createComponent(SingleComponentManager.java:127)[23:org.apache.felix.scr:1.8.2]
 at org.apache.felix.scr.impl.manager.SingleComponentManager.getService(SingleComponentManager.java:871)[23:org.apache.felix.scr:1.8.2]
 at org.apache.felix.scr.impl.manager.SingleComponentManager.getServiceInternal(SingleComponentManager.java:838)[23:org.apache.felix.scr:1.8.2]
 at org.apache.felix.scr.impl.manager.AbstractComponentManager.activateInternal(AbstractComponentManager.java:850)[23:org.apache.felix.scr:1.8.2]
 at org.apache.felix.scr.impl.manager.AbstractComponentManager.enable(AbstractComponentManager.java:419)[23:org.apache.felix.scr:1.8.2]
 at org.apache.felix.scr.impl.config.ConfigurableComponentHolder.enableComponents(ConfigurableComponentHolder.java:376)[23:org.apache.felix.scr:1.8.2]
 at org.apache.felix.scr.impl.BundleComponentActivator.initialize(BundleComponentActivator.java:172)[23:org.apache.felix.scr:1.8.2]
 at org.apache.felix.scr.impl.BundleComponentActivator.<init>(BundleComponentActivator.java:120)[23:org.apache.felix.scr:1.8.2]
 at org.apache.felix.scr.impl.Activator.loadComponents(Activator.java:258)[23:org.apache.felix.scr:1.8.2]
 at org.apache.felix.scr.impl.Activator.access$000(Activator.java:45)[23:org.apache.felix.scr:1.8.2]
 at org.apache.felix.scr.impl.Activator$ScrExtension.start(Activator.java:185)[23:org.apache.felix.scr:1.8.2]
 at org.apache.felix.utils.extender.AbstractExtender.createExtension(AbstractExtender.java:259)[23:org.apache.felix.scr:1.8.2]
 at org.apache.felix.utils.extender.AbstractExtender.modifiedBundle(AbstractExtender.java:232)[23:org.apache.felix.scr:1.8.2]
 at org.osgi.util.tracker.BundleTracker$Tracked.customizerModified(BundleTracker.java:479)[23:org.apache.felix.scr:1.8.2]
 at org.osgi.util.tracker.BundleTracker$Tracked.customizerModified(BundleTracker.java:414)[23:org.apache.felix.scr:1.8.2]
 at org.osgi.util.tracker.AbstractTracked.track(AbstractTracked.java:232)[23:org.apache.felix.scr:1.8.2]
 at org.osgi.util.tracker.BundleTracker$Tracked.bundleChanged(BundleTracker.java:443)[23:org.apache.felix.scr:1.8.2]
 at org.apache.felix.framework.util.EventDispatcher.invokeBundleListenerCallback(EventDispatcher.java:913)[org.apache.felix.framework-5.0.1.jar:]
 at org.apache.felix.framework.util.EventDispatcher.fireEventImmediately(EventDispatcher.java:834)[org.apache.felix.framework-5.0.1.jar:]
 at org.apache.felix.framework.util.EventDispatcher.fireBundleEvent(EventDispatcher.java:516)[org.apache.felix.framework-5.0.1.jar:]
 at org.apache.felix.framework.Felix.fireBundleEvent(Felix.java:4544)[org.apache.felix.framework-5.0.1.jar:]
 at org.apache.felix.framework.Felix.startBundle(Felix.java:2166)[org.apache.felix.framework-5.0.1.jar:]
 at org.apache.felix.framework.BundleImpl.start(BundleImpl.java:977)[org.apache.felix.framework-5.0.1.jar:]
 at org.apache.felix.fileinstall.internal.DirectoryWatcher.startBundle(DirectoryWatcher.java:1245)[4:org.apache.felix.fileinstall:3.5.0]
 at org.apache.felix.fileinstall.internal.DirectoryWatcher.startBundles(DirectoryWatcher.java:1217)[4:org.apache.felix.fileinstall:3.5.0]
 at org.apache.felix.fileinstall.internal.DirectoryWatcher.startAllBundles(DirectoryWatcher.java:1207)[4:org.apache.felix.fileinstall:3.5.0]
 at org.apache.felix.fileinstall.internal.DirectoryWatcher.doProcess(DirectoryWatcher.java:504)[4:org.apache.felix.fileinstall:3.5.0]
 at org.apache.felix.fileinstall.internal.DirectoryWatcher.process(DirectoryWatcher.java:358)[4:org.apache.felix.fileinstall:3.5.0]
 at org.apache.felix.fileinstall.internal.DirectoryWatcher.run(DirectoryWatcher.java:310)[4:org.apache.felix.fileinstall:3.5.0]
Caused by: java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.LocalFileSystem not found
 at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1788)
 at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1880)
... 46 more

I know the class is available in the runtime (59 was my bundle). 我知道该类在运行时可用(59是我的捆绑包)。 In it I'm defining a camel route in a RouteBuilder (class shown below) that will make use of the camel-hdfs component: 在其中,我在RouteBuilder(如下所示的类)中定义了骆驼路线,该路线将使用camel-hdfs组件:

 karaf@root()> list 59
 START LEVEL 100 , List Threshold: 50
 ID | State  | Lvl | Version        | Name
 ---------------------------------------------------
 59 | Active |  80 | 0.0.1.SNAPSHOT | cas-camel-hdfs
 karaf@root()> bundle:classes 59 | grep LocalFileSystem
 org/apache/hadoop/fs/LocalFileSystem.class
 org/apache/hadoop/fs/LocalFileSystemConfigKeys.class
 org/apache/hadoop/fs/RawLocalFileSystem$1.class
 org/apache/hadoop/fs/RawLocalFileSystem$DeprecatedRawLocalFileStatus.class
 org/apache/hadoop/fs/RawLocalFileSystem$LocalFSFileInputStream.class
 org/apache/hadoop/fs/RawLocalFileSystem$LocalFSFileOutputStream.class
 org/apache/hadoop/fs/RawLocalFileSystem.class
 karaf@root()> 

Here is my RouteBuilder/Activator: 这是我的RouteBuilder / Activator:

package cas.example.camel_hdfs;

import java.net.URI;

import org.apache.camel.CamelContext;
import org.apache.camel.builder.RouteBuilder;
import org.apache.camel.main.Main;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.LocalFileSystem;
import org.apache.hadoop.hdfs.DistributedFileSystem;
import org.osgi.framework.BundleContext;

import aQute.bnd.annotation.component.Activate;
import aQute.bnd.annotation.component.Component;
import aQute.bnd.annotation.component.Deactivate;

@Component
public class LocalRouteBuilder extends RouteBuilder {

    private final String hdfsHost;
    private final String path;
    private static final String MARKED_SUFFIX = "ingested";

    /**
     * If running in OSGI...
     */
    private CamelContext cContext = null;

    public LocalRouteBuilder() {
        this("10.10.1.20", "/user/cloud-user/cas-docs", "cloud-user");
    }

    /**
     * If you use this constructor, make sure the HADOOP_USER_NAME is set via a
     * jvm property.
     * 
     * @param hdfsHost
     * @param path
     */
    public LocalRouteBuilder(final String hdfsHost, final String path) {
        this(hdfsHost, path, null);
    }

    /**
     * 
     * @param hdfsHost
     * @param path
     * @param userName
     */
    public LocalRouteBuilder(final String hdfsHost, final String path, final String userName) {
        this.cContext = this.getContext();
        this.hdfsHost = hdfsHost;
        this.path = path;
        if (userName != null) {
            System.setProperty("HADOOP_USER_NAME", userName);
        }
    }

    /**
     * {@inheritDoc}
     */
    @Override
    public void configure() throws Exception {

        from("hdfs2://" + hdfsHost + path + "?delay=5000&chunkSize=4096&connectOnStartup=true&readSuffix=" + MARKED_SUFFIX)

        .setBody(simple(path + "/${header[CamelFileName]}." + MARKED_SUFFIX))

        .to("log:cas.example.camel_hdfs.BasicRouteBuilder")

        .to("file:/tmp/RECEIVED")

        .stop().end();

    }

    @Activate
    public void start(BundleContext context) throws Exception {
        Configuration conf = new Configuration();
        conf.setClass("fs.file.impl", LocalFileSystem.class, FileSystem.class);
        conf.setClass("fs.hdfs.impl", DistributedFileSystem.class, FileSystem.class);
        FileSystem.get(URI.create("file:///"), conf);
        FileSystem.get(URI.create("hdfs://10.10.1.20:9000/"), conf);

        if (cContext != null) {
            cContext.stop();
            cContext = null;
        }
        // cContext = new OsgiDefaultCamelContext(context);
        cContext.addRoutes(this);
        cContext.start();
        cContext.startAllRoutes();
    }

    @Deactivate
    public void stop(BundleContext context) throws Exception {
        System.out.println("Stopping hdfs camel bundle");
        if (cContext != null) {
            cContext.stop();
            cContext = null;
        }
    }

    public static void main(String[] args) {
        try {
            Main m = new Main();
            m.addRouteBuilder(new LocalRouteBuilder("10.10.1.20", "/user/cloud-user/cas-docs", "cloud-user"));
            m.enableHangupSupport();
            m.enableTrace();
            m.run();
        } catch (Exception e) {
            e.printStackTrace();
            System.exit(-1);
        }
    }

}

Just in case it helps, here is bundle list: 以防万一,在这里是捆绑包列表:

karaf@root()> list
START LEVEL 100 , List Threshold: 50
 ID | State    | Lvl | Version            | Name
----------------------------------------------------------------------------------------------
 58 | Active   |  80 | 0.0.1.SNAPSHOT     | karaf-feature-export
 59 | Active   |  80 | 0.0.1.SNAPSHOT     | cas-camel-hdfs
 60 | Active   |  80 | 2.4.0.201411031534 | bndlib
 61 | Active   |  80 | 2.15.2             | camel-blueprint
 62 | Active   |  80 | 2.15.2             | camel-catalog
 63 | Active   |  80 | 2.15.2             | camel-commands-core
 64 | Active   |  80 | 2.15.2             | camel-core
 65 | Active   |  80 | 2.15.2             | camel-spring
 66 | Active   |  80 | 2.15.2             | camel-karaf-commands
 67 | Active   |  80 | 1.1.1              | geronimo-jta_1.1_spec
 72 | Active   |  80 | 2.2.6.1            | Apache ServiceMix :: Bundles :: jaxb-impl
 84 | Active   |  80 | 3.1.4              | Stax2 API
 85 | Active   |  80 | 4.4.1              | Woodstox XML-processor
 86 | Active   |  80 | 2.15.2             | camel-core-osgi
 87 | Active   |  80 | 18.0.0             | Guava: Google Core Libraries for Java
 88 | Active   |  80 | 2.6.1              | Protocol Buffer Java API
 89 | Active   |  80 | 1.9.12             | Jackson JSON processor
 90 | Active   |  80 | 1.9.12             | Data mapper for Jackson JSON processor
 91 | Active   |  80 | 2.15.2             | camel-hdfs2
 92 | Active   |  80 | 1.2                | Commons CLI
 93 | Active   |  80 | 1.10.0             | Apache Commons Codec
 94 | Active   |  80 | 3.2.1              | Commons Collections
 95 | Active   |  80 | 1.5.0              | Commons Compress
 96 | Active   |  80 | 1.9.0              | Commons Configuration
 97 | Active   |  80 | 2.4.0              | Commons IO
 98 | Active   |  80 | 2.6                | Commons Lang
 99 | Active   |  80 | 3.3.0              | Apache Commons Math
100 | Active   |  80 | 3.3.0              | Commons Net
101 | Active   |  80 | 3.4.6              | ZooKeeper Bundle
102 | Active   |  80 | 1.7.7.1            | Apache ServiceMix :: Bundles :: avro
103 | Active   |  80 | 3.1.0.7            | Apache ServiceMix :: Bundles :: commons-httpclient
104 | Active   |  80 | 3.0.0.1            | Apache ServiceMix :: Bundles :: guice
105 | Active   |  80 | 2.3.0.2            | Apache ServiceMix :: Bundles :: hadoop-client
106 | Active   |  80 | 0.1.51.1           | Apache ServiceMix :: Bundles :: jsch
107 | Active   |  80 | 2.6.0.1            | Apache ServiceMix :: Bundles :: paranamer
108 | Active   |  80 | 0.52.0.1           | Apache ServiceMix :: Bundles :: xmlenc
109 | Active   |  80 | 1.2.0.5            | Apache ServiceMix :: Bundles :: xmlresolver
110 | Active   |  80 | 3.9.6.Final        | Netty
111 | Resolved |  80 | 1.1.0.1            | Snappy for Java
karaf@root()>

Thanks for your help! 谢谢你的帮助!

-Ben -本

EDIT: 编辑:

So, I added the bundle headers for my custom bundle (I did a karaf clean, so the bundle id changed from 39 to 109). 因此,我为我的自定义捆绑包添加了捆绑包头(我做了karaf清理,因此捆绑包ID从39更改为109)。

karaf@root()> bundle:headers 109

cas-camel-hdfs (109)
--------------------
Bnd-LastModified = 1440904390702
Build-Jdk = 1.8.0_51
Built-By = bdgould
Created-By = Apache Maven Bundle Plugin
Manifest-Version = 1.0
Service-Component = OSGI-INF/cas.example.camel_hdfs.Hdfs2RouteBuilder.xml,OSGI-INF/cas.example.camel_hdfs.SimpleRouteBuilder.xml
Tool = Bnd-2.4.1.201501161923

Bundle-ManifestVersion = 2
Bundle-Name = cas-camel-hdfs
Bundle-SymbolicName = com.inovexcorp.cas_cas-camel-hdfs
Bundle-Version = 0.0.1.SNAPSHOT

Require-Capability = 
    osgi.ee;filter:=(&(osgi.ee=JavaSE)(version=1.8))

DynamicImport-Package = 
    *
Export-Package = 
    cas.example.camel_hdfs;uses:="org.apache.camel.builder,org.osgi.framework";version=0.0.1.SNAPSHOT
Import-Package = 
    org.apache.camel;version="[2.15,3)",
    org.apache.camel.builder;version="[2.15,3)",
    org.apache.camel.main;version="[2.15,3)",
    org.apache.camel.model;version="[2.15,3)",
    org.apache.hadoop.conf,
    org.apache.hadoop.fs,
    org.apache.hadoop.hdfs,
    org.osgi.framework;version="[1.6,2)",
    org.apache.camel.component.hdfs2;version="[2.15,3)"

I'm still not sure why it can't find the LocalFileSystem class, as it's definitely exported from: 我仍然不确定为什么找不到LocalFileSystem类,因为它肯定是从以下位置导出的:

102 | Active    |  80 | 2.3.0.2            | Apache ServiceMix :: Bundles :: hadoop-client

This is the hadoop bundle installed as part of the camel-hdfs2 feature. 这是作为camel-hdfs2功能的一部分安装的hadoop软件包。

EDIT 2: Hmm, I'm actually not sure why the bundle:classes is showing me all of those classes. 编辑2:嗯,我实际上不确定为什么bundle:classs向我展示了所有这些类。 I just opened up my JAR, and I'm seeing this: 我刚刚打开了JAR,然后看到了:

" zip.vim version v27
" Browsing zipfile /opt/apache-karaf-4.0.1/deploy/cas-camel-hdfs-0.0.1-SNAPSHOT.jar
" Select a file with cursor and press ENTER

META-INF/MANIFEST.MF
META-INF/
META-INF/maven/
META-INF/maven/com.inovexcorp.cas/
META-INF/maven/com.inovexcorp.cas/cas-camel-hdfs/
META-INF/maven/com.inovexcorp.cas/cas-camel-hdfs/pom.properties
META-INF/maven/com.inovexcorp.cas/cas-camel-hdfs/pom.xml
META-INF/services/
META-INF/services/org.apache.hadoop.fs.FileSystem
OSGI-INF/
OSGI-INF/cas.example.camel_hdfs.Hdfs2RouteBuilder.xml
OSGI-INF/cas.example.camel_hdfs.SimpleRouteBuilder.xml
cas/
cas/example/
cas/example/camel_hdfs/
cas/example/camel_hdfs/Hdfs2RouteBuilder.class
cas/example/camel_hdfs/SimpleRouteBuilder.class
core-default.xml
hdfs-default.xml
log4j.xml

The classes listed in Karaf don't seem to match what's actually in the JAR (but it must be seeing what classes my bundle is referencing?). Karaf中列出的类似乎与JAR中的实际类不匹配(但是必须查看我的包正在引用什么类?)。 Here is my POM, just in case it helps: 这是我的POM,以防万一:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <groupId>com.inovexcorp.cas</groupId>
    <artifactId>cas-camel-hdfs</artifactId>
    <version>0.0.1-SNAPSHOT</version>
    <name>Example Camel-HDFS Integration</name>
    <packaging>bundle</packaging>

    <properties>
        <project.build.sourceEncoding>UTF8</project.build.sourceEncoding>
        <camel.version>2.15.2</camel.version>
    </properties>

    <dependencies>
        <dependency>
            <groupId>org.apache.camel</groupId>
            <artifactId>camel-core</artifactId>
            <version>${camel.version}</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.camel</groupId>
            <artifactId>camel-hdfs2</artifactId>
            <version>${camel.version}</version>
            <scope>provided</scope>
        </dependency><!-- <dependency> <groupId>org.apache.camel</groupId> <artifactId>camel-core-osgi</artifactId> 
            <version>${camel.version}</version> </dependency> -->
        <dependency>
            <groupId>log4j</groupId>
            <artifactId>log4j</artifactId>
            <version>1.2.17</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.ops4j.pax.logging</groupId>
            <artifactId>pax-logging-api</artifactId>
            <version>1.7.0</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>biz.aQute.bnd</groupId>
            <artifactId>bndlib</artifactId>
            <version>2.3.0</version>
            <scope>provided</scope>
        </dependency>
    </dependencies>


    <build>
        <plugins>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-compiler-plugin</artifactId>
                <version>3.3</version>
                <configuration>
                    <!-- http://maven.apache.org/plugins/maven-compiler-plugin/ -->
                    <source>1.8</source>
                    <target>1.8</target>
                </configuration>
            </plugin>
            <plugin>
                <groupId>org.apache.felix</groupId>
                <artifactId>maven-bundle-plugin</artifactId>
                <extensions>true</extensions>
                <configuration>
                    <instructions>
                        <Bundle-SymbolicName>${project.groupId}_${project.artifactId}</Bundle-SymbolicName>
                        <Bundle-Name>${project.artifactId}</Bundle-Name>
                        <Bundle-Version>${project.version}</Bundle-Version>
                        <Import-Package>org.apache.camel.component.hdfs2,*;resolution:=required</Import-Package>
                        <Service-Component>*</Service-Component>
                    </instructions>
                </configuration>
            </plugin>
        </plugins>
    </build>

</project>

Looks like a classic Import-Package/Export-Package issue. 看起来像是经典的Import-Package / Export-Package问题。 Does your bundle 59 that obviously contain the class in question actually export it? 您的捆绑软件59显然包含所涉及的类,实际上是否将其导出?

Export-Package: org.apache.hadoop.fs

After the Update: Obviously you import the right packages, but according to your listing 更新后:显然,您导入了正确的软件包,但是根据您的清单

59 | Active |  80 | 0.0.1.SNAPSHOT | cas-camel-hdfs
karaf@root()> bundle:classes 59 | grep LocalFileSystem
 org/apache/hadoop/fs/LocalFileSystem.class
 org/apache/hadoop/fs/LocalFileSystemConfigKeys.class
 org/apache/hadoop/fs/RawLocalFileSystem$1.class
 org/apache/hadoop/fs/RawLocalFileSystem$DeprecatedRawLocalFileStatus.class
 org/apache/hadoop/fs/RawLocalFileSystem$LocalFSFileInputStream.class
 org/apache/hadoop/fs/RawLocalFileSystem$LocalFSFileOutputStream.class
 org/apache/hadoop/fs/RawLocalFileSystem.class

your own bundle also contains those classes, so make sure you don't package them along. 您自己的捆绑包还包含这些类,因此请确保不要将它们打包在一起。 Most likely your dependency to the org.apache.servicemix.bundles.hadoop-client is marked as compile dependency in your pom. 您对org.apache.servicemix.bundles.hadoop-client的依赖很可能在pom中被标记为编译依赖。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM