web-dev-qa-db-ja.com

使用方法Kafka 0.8Log4jアペンダー

Kafka-0.8 Log4jアペンダーを実行しようとしていますが、実行できません。アプリケーションがLog4jアペンダーを介してログをkafkaに直接送信するようにしたい。

これが私のlog4j.propertiesです。適切なエンコーダーが見つからなかったので、デフォルトのエンコーダーを使用するように構成しました。 (例:私はその行にコメントしました。)

log4j.rootLogger=INFO, stdout, KAFKA

log4j.appender.stdout=org.Apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.Apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%5p [%t] (%F:%L) - %m%n

log4j.appender.KAFKA=kafka.producer.KafkaLog4jAppender
log4j.appender.KAFKA.layout=org.Apache.log4j.PatternLayout
log4j.appender.KAFKA.layout.ConversionPattern=%-5p: %c - %m%n
log4j.appender.KAFKA.BrokerList=hnode01:9092
log4j.appender.KAFKA.Topic=DKTestEvent

#log4j.appender.KAFKA.SerializerClass=kafka.log4j.AppenderStringEncoder

そして、これは私のサンプルアプリケーションです。

import org.Apache.log4j.Logger;
import org.Apache.log4j.BasicConfigurator;
import org.Apache.log4j.PropertyConfigurator;

public class HelloWorld {

        static Logger logger = Logger.getLogger(HelloWorld.class.getName());

        public static void main(String[] args) {
            PropertyConfigurator.configure(args[0]);

            logger.info("Entering application.");
            logger.debug("Debugging!.");
            logger.info("Exiting application.");
        }
}

コンパイルにはMavenを使用しました。 pom.xmlにkafka_2.8.2-0.8.0とlog4j_1.2.17を含めました

そして、私はこれらのエラーを受け取っています:

INFO [main] (Logging.scala:67) - Verifying properties
INFO [main] (Logging.scala:67) - Property metadata.broker.list is overridden to hnode01:9092
INFO [main] (Logging.scala:67) - Property serializer.class is overridden to kafka.serializer.StringEncoder
INFO [main] (HelloWorld.Java:14) - Entering application.
INFO [main] (HelloWorld.Java:14) - Fetching metadata from broker id:0,Host:hnode01,port:9092 with correlation id 0 for 1 topic(s) Set(DKTestEvent)
INFO [main] (HelloWorld.Java:14) - Fetching metadata from broker id:0,Host:hnode01,port:9092 with correlation id 1 for 1 topic(s) Set(DKTestEvent)
INFO [main] (HelloWorld.Java:14) - Fetching metadata from broker id:0,Host:hnode01,port:9092 with correlation id 2 for 1 topic(s) Set(DKTestEvent)
INFO [main] (HelloWorld.Java:14) - Fetching metadata from broker id:0,Host:hnode01,port:9092 with correlation id 3 for 1 topic(s) Set(DKTestEvent)
INFO [main] (HelloWorld.Java:14) - Fetching metadata from broker id:0,Host:hnode01,port:9092 with correlation id 4 for 1 topic(s) Set(DKTestEvent)
INFO [main] (HelloWorld.Java:14) - Fetching metadata from broker id:0,Host:hnode01,port:9092 with correlation id 5 for 1 topic(s) Set(DKTestEvent)
.
.
.
INFO [main] (HelloWorld.Java:14) - Fetching metadata from broker id:0,Host:hnode01,port:9092 with correlation id 60 for 1 topic(s) Set(DKTestEvent)
INFO [main] (HelloWorld.Java:14) - Fetching metadata from broker id:0,Host:hnode01,port:9092 with correlation id 61 for 1 topic(s) Set(DKTestEvent)
INFO [main] (HelloWorld.Java:14) - Fetching metadata from broker id:0,Host:hnode01,port:9092 with correlation id 62 for 1 topic(s) Set(DKTestEvent)
INFO [main] (Logging.scala:67) - Fetching metadata from broker id:0,Host:hnode01,port:9092 with correlation id 63 for 1 topic(s) Set(DKTestEvent)
INFO [main] (Logging.scala:67) - Fetching metadata from broker id:0,Host:hnode01,port:9092 with correlation id 64 for 1 topic(s) Set(DKTestEvent)
INFO [main] (Logging.scala:67) - Fetching metadata from broker id:0,Host:hnode01,port:9092 with correlation id 65 for 1 topic(s) Set(DKTestEvent)
INFO [main] (Logging.scala:67) - Fetching metadata from broker id:0,Host:hnode01,port:9092 with correlation id 66 for 1 topic(s) Set(DKTestEvent)
INFO [main] (Logging.scala:67) - Fetching metadata from broker id:0,Host:hnode01,port:9092 with correlation id 67 for 1 topic(s) Set(DKTestEvent)
.
.
.
INFO [main] (Logging.scala:67) - Fetching metadata from broker id:0,Host:hnode01,port:9092 with correlation id 534 for 1 topic(s) Set(DKTestEvent)
ERROR [main] (Logging.scala:67) - 
ERROR [main] (Logging.scala:67) - 
ERROR [main] (Logging.scala:67) - 
ERROR [main] (Logging.scala:67) - 
ERROR [main] (Logging.scala:67) - 
ERROR [main] (Logging.scala:67) - 
Java.lang.StackOverflowError
    at Java.lang.ClassLoader.defineClass1(Native Method)
    at Java.lang.ClassLoader.defineClass(ClassLoader.Java:643)
at Java.security.SecureClassLoader.defineClass(SecureClassLoader.Java:142)
at Java.net.URLClassLoader.defineClass(URLClassLoader.Java:277)
at Java.net.URLClassLoader.access$000(URLClassLoader.Java:73)
at Java.net.URLClassLoader$1.run(URLClassLoader.Java:212)
at Java.security.AccessController.doPrivileged(Native Method)
at Java.net.URLClassLoader.findClass(URLClassLoader.Java:205)
at Java.lang.ClassLoader.loadClass(ClassLoader.Java:323)
at Sun.misc.Launcher$AppClassLoader.loadClass(Launcher.Java:294)
at Java.lang.ClassLoader.loadClass(ClassLoader.Java:268)
at Java.lang.ClassLoader.defineClass1(Native Method)
at Java.lang.ClassLoader.defineClass(ClassLoader.Java:643)
at Java.security.SecureClassLoader.defineClass(SecureClassLoader.Java:142)
at Java.net.URLClassLoader.defineClass(URLClassLoader.Java:277)
at Java.net.URLClassLoader.access$000(URLClassLoader.Java:73)
at Java.net.URLClassLoader$1.run(URLClassLoader.Java:212)
at Java.security.AccessController.doPrivileged(Native Method)
at Java.net.URLClassLoader.findClass(URLClassLoader.Java:205)
at Java.lang.ClassLoader.loadClass(ClassLoader.Java:323)
at Sun.misc.Launcher$AppClassLoader.loadClass(Launcher.Java:294)
at Java.lang.ClassLoader.loadClass(ClassLoader.Java:268)
at org.Apache.log4j.spi.ThrowableInformation.getThrowableStrRep(ThrowableInformation.Java:87)
at org.Apache.log4j.spi.LoggingEvent.getThrowableStrRep(LoggingEvent.Java:413)
at org.Apache.log4j.WriterAppender.subAppend(WriterAppender.Java:313)
at org.Apache.log4j.WriterAppender.append(WriterAppender.Java:162)
at org.Apache.log4j.AppenderSkeleton.doAppend(AppenderSkeleton.Java:251)
at org.Apache.log4j.helpers.AppenderAttachableImpl.appendLoopOnAppenders(AppenderAttachableImpl.Java:66)
at org.Apache.log4j.Category.callAppenders(Category.Java:206)
at org.Apache.log4j.Category.forcedLog(Category.Java:391)
at org.Apache.log4j.Category.error(Category.Java:322)
at kafka.utils.Logging$$anonfun$swallowError$1.apply(Logging.scala:105)
at kafka.utils.Logging$$anonfun$swallowError$1.apply(Logging.scala:105)
at kafka.utils.Utils$.swallow(Utils.scala:189)
at kafka.utils.Logging$class.swallowError(Logging.scala:105)
at kafka.utils.Utils$.swallowError(Utils.scala:46)
at kafka.producer.async.DefaultEventHandler.handle(DefaultEventHandler.scala:67)
at kafka.producer.Producer.send(Producer.scala:76)
at kafka.producer.KafkaLog4jAppender.append(KafkaLog4jAppender.scala:96)
at org.Apache.log4j.AppenderSkeleton.doAppend(AppenderSkeleton.Java:251)
at org.Apache.log4j.helpers.AppenderAttachableImpl.appendLoopOnAppenders(AppenderAttachableImpl.Java:66)
at org.Apache.log4j.Category.callAppenders(Category.Java:206)
at org.Apache.log4j.Category.forcedLog(Category.Java:391)
at org.Apache.log4j.Category.info(Category.Java:666)
at kafka.utils.Logging$class.info(Logging.scala:67)
at kafka.client.ClientUtils$.info(ClientUtils.scala:31)
at kafka.client.ClientUtils$.fetchTopicMetadata(ClientUtils.scala:51)
at kafka.producer.BrokerPartitionInfo.updateInfo(BrokerPartitionInfo.scala:82)
at kafka.producer.async.DefaultEventHandler$$anonfun$handle$1.apply$mcV$sp(DefaultEventHandler.scala:67)
at kafka.utils.Utils$.swallow(Utils.scala:187)
at kafka.utils.Logging$class.swallowError(Logging.scala:105)
at kafka.utils.Utils$.swallowError(Utils.scala:46)
at kafka.producer.async.DefaultEventHandler.handle(DefaultEventHandler.scala:67)
at kafka.producer.Producer.send(Producer.scala:76)
at kafka.producer.KafkaLog4jAppender.append(KafkaLog4jAppender.scala:96)
at org.Apache.log4j.AppenderSkeleton.doAppend(AppenderSkeleton.Java:251)
at org.Apache.log4j.helpers.AppenderAttachableImpl.appendLoopOnAppenders(AppenderAttachableImpl.Java:66)
.
.
.

プログラムを終了しないと、上記のエラーが継続的に発生します。

何かが足りない場合は、お知らせください。

9
style95

Jonasが問題を特定したと思います。つまり、KafkaプロデューサーロギングもKafkaアペンダーに記録され、無限ループと最終的なスタックオーバーフローを引き起こします( pun意図)すべてのKafkaログを別のアペンダーに移動するように構成できます。以下に、出力をstdoutに送信する方法を示します。

log4j.logger.kafka=INFO, stdout

したがって、log4j.propertiesで次のようになるはずです。

log4j.rootLogger=INFO, stdout, KAFKA
log4j.logger.kafka=INFO, stdout
log4j.logger.HelloWorld=INFO, KAFKA

Kafka 0.8.2.2でlog4jを介してイベントを生成できました。log4jの構成は次のとおりです。

<?xml version="1.0" encoding="UTF-8" ?>
<!DOCTYPE log4j:configuration SYSTEM "log4j.dtd">

<log4j:configuration xmlns:log4j="http://jakarta.Apache.org/log4j/">

   <appender name="console" class="org.Apache.log4j.ConsoleAppender">
      <param name="Target" value="System.out" />
      <layout class="org.Apache.log4j.PatternLayout">
         <param name="ConversionPattern" value="%-5p %c{1} - %m%n" />
      </layout>
   </appender>
   <appender name="fileAppender" class="org.Apache.log4j.RollingFileAppender">
      <param name="Threshold" value="INFO" />
      <param name="MaxBackupIndex" value="100" />
      <param name="File" value="/tmp/agna-LogFile.log" />
      <layout class="org.Apache.log4j.PatternLayout">
         <param name="ConversionPattern" value="%d  %-5p  [%c{1}] %m %n" />
      </layout>
   </appender>
   <appender name="kafkaAppender" class="kafka.producer.KafkaLog4jAppender">
      <param name="Topic" value="kafkatopic" />
      <param name="BrokerList" value="localhost:9092" />
      <param name="syncSend" value="true" />
      <layout class="org.Apache.log4j.PatternLayout">
         <param name="ConversionPattern" value="%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L %% - %m%n" />
      </layout>
   </appender>
   <logger name="org.Apache.kafka">
      <level value="error" />
      <appender-ref ref="console" />
   </logger>
   <logger name="com.example.kafkaLogger">
      <level value="debug" />
      <appender-ref ref="kafkaAppender" />
   </logger>
   <root>
      <priority value="debug" />
      <appender-ref ref="console" />
   </root>
</log4j:configuration>

ソースコードは次のとおりです。

package com.example;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

import org.json.simple.JSONArray;
import org.json.simple.JSONObject;
import Java.util.Properties;
import Java.util.concurrent.ExecutionException;
import org.Apache.kafka.clients.producer.ProducerConfig;
import org.Apache.kafka.clients.producer.KafkaProducer;

import org.Apache.kafka.clients.producer.ProducerRecord;
import org.Apache.kafka.common.serialization.StringSerializer;

public class JsonProducer {
    static Logger defaultLogger = LoggerFactory.getLogger(JsonProducer.class);
    static Logger kafkaLogger = LoggerFactory.getLogger("com.example.kafkaLogger");

    public static void main(String args[]) {

        JsonProducer obj = new JsonProducer();

        String str = obj.getJsonObjAsString();

        // Use the logger
        kafkaLogger.info(str);

        try {
            // Construct and send message
            obj.constructAndSendMessage();
        } catch (InterruptedException e) {
            defaultLogger.error("Caught interrupted exception " + e);
        } catch (ExecutionException e) {
            defaultLogger.error("Caught execution exception " + e);
        }   
    }

    private String getJsonObjAsString() {
        JSONObject obj = new JSONObject();
        obj.put("name", "John");
        obj.put("age", new Integer(55));
        obj.put("address", "123 MainSt, Palatine, IL");

        JSONArray list = new JSONArray();
        list.add("msg 1");
        list.add("msg 2");
        list.add("msg 3");

        obj.put("messages", list);

        return obj.toJSONString();
    }

    private void constructAndSendMessage() throws InterruptedException, ExecutionException {
        Properties props = new Properties();
        props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
        props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName());
        props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName());

        KafkaProducer<String, String> producer = new KafkaProducer<String, String>(props);

        boolean sync = false;
        String topic = "kafkatopic";
        String key = "mykey";
        String value = "myvalue1 mayvalue2 myvalue3";
        ProducerRecord<String, String> producerRecord = new ProducerRecord<String, String>(topic, key, value);
        if (sync) {
            producer.send(producerRecord).get();
        } else {
            producer.send(producerRecord);
        }
        producer.close();
    }
}

プロジェクト全体は、次のリンクから入手できます。

https://github.com/ypant/kafka-json-producer.git

2
Yagna Pant

次のように、アペンダーの非同期を設定してみてください:log4j.appender.KAFKA.ProducerType = async

kafkaプロデューサーはそれ自体にログインしているので、無限ループに入るのは妥当なようです。

1