其他分享
首页 > 其他分享> > Kfka异步发送 API

Kfka异步发送 API

作者:互联网

1.创建Maven工程 kafka

2.在pom.xml文件导入依赖 
<dependencies>
        <dependency>
            <groupId>org.apache.kafka</groupId>
            <artifactId>kafka-clients</artifactId>
            <version>3.0.0</version>
        </dependency>
</dependencies>

3.创建包名com.kafka.producer

4.编写不带回调函数的 API 代码
package com.kafka.producer;

import org.apache.kafka.clients.producer.KafkaProducer;
import org.apache.kafka.clients.producer.ProducerConfig;
import org.apache.kafka.clients.producer.ProducerRecord;

import org.apache.kafka.common.serialization.StringSerializer;

import java.util.Properties;

public class CustomProducer {
    public static void main(String[] args) {
        //配置
        Properties properties = new Properties();
        //连接集群
        properties.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG,"hadoop102:9092");
        //指定对应的key和value序列化类型
        properties.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName());
        properties.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG,StringSerializer.class.getName());

        //创建Kafka生产者对象
        KafkaProducer<String, String> kafkaProducer = new KafkaProducer<>(properties);
        //发送数据
        for (int i = 0; i < 5; i++) {
            kafkaProducer.send(new ProducerRecord<>("first","Kafka"+i));
        }
        //关闭资源
        kafkaProducer.close();
    }
}

5.在 hadoop102 上开启 Kafka 消费者。

bin/kafka-console-consumer.sh --bootstrap-server hadoop102:9092 --topic first

6.在 IDEA 中执行代码,观察 hadoop102 控制台中是否接收到消息。

 6.编写带回调函数的 API 代码

将CustomProducer类复制一份命名为CustomProducerCallBack类

package com.kafka.producer;

import org.apache.kafka.clients.producer.*;
import org.apache.kafka.common.serialization.StringSerializer;

import java.util.Properties;

public class CustomProducerCallBack {
    public static void main(String[] args) {
        //配置
        Properties properties = new Properties();
        //连接集群
        properties.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "hadoop102:9092");
        //指定对应的key和value序列化类型
        properties.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName());
        properties.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName());

        //创建Kafka生产者对象
        KafkaProducer<String, String> kafkaProducer = new KafkaProducer<>(properties);
        //发送数据
        for (int i = 0; i < 5; i++) {
            kafkaProducer.send(new ProducerRecord<>("first", "Kafka" + i), new Callback() {
                @Override
                public void onCompletion(RecordMetadata metadata, Exception exception) {
                    if (exception == null) {
                        System.out.println("主题:" + metadata.topic() + " 分区:" + metadata.partition());
                    }
                }
            });
        }
        //关闭资源
        kafkaProducer.close();
    }
}

7.在 IDEA 中执行代码,观察 hadoop102 控制台中是否接收到消息。

同时IDEA返回结果

 

标签:异步,producer,kafka,API,new,apache,import,properties,Kfka
来源: https://www.cnblogs.com/hz-Master/p/16272650.html