Install nginx Kafka plug-in

Keywords: kafka Nginx git yum

Install nginx Kafka plug-in

nginx can directly write data into kafka.

1. install git

    yum install -y git

2. Switch to the directory / usr/local/src, and then clone the c client source code of kafka to local

    cd /usr/local/src
    git clone https://github.com/edenhill/librdkafka

3. Enter librdkafka and compile

    cd librdkafka
    yum install -y gcc gcc-c++ pcre-devel zlib-devel
    ./configure
    make && make install

4. Install the plug-in of nginx integration kafka, enter / usr/local/src, clone nginx integration kafka source code

    cd /usr/local/src
    git clone https://github.com/brg-liuwei/ngx_kafka_module

5. Enter the source package directory of nginx (compile nginx, and then compile the plug-in at the same time)

    cd /usr/local/src/nginx-1.12.2
    ./configure --add-module=/usr/local/src/ngx_kafka_module/
    make
    make install

6. Modify the configuration file of nginx. For details, see nginx.conf in the current directory

#user  nobody;
worker_processes  1;

#error_log  logs/error.log;
#error_log  logs/error.log  notice;
#error_log  logs/error.log  info;

#pid        logs/nginx.pid;


events {
    worker_connections  1024;
}


http {
    include       mime.types;
    default_type  application/octet-stream;

    #log_format  main  '$remote_addr - $remote_user [$time_local] "$request" '
    #                  '$status $body_bytes_sent "$http_referer" '
    #                  '"$http_user_agent" "$http_x_forwarded_for"';
    #access_log  logs/access.log  main;
    sendfile        on;
    #tcp_nopush     on;
    #keepalive_timeout  0;
    keepalive_timeout  65;
    #gzip  on;

    kafka;
    kafka_broker_list node-1.xiaoniu.com:9092 node-2.xiaoniu.com:9092 node-3.xiaoniu.com:9092;  

    server {
        listen       80;
        server_name  node-6.xiaoniu.com;
        #charset koi8-r;
        #access_log  logs/host.access.log  main;

        location = /kafka/track {
                kafka_topic track;
        }

        location = /kafka/user {
                kafka_topic user;
        }

        #error_page  404              /404.html;

        # redirect server error pages to the static page /50x.html
        #
        error_page   500 502 503 504  /50x.html;
        location = /50x.html {
            root   html;
        }

    }

}

It mainly adds kafka and location, which are mentioned in the instructions in the git warehouse of liuwei.

7. Start zk and kafka clusters (create topic)

    /bigdata/zookeeper-3.4.9/bin/zkServer.sh start
    /bigdata/kafka_2.11-0.10.2.1/bin/kafka-server-start.sh -daemon /bigdata/kafka_2.11-0.10.2.1/config/server.properties

8. Start nginx and report an error. The file of kafka.so.1 cannot be found

    error while loading shared libraries: librdkafka.so.1: cannot open shared object file: No such file or directory

The reason is that the library compilation is not loaded

9. Load so Library

    echo "/usr/local/lib" >> /etc/ld.so.conf
    ldconfig

10. Before the test, turn on nginx, remember to ping to test, and open the corresponding port, start the test: write data to nginx, and then observe whether kafka consumers can consume data

    curl localhost/kafka/track -d "message send to kafka topic"

Posted by KevinMG on Sat, 04 Jan 2020 23:44:35 -0800