Install nginx Kafka plug-in
Install nginx Kafka plug-in
nginx can directly write data into kafka.
1. install git
yum install -y git
2. Switch to the directory / usr/local/src, and then clone the c client source code of kafka to local
cd /usr/local/src
git clone https://github.com/edenhill/librdkafka
3. Enter librdkafka and compile
...
Posted by KevinMG on Sat, 04 Jan 2020 23:44:35 -0800
Kafka 2.4 release - Introduction to new features (with Java Api Demo code)
new function
Allow consumers to get from the most recent copy
Increase support for incremental cooperative rebalancing for consumer rebalancing protocol
New MirrorMaker 2.0 (MM2), new multi cluster cross data center replication engine
Introducing a new Java authorized program interface
Supports non key connections in KTable
Adminis ...
Posted by tex1820 on Mon, 30 Dec 2019 12:34:49 -0800
Kafka Golang client introduction
At present, there are several commonly used Golang Kafka clients, each of which has its own advantages and disadvantages
Client name
Advantages and disadvantages
sarama
The number of users is relatively large, but it is relatively difficult to use, with better performance
confluent-kafka-go
The encapsulation of Kafka in C language has st ...
Posted by Stressed on Wed, 25 Dec 2019 21:56:55 -0800
Kubernetes+docker-DIY-kafka+zookeeper+manager cluster deployment
Foreword: Recent project of combining kafka and zookeeper containerization with rancher, I consulted related websites and books and found that there are many reasons why I chose to customize it if it is relatively strong to standardize the company. Let me briefly talk about the reasons why I chose to customize it finally: (Because k8s+kakfa te ...
Posted by harty83 on Fri, 20 Dec 2019 09:48:27 -0800
Kafka ou 2.12-2.0.0 installation process
1. Download
Kafka? 2.12-2.0.0.tgz and upload it to / usr/local/kafka /
Extract and remove the installation package
2. configuration
Modify the config/server.properties of each server
cd /usr/local/kafka/kafka_2.12-2.0.0/config
vim server.properties
Modify several places:broker.id : unique, fill in the number, 10,11,12,13,1 ...
Posted by N350CA on Sun, 15 Dec 2019 12:29:42 -0800
Kafka+Zookeeper cluster construction
Environmental preparation
Kafka
Version: kafka_2.10-0.10.2.0 (2.10 for Scala version 0.10.2.0 for Kafka version)
Official download address: http://kafka.apache.org/downloads
Zookeeper
Version: zookeeper-3.4.13
Official download address: http://zookeeper.apache.org/releases.html
JDK
Version: jdk1.8.0_
Official download address: h ...
Posted by NuLL[PL] on Tue, 10 Dec 2019 14:49:00 -0800
Kubernetes builds Zookeeper and Kafka clusters
Main references https://www.cnblogs.com/00986014w/p/9561901.html This blog post, but he's not using the official image of zookeeper
I use the Zookeeper cluster with three nodes, which can be modified by reference.
##Build Zookeeper cluster
zookeeper-svc.yaml of cluster Service
apiVersion: v1
kind: Service
metadata:
labels:
app: zookeeper ...
Posted by tunage on Tue, 10 Dec 2019 07:43:35 -0800
filebeat collects IIS logs into es
If you need to collect IIS logs in your work, you can use the filebeat component;
Log format resolution of iis:
Log example:
#Software: Microsoft Internet Information Services 7.5
#Version: 1.0
#Date: 2019-03-14 00:00:00
#Fields: date time s-ip cs-method cs-uri-stem cs-uri-query s-port cs-username c-ip cs(User-Agent) sc-status sc-substatus sc- ...
Posted by timc37 on Mon, 02 Dec 2019 20:29:53 -0800
brew installation path under mac, with kafka as an example
Huge buildings are always made of one wood and one stone. Why don't we make them? I often do some odd things, that's why.
That's right, but I didn't say that! ——Lu Xun
brew is the installation package tool of mac system. Similar to yum or apt get.
What is the package path of brew installation. The following is an exam ...
Posted by geus on Sun, 01 Dec 2019 22:37:23 -0800
[Flume] - collect Log4j log and send it to Kafka
Flume collects Log4j logs and sends them to Kafka for storage
Environmental preparation
Download Flume: http://flume.apache.org/
Install: extract the download package to the custom path
Configure agent
# Function: filter json information storage kafka
agent.sources = s1
agent.channels = c1
agent.sinks = k1
agent.sources.s1.type = avro
agen ...
Posted by mirana on Fri, 29 Nov 2019 12:57:00 -0800