Flink notes: Flink data saving redis (custom Redis Sink)

This paper mainly introduces the process that Flink reads Kafka data and sinks (Sink) data to Redis in real time. Through the following link: Flink official documents , we know that the fault tolerance mechanism for saving data to Redis is at least once. So we use idempotent operation and the principle ...

Posted by quak on Wed, 19 Feb 2020 20:38:23 -0800

Flink notes: Flink uses EventTime to process real-time data

Combined with the previous post: Introduction to Time and Window in Flink , we learned that Flink's streaming process involves three time concepts. They are Event Time, induction time and Processing Time. If you still don't understand these three concepts of time, it's suggested that you jump to the pre ...

Posted by PHPSpirit on Wed, 12 Feb 2020 03:14:26 -0800

Spark Streaming of big data technology

Spark Streaming of big data technology 1: Overview 1. Definition: Spark Streaming is used for streaming data processing. Spark Streaming supports many data input sources, such as Kafka, Flume, Twitter, ZeroMQ and simple TCP sockets. After data input, you can use Spark's highly abstract primitives such ...

Posted by croakingtoad on Mon, 10 Feb 2020 07:28:21 -0800

Setting up Kafka-0.10.2 source reading environment and running Windows locally

Setting up Kafka-0.10.2 source reading environment and running Windows locally Version information 2. Building Kafka Source Environment 3. Configuring Kafka Source Environment Build a bin package Version information Kafka: 0.10.2,Scala: 2.10.6,Java: 1.8.0_221,IntelliJ IDEA: 2019.2,Zookeeper: 3.4 ...

Posted by crazydip on Thu, 30 Jan 2020 18:23:30 -0800

4.5 billion data migration records follow-up-to-day data volume levels of 10 million to clickhouse

4.5 billion data migration records follow-up-to-day data volume levels of 10 million to clickhouse Related Document Address flume Reference Address waterdrop Reference Address clickhouse Reference Address kafka Reference Address The environment log is on one server and the clickhouse cluster is on the other. thinking Previously, when m ...

Posted by gszauer on Mon, 20 Jan 2020 18:52:36 -0800

Install nginx Kafka plug-in

Install nginx Kafka plug-in nginx can directly write data into kafka. 1. install git yum install -y git 2. Switch to the directory / usr/local/src, and then clone the c client source code of kafka to local cd /usr/local/src git clone https://github.com/edenhill/librdkafka 3. Enter librdkafka and compile ...

Posted by KevinMG on Sat, 04 Jan 2020 23:44:35 -0800

Kafka 2.4 release - Introduction to new features (with Java Api Demo code)

new function Allow consumers to get from the most recent copy Increase support for incremental cooperative rebalancing for consumer rebalancing protocol New MirrorMaker 2.0 (MM2), new multi cluster cross data center replication engine Introducing a new Java authorized program interface Supports non key connections in KTable Adminis ...

Posted by tex1820 on Mon, 30 Dec 2019 12:34:49 -0800

Kafka Golang client introduction

At present, there are several commonly used Golang Kafka clients, each of which has its own advantages and disadvantages Client name Advantages and disadvantages sarama The number of users is relatively large, but it is relatively difficult to use, with better performance confluent-kafka-go The encapsulation of Kafka in C language has st ...

Posted by Stressed on Wed, 25 Dec 2019 21:56:55 -0800

Kubernetes+docker-DIY-kafka+zookeeper+manager cluster deployment

Foreword: Recent project of combining kafka and zookeeper containerization with rancher, I consulted related websites and books and found that there are many reasons why I chose to customize it if it is relatively strong to standardize the company. Let me briefly talk about the reasons why I chose to customize it finally: (Because k8s+kakfa te ...

Posted by harty83 on Fri, 20 Dec 2019 09:48:27 -0800

Kafka ou 2.12-2.0.0 installation process

1. Download Kafka? 2.12-2.0.0.tgz and upload it to / usr/local/kafka / Extract and remove the installation package 2. configuration Modify the config/server.properties of each server cd /usr/local/kafka/kafka_2.12-2.0.0/config vim server.properties Modify several places:broker.id : unique, fill in the number, 10,11,12,13,1 ...

Posted by N350CA on Sun, 15 Dec 2019 12:29:42 -0800