How to avoid registering hostname of Kafka broker machine into zookeeper

Cause, when using mirror-maker to test cluster copy data from production cluster top, the error is as follows: [2018-10-23 10:21:47,821] FATAL [mirrormaker-thread-2] Mirror maker thread failure due to  (kafka.tools.MirrorMaker$MirrorMakerThread) java.util.concurrent.ExecutionException: org.apache.kafka.common.errors.TimeoutEx ...

Posted by rish1103 on Sat, 26 Jan 2019 14:00:16 -0800

OkHttp Upload byte Array Developed by android

Recently, we are going to rebuild Android 9.0 compatibility for the old project. The previous app request was packaged package of org.apache.http.legacy.jar, which will not be accessible on the 9.0 simulator, so it needs to be rebuilt. My plan is to directly change the bottom of the package to okhttp, which is simple and fast. ...

Posted by vampke on Fri, 25 Jan 2019 18:00:14 -0800

Kafka-Kafka-Java extension query specifies all consumer-group s of topic

Reference article: https://www.bbsmax.com/A/n2d9bqDvzD/   The corresponding Kafka version: kafka_2.12-2.0.0.jar, namely scala 2.12, kafka 2.0.0 version.   Note the kafka version and recommend 1.0.0 + version. We usually use the following way to query the group.id of all consumption:   kafka-consumer-groups.sh --bootstra ...

Posted by speckledapple on Fri, 25 Jan 2019 15:21:13 -0800

hadoop garbage collection station

In production, the hdfs recycling bin must be open, usually set to 7 days. fs.trash.interval reserves time for the garbage collection station, and disables the function of the collection station if it is 0. The checkpoint time of fs.trash.checkpoint.interval recycling bin is generally set to be less than or equal to fs.trash.interval. If 0, t ...

Posted by NJordan72 on Fri, 25 Jan 2019 14:51:13 -0800

Hadoop 2.9.1 Install Hive 2.3.3 on Ubuntu 16.04

Hadoop 2.9.1 Install Hive 2.3.3 on Ubuntu 16.04 Preface http://hive.apache.org/downloads.html There are instructions. hadoop3.x Version needs hive3.0.0,and hadoop2.x Need hive2.3.3. Because of mine. hadoop It's 2..9,So choose to download hive2.3.3. Hive yes hadoop Tools, so you only need to install them NameNode Up, no need to install DataN ...

Posted by longtone on Fri, 25 Jan 2019 04:45:13 -0800

Java Api has no aggregation problem after writing Spark program reduceByKey (custom type as Key)

Writing Spark using Java Api If PairRDD's key value is a custom type, you need to override hashcode and equals methods, otherwise you will find that the same Key value is not aggregated. For example: Use User type as Key ​ public class User { private String name; private String age; public String getName() { return name; } pu ...

Posted by harrisonad on Thu, 24 Jan 2019 20:18:13 -0800

Spark-core Comprehensive Exercise-IP Matching

ip.txt data: 220.177.248.0 | 220.177.255.255 | 3702650880 | 3702652927 | Asia | China | Jiangxi | Nanchang | Telecom | 360100|China|CN|115.892151|28.676493 220.178.0.0 | 220.178.56.113 | 3702652928 | 3702667377 | Asia | China | Anhui | Hefei | Telecom | 340100|China|CN|117.283042|31.86119 220.178.56.114 | 220.178.57.33 | 37026 ...

Posted by penguin_powered on Thu, 24 Jan 2019 16:18:14 -0800

5.Spring Cloud's First Acquaintance--Hystrix Fuse

Preface: 1. Introduction to Hystrix In a distributed system, many dependencies inevitably fail to invoke, such as timeouts, exceptions, and so on. How to ensure that the whole service fails when a dependency fails is what Hystrix needs to do. Hystrix provides functions such as fuse, isolation, Fallback, cache, monitoring and so on. It can ensu ...

Posted by mrdonrule on Thu, 24 Jan 2019 14:48:13 -0800

6. First acquaintance of Spring Cloud - --- Zool routing

Preface: In the production environment, it is impossible for us to leak out the true information of each service, because it is too unsafe. We will choose to use the routing agent to forward the real service information. Create a new Zool: 1. Adding dependencies <?xml version="1.0" encoding="UTF-8"?> <project xmlns="http://maven.apache ...

Posted by jengle52 on Thu, 24 Jan 2019 14:33:15 -0800

Introduction to Chinese Word Segmentation in Lucene Notes 17-Lucene

I. The Function of Word Segmenter The function of word segmentation is to get a TokenStream stream, which stores some information related to word segmentation, and can get detailed information of word segmentation through attributes. 2. Custom Stop Segmenter package com.wsy; import org.apache.lucene.analysis.*; import org.a ...

Posted by programmingjeff on Thu, 24 Jan 2019 06:24:14 -0800