Common operations of docker swarm

1. description This document is for docker swarm operations. The system for is a local test system. The machine information is as follows, 172.16.1.13 is the manager of docker swarm.   Machine list information for local test: host name Simulated extranet Intranet IP To deploy a module mini01 10.0.0.11 172.16.1.11 tomcat [swar ...

Posted by ryanschefke on Tue, 10 Dec 2019 11:09:12 -0800

###Observer of coprocessor of hbase coprocessor

Customize JAVA class and inherit BaseRegionObserver package com.charley.example.hbase2es; import org.apache.hadoop.hbase.Cell; import org.apache.hadoop.hbase.CellUtil; import org.apache.hadoop.hbase.CoprocessorEnvironment; import org.apache.hadoop.hbase.client.Durability; import org.apache.hadoop.hbase.client.Put; import ...

Posted by kurtsu on Tue, 10 Dec 2019 04:29:07 -0800

Big data tutorial (8.2) wordcount program principle and code implementation / operation

Last blog shared mapreduce's programming ideas. In this section, the blogger will take his friends to understand the principle of wordcount program and the details of code implementation / operation. Through this section, we can have a general understanding of map reduce program. In fact, map and reduce programs in hadoop are only two of them. ...

Posted by jek on Thu, 05 Dec 2019 09:47:53 -0800

Hive Installation, Configuration, and Use

Overview of Hive Hive is a Hadoop-based data warehouse tool that maps structured data files to a table and provides SQL-like query capabilities. Hive is essentially a MapReduce program that converts HQL. Data processed by Hive is stored in HDFS, and the underlying implementation of analytic data can be MapReduce, tes, or Spark, with its executo ...

Posted by Wayniac on Tue, 03 Dec 2019 17:47:05 -0800

HBase installation configuration, using independent zookeeper

1.HBase installation configuration, using independent zookeeper 2. Modify environment variables:The first machine is planned to be master, the second machine is RegionServer, and then start RegionServer on the first machineMake a cluster1 master2 RegionServer Execute vi /etc/profile on the machine, and add the following:export HBASE_HOME=/usr/ ...

Posted by vadercole on Tue, 03 Dec 2019 01:49:01 -0800

Hadoop MapReduce case - statistics of mobile phone uplink traffic, downlink traffic and total traffic

The format of the log is as follows: required fields, the second column: mobile number (user), the third last column: uplink traffic, the second last column: downlink popular Train of thought: Encapsulate the uplink traffic, downlink traffic and total traffic into an object bean. In the map, context.write (cell phone number ...

Posted by cloudbase on Fri, 29 Nov 2019 09:44:30 -0800

Hadoop HA double namenode construction

Machine distribution hadoop1 192.168.56121 hadoop2 192.168.56122 hadoop3 192.168.56123 Preparing the installation package jdk-7u71-linux-x64.tar.gz zookeeper-3.4.9.tar.gz hadoop-2.9.2.tar.gz Upload the installation package to the / usr/local directory of three machines and extract it Configure hosts echo "192.168.56.121 hadoop1" >> /etc/ ...

Posted by bungychicago on Fri, 29 Nov 2019 09:16:17 -0800

Running Hadoop in Docker and image making

                    ; The Hadoop cluster depends on the following software: jdk, ssh, etc., so as long as these two items and Hadoop related packages are put into the image; Profile preparation 1. Hadoop related configuration files: core-site.xm ...

Posted by Elhombrebala on Mon, 25 Nov 2019 06:31:54 -0800

Configuring Hadoop in Linux Environment

Ubuntu 16 configuration Hadoop 2.85 Set ssh password free login sudo apt-get install openssh-server #Install SSH server $ ssh localhost #Log in SSH, enter yes for the first time $ exit #Exit ssh localhost logged in $ cd ~/.ssh/ #If you ...

Posted by accu on Sat, 23 Nov 2019 12:30:15 -0800

Eclipse integrated hadoop plug-in development environment

First, set up the hadoop environment under win10, hadoop can runExtract the installation package and source package of Hadoop 2.7.7, create an empty directory after decompression, and copy all the jar packages under the unpacked source package and other packages except kms directory package under share/hadoop under the installation package to ...

Posted by ThunderVike on Tue, 19 Nov 2019 13:38:13 -0800