xml file bad instructions do not allow matching

xml file bad instructions do not allow matching Problem description The xml file in Eclipse reported the following error: The processing instruction target matching "[xX][mM][lL]" is not allowed. The error file is empty-configuration.xml in hadoop-common project in Hadoop 2.7.1 source code. The directory of this file in the project is as f ...

Posted by Rayman3.tk on Fri, 03 Jan 2020 21:27:30 -0800

18 19 20 py if statement comparison operator assertions

The fourth lesson Conditional statement( if,else and elif) # coding:utf-8 python The language is defined by indented code blocks # Conditional statements (if, else, and elif) ''' if logic_expression: statement1 statement2 statement3 ... ... statementn elif logic_expression: statement1 statement2 ... ... ...

Posted by berbbrown on Fri, 03 Jan 2020 16:59:05 -0800

Build hadoop HA high availability

Step 1: Cluster Planning Step 2: set hosts Step3: turn off the firewall Step 4: turn off Selinux Step5: Keyless login Step 6: install jdk #decompression tar -xvf jdk-8u131-linux-x64.tar.gz mv jdk1.8.0_131 /usr/local/jdk1.8 #Setting environment variables vim /etc/profile JAVA_HOME=/usr/local/jdk1.8/ JAVA_BIN=/usr/local/ ...

Posted by scriptkiddie on Fri, 03 Jan 2020 09:30:56 -0800

Integrating hbase Library in yii2 framework of php

Hbase provides multilingual calls through thrift, a cross language RPC framework. Hbase has two sets of thrift interfaces (thrift1 and thrift2), but they are not compatible. According to the official documents, thrift1 is likely to be abandoned. This article takes thrift2 integration as an example. 1. Visit http://thrift.apache.org/download to ...

Posted by depojones on Tue, 31 Dec 2019 22:15:31 -0800

nginx deployment and installation

1. When learning ngnix, there is no need to install it. In fact, the installation is very simple. A shell script can do it. Please refer to the following Use the root user to execute the nginx-install.sh script, which is as follows: #!/bin/bash set -o nounset basedir=$(cd "$(dirname "$0")"; pwd) # Set user name and password sys_user=hadoop sy ...

Posted by bk6662 on Mon, 30 Dec 2019 12:26:23 -0800

The way of Hadoop learning Mapreduce program completes wordcount

Test text data used by the program: Dear River Dear River Bear Spark Car Dear Car Bear Car Dear Car River Car Spark Spark Dear Spark 1 main categories (1) Maper class The first is the custom Maper class code public class WordCountMap extends Mapper<LongWritable, Text, Text, IntWritable> { public void map(LongWritable key, Text val ...

Posted by SundayDriver on Fri, 27 Dec 2019 02:19:33 -0800

UDF Writing for Struct Complex Data Type and GenericUDF Writing

1. Background introduction: With the upgrade of MaxCompute version 2.0, the data types supported by Java UDF have expanded from BIGINT, STRING, DOUBLE, BOOLEAN to more basic data types, as well as complex types such as ARRAY, MAP, STRUCT, and Writable parameters.Java UDF uses a method of complex data types, STRUCT corresponds to com.aliyun.odps ...

Posted by mnewbegin on Mon, 23 Dec 2019 19:19:12 -0800

9. User defined FileOutputFormat classification output

1, demand The output path of MR's map and reduce is the path specified by FileOutPutFormat.setOutPutPath(), but sometimes the code needs to output the results by classification, such as the error information output to one file, and the correct output to another file. In this way, you need to customize to override a FileOutPutFormat class to cla ...

Posted by jonathanellis on Sat, 14 Dec 2019 12:29:26 -0800

Everything is a map: the essence of code

Dynamic execution of a simple code, using the generation of java files, call javac compilation, reflection of the way of execution. Use the I / O stream (or what you might say is to use reflection to get the program results to parse) to parse *. Java files. You can then use runtime to call the java compile command under Dos to comp ...

Posted by olko on Thu, 12 Dec 2019 10:17:19 -0800

Using docker to install Hadoop and Spark

Using docker configuration to install hadoop and spark Install hadoop and spark images respectively Install hadoop image docker selected Mirror Address , the version of hadoop provided by this image is relatively new, and jdk8 is installed, which can support the installation of the latest version of spark. docker pull uhopp ...

Posted by shantred on Tue, 10 Dec 2019 22:46:53 -0800