Hadoop installation steps
hadoop Download address: http://mirrors.advancedhosters.com/apache/hadoop/common/
1. Install JDK and hadoop (jdk requires version 1.7 or more)
The user installing hadoop is root
Create installation directory and unzip related files:
mkdir /usr/local/java mkdir /usr/local/hadoop mkdir /usr/local/hadoop/hadoop_tmp tar -zxvf jdk-8u201-linux-x64.tar.gz cd jdk1.8.0_201 mv * /usr/local/java tar -zxvf hadoop-3.0.3.tar.gz cd hadoop-3.0.3 mv * /usr/local/hadoop/
Configuring environment variables
vi /etc/profile
Finally add the following:
export JAVA_HOME=/usr/local/java export HADOOP_HOME=/usr/local/hadoop export HADOOP_MAPRED_HOME=$HADOOP_HOME export HADOOP_COMMON_HOME=$HADOOP_HOME export HADOOP_HDFS_HOME=$HADOOP_HOME export YARN_HOME=$HADOOP_HOME export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin export HADOOP_INSTALL=$HADOOP_HOME export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib:$HADOOP_COMMON_LIB_NATIVE_DIR"
Output valid:
source /etc/profile
3. Modify Profile
vi /usr/local/hadoop/etc/hadoop/hadoop-env.sh
Finally add:
export JAVA_HOME=/usr/local/java
vi /usr/local/hadoop/etc/hadoop/core-site.xml
Add between <configuration>and </configuration>tags:
<property> <name>fs.default.name</name> <value>hdfs://localhost:9000</value> </property> <property> <name>hadoop.tmp.dir</name> <value>/usr/local/hadoop/tmp</value> </property>
vi /usr/local/hadoop/etc/hadoop/hdfs-site.xml
Add between <configuration>and </configuration>tags:
<property> <name>dfs.replication</name> <value>1</value> </property> <property> <name>dfs.http.address</name> <value>192.168.56.123:50070</value> </property>
Note: 192.168.56.123 is a local IP address, set according to different circumstances
vi /usr/local/hadoop/etc/hadoop/yarn-site.xml
Add between <configuration>and </configuration>tags:
<property> <name>yarn.nodemanager.aux-services</name> <value>mapreduce_shuffle</value> </property>
vi /usr/local/hadoop/etc/hadoop/mapred-site.xml
Add between <configuration>and </configuration>tags:
<property> <name>mapreduce.framework.name</name> <value>yarn</value> </property>
Add the following parameters to the top of the files / usr/local/hadoop/sbin/start-dfs.sh, stop-dfs.sh:
HDFS_DATANODE_USER=root HDFS_DATANODE_SECURE_USER=hdfs HDFS_NAMENODE_USER=root HDFS_SECONDARYNAMENODE_USER=root
Add the following parameters to the top of /usr/local/hadoop/sbin/start-yarn.sh, stop-yarn.sh:
YARN_RESOURCEMANAGER_USER=root HADOOP_SECURE_DN_USER=yarn YARN_NODEMANAGER_USER=root
4. Verify hadoop
1. Set up native ssh passwordless login
ssh-keygen -t rsa
Then all return, with default values
cat /root/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
2. Format the file system
hdfs namenode -format
3. Start
start-dfs.sh
4. Enter in the browser
localhost:50070 localhost:8088
5. Problem Resolution
1. Warning: WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Resolution: vi/usr/local/hadoop/etc/hadoop/log4j.properties, and finally add: log4j.logger.org.apache.hadoop.util.NativeCodeLoader=ERROR
2. Warning:Permanently added (RSA) to the list of known hosts pops up when the host logs on Using ssh password-free, which is very difficult to see.Solve it by:
vim/etc/ssh/ssh_config (master and slave1 both need to be set)
Find #StrictHostKeyChecking ask to uncomment and change the ask to no
3. Set auto-start for power on
vi /etc/rc.d/rc.local
/usr/local/hadoop/sbin/start-all.sh
Hbase installation steps
HBase Download: https://mirrors.tuna.tsinghua.edu.cn/apache/hbase/
1. Create a hadoop user
useradd hadoop mkdir /usr/local/hbase chown hadoop:hadoop /usr/local/hbase
The user installing hbase is hadoop
2. Create installation directory and unzip files
su - hadoop cd /usr/local/hbase/ cp /usr/hbase-2.0.4-bin.tar.gz tar -zxvf hbase-2.0.4-bin.tar.gz
- Setting environment variables
export HBASE_HOME=/usr/local/hbase vi /usr/local/hbase/conf/hbase-env.sh export JAVA_HOME=/usr/local/java export HBASE_MANAGES_ZK=true
su – root vi /etc/profile export HBASE_HOME=/usr/local/hbase export HBASE_CONF_DIR=$HBASE_HOME/conf export HBASE_CLASS_PATH=$HBASE_CONF_DIR export PATH=$PATH:$HBASE_HOME/bin
4. Create a directory for HBase to store data and set it in the HBase configuration file
mkdir -p /usr/local/hbase/var/hbase
vi /usr/local/hbase/conf/hbase-site.xml
<configuration> <property> <name>hbase.rootdir</name> <value>file:///usr/local/hbase/var/hbase</value> </property> </configuration>
5. Start HBase in single machine mode
cd $HBASE_HOME/bin ./start-hbase.sh vi .bash_profile export HBASE_HOME=/usr/local/hbase export PATH=$HBASE_HOME/bin:$PATH export HBASE_CLASSPATH=/usr/local/hbase/conf
6. Enter the web address in the browser to verify
localhost:16010
7. Log on to HBase
$ hbase shell
2017-07-20 09:33:19,959 WARN [main] util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
HBase Shell; enter 'help<RETURN>' for list of supported commands.
Type "exit<RETURN>" to leave the HBase Shell
Version 1.2.6, rUnknown, Mon May 29 02:25:32 CDT 2017
hbase(main):001:0>