premise
- Log in with the hive user and create a Hive database: mysql-uhive-phive under your own database
mysql>create database hive character set utf8; mysql>flush privileges; mysql>show databases;
install
- Start installing hive on hadoop1
tar -zxvf apache-hive-1.2.1-bin.tar.gz -C /usr/local/src/ cd /usr/local/src/ mv apache-hive-1.2.1-bin/ hive-1.2.1
-
Copy mysql-connector-java-5.1.25.jar to hive-1.2.1/lib/cp/usr/local/src/hive-1.2.1/lib/jline-2.12.jar/home/hadoop/apps/hadoop-2.8.0/share/hadoop/yarn/lib/
-
Configure / etc/profile environment variables
Open the / etc/profile file with the following command:
sudo vi /etc/profile -
Set the following parameters:
export HIVE_HOME=/usr/local/src/hive-1.2.1
export PATH=PATH:PATH:PATH:HIVE_HOME/bin
export CLASSPATH=CLASSPATH:CLASSPATH:CLASSPATH:HIVE_HOME/bin
Write source/etc/profile to your profile file -
Modify hive's configuration file
cd /usr/local/src/hive-1.2.1/conf/ cp hive-env.sh.template hive-env.sh cp hive-default.xml.template hive-site.xml
- hive defaults to derby database, derby data only runs a single user to connect, here need to be adjusted to mysql database - - 10.3.26.113 according to their actual IP to modify.
vi hive-site.xml
Modify the following
<property> <name>javax.jdo.option.ConnectionURL</name> <value>jdbc:mysql://10.3.26.113:3306/hive?=createDatabaseIfNotExist=true</value> <description>JDBC connect string for a JDBC metastore</description> </property> <property> <name>javax.jdo.option.ConnectionDriverName</name> <value>com.mysql.jdbc.Driver</value> <description>Driver class name for a JDBC metastore</description> </property> <property> <name>javax.jdo.option.ConnectionUserName</name> <value>hive</value> <description>Username to use against metastore database</description> </property> <property> <name>javax.jdo.option.ConnectionPassword</name> <value>hive</value> <description>password to use against metastore database</description> </property>
- Create a directory for configuration
mkdir /usr/local/src/hive-1.2.1/tmp
vi hive-site.xml
Modify the following
<property> <name>hive.querylog.location</name> <valuo>/usr/local/src/hive-1.2.1/tmp</value> </description>Location of Hive run time structured log file</description> </property> <property> <name>hive.exec.scratchdir</name> <value>/usr/local/src/hive-1.2.1/tmp</value> </property> <property> <name>hive.exec.local.scratchdir</name> <value>/usr/local/src/hive-1.2.1/tmp</value> </property> <property> <name>hive.downloaded.resources.dir</name> <value>/usr/local/src/hive-1.2.1/tmp</value> <description>Temporary local directory for added resources in the remote file system.</description> </property>
- Start Hadoop start-all.sh