HDFS delegate token could not be found in Hive execution group by cache

When I run the hit query, an error occurs. When I execute the query group by statement, it throws an error: java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.tez.TezTask. Vertex failed, vertexName=Map 8, vertexId=vertex_1530387194612_0030_4_00, diagnostics=[Vertex ...

Posted by centerwork on Tue, 12 Mar 2019 04:57:25 -0700

Sqoop Incremental Import and Export and Job Operation Example

Incremental import Incremental import append for incremental columns # First import [root@node222 ~]# /usr/local/sqoop-1.4.7/bin/sqoop import --connect jdbc:mysql://192.168.0.200:3306/sakila?useSSL=false --table actor --where "actor_id < 50" --username sakila -P --num-mappers 1 --target-dir /tmp/hive/sqoop/actor_all ... 18/10/1 ...

Posted by zak on Sat, 02 Feb 2019 12:21:16 -0800

Hive Integrated HBase Detailed

Reproduced from: https://www.cnblogs.com/MOBIN/p/5704001.html 1. Create HBase tables from Hive Create a Live table pointing to HBase using the HQL statement CREATE TABLE hbase_table_1(key int, value string) //Table name hbase_table_1 in Hive STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' //Designated Storage P ...

Posted by maxpagels on Sat, 02 Feb 2019 02:45:15 -0800

Initialization failure of hiveMetastore metadata base: java.io.IOException: Schema script failed, errorcode 2

Traceback (most recent call last): File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_metastore.py", line 259, in <module> HiveMetastore().execute() File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 280, in execute method(env) File ...

Posted by aceconcepts on Thu, 31 Jan 2019 16:00:15 -0800

Call From hadoop01/192.168.80.128 to 0.0.0.0:10020 failed on connection exception about hive execution mr

Today, 42 jobs were enabled when using hive to perform mr. Halfway through the execution, the following errors were reported suddenly. I have never encountered them before. I don't know if it is the reason why there are too many jobs. The error prompt was that port 10020 could not be accessed from the host. Check the reason on ...

Posted by pha3dr0n on Thu, 31 Jan 2019 13:30:15 -0800

Java API Operation of Hive in the Big Data Introduction Tutorial Series

To access Hive, Java needs to connect Hive by beeline. hiveserver2 provides a new command-line tool, beeline. hiveserver2 upgrades the previous hive, which has more powerful functions. It adds permission control. To use beeline, you need to start hiverserver2 first, and then use beeline connection. Operation steps: (1) Modi ...

Posted by blacklotus on Wed, 30 Jan 2019 02:54:15 -0800

Hadoop 2.9.1 Install Hive 2.3.3 on Ubuntu 16.04

Hadoop 2.9.1 Install Hive 2.3.3 on Ubuntu 16.04 Preface http://hive.apache.org/downloads.html There are instructions. hadoop3.x Version needs hive3.0.0,and hadoop2.x Need hive2.3.3. Because of mine. hadoop It's 2..9,So choose to download hive2.3.3. Hive yes hadoop Tools, so you only need to install them NameNode Up, no need to install DataN ...

Posted by longtone on Fri, 25 Jan 2019 04:45:13 -0800

Development and Use of UDF by hive

Recently, there is a need for data mining, which requires statistics of the number of things given in the vicinity of longitude and latitude n kilometers. When it comes to calculating the distance between two points of the earth, UDF should be written to calculate the distance between two points of the earth. I. UDF Writing A ...

Posted by HairyScotsman on Sun, 20 Jan 2019 08:12:12 -0800

Using sqoop, data is imported from mysql to hdfs, hbase, hive

The first part is the reprinted article, which introduces the ways and means of transferring data from mysql to hdfs, hbase and hive. The following is an example of my own guiding mysql data to HDFS in practical projects for readers'reference. 1. Testing MySQL connections bin/sqoop list-databases --connect jdbc:mysql://192.16 ...

Posted by varun_146100 on Mon, 07 Jan 2019 12:06:09 -0800