Hive 6Hive DML(Data Manipulation Language)

Posted

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了Hive 6Hive DML(Data Manipulation Language)相关的知识,希望对你有一定的参考价值。

  DML主要是对Hive 表中的数据进行操作的(增 删 改),但是由于Hadoop的特性,所以单条的修改、删除,其性能会非常的低所以不支持进行级操作; 

  主要说明一下最常用的批量插入数据较为常用的方法:

  1、从文件中加载数据

语法:LOAD DATA [LOCAL] INPATH filepath [OVERWRITE] INTO TABLE tablename [PARTITION (partcol1=val1, partcol2=val2 ...)]

  例:

load data local inpath /opt/data.txt overwrite into table table1;
-- 如果文件存放在HDFS中,则不需要写Local 

  

  2、从其他表中插入数据

  

语法:Standard syntax:
INSERT OVERWRITE TABLE tablename1 [PARTITION (partcol1=val1, partcol2=val2 ...) [IF NOT EXISTS]] select_statement1 FROM from_statement;
INSERT INTO TABLE tablename1 [PARTITION (partcol1=val1, partcol2=val2 ...)] select_statement1 FROM from_statement;
 
Hive extension (multiple inserts):
FROM from_statement
INSERT OVERWRITE TABLE tablename1 [PARTITION (partcol1=val1, partcol2=val2 ...) [IF NOT EXISTS]] select_statement1
[INSERT OVERWRITE TABLE tablename2 [PARTITION ... [IF NOT EXISTS]] select_statement2]
[INSERT INTO TABLE tablename2 [PARTITION ...] select_statement2] ...;
FROM from_statement
INSERT INTO TABLE tablename1 [PARTITION (partcol1=val1, partcol2=val2 ...)] select_statement1
[INSERT INTO TABLE tablename2 [PARTITION ...] select_statement2]
[INSERT OVERWRITE TABLE tablename2 [PARTITION ... [IF NOT EXISTS]] select_statement2] ...;
 
Hive extension (dynamic partition inserts):
INSERT OVERWRITE TABLE tablename PARTITION (partcol1[=val1], partcol2[=val2] ...) select_statement FROM from_statement;
INSERT INTO TABLE tablename PARTITION (partcol1[=val1], partcol2[=val2] ...) select_statement FROM from_statement;

  例:

FROM page_view_stg pvs
INSERT OVERWRITE TABLE page_view PARTITION(dt=2008-06-08, country)
SELECT pvs.viewTime, pvs.userid, pvs.page_url, pvs.referrer_url, null, null, pvs.ip, pvs.cnt

 

以上是关于Hive 6Hive DML(Data Manipulation Language)的主要内容,如果未能解决你的问题,请参考以下文章

Hive 官方手册翻译 -- Hive DML(数据操纵语言)

大数据技术之Hive基本概念安装数据类型

大数据技术之Hive基本概念安装数据类型

Hive基于命令行的DDL和DML

Hive基础sql语法(DML)

Hive中实现group concat功能(不用udf)