Hive 问题解决 No enum constant org.apache.parquet.hadoop.metadata.CompressionCodecName.LZOP

Posted 扫地增

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了Hive 问题解决 No enum constant org.apache.parquet.hadoop.metadata.CompressionCodecName.LZOP相关的知识,希望对你有一定的参考价值。

背景

在正常开发中建表进行数据插入测试,突然报错
代码如下:

INSERT OVERWRITE TABLE dim_common.dim_common_product_..._sku
SELECT 
   id,
  .....
FROM ods_common.ods_common_product_..._sku

建表语句:

CREATE EXTERNAL TABLE dim_common.dim_common_product_..._sku(
  `id` bigint COMMENT '主键',
......)
COMMENT 
STORED AS parquet
LOCATION '/big-data/dim/common/dim_common_product_..._sku'
TBLPROPERTIES ('parquet.compression'='lzop');

报错如下

Query ID = xpc_20220713182549_fd51fa52-1d3d-469b-bea6-a74d74f8dce3
Total jobs = 3
Launching Job 1 out of 3
Number of reduce tasks determined at compile time: 1
In order to change the average load for a reducer (in bytes):
  set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
  set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
  set mapreduce.job.reduces=<number>
Starting Job = job_1656493749069_1202, Tracking URL = http://dw-name-1:8088/proxy/application_1656493749069_1202/
Kill Command = /opt/module/hadoop-3.1.3/bin/mapred job  -kill job_1656493749069_1202
Hadoop job information for Stage-1: number of mappers: 2; number of reducers: 1
2022-07-13 18:25:56,196 Stage-1 map = 0%,  reduce = 0%
2022-07-13 18:26:18,627 Stage-1 map = 100%,  reduce = 100%
Ended Job = job_1656493749069_1202 with errors
Error during job, obtaining debugging information...
Examining task ID: task_1656493749069_1202_m_000000 (and more) from job job_1656493749069_1202

Task with the most failures(4):
-----
Task ID:
  task_1656493749069_1202_m_000000

URL:
  http://0.0.0.0:8088/taskdetails.jsp?jobid=job_1656493749069_1202&tipid=task_1656493749069_1202_m_000000
-----
Diagnostic Messages for this Task:
Error: java.lang.RuntimeException: Hive Runtime Error while closing operators
	at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.close(ExecMapper.java:211)
	at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61)
	at org.apache.hadoop.hive.ql.exec.mr.ExecMapRunner.run(ExecMapRunner.java:37)
	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:465)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:349)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1729)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.IllegalArgumentException: No enum constant org.apache.parquet.hadoop.metadata.CompressionCodecName.LZOP
	at org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketFiles(FileSinkOperator.java:742)
	at org.apache.hadoop.hive.ql.exec.FileSinkOperator.closeOp(FileSinkOperator.java:1260)
	at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:733)
	at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:757)
	at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:757)
	at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:757)
	at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.close(ExecMapper.java:193)
	... 9 more
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.IllegalArgumentException: No enum constant org.apache.parquet.hadoop.metadata.CompressionCodecName.LZOP
	at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:285)
	at org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketForFileIdx(FileSinkOperator.java:780)
	at org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketFiles(FileSinkOperator.java:731)
	... 15 more
Caused by: java.lang.IllegalArgumentException: No enum constant org.apache.parquet.hadoop.metadata.CompressionCodecName.LZOP
	at java.lang.Enum.valueOf(Enum.java:238)
	at org.apache.parquet.hadoop.metadata.CompressionCodecName.valueOf(CompressionCodecName.java:26)
	at org.apache.parquet.hadoop.metadata.CompressionCodecName.fromConf(CompressionCodecName.java:39)
	at org.apache.hadoop.hive.ql.io.parquet.write.ParquetRecordWriterWrapper.initializeSerProperties(ParquetRecordWriterWrapper.java:119)
	at org.apache.hadoop.hive.ql.io.parquet.write.ParquetRecordWriterWrapper.<init>(ParquetRecordWriterWrapper.java:65)
	at org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat.getParquerRecordWriterWrapper(MapredParquetOutputFormat.java:137)
	at org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat.getHiveRecordWriter(MapredParquetOutputFormat.java:126)
	at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getRecordWriter(HiveFileFormatUtils.java:297)
	at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:282)
	... 17 more


FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask
MapReduce Jobs Launched:
Stage-Stage-1: Map: 2  Reduce: 1   HDFS Read: 0 HDFS Write: 0 FAIL
Total MapReduce CPU Time Spent: 0 msec

报错分析:

首先去掉了insert语句测试了查询语句发现正常执行,排除了查询语句错误。
添加insert之后报上错误,因此推测是集群压缩配置或者建表压缩配置有问题,因为并不是第一次建表,其他表均无异常。所以详细检查了建表语句最终发现压缩配置错误。
错误配置:

TBLPROPERTIES ('parquet.compression'='lzop');

修改为正确配置后解决:

TBLPROPERTIES ('parquet.compression'='lzo');

这里仅此小记,总结粗心教训,为大家解决问题提供思路。

以上是关于Hive 问题解决 No enum constant org.apache.parquet.hadoop.metadata.CompressionCodecName.LZOP的主要内容,如果未能解决你的问题,请参考以下文章

springboot集成swgger2错误解决(No enum constant org.springframework.web.bind.annotation.RequestMethod.GET,P

NO.2 尽量以const,enum,inline 替换 #define

hive 错误 FAILED: SemanticException [Error 10041]: No partition predicate found for

初次启动hive,解决 ls: cannot access /home/hadoop/spark-2.2.0-bin-hadoop2.6/lib/spark-assembly-*.jar: No su

ImportError: No module named 'enum'

Android开发基础规范