CDH6.2.1的hive 2.1.1升级到2.3.9后的beeline报错处理

Posted

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了CDH6.2.1的hive 2.1.1升级到2.3.9后的beeline报错处理相关的知识,希望对你有一定的参考价值。

ERROR [main] org.apache.hadoop.hive.ql.exec.Utilities: Failed to load plan: hdfs://nameservice1/tmp/hive/hdfs/51655939-06db-4020-b7b8-122cc3af9c97/hive_2022-11-25_12-52-19_741_7496984142017234486-3/-mr-10022/99a58a0d-7839-4039-aeba-208a0f923425/map.xml
org.apache.hive.com.esotericsoftware.kryo.KryoException: java.lang.ArrayIndexOutOfBoundsException: -2
Serialization trace:
pathToAliases (org.apache.hadoop.hive.ql.plan.MapWork)
at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:144)
at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:686)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:206)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializeObjectByKryo(SerializationUtilities.java:607)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializePlan(SerializationUtilities.java:494)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializePlan(SerializationUtilities.java:471)
at org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:442)
at org.apache.hadoop.hive.ql.exec.Utilities.getMapWork(Utilities.java:313)
at org.apache.hadoop.hive.ql.io.HiveInputFormat.init(HiveInputFormat.java:394)
at org.apache.hadoop.hive.ql.io.HiveInputFormat.pushProjectionsAndFilters(HiveInputFormat.java:665)
at org.apache.hadoop.hive.ql.io.HiveInputFormat.pushProjectionsAndFilters(HiveInputFormat.java:658)
at org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getRecordReader(CombineHiveInputFormat.java:692)
at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.<init>(MapTask.java:175)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:444)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:349)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)
Caused by: java.lang.ArrayIndexOutOfBoundsException: -2
at java.util.ArrayList.elementData(ArrayList.java:422)
at java.util.ArrayList.get(ArrayList.java:435)
at org.apache.hive.com.esotericsoftware.kryo.util.MapReferenceResolver.getReadObject(MapReferenceResolver.java:60)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readReferenceOrNull(Kryo.java:834)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:788)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClassAndObject(SerializationUtilities.java:176)
at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:153)
at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:39)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:790)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClassAndObject(SerializationUtilities.java:176)
at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:153)
at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:39)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:214)
at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125)

原因是因为hue使用的beeline连的,使用hive命令不会报错

解决方案:

/etc/profile 增加升级后hive的环境变更,引用最新的beeline

export JAVA_HOME=/usr/java/jdk1.8.0_181-cloudera
export JRE_HOME=/usr/java/jdk1.8.0_181-cloudera/jre
export CLASSPATH=.:$JAVA_HOME/lib:$JRE_HOME/lib
export HADOOP_CONF_DIR=/etc/hadoop/conf
export HADOOP_CLASSPATH=hadoop classpath
export HIVE_HOME=/opt/cloudera/parcels/CDH/lib/hive
export HIVE_CONF_DIR=/etc/hive/conf
export PATH=$JAVA_HOME/bin:$JRE_HOME/bin:$HIVE_HOME/bin:$PATH


以上是关于CDH6.2.1的hive 2.1.1升级到2.3.9后的beeline报错处理的主要内容,如果未能解决你的问题,请参考以下文章

大数据问题排查系列 - HIVE踩坑记

大数据线上问题排查系列 - HIVE 踩坑记

Kettle Elasticsearch插件升级

Hive0.13到Hive2.1跨版本升级全姿势

无法在 cloudera VM 中将 hive 升级到 2.3.2

将 Doctrine 从 2.0.6 升级到 2.3