在hive托管表中执行合并操作后,我们在HDP版本的Hive表(Hive 3.1.0.3.1.0.0-78)中面临以下问题。
当我们选择它时,它会引发以下错误,
Caused by: java.lang.RuntimeException: java.io.IOException: java.io.IOException: Two readers for {originalWriteId: 20,bucket: 536870912(1.0.0),row: 397640,currentWriteId 35}: new [key={originalWriteId: 20,currentWriteId 35},nextRecord={2,20,536870912,397640,35,null},reader=Hive ORC Reader(hdfs://dl/warehouse/tablespace/managed/hive/dl_prod.db/sourcetbl/delete_delta_0000035_0000044/bucket_00001,9223372036854775807)],old [key={originalWriteId: 20,reader=Hive ORC Reader(hdfs://dl/warehouse/tablespace/managed/hive/dl_prod.db/sourcetbl/delete_delta_0000035_0000044/bucket_00000,9223372036854775807)]
INFO - at org.apache.hadoop.mapred.split.TezGroupedSplitsInputFormat$TezGroupedSplitsrecordReader.initNextRecordReader(TezGroupedSplitsInputFormat.java:206)
INFO - at org.apache.hadoop.mapred.split.TezGroupedSplitsInputFormat$TezGroupedSplitsrecordReader.<init>(TezGroupedSplitsInputFormat.java:145)
INFO - at org.apache.hadoop.mapred.split.TezGroupedSplitsInputFormat.getRecordReader(TezGroupedSplitsInputFormat.java:111)
INFO - at org.apache.tez.mapreduce.lib.MRReaderMapred.setupOldRecordReader(MRReaderMapred.java:157)
INFO - at org.apache.tez.mapreduce.lib.MRReaderMapred.setsplit(MRReaderMapred.java:83)
INFO - at org.apache.tez.mapreduce.input.MRInput.initFromEventInternal(MRInput.java:703)
INFO - at org.apache.tez.mapreduce.input.MRInput.initFromEvent(MRInput.java:662)
INFO - at org.apache.tez.mapreduce.input.MRInputLegacy.checkAndAwaitRecordReaderInitialization(MRInputLegacy.java:150)
INFO - at org.apache.tez.mapreduce.input.MRInputLegacy.init(MRInputLegacy.java:114)
INFO - at org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.getMRInput(MapRecordProcessor.java:532)
INFO - at org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.init(MapRecordProcessor.java:178)
INFO - at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:266)
这也作为错误登录在apache站点中。
https://issues.apache.org/jira/browse/HIVE-22318
有人可以遇到这样的问题吗,还有其他选择吗?