bigdata-docker-compose icon indicating copy to clipboard operation
bigdata-docker-compose copied to clipboard

Hive COUNT query fails with both Spark and MapReduce execution engines: Spark client creation failed and MapRedTask returns error code 2

Open fwv opened this issue 4 months ago • 1 comments

hive> SELECT COUNT(*) FROM grades; Query ID = root_20251008102316_5a87bba2-fac6-467e-8d5c-07233e835b4d Total jobs = 1 Launching Job 1 out of 1 In order to change the average load for a reducer (in bytes): set hive.exec.reducers.bytes.per.reducer= In order to limit the maximum number of reducers: set hive.exec.reducers.max= In order to set a constant number of reducers: set mapreduce.job.reduces= Failed to execute spark task, with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create Spark client for Spark session f362e80f-06ec-4f77-b641-5778c9bfc6bb)' FAILED: Execution Error, return code 30041 from org.apache.hadoop.hive.ql.exec.spark.SparkTask. Failed to create Spark client for Spark session f362e80f-06ec-4f77-b641-5778c9bfc6bb

hive> set hive.execution.engine=mr; Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases. hive> SELECT COUNT(*) FROM grades; Query ID = root_20251008102445_b12027f7-5bf6-4510-9d64-3dd19073cb7b Total jobs = 1 Launching Job 1 out of 1 Number of reduce tasks determined at compile time: 1 In order to change the average load for a reducer (in bytes): set hive.exec.reducers.bytes.per.reducer= In order to limit the maximum number of reducers: set hive.exec.reducers.max= In order to set a constant number of reducers: set mapreduce.job.reduces= Starting Job = job_1759918824194_0001, Tracking URL = http://master:8088/proxy/application_1759918824194_0001/ Kill Command = /usr/hadoop/bin/mapred job -kill job_1759918824194_0001 Hadoop job information for Stage-1: number of mappers: 0; number of reducers: 0 2025-10-08 10:24:51,155 Stage-1 map = 0%, reduce = 0% Ended Job = job_1759918824194_0001 with errors Error during job, obtaining debugging information... FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask MapReduce Jobs Launched: Stage-Stage-1: HDFS Read: 0 HDFS Write: 0 FAIL Total MapReduce CPU Time Spent: 0 msec

fwv avatar Oct 08 '25 10:10 fwv