Venkatasubrahmanian Narayanan
Venkatasubrahmanian Narayanan
> If my understanding is correct, Hive/Pig would use the value from `mapreduce.parent.job.id` to set the correct committer UUID right? Yes, that was the plan. The property name was just...
@shameersss1 We could actually just set fs.s3a.committer.uuid directly instead of the indirection through the other setting.
@abstractdog @shameersss1 Is there anything else needed?
@steveloughran Most of my test failures were fixed once I rebased to current trunk. I ran the hadoop-aws ITests on an AWS EMR cluster (in us-east-1) against S3 us-east-1 ,...
@yigress So for my implementation I just tried to match what Hive does in its existing implementation. If a sequential job wants to have different extensions for different tables, it...
> @VenkatSNarayanan Please check the failed tests. https://ci.hive.apache.org/blue/organizations/jenkins/hive-precommit/detail/PR-5591/2/tests. Thx I can't seem to access that page, do I need to do something to see it?
@deniskuzZ Fixed the MSCK test failures. There is an Iceberg test failure but it seems unrelated to the changes I've made (it complains about some output being too long).
@steveloughran Then it might just be an issue with my local setup? There is nothing else from my end to report if the automated tests are all passing.
@steveloughran I did post the stacktraces in the JIRA earlier, I'd assumed you were responding after seeing those, guess it fell between the cracks. I'll have to change my setup...
@steveloughran Managed to run the ITests with the SimpleAWSCredentialsProvider, and the aforementioned failures don't show up (aside from the timeout error). There is a different test that fails with this...