redshiftTools icon indicating copy to clipboard operation
redshiftTools copied to clipboard

S3 Bucket not found

Open aayushsahni opened this issue 6 years ago • 3 comments

Hi @sicarul,

I have been working on a solution to push tables from R to redshift. However, I have encountered an issue while using the rs_upsert_table command wherein I am getting the following error. I have all the credentials and aws bucket access, however, the code is still bugging out.

ERROR** Client error: (403) Forbidden Error in uploadToS3(df, bucket, split_files, access_key, secret_key, region) : Bucket does not exist Calls: rs_upsert_table -> uploadToS3 Execution halted ***ERROR END

Any inputs would be great here. Thanks!

Best, Aayush Sahni

aayushsahni avatar Oct 30 '19 17:10 aayushsahni

I've run into this issues a couple of time, and fixed it by forking the package internally and slightly modifying the uploadToS3 function to make it less stringent in the bucket checks. The items that fixed the issue for me were

  1. Setting the check_region kwarg as FALSE
  2. Splitting the bucket arg (which I need to pass a bucket/subfolder string into to get the tool to write files to the proper location) so that it only grabs the bucket part
uploadToS3 = function(data, bucket, split_files, key, secret, session, region){
  prefix=paste0(sample(rep(letters, 10),50),collapse = "")
  if(!bucket_exists(strsplit(bucket, "/")[[1]][1], key=key, secret=secret, session=session, region=region, check_region=FALSE)){
    stop("Bucket does not exist")
  }

andmatt avatar Nov 08 '19 21:11 andmatt

I cannot write also on the main bucket directory.

Would it be possible to specify a subfolder within the bucket? Otherwise will have to use "pandas_redshift" Python Library :(

Thanks! :)

jtelleriar avatar Dec 12 '19 09:12 jtelleriar

Hi everyone,

Ended up using pandas_redshift to solve my issue. Thanks @jtelleriar!

aayushsahni avatar Jan 11 '20 08:01 aayushsahni