Content Range header is not calculated correctly for resumable uploader multiple requests
This is the output of the console.log
LOG bytes 0-262143/*
LOG upload chunk 1 {"isComplete": false, "transferredByteCount": 262144}
LOG bytes 262144-524287/*
LOG upload chunk 2 {"isComplete": false, "transferredByteCount": 524288}
LOG bytes 786432-1048575/*
LOG Big error in importing: HttpError: Invalid request. According to the Content-Range header, the upload offset is 786432 byte(s), which exceeds already uploaded size of 524288 byte(s).!
I think the problem lies on this line:
this.__transferredByteCount += transferredByteCount
in ResumableUploader, where the transferredByteAccount is aggregating.
I corrected it to
this.__transferredByteCount = transferredByteCount
const uploadFileInChunks = async (filePath,gdrive,name) => {
const CHUNK_SIZE = 256 * 1024;
try {
// Get file size first
const fileStats = await RNFS.stat(filePath);
const fileSize = fileStats.size;
// Initialize the uploader
const uploader = await gdrive.files
.newResumableUploader()
.setDataType(MimeTypes.BINARY)
.setShouldUseMultipleRequests(true)
.setRequestBody({
name: name || `${filePath.split('/').pop()}_${Date.now()}`,
// Uncomment and set folder ID if needed
// parents: ["folder_id"]
})
.execute();
// Set the total content length
uploader.setContentLength(fileSize);
// Upload file in chunks
let offset = 0;
let uploadComplete = false;
while (!uploadComplete) {
// Read chunk from file
const chunk = await RNFS.read(
filePath,
CHUNK_SIZE,
offset,
'base64'
);
if (!chunk || chunk.length === 0) break;
// Convert base64 chunk to buffer
const chunkBuffer = Buffer.from(chunk, 'base64');
// Upload chunk
console.log(`Uploading chunk at offset ${offset}`);
const uploadResponse = await uploader.uploadChunk(chunkBuffer);
console.log('Chunk upload response:', uploadResponse);
// Check upload status
const status = await uploader.requestUploadStatus();
console.log('Upload status:', status);
// Update offset
offset += chunkBuffer.length;
// Check if upload is complete
uploadComplete = offset >= fileSize;
}
// Get final status
const finalStatus = await uploader.requestUploadStatus();
console.log('Final upload status:', finalStatus);
// Get file metadata to retrieve ID
const metaData = await gdrive.files.getMetadata();
const id = metaData.files[0].id;
return { id, status: finalStatus };
} catch (error) {
console.error('Error uploading file:', error);
return {
success: false,
message: 'Upload failed',
error: error.message
};
}
};
Getting same error ERROR Error uploading file: [HttpError: Invalid request. According to the Content-Range header, the upload offset is 786432 byte(s), which exceeds already uploaded size of 524288 byte(s).]
Hi guys! Apologies, if I messed things up there. Currently I have no time for this project, sorry.
this.__transferredByteCount = transferredByteCount
This line fixed my issue @RobinBobin
Thanks @cip123
Guys, I retrofitted the library, please check it out. 2.2.3
Closing as stale