azure-functions-java-worker icon indicating copy to clipboard operation
azure-functions-java-worker copied to clipboard

RpcHttpRequestDataSource cannot be cast to java.lang.String #255

Open ravikanth534 opened this issue 6 years ago • 2 comments

This issue is still reproducible for HttpTrigger.

I'm trying to upload a file to azure storage through azure fucntion. I was successful in uploading plain text file but the files are getting corrupted for any other type of files. What I observed is that the bytes that I'm receiving are lesser than the actual size(bodyLength < contentLength).

I tried to change the request data type to HttpRequestMessage<Optional<byte[]>>, HttpRequestMessage<byte[]> and Byte[] which is throwing cannot conver to string error as reported in https://github.com/Azure/azure-functions-java-worker/issues/239

@FunctionName("UploadFile") public HttpResponseMessage run(@HttpTrigger(name = "req", methods = { HttpMethod.GET, HttpMethod.POST }, authLevel = AuthorizationLevel.FUNCTION) HttpRequestMessage<Optional<'String'>> request, final ExecutionContext context) throws InvalidKeyException, URISyntaxException, StorageException, IOException {

    CloudStorageAccount storageAccount = CloudStorageAccount.parse(_storageConnString);

    CloudBlobClient blobClient = storageAccount.createCloudBlobClient();
    CloudBlobContainer blobContainer = blobClient.getContainerReference(_containerName);
        
    CloudBlockBlob blob = blobContainer.getBlockBlobReference(fileName);
    try {
       
        String body = request.getBody().get(); 
        long bodyLength = body.length();
        String contentLength = request.getHeaders().get("content-length");
        InputStream inputStream = new ByteArrayInputStream(body.getBytes());
        blob.upload(inputStream, Integer.parseInt(bodyLength));
       
    } catch (Exception ex) {
        return request.createResponseBuilder(HttpStatus.BAD_REQUEST).body(ex.getMessage()).build();
    }

    return request.createResponseBuilder(HttpStatus.OK).body("File uploaded successfully").build();

}

My requirement is to upload large files to storage. Any help can be appreciated.

ravikanth534 avatar Sep 06 '19 22:09 ravikanth534

This error STILL happens on latest (4.0.4785) - what's the fix for accepting byte[] as input?

For reference, here is the Trigger:

@HttpTrigger(name = "request", dataType = "binary", methods = {
                    HttpMethod.POST,
                    HttpMethod.PUT }, authLevel = AuthorizationLevel.ANONYMOUS) Optional<byte[]> request,

Also tried with HttpRequestMessage<>, no difference.

dviry avatar Sep 27 '22 21:09 dviry

I'm experiencing the same and not finding a way to receive a plain old byte[] in the request body. Conversion between String and byte[] is leading to data corruption for several binary file types.

@FunctionName("UploadFile")
public HttpResponseMessage run(
    @HttpTrigger(name = "req", 
        methods = {HttpMethod.POST }, 
        authLevel = AuthorizationLevel.ANONYMOUS, 
        dataType="binary") 
        HttpRequestMessage<byte[]> request,
        final ExecutionContext context) throws IOException {

When sending a docx file as octet-stream in the request body, I get: Failure Exception: ClassCastException: Cannot convert com.microsoft.azure.functions.worker.binding.RpcHttpRequestDataSource@3510fa54to type com.microsoft.azure.functions.HttpRequestMessage<byte[]> Stack: java.lang.ClassCastException: Cannot convert com.microsoft.azure.functions.worker.binding.RpcHttpRequestDataSource@3510fa54to type com.microsoft.azure.functions.HttpRequestMessage<byte[]> at com.microsoft.azure.functions.worker.binding.DataOperations.generalAssignment(DataOperations.java:191) at com.microsoft.azure.functions.worker.binding.DataOperations.apply(DataOperations.java:120) at com.microsoft.azure.functions.worker.binding.DataSource.computeByType(DataSource.java:56) at com.microsoft.azure.functions.worker.binding.RpcHttpRequestDataSource.computeByType(RpcHttpRequestDataSource.java:20)

raheel-aidrus-usps avatar Apr 04 '24 03:04 raheel-aidrus-usps