Incompatibility with createUploadUrl and dev app server
Using App Engine SDK 1.9.48 with Maven project and this library
<dependency>
<groupId>com.google.appengine.tools</groupId>
<artifactId>appengine-gcs-client</artifactId>
<version>0.6</version>
</dependency>
I implementing a flow which will use an upload-url generated like this
import com.google.appengine.api.blobstore.BlobstoreService;
import com.google.appengine.api.blobstore.BlobstoreServiceFactory;
import com.google.appengine.api.blobstore.UploadOptions;
String bucket = ...;
String callback = ...;
UploadOptions uo = UploadOptions.Builder.withGoogleStorageBucketName(bucket);
BlobstoreService bs = BlobstoreServiceFactory.getBlobstoreService();
return bs.createUploadUrl(callback, uo)
The generated url is something like this
http://localhost:8888/_ah/upload/aghhcGUtYmFzZXIiCxIVX19CbG9iVXBsb2FkU2Vzc2lvbl9fGICAgICAgMAJDA
The callback is implemented like this
import com.google.appengine.api.blobstore.BlobstoreServiceFactory;
import com.google.appengine.api.blobstore.FileInfo;
import com.google.appengine.tools.cloudstorage.GcsFileMetadata;
import com.google.appengine.tools.cloudstorage.GcsFilename;
import com.google.appengine.tools.cloudstorage.GcsInputChannel;
import com.google.appengine.tools.cloudstorage.GcsService;
import com.google.appengine.tools.cloudstorage.GcsServiceFactory;
Map<String, List<FileInfo>> fileInfos = BlobstoreServiceFactory.getBlobstoreService().getFileInfos(request);
FileInfo file = ... get fileinfo from map ...;
String gsObjectName = file.getGsObjectName();
GcsFilename uploadedFile = ... parse gsObjectName to GcsFilename object ...;
The toString() of the file object, is like this
{file=[<FileInfo: contentType = image/jpeg, creation = Fri Dec 23 10:42:59 CET 2016, filename = animals_030.jpg, size = 84015, md5Hash = ad40a7e4a6028bb4192e8189a7f09ab1, gsObjectName = /gs/my_bucket/hmgtHXafqTDbYkY6XkfWNw>]}
I tried reading the file in memory with openReadChannel method, but I get this error
java.io.FileNotFoundException: com.google.appengine.tools.cloudstorage.dev.LocalRawGcsService@1dc2b580: No such file: GcsFilename(my_bucket, hmgtHXafqTDbYkY6XkfWNw)
I also tried getting the file metadata with getMetadata method, but the GcsFileMetadata object is null
Is the uploadUrl flow fully integrated with the local GcsService object?
Does it work only on online environment?
While I wait for an answer, I temporary solved the problem using this workaround.
I load the file in memory using BlobstoreService and then write it back using GcsService, like follows:
private void localhostCopyFile(FileInfo fileInfo, GcsFilename destinationFile) throws IOException {
if (SystemProperty.Environment.Value.Development == SystemProperty.environment.value()) {
LOG.info("File " + fileInfo.getGsObjectName() + " will be copied");
BlobstoreService blobstoreService = BlobstoreServiceFactory.getBlobstoreService();
BlobKey blobKey = blobstoreService.createGsBlobKey(fileInfo.getGsObjectName());
LOG.info("BlobKey: " + blobKey);
// Read file in memory
byte[] data = blobstoreService.fetchData(blobKey, 0, fileInfo.getSize());
// Code copied from https://cloud.google.com/appengine/docs/java/googlecloudstorageclient/app-engine-cloud-storage-sample#writing_a_file_to_cloud_storage
GcsFileOptions instance = GcsFileOptions.getDefaultInstance();
GcsOutputChannel outputChannel = GCS_SERVICE.createOrReplace(destinationFile, instance);
// copy() method available from https://github.com/GoogleCloudPlatform/appengine-gcs-client/blob/master/java/example/src/main/java/com/google/appengine/demos/GcsExampleServlet.java
copy(new ByteArrayInputStream(data), Channels.newOutputStream(outputChannel));
// Optional, remove the original uploaded file
blobstoreService.delete(blobKey);
} else {
// Production environment, nothing to do here
}
}
Because this is only for local dev-server, I'm not using big files (only small test files) so loading the entire file in memory is not a blocking issue.