Can't read Google Storage Cloud file in upload success handler - google-app-engine

I'm trying to get file content from 'upload success' handler in GAE. File is uploaded to the url:
blobstoreService.createUploadUrl("/onupload", uploadOptions));
So, in /onupload, I'm doing like:
BlobKey myFile = context.getRequestBlobs().get("myFile").get(0);
and then I've tried:
InputStream is = new BlobstoreInputStream(myFile);
// .. read the stream
which failed with com.google.appengine.api.blobstore.BlobstoreInputStream$BlobstoreIOException: BlobstoreInputStream received an invalid blob key: =?ISO-8859-1?Q?AMIfv96J=2DsyIbhm5=5FET?=
and
FileReadChannel ch = fileService.openReadChannel(myFile, false);
which failed with
java.io.IOException
at com.google.appengine.api.files.FileServiceImpl.translateException(FileServiceImpl.java:615)
at com.google.appengine.api.files.FileServiceImpl.makeSyncCall(FileServiceImpl.java:588)
at com.google.appengine.api.files.FileServiceImpl.open(FileServiceImpl.java:521)
at com.google.appengine.api.files.FileServiceImpl.openForRead(FileServiceImpl.java:481)
at com.google.appengine.api.files.FileServiceImpl.openForRead(FileServiceImpl.java:473)
at com.google.appengine.api.files.FileServiceImpl.openReadChannel(FileServiceImpl.java:197)
Any thoughts on what am I doing wrong and is it possible at all to read file's content in upload hander?
Note, that for blobstore fs (not GS) it was working fine.

I think you are trying to read a file from the google cloud storage.
Have you seen the example [1] on the docs?
Particularly this part:
/ At this point, the file is visible in App Engine as:
// "/gs/BUCKETNAME/FILENAME"
// and to anybody on the Internet through Cloud Storage as:
// (http://storage.googleapis.com/BUCKETNAME/FILENAME)
// We can now read the file through the API:
String filename = "/gs/" + BUCKETNAME + "/" + FILENAME;
AppEngineFile readableFile = new AppEngineFile(filename);
FileReadChannel readChannel =
fileService.openReadChannel(readableFile, false);
// Again, different standard Java ways of reading from the channel.
BufferedReader reader =
new BufferedReader(Channels.newReader(readChannel, "UTF8"));
String line = reader.readLine();
resp.getWriter().println("READ:" + line);
// line = "The woods are lovely, dark, and deep."
readChannel.close();
[1] https://developers.google.com/appengine/docs/java/googlestorage/overview#Complete_Sample_App

Related

Downloads folder shared file write permissions problem

I am trying to get an offline backup function working on Android 12. It has worked for years on previous versions of Android, 6 & 8. It is required as the size of the backup can often exceed 25mb. I am using a Samsung A7 Lite for this testing to ensure Android 12 compliance. Essentially the function initially creates a backup folder in the downloads folder if it does not exist. It then writes a backup file to that folder. All goes well. I can repeat the function any number of times without there being a problem. It retains father and grandfather versions for security. However, if I try to use the same function where there are existing files the following day, I am presented with a java.io.FileNotFoundException, open failed EACCES (Permission denied). This whole situation appears very illogical, and does not appear to follow the documentation on accessing the downloads folder. If I manually delete the backup file from the previous day, the process succeeds, similarly if I delete the backup directory within the downloads folder, the backup proceeds successfully. The app asks the user for the appropriate permissions which I believe are read and write external storage. Can anybody identify what I am doing wrong in this environment.
The code is below.
String path = "";
// if no external, set to download
if (path.equals("")) {
File systemPath = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DOWNLOADS);
path = systemPath.getAbsolutePath();
}
// set up backup subdirectory
path = path + "/backup";
// check if path exists
File backupDir = new File(path);
if (!backupDir.exists()) {
try {
backupDir.mkdirs();
MediaScannerConnection.scanFile(this, new String[]{backupDir.getAbsolutePath()}, null, null);
}
catch(Exception e){
e.printStackTrace();
}
}
// first get rid of old backup files leaving at least 2 older versions
File backupFile = new File(path,"backup3.bkp");
if (backupFile.exists())
backupFile.delete();
for (int i = 3;i > 1;i--){
File renameBackupFile = new File(path,"backup" + i + ".bkp");
File existBackupFile = null;
if (i == 2)
existBackupFile = new File(path,"backup.bkp");
else
existBackupFile = new File(path,"backup" + (i - 1) + ".bkp");
if (existBackupFile.exists()) {
try {
existBackupFile.renameTo(renameBackupFile);
} catch (Exception e) {
String message = e.toString();
}
}
}
// create a new backup
String fileName = "backup.bkp";
String backup = path + "/" + fileName;
FileInputStream dataBaseFile = new FileInputStream(DB_PATH);
File newBackupFile = new File(backup);
newBackupFile.createNewFile();
FileOutputStream backupStream = new FileOutputStream(newBackupFile);
//transfer bytes from the inputfile to the outputfile
byte[] buffer = new byte[1024];
int length;
while ((length = dataBaseFile.read(buffer)) > 0) {
backupStream.write(buffer, 0, length);
}
//Close the streams
backupStream.flush();
backupStream.close();
dataBaseFile.close();
MediaScannerConnection.scanFile(this, new String[]{newBackupFile.getAbsolutePath()}, null, null);

Google cloud storage using stream instead of bytebuffer - java

I'm using the following code:
GcsService gcsService = GcsServiceFactory.createGcsService();
GcsFilename filename = new GcsFilename(BUCKETNAME, fileName);
GcsFileOptions options = new GcsFileOptions.Builder()
.mimeType(contentType)
.acl("public-read")
.addUserMetadata("myfield1", "my field value")
.build();
#SuppressWarnings("resource")
GcsOutputChannel outputChannel =
gcsService.createOrReplace(filename, options);
outputChannel.write(ByteBuffer.wrap(byteArray));
outputChannel.close();
The problem is that when I try to store video files, I have to store the file in the byteArray which could cause memory issues.
But I cannot find any interface to do the same with stream.
questions:
Should I worry about mem issues in the appengine srv, or are they capable of keeping a 1 min video in mem?
is it possible to use stream instead of byte array? how?
I'm reading the bytes as byte[] byteArray = IOUtils.toByteArray(stream); should I use the byte array as a real buffer and just read chunks and upload them to the GCS? how do I do that?
The amount of memory available depends on the appengine instance type you've configured. Streaming this data seems like a good idea if you can.
Not sure about the GcsService api, but looks like you can do this using the gcloud Storage api:
https://github.com/GoogleCloudPlatform/gcloud-java/blob/master/gcloud-java-storage/src/main/java/com/google/cloud/storage/Storage.java
This code might work (untested)...
final BlobInfo info = BlobInfo.builder(bucket.getBucketName(), "name").contentType("image/png").build();
final ReadableByteChannel src = Channels.newChannel(stream);
final WriteChannel dst = gcsStorage.writer(info);
fastChannelCopy(src, dst);
private void fastChannelCopy(final ReadableByteChannel src, final WritableByteChannel dest) throws IOException {
final ByteBuffer buffer = ByteBuffer.allocateDirect(16 * 1024);
while (src.read(buffer) != -1) {
buffer.flip(); // prepare the buffer to be drained
dest.write(buffer); // write to the channel, may block
// If partial transfer, shift remainder down
// If buffer is empty, same as doing clear()
buffer.compact();
}
// EOF will leave buffer in fill state
buffer.flip();
// make sure the buffer is fully drained.
while (buffer.hasRemaining()) {
dest.write(buffer);
}
}

Read Mail Attchments in appengine using Java

I am using App Engine application to receive emails to a specific list of email address ending with #my-app-id.appspotmail.com will be sent to your application.
Multipart multiPart = (Multipart) message.getContent();
BodyPart bp = multiPart.getBodyPart(0);
log.info("count is "+multiPart.getCount());
String attachFiles = "";
String messageContent = "";
for (int i = 0; i < multiPart.getCount(); i++) {
MimeBodyPart part = (MimeBodyPart) multiPart.getBodyPart(i);
if (Part.ATTACHMENT.equalsIgnoreCase(part.getDisposition())) {
// this part is attachment
String fileName = part.getFileName();
log.info("file name is "+fileName);
} else {
// this part may be the message content
messageContent = part.getContent().toString();
}
}
I want to store the File inside a Blob store but i did not find an API for it, It is going inside the IF loop and am able to get the attachment file name. Any help will be appreciated.
You can read all the data inside the attachment part using the MimeBodyPart.getInputStream method, but you'll need to read the data yourself and create the Blob.

read cloud storage content with "gzip" encoding for "application/octet-stream" type content

We're using "Google Cloud Storage Client Library" for app engine, with simply "GcsFileOptions.Builder.contentEncoding("gzip")" at file creation time, we got the following problem when reading the file:
com.google.appengine.tools.cloudstorage.NonRetriableException: java.lang.RuntimeException: com.google.appengine.tools.cloudstorage.SimpleGcsInputChannelImpl$1#1c07d21: Unexpected cause of ExecutionException
at com.google.appengine.tools.cloudstorage.RetryHelper.doRetry(RetryHelper.java:87)
at com.google.appengine.tools.cloudstorage.RetryHelper.runWithRetries(RetryHelper.java:129)
at com.google.appengine.tools.cloudstorage.RetryHelper.runWithRetries(RetryHelper.java:123)
at com.google.appengine.tools.cloudstorage.SimpleGcsInputChannelImpl.read(SimpleGcsInputChannelImpl.java:81)
...
Caused by: java.lang.RuntimeException: com.google.appengine.tools.cloudstorage.SimpleGcsInputChannelImpl$1#1c07d21: Unexpected cause of ExecutionException
at com.google.appengine.tools.cloudstorage.SimpleGcsInputChannelImpl$1.call(SimpleGcsInputChannelImpl.java:101)
at com.google.appengine.tools.cloudstorage.SimpleGcsInputChannelImpl$1.call(SimpleGcsInputChannelImpl.java:81)
at com.google.appengine.tools.cloudstorage.RetryHelper.doRetry(RetryHelper.java:75)
... 56 more
Caused by: java.lang.IllegalStateException: com.google.appengine.tools.cloudstorage.oauth.OauthRawGcsService$2#1d8c25d: got 46483 > wanted 19823
at com.google.common.base.Preconditions.checkState(Preconditions.java:177)
at com.google.appengine.tools.cloudstorage.oauth.OauthRawGcsService$2.wrap(OauthRawGcsService.java:418)
at com.google.appengine.tools.cloudstorage.oauth.OauthRawGcsService$2.wrap(OauthRawGcsService.java:398)
at com.google.appengine.api.utils.FutureWrapper.wrapAndCache(FutureWrapper.java:53)
at com.google.appengine.api.utils.FutureWrapper.get(FutureWrapper.java:90)
at com.google.appengine.tools.cloudstorage.SimpleGcsInputChannelImpl$1.call(SimpleGcsInputChannelImpl.java:86)
... 58 more
What else should be added to read files with "gzip" compression to be able to read the content in app engine? ( curl cloud storage URL from client side works fine for both compressed and uncompressed file )
This is the code that works for uncompressed object:
byte[] blobContent = new byte[0];
try
{
GcsFileMetadata metaData = gcsService.getMetadata(fileName);
int fileSize = (int) metaData.getLength();
final int chunkSize = BlobstoreService.MAX_BLOB_FETCH_SIZE;
LOG.info("content encoding: " + metaData.getOptions().getContentEncoding()); // "gzip" here
LOG.info("input size " + fileSize); // the size is obviously the compressed size!
for (long offset = 0; offset < fileSize;)
{
if (offset != 0)
{
LOG.info("Handling extra size for " + filePath + " at " + offset);
}
final int size = Math.min(chunkSize, fileSize);
ByteBuffer result = ByteBuffer.allocate(size);
GcsInputChannel readChannel = gcsService.openReadChannel(fileName, offset);
try
{
readChannel.read(result); <<<< here the exception was thrown
}
finally
{
......
It is now compressed by:
GcsFilename filename = new GcsFilename(bucketName, filePath);
GcsFileOptions.Builder builder = new GcsFileOptions.Builder().mimeType(image_type);
builder = builder.contentEncoding("gzip");
GcsOutputChannel writeChannel = gcsService.createOrReplace(filename, builder.build());
ByteArrayOutputStream byteStream = new ByteArrayOutputStream(blob_content.length);
try
{
GZIPOutputStream zipStream = new GZIPOutputStream(byteStream);
try
{
zipStream.write(blob_content);
}
finally
{
zipStream.close();
}
}
finally
{
byteStream.close();
}
byte[] compressedData = byteStream.toByteArray();
writeChannel.write(ByteBuffer.wrap(compressedData));
the blob_content is compressed from 46483 bytes to 19823 bytes.
I think it is the google code's bug
https://code.google.com/p/appengine-gcs-client/source/browse/trunk/java/src/main/java/com/google/appengine/tools/cloudstorage/oauth/OauthRawGcsService.java, L418:
Preconditions.checkState(content.length <= want, "%s: got %s > wanted %s", this, content.length, want);
the HTTPResponse has decoded the blob, so the Precondition is wrong here.
If I good understand you have to set mineType:
GcsFileOptions options = new GcsFileOptions.Builder().mimeType("text/html")
Google Cloud Storage does not compress or decompress objects:
https://developers.google.com/storage/docs/reference-headers?csw=1#contentencoding
I hope that's what you want to do .
Looking at your code it seems like there is a mismatch between what is stored and what is read. The documentation specifies that compression is not done for you (https://developers.google.com/storage/docs/reference-headers?csw=1#contentencoding). You will need to do the actual compression manually.
Also if you look at the implementation of the class that throws the exception (https://code.google.com/p/appengine-gcs-client/source/browse/trunk/java/src/main/java/com/google/appengine/tools/cloudstorage/oauth/OauthRawGcsService.java?r=81&spec=svn134) you will notice that you get the original contents back but you're actually expecting compressed content. Check the method readObjectAsync in the above mentioned class.
It looks like the content persisted might not be gzipped or the content-length is not set properly. What you should do is verify length of the compressed stream just before writing it into the channel. You should also verify that the content length is set correctly when doing the http request. It would be useful to see the actual http request headers and make sure that content length header matches the actual content length in the http response.
Also it looks like contentEncoding could be set incorrectly. Try using:.contentEncoding("Content-Encoding: gzip") as used in this TCK test. Although still the best thing to do is inspect the HTTP request and response. You can use wireshark to do that easily.
Also you need to make sure that GCSOutputChannel is closed as that's when the file is finalized.
Hope this puts you on the right track. To gzip your contents you can use java GZIPInputStream.
I'm seeing the same issue, easily reproducable by uploading a file with "gsutil cp -Z", then trying to open it with the following
ByteArrayOutputStream output = new ByteArrayOutputStream();
try (GcsInputChannel readChannel = svc.openReadChannel(filename, 0)) {
try (InputStream input = Channels.newInputStream(readChannel))
{
IOUtils.copy(input, output);
}
}
This causes an exception like this:
java.lang.IllegalStateException:
....oauth.OauthRawGcsService$2#1883798: got 64303 > wanted 4096
at ....Preconditions.checkState(Preconditions.java:199)
at ....oauth.OauthRawGcsService$2.wrap(OauthRawGcsService.java:519)
at ....oauth.OauthRawGcsService$2.wrap(OauthRawGcsService.java:499)
The only work around I've found is to read the entire file into memory using readChannel.read:
int fileSize = 64303;
ByteBuffer result = ByteBuffer.allocate(fileSize);
try (GcsInputChannel readChannel = gcs.openReadChannel(new GcsFilename("mybucket", "mygzippedfile.xml"), 0)) {
readChannel.read(result);
}
Unfortunately, this only works if the size of the bytebuffer is greater or equal to the uncompressed size of the file, which is not possible to get via the api.
I've also posted my comment to an issue registered with google: https://code.google.com/p/googleappengine/issues/detail?id=10445
This is my function for reading compressed gzip files
public byte[] getUpdate(String fileName) throws IOException
{
GcsFilename fileNameObj = new GcsFilename(defaultBucketName, fileName);
try (GcsInputChannel readChannel = gcsService.openReadChannel(fileNameObj, 0))
{
maxSizeBuffer.clear();
readChannel.read(maxSizeBuffer);
}
byte[] result = maxSizeBuffer.array();
return result;
}
The core is that you cannot use the size of the saved file cause Google Storage will give it to you with the original size, so it checks the sizes you expected and the real size and these are differents:
Preconditions.checkState(content.length <= want, "%s: got %s > wanted
%s", this, content.length, want);
So i solved it allocating the biggest amount possible for these files using BlobstoreService.MAX_BLOB_FETCH_SIZE. Actually maxSizeBuffer is only allocated once outsize of the function
ByteBuffer maxSizeBuffer = ByteBuffer.allocate(BlobstoreService.MAX_BLOB_FETCH_SIZE);
And with maxSizeBuffer.clear(); all data is flushed again.

Getting path of audio file from sdcard

In my app I tried to pass the file path from one activity to another activity using intent.In my receiving activity I got the file path as "null".But when I print the file in first activity it prints the path.From my second activity I attach that file to mail using Gmailsender.This was the code I tried,
private void startRecord()
{
File file = new File(Environment.getExternalStorageDirectory(), "test.pcm");
try
{
file.createNewFile();
OutputStream outputStream = new FileOutputStream(file);
BufferedOutputStream bufferedOutputStream = new BufferedOutputStream(outputStream);
DataOutputStream dataOutputStream = new DataOutputStream(bufferedOutputStream);
int minBufferSize = AudioRecord.getMinBufferSize(8000,
AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT);
short[] audioData = new short[minBufferSize];
AudioRecord audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC,
8000,
AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT,
minBufferSize);
audioRecord.startRecording();
while(recording)
{
int numberOfShort = audioRecord.read(audioData, 0, minBufferSize);
for(int i = 0; i < numberOfShort; i++)
{
dataOutputStream.writeShort(audioData[i]);
}
}
audioRecord.stop();
audioRecord.release();
dataOutputStream.close();
}
catch (IOException e)
{
e.printStackTrace();
}
String audiofile;
audiofile=file.getAbsolutePath();
System.out.println("File Path::::"+audiofile);
}
Intent is,
Intent sigout=new Intent(getApplicationContext(),WeeklyendActivity.class);
sigout.putExtra("mnt/sdcard-test.pcm",audiofile);
startActivity(sigout);
In my receiving activity,
String patty=getIntent().getStringExtra("mnt/sdcard-text.pcm");
System.out.println("paathhhy frfom ::"+patty);
It prints null.Can anyone help me how to get the file path.And more thing I am not sure whether the audio would save in that file correctly?
Please anyone help me!!!Thanks in advance!
Based on your information that audioFile is a variable of type File, when you do this:
sigout.putExtra("mnt/sdcard-test.pcm",audiofile);
you are putting a File object in the extras Bundle. Then, when you try to get the extra from the Bundle you do this:
String patty=getIntent().getStringExtra("mnt/sdcard-text.pcm");
However, the object in this extra is of type File, not type String. This is why you are getting null.
If you only want to pass the name of the file, then put the extra like this:
sigout.putExtra("mnt/sdcard-test.pcm",audiofile.getAbsolutePath());

Resources