Akka File Streaming throw error as akka.http.scaladsl.model.EntityStreamException: Entity stream truncation - akka-stream

We are Streaming file from S3 and processing it , after process complete we upload file back to S3 as Error / Archive file while streaming file from S3 it streams data and in between it stops processing with Error as "akka.http.scaladsl.model.EntityStreamException: Entity stream truncation" , Not sure is this depend on file size stream from S3 or corrupt file ?
val source = s3Client.download(baseConfig.bucketName.get,
content.key)._1.via(Gzip.decoderFlow).
via(Framing.delimiter(ByteString("\n"), 256,
byeFormatterFlag).map(_.utf8String))
val flow = flowDefintion(list)
val resp = source.via(flow).runWith(Sink.seq)
akka {
loglevel = "INFO"
stdout-loglevel = "INFO"
logging-filter = "akka.event.slf4j.Slf4jLoggingFilter"
http {
routing {
decode-max-size = 25m
}
parsing {
max-to-strict-bytes = 20m
max-content-length = 20m
max-chunk-size=10m
}
}
}

Related

Kotlin, how can I base64Encode the file I get back from the OnActivityResult

I need to make the logic to add attachments to a email client.
I have all the needed permissions and I create a intent for files:
val intent = Intent(Intent.ACTION_GET_CONTENT)
intent.type = "*/*"
activity?.startActivityForResult(intent, ActivityResultHandler.PICK_FILE)
I choose a file, and I get result ok in my onActivityResult:
public override fun onActivityResult(reqCode: Int, resultCode: Int, data: Intent?) {
super.onActivityResult(reqCode, resultCode, data)
ActivityResultHandler().onActivityResult(this, reqCode, resultCode, data)
if (reqCode == REQUEST_CODE_SET_DEFAULT_DIALER) {
activityToFragmentCommunicationCallback?.sendData("refresh")
} else if (reqCode == ActivityResultHandler.PICK_FILE && resultCode == RESULT_OK && data != null) {
val uri = data.data
val cr = this.contentResolver
val mime = cr.getType(uri)
var file = uri.toFile()
if (file.exists()) {
var base64 = convertToBase64(file)
DialogFullScreenEmailComposer.addAttachment()
}
}
}
But then it crashes at this: var file = uri.toFile()
Telling me that:
2019-09-18 16:12:12.300 20608-20608/com.xelion.android.debug E/AndroidRuntime: FATAL EXCEPTION: main
Process: com.xelion.android.debug, PID: 20608
java.lang.RuntimeException: Failure delivering result ResultInfo{who=null, request=3, result=-1, data=Intent { dat=content://com.android.providers.downloads.documents/document/raw:/storage/emulated/0/Download/106.apk flg=0x1 }} to activity {com.xelion.android.debug/com.xelion.android.activity.MainActivity}: java.lang.IllegalArgumentException: Uri lacks 'file' scheme: content://com.android.providers.downloads.documents/document/raw%3A%2Fstorage%2Femulated%2F0%2FDownload%2F106.apk
at android.app.ActivityThread.deliverResults(ActivityThread.java:4398)
at android.app.ActivityThread.handleSendResult(ActivityThread.java:4440)
at android.app.servertransaction.ActivityResultItem.execute(ActivityResultItem.java:49)
at android.app.servertransaction.TransactionExecutor.executeCallbacks(TransactionExecutor.java:108)
at android.app.servertransaction.TransactionExecutor.execute(TransactionExecutor.java:68)
at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1816)
at android.os.Handler.dispatchMessage(Handler.java:106)
at android.os.Looper.loop(Looper.java:193)
at android.app.ActivityThread.main(ActivityThread.java:6718)
at java.lang.reflect.Method.invoke(Native Method)
at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:493)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:858)
Caused by: java.lang.IllegalArgumentException: Uri lacks 'file' scheme: content://com.android.providers.downloads.documents/document/raw%3A%2Fstorage%2Femulated%2F0%2FDownload%2F106.apk
at androidx.core.net.UriKt.toFile(Uri.kt:40)
at com.xelion.android.activity.MainActivity.onActivityResult(MainActivity.kt:312)
at android.app.Activity.dispatchActivityResult(Activity.java:7462)
at android.app.ActivityThread.deliverResults(ActivityThread.java:4391)
at android.app.ActivityThread.handleSendResult(ActivityThread.java:4440) 
at android.app.servertransaction.ActivityResultItem.execute(ActivityResultItem.java:49) 
at android.app.servertransaction.TransactionExecutor.executeCallbacks(TransactionExecutor.java:108) 
at android.app.servertransaction.TransactionExecutor.execute(TransactionExecutor.java:68) 
at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1816) 
at android.os.Handler.dispatchMessage(Handler.java:106) 
at android.os.Looper.loop(Looper.java:193) 
at android.app.ActivityThread.main(ActivityThread.java:6718) 
at java.lang.reflect.Method.invoke(Native Method) 
at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:493) 
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:858) 
What can be done so that it can create my file? Or can I skip that and just create the base64Encoded directly?
The error message states that the Uri isn't pointing to a file (it doesn't start with file://). In newer Android versions you don't have direct access to the file you are requesting, but you can read its content through the ContentResolver. You can do that by opening an InputStream with cr.openInputStream(uri). When you have it you can read from that stream and convert the result to Base64.
Here's an example on how to read the entire content of the file into a ByteArray:
private fun readFile(cr: ContentResolver, uri: Uri): ByteArray {
val inStream = cr.openInputStream(uri) ?: return ByteArray(0)
val outStream = ByteArrayOutputStream()
val buffer = ByteArray(32)
while (inStream.read(buffer) > 0) {
outStream.write(buffer)
}
return outStream.toByteArray()
}

How to get native code of flink from apache beam build file?

I defined a pipline of stream processing with Apache Beam and I build it for Fling runner.
I got a jar file, but when I extracted the java files, the java code is not a native Flink code.
Though I found some generated code in my classe:
// $FF: synthetic method
private static Object $deserializeLambda$(SerializedLambda lambda) {
String var1 = lambda.getImplMethodName();
byte var2 = -1;
switch(var1.hashCode()) {
case -30470307:
if (var1.equals("lambda$main$fd9fc9ef$1")) {
var2 = 1;
}
break;
case -30470306:
if (var1.equals("lambda$main$fd9fc9ef$2")) {
var2 = 0;
}
}
Is there any generated file with Flink code?
Regards,
Ali

PDF Error Opening Error Retrieved From Database to Virtual Folder

I am trying to Retrieve Binary Data [PDF] From SQL Server to My Virtual Directory:
string filePaths = System.Web.HttpContext.Current.Server.MapPath("~/TempPDF/");
And I have the following code to Write Data Into filePath, its downloading fine but when I am trying to open in PDF its giving me error. "Adobe Reader could not open "FileName.pdf" because it is either not supported or because file has been damaged"
My Code :
string last = fileName.Substring(fileName.LastIndexOf('.') + 1);
if (last == "pdf")
{
using (System.IO.FileStream fs = new System.IO.FileStream(filePaths+fileName, System.IO.FileMode.CreateNew ))
{ // use a binary writer to write the bytes to disk
using (System.IO.BinaryWriter bw = new System.IO.BinaryWriter(fs))
{
bw.Write(Data, 0, Data.Length);
//bw.Write(Data);
bw.Flush();
bw.Close();
}
}
}

read cloud storage content with "gzip" encoding for "application/octet-stream" type content

We're using "Google Cloud Storage Client Library" for app engine, with simply "GcsFileOptions.Builder.contentEncoding("gzip")" at file creation time, we got the following problem when reading the file:
com.google.appengine.tools.cloudstorage.NonRetriableException: java.lang.RuntimeException: com.google.appengine.tools.cloudstorage.SimpleGcsInputChannelImpl$1#1c07d21: Unexpected cause of ExecutionException
at com.google.appengine.tools.cloudstorage.RetryHelper.doRetry(RetryHelper.java:87)
at com.google.appengine.tools.cloudstorage.RetryHelper.runWithRetries(RetryHelper.java:129)
at com.google.appengine.tools.cloudstorage.RetryHelper.runWithRetries(RetryHelper.java:123)
at com.google.appengine.tools.cloudstorage.SimpleGcsInputChannelImpl.read(SimpleGcsInputChannelImpl.java:81)
...
Caused by: java.lang.RuntimeException: com.google.appengine.tools.cloudstorage.SimpleGcsInputChannelImpl$1#1c07d21: Unexpected cause of ExecutionException
at com.google.appengine.tools.cloudstorage.SimpleGcsInputChannelImpl$1.call(SimpleGcsInputChannelImpl.java:101)
at com.google.appengine.tools.cloudstorage.SimpleGcsInputChannelImpl$1.call(SimpleGcsInputChannelImpl.java:81)
at com.google.appengine.tools.cloudstorage.RetryHelper.doRetry(RetryHelper.java:75)
... 56 more
Caused by: java.lang.IllegalStateException: com.google.appengine.tools.cloudstorage.oauth.OauthRawGcsService$2#1d8c25d: got 46483 > wanted 19823
at com.google.common.base.Preconditions.checkState(Preconditions.java:177)
at com.google.appengine.tools.cloudstorage.oauth.OauthRawGcsService$2.wrap(OauthRawGcsService.java:418)
at com.google.appengine.tools.cloudstorage.oauth.OauthRawGcsService$2.wrap(OauthRawGcsService.java:398)
at com.google.appengine.api.utils.FutureWrapper.wrapAndCache(FutureWrapper.java:53)
at com.google.appengine.api.utils.FutureWrapper.get(FutureWrapper.java:90)
at com.google.appengine.tools.cloudstorage.SimpleGcsInputChannelImpl$1.call(SimpleGcsInputChannelImpl.java:86)
... 58 more
What else should be added to read files with "gzip" compression to be able to read the content in app engine? ( curl cloud storage URL from client side works fine for both compressed and uncompressed file )
This is the code that works for uncompressed object:
byte[] blobContent = new byte[0];
try
{
GcsFileMetadata metaData = gcsService.getMetadata(fileName);
int fileSize = (int) metaData.getLength();
final int chunkSize = BlobstoreService.MAX_BLOB_FETCH_SIZE;
LOG.info("content encoding: " + metaData.getOptions().getContentEncoding()); // "gzip" here
LOG.info("input size " + fileSize); // the size is obviously the compressed size!
for (long offset = 0; offset < fileSize;)
{
if (offset != 0)
{
LOG.info("Handling extra size for " + filePath + " at " + offset);
}
final int size = Math.min(chunkSize, fileSize);
ByteBuffer result = ByteBuffer.allocate(size);
GcsInputChannel readChannel = gcsService.openReadChannel(fileName, offset);
try
{
readChannel.read(result); <<<< here the exception was thrown
}
finally
{
......
It is now compressed by:
GcsFilename filename = new GcsFilename(bucketName, filePath);
GcsFileOptions.Builder builder = new GcsFileOptions.Builder().mimeType(image_type);
builder = builder.contentEncoding("gzip");
GcsOutputChannel writeChannel = gcsService.createOrReplace(filename, builder.build());
ByteArrayOutputStream byteStream = new ByteArrayOutputStream(blob_content.length);
try
{
GZIPOutputStream zipStream = new GZIPOutputStream(byteStream);
try
{
zipStream.write(blob_content);
}
finally
{
zipStream.close();
}
}
finally
{
byteStream.close();
}
byte[] compressedData = byteStream.toByteArray();
writeChannel.write(ByteBuffer.wrap(compressedData));
the blob_content is compressed from 46483 bytes to 19823 bytes.
I think it is the google code's bug
https://code.google.com/p/appengine-gcs-client/source/browse/trunk/java/src/main/java/com/google/appengine/tools/cloudstorage/oauth/OauthRawGcsService.java, L418:
Preconditions.checkState(content.length <= want, "%s: got %s > wanted %s", this, content.length, want);
the HTTPResponse has decoded the blob, so the Precondition is wrong here.
If I good understand you have to set mineType:
GcsFileOptions options = new GcsFileOptions.Builder().mimeType("text/html")
Google Cloud Storage does not compress or decompress objects:
https://developers.google.com/storage/docs/reference-headers?csw=1#contentencoding
I hope that's what you want to do .
Looking at your code it seems like there is a mismatch between what is stored and what is read. The documentation specifies that compression is not done for you (https://developers.google.com/storage/docs/reference-headers?csw=1#contentencoding). You will need to do the actual compression manually.
Also if you look at the implementation of the class that throws the exception (https://code.google.com/p/appengine-gcs-client/source/browse/trunk/java/src/main/java/com/google/appengine/tools/cloudstorage/oauth/OauthRawGcsService.java?r=81&spec=svn134) you will notice that you get the original contents back but you're actually expecting compressed content. Check the method readObjectAsync in the above mentioned class.
It looks like the content persisted might not be gzipped or the content-length is not set properly. What you should do is verify length of the compressed stream just before writing it into the channel. You should also verify that the content length is set correctly when doing the http request. It would be useful to see the actual http request headers and make sure that content length header matches the actual content length in the http response.
Also it looks like contentEncoding could be set incorrectly. Try using:.contentEncoding("Content-Encoding: gzip") as used in this TCK test. Although still the best thing to do is inspect the HTTP request and response. You can use wireshark to do that easily.
Also you need to make sure that GCSOutputChannel is closed as that's when the file is finalized.
Hope this puts you on the right track. To gzip your contents you can use java GZIPInputStream.
I'm seeing the same issue, easily reproducable by uploading a file with "gsutil cp -Z", then trying to open it with the following
ByteArrayOutputStream output = new ByteArrayOutputStream();
try (GcsInputChannel readChannel = svc.openReadChannel(filename, 0)) {
try (InputStream input = Channels.newInputStream(readChannel))
{
IOUtils.copy(input, output);
}
}
This causes an exception like this:
java.lang.IllegalStateException:
....oauth.OauthRawGcsService$2#1883798: got 64303 > wanted 4096
at ....Preconditions.checkState(Preconditions.java:199)
at ....oauth.OauthRawGcsService$2.wrap(OauthRawGcsService.java:519)
at ....oauth.OauthRawGcsService$2.wrap(OauthRawGcsService.java:499)
The only work around I've found is to read the entire file into memory using readChannel.read:
int fileSize = 64303;
ByteBuffer result = ByteBuffer.allocate(fileSize);
try (GcsInputChannel readChannel = gcs.openReadChannel(new GcsFilename("mybucket", "mygzippedfile.xml"), 0)) {
readChannel.read(result);
}
Unfortunately, this only works if the size of the bytebuffer is greater or equal to the uncompressed size of the file, which is not possible to get via the api.
I've also posted my comment to an issue registered with google: https://code.google.com/p/googleappengine/issues/detail?id=10445
This is my function for reading compressed gzip files
public byte[] getUpdate(String fileName) throws IOException
{
GcsFilename fileNameObj = new GcsFilename(defaultBucketName, fileName);
try (GcsInputChannel readChannel = gcsService.openReadChannel(fileNameObj, 0))
{
maxSizeBuffer.clear();
readChannel.read(maxSizeBuffer);
}
byte[] result = maxSizeBuffer.array();
return result;
}
The core is that you cannot use the size of the saved file cause Google Storage will give it to you with the original size, so it checks the sizes you expected and the real size and these are differents:
Preconditions.checkState(content.length <= want, "%s: got %s > wanted
%s", this, content.length, want);
So i solved it allocating the biggest amount possible for these files using BlobstoreService.MAX_BLOB_FETCH_SIZE. Actually maxSizeBuffer is only allocated once outsize of the function
ByteBuffer maxSizeBuffer = ByteBuffer.allocate(BlobstoreService.MAX_BLOB_FETCH_SIZE);
And with maxSizeBuffer.clear(); all data is flushed again.

Can't read Google Storage Cloud file in upload success handler

I'm trying to get file content from 'upload success' handler in GAE. File is uploaded to the url:
blobstoreService.createUploadUrl("/onupload", uploadOptions));
So, in /onupload, I'm doing like:
BlobKey myFile = context.getRequestBlobs().get("myFile").get(0);
and then I've tried:
InputStream is = new BlobstoreInputStream(myFile);
// .. read the stream
which failed with com.google.appengine.api.blobstore.BlobstoreInputStream$BlobstoreIOException: BlobstoreInputStream received an invalid blob key: =?ISO-8859-1?Q?AMIfv96J=2DsyIbhm5=5FET?=
and
FileReadChannel ch = fileService.openReadChannel(myFile, false);
which failed with
java.io.IOException
at com.google.appengine.api.files.FileServiceImpl.translateException(FileServiceImpl.java:615)
at com.google.appengine.api.files.FileServiceImpl.makeSyncCall(FileServiceImpl.java:588)
at com.google.appengine.api.files.FileServiceImpl.open(FileServiceImpl.java:521)
at com.google.appengine.api.files.FileServiceImpl.openForRead(FileServiceImpl.java:481)
at com.google.appengine.api.files.FileServiceImpl.openForRead(FileServiceImpl.java:473)
at com.google.appengine.api.files.FileServiceImpl.openReadChannel(FileServiceImpl.java:197)
Any thoughts on what am I doing wrong and is it possible at all to read file's content in upload hander?
Note, that for blobstore fs (not GS) it was working fine.
I think you are trying to read a file from the google cloud storage.
Have you seen the example [1] on the docs?
Particularly this part:
/ At this point, the file is visible in App Engine as:
// "/gs/BUCKETNAME/FILENAME"
// and to anybody on the Internet through Cloud Storage as:
// (http://storage.googleapis.com/BUCKETNAME/FILENAME)
// We can now read the file through the API:
String filename = "/gs/" + BUCKETNAME + "/" + FILENAME;
AppEngineFile readableFile = new AppEngineFile(filename);
FileReadChannel readChannel =
fileService.openReadChannel(readableFile, false);
// Again, different standard Java ways of reading from the channel.
BufferedReader reader =
new BufferedReader(Channels.newReader(readChannel, "UTF8"));
String line = reader.readLine();
resp.getWriter().println("READ:" + line);
// line = "The woods are lovely, dark, and deep."
readChannel.close();
[1] https://developers.google.com/appengine/docs/java/googlestorage/overview#Complete_Sample_App

Resources